What kind of bread should you eat?

A sceptical approach to commonly held attitudes and beliefs

I took part in a fascinating discussion at a conference a few weeks ago. The conference theme was ‘What shall we eat?’ with the parallel theme ‘how shall we grow our food?’ The speakers were excellent. One of the workshops I attended was exploring attitudes towards food, in relation to production methods.

During the workshop, the facilitators handed around two bread samples from a white sliced loaf and an artisan loaf. The overwhelming consensus was that the artisan bread was healthier – a ‘no brainer’ as far as healthy food choices are concerned. Who with any sense would choose a white ‘factory’ bread over the artisan option? It got me thinking about people’s values regarding food. What drives our choices?

artisan-bread
Spoilt for choice

These are some of the attitudes shared in the workshop:

Artisan bread is more filling and satisfying/ more nutritious / better for the environment/ supporting small producers/ contains natural chemical-free ingredients

I felt increasingly uncomfortable. How much choice do most people really have? For me, these attitudes smacked of food snobbery. (TV presenter Greg Wallace gives his own take on food snobbery.) Do ‘natural’ food campaigners sometimes lose sight of the overwhelming social and financial issues that restrict food and lifestyle choices? But more importantly, is artisan bread really that much better for us?

Is artisan bread more nutritious?

As a result of this discomfort, I feel compelled to check out the bread facts. To what extent is artisan bread healthier than mass-produced sliced white bread? Let’s start with the nutritional content for protein, carbohydrate fat, energy and dietary fibre. In the cold light of day, there is actually very little difference in the macronutrient content of white bread and artisan bread.

bread-nutrients
Figures derived from http://badges.myfitnesspal.com/food/calories/93175308 and https://www.gov.uk/government/publications/composition-of-foods-integrated-dataset-cofid

So, what about the micronutrients? A scan of a range of sources reveals that the differences in the levels of thiamin, niacin, riboflavin, iron and calcium are negligible, although they might be slightly lower in white bread.

Is artisan bread easier to digest?

In my investigation of bread and health, I found an article by Jamie Oliver.  In it, he states that ‘Artisan bread is actually easier to digest, because the enzymes have had time to begin breaking down the gluten in the flour while fermenting’. This is  news to me, and I want to find out more about the breakdown of gluten in the bread-making process. And what about the claim about this making the bread more easily digestible? I need to find out!

It’s quite hard to get beyond the rhetoric about gluten. If you believe the vast majority of web chatter, gluten is a very bad thing indeed. I accept that evidence is building of adverse human responses to gluten. There are plenty of science-based articles pointing to the dangers of gluten. Despite referencing their sources, even here there seems to be some misinterpretation of the science, and I’m looking for the bare facts.

I feel a bit more comfortable with the standard of this article written by Jake New, as it draws directly from original science and doesn’t make over-inflated health claims. Jake explains how the gluten proteins can be incompletely broken down. Some people develop an inflammatory response to the resulting polypeptides (partially digested proteins). I’m still no further in discovering whether the preparation method really does makes the gluten in one bread easier to digest than another. What a pity Jamie Oliver did not give a reference for this claim! The only way I know to resolve this is to read the scientific papers, and particularly any systematic reviews of the research on gluten and health.

According to a systematic review by Smith et al (2015), about half the protein in wheat comprises gluten. Importantly, the authors stated that research into gluten digestion is conducted using test tubes. It is not particularly straightforward to extrapolate these findings into what actually happens inside your stomach and intestines. This aside, in their experiments, they found that baking markedly reduced the digestibility of the gluten proteins, compared with flour. They also found that the gluten proteins in bread are hardly broken down at all by the stomach enzyme pepsin, but that the enzymes normally present in the small intestine were effective in digesting gluten.

I cannot find anything discussing the baking method with respect to gluten digestibility, although I do feel more knowledgeable about gluten composition and digestion.

This New Zealand site created by the Baking Industry Research Trust explains a number of useful facts about bread and gluten. It confirms what I thought – that fermentation involves the breakdown of starches to produce the gas carbon dioxide and alcohol. I have found no sources explaining how fermentation might break down gluten. It explains that the proteins gliadin and glutenin combine during the bread-making process to form gluten, which does not otherwise exist in this form. According to another source,  the fermentation process also produces lactic and acetic acids, which add to the flavour.

To summarise, this is what I have discovered so far:

Wheat does not contain gluten – it contains two proteins, gliadin and glutenin, that combine during the bread-making process to form gluten.

The part of gluten that causes allergic responses and intolerance is gliadin, after the gluten has been partially digested into glutenin and gliadin.

The gluten in bread is much more difficult for human digestive enzymes to break down than the separate proteins found in flour. This seems to happen in the duodenum rather than in the stomach.

In artisan baking, the gluten undergoes repeated, slow cycles of stretching and relaxing, during which time the starches and sugars in the flour ferment into carbon dioxide, alcohol, and lactic and acetic acids. The liquid by-products contribute to the tastiness of artisan breads. It isn’t clear where the claims about the gluten becoming more digestible originate from.

Does artisan bread contain fewer harmful chemicals?

The UK Flour Advisory Bureau provides a helpful guide to the additives commonly made to flour. For bread, they might augment the enzymes that are already naturally present. To improve the texture and structure, they add Vitamin C and sometimes the amino acid cysteine (which is a necessary component of our body proteins and can be made by our livers). All flours except wholegrain are also fortified with B vitamins and calcium. And so far, I’m not worried about the chemicals in mass produced bread.

But what about pesticides? This is a complex topic and one to put aside for a follow-up blog article!

What kind of bread will I eat?

I have reassured myself that as far as nutrition is concerned, the type of bread I choose makes little difference. It’s far more important to consider diet as a whole. There will be times when I want to splash out on an artisan bread as a treat, and also when I want to support my local breadmakers. What bread choices do you make and why?

Apprenticeships in England: recent trends

I’ve noticed a lot of excitement around degree level apprenticeships lately. Perhaps this is fuelled by the relentless rise in university tuition fees in England. I decided to look into some of the figures and came up with this infographic. I hope you like it!

apprenticeship-trends

The fragile frontiers of trust in healthcare

Have you ever been in the situation of vulnerability, perhaps as a hospital patient, in which you felt able to trust some clinical staff more than others? As I have previously studied trust in the nursing context (see. http://oro.open.ac.uk/29954/ ), I have decided to explore whether there are any recognisable patterns at the personal level in feelings of trust or distrust. What leaps out at me more than anything else is my observation that the issue of trust doesn’t become relevant until the moment you encounter someone you feel you can’t trust entirely.

As this is a sensitive issue, I am keeping my personal sources private. And I should make clear that this blogpost, being drawn from highly personal experiences, cannot claim to be generalisable. Despite these caveats, I hope that my observations find resonance with readers.

I have tried to find useful starting points in the trust literature. The most helpful source I have found is an academic paper by Liz Bell and Anita Duffy published in 2009 in the British Journal of Nursing.

They identified the following four characteristics of trust in a nursing context:

  • Expectation of competence
  • Goodwill of others
  • Fragility/vulnerability
  • Element of risk.

I’ll organise my thoughts around this list.

Expectation of competence

It is not easy for patients to know whether the people caring for them are competent. One patient might be impressed that a care worker is wearing gloves to do finger-prick tests, and another might be very concerned to notice that the gloves are not changed between patients.

Healthcare Support Workers (HCSWs) and housekeeping staff are the people who have the most sustained contact with patients, maintaining the routine ‘servicing’ work. Between them, they make sure the beds, bodies, floors and so on, are clean. They serve up food and drink. They maintain the routine patient observations of ‘vital signs’, enquire whether patients have any pain, and record food and drink intake and bowel movements. In a recent hospital stay, I was surprised that the only person who asked me how I was feeling ‘in myself’ was a doctor.

From the vantage point of a patient, it is hard to know whether or not someone is good at their job. So much of the work happens out of sight. With the divisions of labour on a hospital ward, it is difficult even to know what people’s jobs are. How do you know if someone is stepping outside the boundaries of their competence? Some members of the housekeeping staff might be very keen to engage the patients in conversation about their conditions and their anxieties, and others quietly get on with cleaning tasks. Which is more competent? Can you trust a housekeeper to handle your anxieties sensitively as much as you can a nurse?

Much of this expectation of competence is an expression of hope. Expecting confidence and goodwill from those who care for you is based on hope for a good recovery and/or the best possible care at a time of need.

Goodwill of others

Do healthcare staff have the best interests of their patients at heart? Are they diligent? Do they have the personal capacity to care?

Body language

Why would a patient sense a lack of trust in someone who, as far as they can tell, is perfectly competent? Part of the answer here is that healthcare is not just about getting things done, although this is of course vital. Trust is important because people are not simply machines to be serviced and repaired. Trust also hinges on attitude. Body language could give clues.

Some people look as though everything is just too much effort. Perhaps they are preoccupied with their own tiredness, discomfort, or boredom. If they are in a caring role in these circumstances, it can be difficult for a patient to feel confident about their capacity to care.

By contrast, some people exude a kind of flamboyance clearly aimed at cheering patients and colleagues with their sunny and congenial disposition. Whilst it’s reassuring to know that people are putting positive energy into their work, patients may be left wondering whether this behaviour is masking a lack of knowledge or a fundamental lack of confidence. Most concerning – is this flamboyance a sign of over-preoccupation with ‘jollying people along’ at the expense of the more sober work of caring for people who are in a vulnerable position? Now, I suddenly realise this perceived capacity to care could be at the root of trust.

GKEkt

Diligence and capacity to care

I am starting to feel some better insight into the elusive nature of trust. Patients need to know that a person who has a key role to play in their care is taking a solicitous and diligent approach to their work. A telling proxy for care and diligence is hand hygiene. Who do you trust better? A flamboyant high-octane worker who does not change gloves between patients, or a very reserved care worker, who has a downtrodden air about him, going quietly about his work, sanitising his hands in a well-rehearsed fashion at all the appropriate times? Perhaps the more concrete concerns about cross infection can override the initial impact of body language.

The strange thing is, if you trust someone intuitively, you probably wouldn’t bother to monitor their hand hygiene. I remember the badges that were all the rage just a little while ago – the ones that stated ‘Clean hands? It’s OK to ask’. Whatever happened to them?

If body language sparks distrust, I know that trust is repairable. The person just needs to show that patient’s recovery, dignity, or relief of suffering is their prime concern as they carry out their work. Patients don’t want to feel that we are inconveniencing the staff. On the whole, they want to be ‘good’ patients – not excessively demanding, grateful for the care and thankful that the NHS still exists. In return, they want the staff to see them as individuals.

Fragility/vulnerability

To encourage a trusting relationship with patients, staff need to let down their defences just a little.

#HelloMyNameIs

Imagine I’m lying in a hospital bed. You appear at my side. I’ve never seen you before, I don’t recognize your uniform, I don’t know on what basis I can trust you. Tell me who you are when you first come to my bedside. What is your name? What is your role? Are you qualified for the tasks you are performing? Are you responsible for my care today, or are you just performing the one task? It makes a difference. Without this knowledge, I am left wondering. I don’t understand how the team fits together and how the communication is working behind the scenes. I don’t know whether I should trust you or not.

Does someone have my back?

Now I feel I’m really getting somewhere in my efforts to understand the trust or distrust felt by hospital inpatients. They want to know that someone has their back. If things aren’t getting done properly, they want to know that someone in a position of authority will notice. They need to know where the accountability lies. A clear sense of leadership is missing sometimes. Patients need to know who is in charge. Managers and leaders need to be making some visible effort to communicate with patients person to person.

If I said to a nurse that I am expecting a blood sample to be taken today and am concerned that it hasn’t happened, it is only mildly reassuring to be told that the phlebotomist will turn up at some point if they want the blood. I want the nurse to check. Perhaps I misunderstood what the doctor said yesterday. Taking initiative at this one-to-one level would go a long way in restoring trust. Patients are afraid of appearing too bossy, too interfering, too self-important. They want nurses to be able to ‘read’ situations and respond appropriately.

Will you respond appropriately in an emergency?

It takes a position of vulnerability to raise your sensibilities regarding trust. If you’re not feeling vulnerable, trust isn’t important. Sudden life-threatening events expose vulnerability at its most extreme. If a patient observes a care worker not responding with sufficient urgency, either to a patient’s call for help, or a dangerous change in vital signs, any trust they have in that worker is likely to crumble.

Element of risk

There is an element of risk on both sides of a relationship based on trust. From a worker’s point of view, letting down one’s defences could open the way to overfamiliarity. Worse, giving away ‘too much information’ could lead to misinterpretation on the part of a patient. Healthcare staff are constantly aware of the risk of litigation. Taking time to follow up the concerns of one patient will have to be balanced with competing priorities – something has to give.

From the viewpoint of a patient, their very vulnerability adds an element of risk. For example, they have to trust nurses to administer medications correctly. Any routine medications they could normally manage themselves are now in the hands of someone else. Hospital acquired infections loom large as constant threats. Invasive treatments such as surgery or the insertion of needles, cannulas and tubes carry risk. Sick patients are at risk of developing pressure ulcers. I can go on.

Patients want to trust

I’ll conclude by drawing together the strands I have woven here. Patients want to trust healthcare staff. It goes with the hope they hold at a time of extreme need: hoping for recovery, or simply tender loving care. Body language gives important signals, but perhaps the more concrete demonstration of diligence and capacity to care hold sway. The vulnerability of patients can make them look for signs that someone has their back. They need to know that there is a trustworthy figure who is accountable. They need to see that staff respond appropriately in an emergency. Healthcare is a risky business. Risk is present on both sides. Everyone, including patients, needs to pull together to minimise these risks. I hope the steady, caring, trustworthy people get the recognition they deserve.

Prototyping for eHealth

I have never used eHealth apps myself. However I am in the middle of a course on Futurelearn that is all about developing eHealth. This means I am able to participate in discussions with people who do have experience of using eHealth apps. The course is great for me. I’ve always been curious about the processes involved in developing apps for mobile phones, and now is my chance to have a go.

Currently, I am at the stage of designing a ‘lo-fi’ prototype for an eHealth app. Prototypes of this sort don’t need to be technical in any way. In fact, we are being encouraged to make them on paper or on a computer screen. At the same time as taking this course, I am also working on a literature review about diabetes prevention. So what better topic for my first design than a Type 2 diabetes prevention app?

Developing the concept

Actually, I decided it might be more realistic to create a diabetes manager app. At least there is a clear target audience. After all, I am finding from the literature review that it can be very challenging to engage people to look after their health “just in case” they develop a preventable disease.

So far, I am discovering that an important key to a good design is to make the app visually appealing. The information needs to be concise, and people need to be enticed in with simple images, videos and clear navigation. Underneath all of that, the needs and requirements of the target audience need to be established. People need a very good reason to use it.

Understanding user requirements

Why would someone with diabetes want an app? I can only guess that people who use apps are already committed in some way. They are motivated to take good care of themselves and their condition. And, they are probably the kind of people who are interested in making technology work for them. I will have to leave to another day the problem of how to reach people who do not fit into these two categories.

I’m interested in how healthcare professionals and service users can work together to manage long-term conditions. There’s a huge scope for technology to play a supporting role here. I like the idea of technology levelling out the playing field between clinician and patient. People need to be empowered to take control.

For instance, when you attend a consultation with a clinician, they will have all the data about you available at their fingertips. But you don’t have the data in front of you in the same way. As a patient, you are aware of the time pressures, and you’ve been thinking for weeks about all the things you wanted to talk about. If you’re anything like me you come armed with the set of questions, but don’t ask them all because as you get the end of the list, it all starts to feel a little trivial. Perhaps those last two questions aren’t really very important. You feel you are already taking up too much of their time. That’s my reality, anyway.

When I first started to design my diabetes manager app, I had these experiences in mind. I was also considering a case study in which a clinician had to input a lot of patient self-report data by hand during a consultation. The challenge was to design an eHealth solution that would solve some of the problems for both clinician and patient.

Prototyping

At first, I felt completely overwhelmed by the task. But then I thought I should really just have a go. I tried to analyse the situation from both perspectives. I tried to apply the design principles. I also decided to have a go at using Balsamiq software .

Balsamiq is a software environment that allows you to do rapid mockups. It claims to reproduce the experience of sketching on a whiteboard. I always find a little bit of technology helps me to be creative and overcome the inertia of getting started on something that feels difficult.

Imagine Ravi who has type II diabetes. He’s really keen to manage it better, and would like to have a better relationship with the diabetic nurse specialist who he sees about once every couple of months.

So I thought, what if Ravi has a really clever glucose meter that will transmit data to a phone app, and the app will transmit this data to a central patient record. Then, I thought well, what if everything else like the exercise diary, food diary and weight measurements could also be sent by the app to the central record? Ravi could agree with the nurse specialist the kind of things to keep a record of.

I started to sketch out the home screen of the app. I majored on using photographs for visual appeal and interest. I had very little text on this screen. Alongside the exercise diary and food diary, I decided that Ravi might want some information as an aide memoire of exercises or types of physical activity he can do as well as the kind of foods he should be encouraged to eat or avoid.

Here is a screenshot of the prototype I came up with after about an hour fiddling around with Balsamiq. This of course is only the first step. Each link on the homepage opens up a range of possibilities. How will we exercise and food tips be presented? Should I use video and how much? How can I produce a blood glucose graph? And so on.

Diabetes_app5

Learning with others and making connections

Learning as part of a large group has its benefits here. I’m able to pick up ideas from my peers. For example, wouldn’t it be great to incorporate reminders to exercise, measure blood glucose, or take medication? One of my fellow students included mood snapshots – how am I feeling now? What a great idea! Sending alerts to the clinician if things are going a bit haywire would also be very useful.

Then, I remember from my work on the literature review that waist measurement can be very significant in people with diabetes. Again, someone on the course had included this in their app. Aiming to reduce your waist measurement can have a huge impact on your metabolism. Controlling weight and waist size can even mean that a person no longer needs further interventions to control diabetes.

Once I have finished work on the literature review, and once I have moved on to usability testing on the course, I may have more to say. If you would like some help to manage your diabetes, would you use this sort of thing? What apps are you already using?

Blisters and spoons

How dealing with severe hand eczema is a bit like juggling the ‘spoons’ in spoon theory.

I’m the kind person who is always in a rush. I have a certain amount of things to get done in a day, and I dart between one thing and the next. I hate it when I have to stop to search for a piece of essential equipment. I’m always racing against the clock. I squeeze as much as I can into a day.

All this changed about four years ago. Something happened that meant I had to slow right down. It wasn’t multiple sclerosis or chronic fatigue syndrome, lupus or cancer. It wasn’t any those high profile diseases that are well known for making people reappraise their whole lives. It was eczema.

I know that a lot of people have eczema. And that many parents are grappling right now with children who are terribly debilitated by it. I have had eczema in different forms all through my life, although there have been periods when it hasn’t really bothered me very much. Four years ago, the eczema appeared on my hands. This wasn’t the first time I had had eczema on my hands, but this time it was different.

Itchy-palms

Whereas before the eczema was on the sides of my fingers, on the backs of my hands, knuckles, and on my wrists, this time it appeared on the palms of my hands and on the surfaces of my fingers that I use to touch things. That changed everything. Suddenly, I experienced pain and soreness every time I touched anything. The inevitable itching that accompanied the eczema was all the more intense for being located on the parts of my body that were particularly dense with nerve endings. It seemed it could only get worse. Dense clusters of itchy blisters eventually gave way to peeling skin. Skin peeled and peeled, eventually leaving a fragile and terribly thin layer of parchment-like skin.

Now I lived in a very different world. I started wearing cotton gloves to protect my hands. It wasn’t long before I started turning the gloves inside out so that the seams were pointing outwards. I couldn’t even bear to have the seams pressing against my skin. When I needed to touch something wet, I had a choice. I could either remove the gloves, get my hands wet, dry them and put the gloves back on, or I could don waterproof gloves over the top so that I didn’t have to remove the cotton gloves. As my skin flaked and peeled, the razor sharp edges would snag on the fibres and made it increasingly difficult to keep on putting the cotton gloves on and off, so increasingly I opted to wear waterproof gloves over the top whenever I wanted to do anything wet. Wet hands were bad news anyway, as my wettened skin always became more sensitive afterwards.

Working in the kitchen was the most trying. Peeling and chopping vegetables was a challenge. Carrots and potatoes slipped through my fingers and sometimes shot across the kitchen. My disposable vinyl gloves usually managed to get nicked or torn. This meant my cotton gloves would get wet, so I would have to change into clean dry pair before I could continue. Forging ahead with wet gloves usually stored up trouble for later, as my skin would never forgive me for leaving it wet for any length of time. Wearing waterproof gloves did not guarantee dry hands, even if they didn’t leak. Eventually perspiration would build up inside waterproof gloves, which meant that my hands would get damp after a certain length of time anyway.

I entered the world of having to make a decision every time I began a new task. How much do I fear damp hands at this moment? Which gloves should I put on? What can I achieve before my hands start to sweat inside these gloves? Cotton gloves alone were also pretty useless for more heavy-duty work such as going shopping and driving a car, not to mention gardening. Cotton gloves are not designed for such wear and tear. They rub against the surfaces of your hand, get grubby very quickly, and look rather shabby. And of course they are not waterproof which means that going out in the rain with them makes me feel anxious and a bit silly.

I managed to find some variations on gloves by searching the internet. I invested in cycle gloves and hockey gloves that I could wear over the top of my cotton gloves. They offered some firmer support and also some grip, as well as being tough enough to manage driving a car, handing shopping bags and trolleys, freezer goods, tins, packets, and so on. Since those early days of my adventures with gloves, I have accumulated quite a store of them. I have a whole drawer in my bedroom dedicated to gloves. Next to my skin I now wear cotton, viscose, silk, and a whole range of highly engineered moisture wicking fabrics. I have discovered external coatings on gloves that allow me to grip the credit cards in my purse. Some gloves will allow me to use my smart phone, others won’t.

SmartphoneCamera-with-gloves

My main motivation for writing this blog post was not so much to discuss gloves, but to talk about spoon theory. Spoon theory is a neat metaphor for expressing the amount of energy you have to get through a day – the possibilities available to you. You start the day with a certain number of spoons, and once you have spent them all you are out of choices. I feel very much like that when my hands are bad. I can only achieve so much in one day. If I do house work, I can’t do the garden. I can’t go shopping AND prepare a meal. If I hang out the washing, get it in again later and put it away, I might not be able to read a book that evening. I might not put the washing away the same day, though, as it’s difficult for me to work out whether or not it is dry. Depending on where the blisters are on my fingers, reading a book might be off the agenda anyway.

Just by seeing the links between my own situation and spoon theory, I feel connected to a wider community. However, I can’t help feeling a bit like an impostor. I haven’t seen anyone else relate hand eczema to spoon theory. How can I think that what I have is equivalent to some much more serious chronic diseases? Nevertheless, I’m starting to see how it can be a useful way of explaining to other people the wide ranging impact of having severe hand eczema.

Related

The Spoon Theory by Christine Miserandino http://www.butyoudontlooksick.com/articles/written-by-christine/the-spoon-theory/

What is the Spoon Theory? http://thespoontheory.tumblr.com/post/44757754831/faq

Severe hand eczema: Major new clinical trial compares treatments “head to head”. http://www.bbc.co.uk/programmes/p03tqv2c

Ten Layers of expertise in developing distance learning

Photo of mountains-woodland-scrubWell-written and well-designed distance learning materials aim to provide learners with an interesting and seamless experience. If they are composed well, they seem to capture the audience effortlessly. Slips in coherence, even small ones, can draw attention to the material itself, rather than keep the learner engaged and absorbed in the narrative.

A major problem for authors, especially those new to this genre, is that it looks much easier than it actually is. Even for seasoned Open University academics, distance-learning materials undergo a multitude of iterations. Here, I explain about ten layers of expertise involved in developing these materials.

1.       Curriculum knowledge

2.       Knowledge of academic levels

3.       Knowledge of assessment

4.       Knowledge of the audience and all stakeholders

5.       Subject knowledge

6.       Skills knowledge

7.       Pedagogical knowledge

8.       Design knowledge

9.       Knowledge about online and print formats

10.   Knowledge about correct use of written English

1 Curriculum knowledge

As a consultant, one of the first things I do is to locate a set of learning outcomes, along with a broad outline of the intended subject content and skills development. It also helps if the assessment points are mapped against the intended content and skills. I like to get my ducks lined up ready! Seeing how a course or module fits within the broader offering of a qualification is also a helpful grounding exercise. What has already been taught, and what is to come next?

2. Knowledge of academic levels

The academic level of a course will guide the demands placed upon learners. Elements such as the complexity of ideas, length and difficulty of readings, expectations for originality of learning outputs, ability to collaborate, maturity of approach and learner autonomy, will depend on the level. Achieving this knowledge requires reference to national levels frameworks.

3. Knowledge of assessment

Assessment has close links with curriculum and levels. Are you applying the right method of assessment depending on the desired learning outcome? Distance learning can mean some limitations on mode of assessment – giving a presentation may not be practical, for example. Skills in writing assessments can stretch from writing essay questions and guidance, interactive computer marked assessments, project assignments, and exam questions. Knowledge of how to minimise plagiarism is also important here.

4. Knowledge of the audience and all stakeholders

Who are your students? Are they going to be putting their new knowledge into practice straight away? Will they be supported by a tutor? How much time will the tutor contract involve? Who else has stakes in your course? For this last question, consider professional groups and employers, and even service users. Think globally if appropriate. Consider asking for feedback from all potential stakeholders at the development stage.

5. Subject knowledge

Many academic authors focus on their subject expertise, and so they should. Finding ways of presenting your expert knowledge to ‘invisible’ learners can be greatly challenging. Any of the content needs to be relevant to the wider narrative of a course, so paying attention to the sequence is vitally important.  You also need to ensure that you support any claims you make with robust evidence – after all, you expect your students to do this!

6. Skills knowledge

How are your students going to engage with the subject? What skills do they need? How can you help them develop practical and academic skills? Thinking about these questions can guide your writing and assist you in constructing your students’ learning activity. All too often, academics tend to overlook the skills required for reading academic texts, navigating online databases, using search engines, engaging with audio-visual resources, engaging in social media, and writing in your own words.

7. Pedagogical knowledge

How can you teach an invisible audience who will be interacting with the learning materials at some point in the future? How can you engage their curiosity? The magic of learning occurs at the intersection between the distance learning materials and the engaged learner. If you get all the ingredients right, students will follow your instructions – read/watch/discuss/find/make notes – and bring their own life experience to bear on making this process and it outputs meaningful.

8. Design knowledge

Bringing together all the elements of a distance-learning course can require complex design skills. Visual design (logos, icons, images, page layout online navigation) and learning design (e.g. combining assimilative, productive, cooperative, interactive activities) are both important here. Good assessment design is also crucial for helping learners make sense of the learning journey.

9. Knowledge about online and print formats

You need to adapt your approach, depending on whether you are writing for print or writing for the computer or smartphone screen. Attention spans will be shorter on-screen, so limit long expanses of text to print resources. Of course, the boundaries are blurring with the burgeoning of tablets designed for reading books and long documents. Methods of annotating have also expanded with the introduction of new technologies. Keep up to date! In online formats, the reader can easily follow an embedded hyperlink, whereas in print it will be difficult for the reader to move seamlessly to a website. If your online materials are easy to update, make sure that any time-sensitive content is presented in this format rather than in print.

10. Knowledge about correct use of written English

You want your learners to develop good writing habits, so make sure you are modelling a good example. Paying attention to clear, plain language will also mean that learners are more likely to understand what you are saying, and reduce the risk of excluding people who are non-native English speakers. Correct use of English is also about ensuring you use non-discriminatory language. The input of skilled editors is invaluable! If you are writing your materials in any other language, the principles still apply!

No wonder great quality distance learning is highly valued and sometimes envied. It is not something to be rattled out over a weekend!

Unravelling digital health literacy

When I come across the phrase ‘Digital health literacy’, I feel uneasy. There seems to be ambiguity here and I often wonder if I’m the only one who senses it.

Finally, I have decided to explore whether there is a distinction between the potentially three meanings of the phrase:

  1. Is it about ‘digital information literacy’ in the world of health?
  2. Or perhaps we are talking about health literacy in the digital age (digital ‘health literacy’)
  3. Or is it about people’s ability to engage with digital health (‘digital health’ literacy)?

Or are they the same phenomena? To understand the knot I’ve created for myself, I’ll begin by disentangling these terms.

What is Literacy?

Literacy word cloud
Image source: http://dpcdsb-literacy.wikispaces.com/

‘Literacy’ is a widely-used term, especially amongst educators. From nurseries to universities, literacy looms as a set of skills and competencies that need mastering.

UNESCO recognises the complexity of meanings the word represents. At a basic level, literacy is about simply being able to read and write. But it is also about being educated and knowledgeable, including knowing how to access, engage with, and share knowledge. Amongst academics, the main buzzword is ‘information literacy’. In this context, people develop the skills to interpret information sources, making ‘informed judgments’. They also learn how to produce information in their own right. Armed with these skills, people are empowered to make critical decisions about key aspects of their lives, including their own health.

Temporarily discarding the ‘digital’ label, this seems a good point at which to consider ‘health literacy’. According to the World Health Organisation (2015), health literacy refers to ‘the cognitive and social skills which determine the motivation and ability of individuals to gain access to, understand and use information in ways which promote and maintain good health.’ This definition of health literacy is very close to the idea of information literacy (as defined above), applied to health domain.

And so, to digital ‘health literacy’. The European Commission’s definition of ‘digital health literacy’ looks very much like a ‘digital’ extension of ‘health literacy’: ‘the ability to seek, find, understand and appraise health information from electronic sources and apply the knowledge gained to addressing or solving a health problem’. A scan of publically available literature on digital health literacy reveals a common, almost exclusive, focus on using the internet to search for health-related information. This observation is unnerving because it seems to be side-stepping the elephant in my room, which is the one obsessed with digital health.

Image of elephant in roomm labelled with 'digital health'

Digital health involves the application of digital tools (e.g. smartphones, wireless sensors, apps, and social networking) to monitor and help to maintain health. It draws on advances in genomics and mobile technologies to individualise healthcare interventions as well as to understand population health. Self-monitoring becomes a central feature of digital health tools.

In this world of digital health, consumers have access to raw data about their own bodies and they need to develop a new set of literacies around reading and interpreting this information. I was encouraged to see that a summary of ‘digital ambitions’ for healthcare in Wales included developing capability in both staff and patients to engage with smartphones and wearable devices, as well as online records. Engaging with digital health helps to meet the ambitions of providing increasingly personalised care.

This wider application of the term ‘digital health literacy’ can encompass a diverse range of electronic information sources. Greater use of technology-based health tools would open the way for the Internet of Things, as well as the pre-digested information found on the internet. The burgeoning of digital health technologies is no less a challenge for healthcare staff as it is for the general public.

Returning to the puzzle I set for myself, it seems that the first two meanings are practically the same. Most common understandings of ‘digital health literacy’ are about digital ‘health information literacy’, where the focus is mainly on the ability to engage with health-related texts on the internet. This is different to the third meaning, in which the focus is on ‘digital health’ interventions. I would like to see ‘digital health literacy’ represent the broadest range of digital tools, data and information, to keep pace with advances in technology.