Gloves that are meant for wearing

I have to admit, I’m obsessed with gloves. I don’t leave the house without a pair, I have a drawer by the bed dedicated to them, and they can be found scattered all around our home. The washing line regularly sports a glove or two… or eight… or twelve.

It all started with a GP consultation for hand eczema, in which the doctor suggested that cotton gloves would be useful to protect my hands when they were very sore. I immediately had visions of a pair of cotton gloves I thought I’d seen on the shelf in the local Boots store. I went off confidently to equip myself. After an extended search of the shelves, I eventually purchased two pairs of cotton gloves – one to wear, and one to wash. Each pair of gloves came in a little cardboard box. The gloves were wide and baggy, and did not cover my wrists. If I wore them underneath washing up gloves, they would end up stuck inside the outer gloves when I removed them. Once a glove became wet, it was useless. Two pairs weren’t enough, of course, and I could see it would become expensive to buy gloves one pair at a time.

The search for cotton gloves

I decided that cotton gloves could be helpful if only I could find some that covered my wrists and fitted better. Surely that wasn’t too difficult? After all, there must be thousands of people like me who need to wear cotton gloves to protect their hands.  I tried every shop in town that had the faintest chance of selling gloves of any kind. I failed dismally. At that point, I turned to the internet. After many hours of research, my first online purchase was a batch of 10 ‘cotton fourchette’ gloves at 50p each, similar to the ones in this image:

cotton gloves
Cotton gloves, one turned inside out

They were fine, sort of. My sore hands made me increasingly reliant on the gloves. I wore them for driving, shopping, handling most objects, and using my computer. It was okay, but they slipped around on my hands when driving and quickly became grubby when doing anything. It was very challenging to get my credit card out of my purse; touchscreens on ATMs, for example, were impossible to use. My ability to type with any accuracy took a nosedive. I could remove my gloves, of course, when using a touch screen or handling cards, but then I was exposing my sore fingers to painful experiences, and leaving a trail of Vaseline.

Then, one terrible morning, the day of a really important meeting I was chairing, I got up to find that two of my fingers were weeping sticky tissue fluid. In my panic to get to work, I donned a pair of cotton gloves and found a pair of cycling gloves that were meant to be given as a present the previous Christmas. My fingers were so sore and drippy there was no way I could actually touch anything, even with the cotton gloves. I squeezed on a cycling glove over the worst hand and, feeling protected, I drove to work. At home at the end of the day, I was still wearing the cycling glove on the badly affected hand. Fluid had soaked through into a large damp patch. I eased off the outer glove and discovered that the cotton glove was completely stuck to my fingers. Even worse, loose threads from the seams were embedded in my raw flesh. After that day, I always wore my cotton gloves inside out.

Wearing cotton gloves is never a good look, unless you are a museum curator. Worse still, is cotton gloves with the seams hanging out. Nevertheless, I championed this look for about two years at work. For a  while, I tried cotton stockinette gloves, after buying a pack of 24. However, they turned out to be uncomfortable, especially because where the edges were sewn together they were less stretchy and tended to dig in. Once my fingers began poking through the ends, I gladly ditched each glove.

Outer layers are important too

There followed a frenzy of internet glove shopping: pure silk liner gloves, cycle gloves, hockey gloves, assembly grip gloves, leather gloves. I needed a soft absorbent surface next to my skin and a tough outer layer that would allow me to feel protected while generally negotiating the wider world. Preferably, the outer layer would also provide sufficient grip and dexterity to allow me to handle money and cards without removing them and risking the aggravation to my skin caused by constant whisking on and off. It was also helpful if the outer glove was resistant to rain. Going out in the rain wearing cotton gloves is, frankly, absurd.

For wet work in the kitchen, decorating and some gardening jobs (and being allergic to rubber), I purchased several boxes of vinyl gloves from a company that mainly sells to business.  Buying in bulk was much cheaper than  picking up the small packages available in supermarkets. I learnt that powder free is best, as long as I wear them over a liner. The downside of disposable vinyl gloves is that they are easily nicked and split when chopping or peeling vegetables. Also, be prepared for onions and potatoes to fly our of your hands at any moment! Any waterproof glove causes a build-up of moisture against the skin, so it’s best to limit wear to short periods.

Vinyl glove worn over a Skinnies glove
Vinyl glove worn over a Skinnies glove for wet work

I was starting to feel better equipped with the outer layers. However, I was increasingly disillusioned with the cotton gloves. Even if they were long enough to cover wrists, in practice they rode or curled up. As soon as they became damp (e.g. when gardening), they held the moisture against my skin. I was also desperate to find gloves without seams. Not much to ask, surely? As the eczema had settled on my fingertips, I needed to solve the problem of typing, and seams hanging out hampered me greatly. Also, at night the outward facing seams were very good for scratching and damaging all the other itchy skin. And wouldn’t it be good if I could use touchscreens with my gloves on!

Using touchscreens

Hours … and days of internet research later, I received an assortment of packages through the post. The photos below of just a few of the gloves show the outcome of months of hard wear!

I was particularly excited by two of my acquisitions, which would make it possible for me to use a touchscreen. First up, Thermocool seamless gloves with silver filament. They were amazing to use with a touchscreen, except that on occasions with my smartphone I couldn’t achieve quite enough precision. With wear, the silver filaments began fluffing up, which further reduced their effectiveness. As the gloves wore further, the tiny silver threads began to irritate the sensitive skin on my fingers.

Well-worn Thermocool gloves
Well-worn Thermocool gloves with silver filament

The second pair of gloves I could use with a touchscreen were these coffee-coloured ones, which had stitching on the thumbs and first fingers that did, indeed, allow me to work with touchscreens. They were a cotton and modal blend, and very soft. I wore them all summer to keep the sun off and enable me to touch and handle things. The seams were relatively unobtrusive.

Touchscreen-enabled Summer gloves
Touchscreen-enabled Summer gloves

Moisture wicking

Moisture wicking fabrics minus seams is something I now consider essential for a base layer, although there is very little choice when purchasing gloves without seams. Moisture wicking occurs when moisture is drawn away from the skin via capillary action. A useful feature of many synthetic fibres and fabrics designed for wicking is that they tend not to cling to the skin or feel soggy, as cotton does. Cotton can absorb a lot of water, but the water tends to remain held against the skin. This can be  bad news for eczema sufferers like me whose eczema is triggered or worsened by soggy skin. I wish dermatologists weren’t quite so evangelic about cotton.

My current situation is that I have discovered Skinnies gloves, and favour them for being seamless and having good moisture wicking performance. If I wear them underneath my gardening gloves, for instance, my hands stay noticeably drier for longer, when compared with cotton. Skinnies are made of 86% viscose, 11% nylon and 3% elastane. They cover my wrists well. I can tuck my sleeves into them if I am gardening and want to keep irritants and the sun off my arms. They are sufficiently unobtrusive to wear for typing for a short periods, although not perfect by any stretch of the imagination. I can’t use a touchscreen when wearing them, but do now own a touchscreen stylus pen.

'Skinnies' gloves
‘Skinnies’ gloves

When out and about, my other mainstay is Foxgloves. These are great for rummaging around in a handbag, removing cash and cards from my purse, and doing most things normal people take for granted in their everyday lives. Made from Supplex® nylon and elastane, they last and last, and wash well. Designed by a gardener who had hand eczema they are sturdy and keep a lot of particles and the sun off the skin. At the same time, they are soft and breathable. The seams are bearable most of the time, and when my fingers are particularly sore, I wear my Foxgloves over the top of Skinnies. I have a black pair, which comes in useful for funerals and other formal occasions. I really couldn’t function without my Foxgloves.


I have travelled a long and difficult journey in my efforts to live with often debilitating hand eczema. I would like dermatologists to wake up to the fact that there is a range of novel fabrics available as alternatives to cotton. Cotton is not the answer to everything.


Histamine – the enemy within for allergy sufferers


You probably know someone who suffers from allergies that result in hayfever, asthma or urticaria. It could be you. The recent heatwave in the UK has caused huge discomfort for many such allergy sufferers. A period of rain allowing lush grass growth followed by the heat has given rise to a perfect storm of airborne pollen. Antihistamines are the main pharmaceutical remedy for allergies, and I imagine they are flying off the supermarket shelves as I write. Cetirizine, Loratadine, and Acrivastine are types of antihistamine readily available in the UK.

But have you ever wondered why ‘anti-histamines’? Why is histamine such bad news for allergy sufferers, and why do our bodies produce it if it wreaks such havoc? Well, actually, our bodies need histamine for normal functioning. Histamine plays an essential role in protecting the skin, airways and digestive tract from invaders such as parasites, bacteria and viruses. It also acts as a chemical messenger in the brain, helping to keep us alert. Additionally, histamine helps to stimulate the release of stomach acid. Histamine’s role in fighting the invasion of harmful organisms and initiating tissue responses to damage is what affects so many allergy sufferers.

Photo of airboorne grass pollen
Airborne grass pollen

Mediator of protection and healing

If you cut your finger or inhale a virus into your air passages, your body needs to supply extra blood to those areas to bring in the necessary ‘weapons’ and healing substances. Release of histamine, which is stored in ‘mast cells’ in the skin and mucous membranes, makes the tiny blood vessels dilate, allowing more blood into the area. The walls of the tiniest blood vessels, the capillaries, become more permeable to allow protective white blood cells and fluid to pass into the tissues. They become ‘leaky’. This leakiness means that tissues swell with extra fluid and blood cells. In the skin, this appears as swelling and redness. In the airways, there is congestion and watery mucus appears. Itching and sneezing can occur from stimulation of nerve endings.

Histamine molecule
Histamine is a small molecule made of three nitrogen, five carbon and nine hydrogen atoms.

Harmless substances taken for pathogens

When someone has an allergic reaction, their body interprets harmless substances such as pollen as being pathogenic. At some point in the past, their immune system has become sensitised. The hygiene hypothesis proposes that children in the developed world have insufficient exposure to a wide range of harmful organisms, and a relatively ‘idle’ immune system targets harmless substances instead. Allergic reactions are most closely aligned with the mechanisms for fighting parasitic invasions, in which there is a massive release of histamine in the affected tissues.

In allergic individuals, then, histamine feels like the enemy within. The itching, soreness, sneezing, coughing and wheezing evoked by histamine release cause extensive misery, and in extreme cases an allergic reaction can be fatal.


In extreme cases, histamine release can be deadly. If you have a severe allergy to wasp or bee stings, certain medications, or foods, for example, it may cause anaphylaxis. Overwhelming levels of histamine are released into many parts of the body within a very short time. Skin breaks out in urticaria, the mouth and throat swell, airways constrict. Rapid leakage of fluid from the capillaries into these body tissues can cause a fall in blood pressure, leading to fainting. All these effects combine into anaphylactic shock, which is life threatening. The main lifesaver is an injection of adrenaline, which counteracts the most damaging effects of histamine. You may be interested in this informative video featuring Anaphylaxis Campaign Professor John Warner OBE:

Mast cells

A discussion of hayfever feels quite mundane after contemplating anaphylaxis! It is worth taking a closer look at the mast cells before a final thought about antihistamines. Mast cells are made receptive to allergens (which act as antigens – something that triggers an immune response) by antibodies attaching themselves to the cell. Once triggered, the mast cell releases histamine into the surrounding tissues. Most of the time, this is a local response in the air passages or a certain place on the skin. In anaphylaxis, mast cells are triggered on a large scale.

A diagram of a mast cell
A mast cell releasing histamine (diagrammatic)

The blood-brain barrier

You remember I mentioned that histamine acts on the brain to help us stay alert? Presumably, then, antihistamines interfere with this? I remember as a child feeling privileged that my GP took the trouble to explain to me why the antihistamines I was taking made me feel sleepy. Luckily, for those hayfever sufferers nowadays who drive or operate machinery or are doing exams and so on, pharmacologists have developed modern antihistamines to minimise the effects on the brain. They have taken advantage of something known as the ‘blood-brain barrier’, identified in 1913. Modern antihistamines are barely able to pass from the blood into the brain through the blood-brain barrier, and therefore have almost negligible effects on a person’s wakefulness.

It’s quite a simple idea to grasp, although very complicated in detail, and still the subject of intense research. In short, the blood-brain barrier is a unique physical and chemical structure in the body that keeps the majority of the substances carried by the blood out of the brain. It will allow, for example, glucose, oxygen, amino acids, hormones and anaesthetics to pass through, but not antibodies, toxins and bacteria. Although alcohol can be considered a toxin, it is small enough to slip through the blood-brain barrier. If scientists could prevent alcohol passing into our brains, I wonder if this would increase or decrease our consumption?

I wish all you allergy suffers some respite from the sneezing, itching and wheezing this summer!

Photo of grass
A grass head laden with pollen

What kind of bread should you eat?

A sceptical approach to commonly held attitudes and beliefs

I took part in a fascinating discussion at a conference a few weeks ago. The conference theme was ‘What shall we eat?’ with the parallel theme ‘how shall we grow our food?’ The speakers were excellent. One of the workshops I attended was exploring attitudes towards food, in relation to production methods.

During the workshop, the facilitators handed around two bread samples from a white sliced loaf and an artisan loaf. The overwhelming consensus was that the artisan bread was healthier – a ‘no brainer’ as far as healthy food choices are concerned. Who with any sense would choose a white ‘factory’ bread over the artisan option? It got me thinking about people’s values regarding food. What drives our choices?

Spoilt for choice

These are some of the attitudes shared in the workshop:

Artisan bread is more filling and satisfying/ more nutritious / better for the environment/ supporting small producers/ contains natural chemical-free ingredients

I felt increasingly uncomfortable. How much choice do most people really have? For me, these attitudes smacked of food snobbery. (TV presenter Greg Wallace gives his own take on food snobbery.) Do ‘natural’ food campaigners sometimes lose sight of the overwhelming social and financial issues that restrict food and lifestyle choices? But more importantly, is artisan bread really that much better for us?

Is artisan bread more nutritious?

As a result of this discomfort, I feel compelled to check out the bread facts. To what extent is artisan bread healthier than mass-produced sliced white bread? Let’s start with the nutritional content for protein, carbohydrate fat, energy and dietary fibre. In the cold light of day, there is actually very little difference in the macronutrient content of white bread and artisan bread.

Figures derived from and

So, what about the micronutrients? A scan of a range of sources reveals that the differences in the levels of thiamin, niacin, riboflavin, iron and calcium are negligible, although they might be slightly lower in white bread.

Is artisan bread easier to digest?

In my investigation of bread and health, I found an article by Jamie Oliver.  In it, he states that ‘Artisan bread is actually easier to digest, because the enzymes have had time to begin breaking down the gluten in the flour while fermenting’. This is  news to me, and I want to find out more about the breakdown of gluten in the bread-making process. And what about the claim about this making the bread more easily digestible? I need to find out!

It’s quite hard to get beyond the rhetoric about gluten. If you believe the vast majority of web chatter, gluten is a very bad thing indeed. I accept that evidence is building of adverse human responses to gluten. There are plenty of science-based articles pointing to the dangers of gluten. Despite referencing their sources, even here there seems to be some misinterpretation of the science, and I’m looking for the bare facts.

I feel a bit more comfortable with the standard of this article written by Jake New, as it draws directly from original science and doesn’t make over-inflated health claims. Jake explains how the gluten proteins can be incompletely broken down. Some people develop an inflammatory response to the resulting polypeptides (partially digested proteins). I’m still no further in discovering whether the preparation method really does makes the gluten in one bread easier to digest than another. What a pity Jamie Oliver did not give a reference for this claim! The only way I know to resolve this is to read the scientific papers, and particularly any systematic reviews of the research on gluten and health.

According to a systematic review by Smith et al (2015), about half the protein in wheat comprises gluten. Importantly, the authors stated that research into gluten digestion is conducted using test tubes. It is not particularly straightforward to extrapolate these findings into what actually happens inside your stomach and intestines. This aside, in their experiments, they found that baking markedly reduced the digestibility of the gluten proteins, compared with flour. They also found that the gluten proteins in bread are hardly broken down at all by the stomach enzyme pepsin, but that the enzymes normally present in the small intestine were effective in digesting gluten.

I cannot find anything discussing the baking method with respect to gluten digestibility, although I do feel more knowledgeable about gluten composition and digestion.

This New Zealand site created by the Baking Industry Research Trust explains a number of useful facts about bread and gluten. It confirms what I thought – that fermentation involves the breakdown of starches to produce the gas carbon dioxide and alcohol. I have found no sources explaining how fermentation might break down gluten. It explains that the proteins gliadin and glutenin combine during the bread-making process to form gluten, which does not otherwise exist in this form. According to another source,  the fermentation process also produces lactic and acetic acids, which add to the flavour.

To summarise, this is what I have discovered so far:

Wheat does not contain gluten – it contains two proteins, gliadin and glutenin, that combine during the bread-making process to form gluten.

The part of gluten that causes allergic responses and intolerance is gliadin, after the gluten has been partially digested into glutenin and gliadin.

The gluten in bread is much more difficult for human digestive enzymes to break down than the separate proteins found in flour. This seems to happen in the duodenum rather than in the stomach.

In artisan baking, the gluten undergoes repeated, slow cycles of stretching and relaxing, during which time the starches and sugars in the flour ferment into carbon dioxide, alcohol, and lactic and acetic acids. The liquid by-products contribute to the tastiness of artisan breads. It isn’t clear where the claims about the gluten becoming more digestible originate from.

Does artisan bread contain fewer harmful chemicals?

The UK Flour Advisory Bureau provides a helpful guide to the additives commonly made to flour. For bread, they might augment the enzymes that are already naturally present. To improve the texture and structure, they add Vitamin C and sometimes the amino acid cysteine (which is a necessary component of our body proteins and can be made by our livers). All flours except wholegrain are also fortified with B vitamins and calcium. And so far, I’m not worried about the chemicals in mass produced bread.

But what about pesticides? This is a complex topic and one to put aside for a follow-up blog article!

What kind of bread will I eat?

I have reassured myself that as far as nutrition is concerned, the type of bread I choose makes little difference. It’s far more important to consider diet as a whole. There will be times when I want to splash out on an artisan bread as a treat, and also when I want to support my local breadmakers. What bread choices do you make and why?

The fragile frontiers of trust in healthcare

Have you ever been in the situation of vulnerability, perhaps as a hospital patient, in which you felt able to trust some clinical staff more than others? As I have previously studied trust in the nursing context (see. ), I have decided to explore whether there are any recognisable patterns at the personal level in feelings of trust or distrust. What leaps out at me more than anything else is my observation that the issue of trust doesn’t become relevant until the moment you encounter someone you feel you can’t trust entirely.

As this is a sensitive issue, I am keeping my personal sources private. And I should make clear that this blogpost, being drawn from highly personal experiences, cannot claim to be generalisable. Despite these caveats, I hope that my observations find resonance with readers.

I have tried to find useful starting points in the trust literature. The most helpful source I have found is an academic paper by Liz Bell and Anita Duffy published in 2009 in the British Journal of Nursing.

They identified the following four characteristics of trust in a nursing context:

  • Expectation of competence
  • Goodwill of others
  • Fragility/vulnerability
  • Element of risk.

I’ll organise my thoughts around this list.

Expectation of competence

It is not easy for patients to know whether the people caring for them are competent. One patient might be impressed that a care worker is wearing gloves to do finger-prick tests, and another might be very concerned to notice that the gloves are not changed between patients.

Healthcare Support Workers (HCSWs) and housekeeping staff are the people who have the most sustained contact with patients, maintaining the routine ‘servicing’ work. Between them, they make sure the beds, bodies, floors and so on, are clean. They serve up food and drink. They maintain the routine patient observations of ‘vital signs’, enquire whether patients have any pain, and record food and drink intake and bowel movements. In a recent hospital stay, I was surprised that the only person who asked me how I was feeling ‘in myself’ was a doctor.

From the vantage point of a patient, it is hard to know whether or not someone is good at their job. So much of the work happens out of sight. With the divisions of labour on a hospital ward, it is difficult even to know what people’s jobs are. How do you know if someone is stepping outside the boundaries of their competence? Some members of the housekeeping staff might be very keen to engage the patients in conversation about their conditions and their anxieties, and others quietly get on with cleaning tasks. Which is more competent? Can you trust a housekeeper to handle your anxieties sensitively as much as you can a nurse?

Much of this expectation of competence is an expression of hope. Expecting confidence and goodwill from those who care for you is based on hope for a good recovery and/or the best possible care at a time of need.

Goodwill of others

Do healthcare staff have the best interests of their patients at heart? Are they diligent? Do they have the personal capacity to care?

Body language

Why would a patient sense a lack of trust in someone who, as far as they can tell, is perfectly competent? Part of the answer here is that healthcare is not just about getting things done, although this is of course vital. Trust is important because people are not simply machines to be serviced and repaired. Trust also hinges on attitude. Body language could give clues.

Some people look as though everything is just too much effort. Perhaps they are preoccupied with their own tiredness, discomfort, or boredom. If they are in a caring role in these circumstances, it can be difficult for a patient to feel confident about their capacity to care.

By contrast, some people exude a kind of flamboyance clearly aimed at cheering patients and colleagues with their sunny and congenial disposition. Whilst it’s reassuring to know that people are putting positive energy into their work, patients may be left wondering whether this behaviour is masking a lack of knowledge or a fundamental lack of confidence. Most concerning – is this flamboyance a sign of over-preoccupation with ‘jollying people along’ at the expense of the more sober work of caring for people who are in a vulnerable position? Now, I suddenly realise this perceived capacity to care could be at the root of trust.


Diligence and capacity to care

I am starting to feel some better insight into the elusive nature of trust. Patients need to know that a person who has a key role to play in their care is taking a solicitous and diligent approach to their work. A telling proxy for care and diligence is hand hygiene. Who do you trust better? A flamboyant high-octane worker who does not change gloves between patients, or a very reserved care worker, who has a downtrodden air about him, going quietly about his work, sanitising his hands in a well-rehearsed fashion at all the appropriate times? Perhaps the more concrete concerns about cross infection can override the initial impact of body language.

The strange thing is, if you trust someone intuitively, you probably wouldn’t bother to monitor their hand hygiene. I remember the badges that were all the rage just a little while ago – the ones that stated ‘Clean hands? It’s OK to ask’. Whatever happened to them?

If body language sparks distrust, I know that trust is repairable. The person just needs to show that patient’s recovery, dignity, or relief of suffering is their prime concern as they carry out their work. Patients don’t want to feel that we are inconveniencing the staff. On the whole, they want to be ‘good’ patients – not excessively demanding, grateful for the care and thankful that the NHS still exists. In return, they want the staff to see them as individuals.


To encourage a trusting relationship with patients, staff need to let down their defences just a little.


Imagine I’m lying in a hospital bed. You appear at my side. I’ve never seen you before, I don’t recognize your uniform, I don’t know on what basis I can trust you. Tell me who you are when you first come to my bedside. What is your name? What is your role? Are you qualified for the tasks you are performing? Are you responsible for my care today, or are you just performing the one task? It makes a difference. Without this knowledge, I am left wondering. I don’t understand how the team fits together and how the communication is working behind the scenes. I don’t know whether I should trust you or not.

Does someone have my back?

Now I feel I’m really getting somewhere in my efforts to understand the trust or distrust felt by hospital inpatients. They want to know that someone has their back. If things aren’t getting done properly, they want to know that someone in a position of authority will notice. They need to know where the accountability lies. A clear sense of leadership is missing sometimes. Patients need to know who is in charge. Managers and leaders need to be making some visible effort to communicate with patients person to person.

If I said to a nurse that I am expecting a blood sample to be taken today and am concerned that it hasn’t happened, it is only mildly reassuring to be told that the phlebotomist will turn up at some point if they want the blood. I want the nurse to check. Perhaps I misunderstood what the doctor said yesterday. Taking initiative at this one-to-one level would go a long way in restoring trust. Patients are afraid of appearing too bossy, too interfering, too self-important. They want nurses to be able to ‘read’ situations and respond appropriately.

Will you respond appropriately in an emergency?

It takes a position of vulnerability to raise your sensibilities regarding trust. If you’re not feeling vulnerable, trust isn’t important. Sudden life-threatening events expose vulnerability at its most extreme. If a patient observes a care worker not responding with sufficient urgency, either to a patient’s call for help, or a dangerous change in vital signs, any trust they have in that worker is likely to crumble.

Element of risk

There is an element of risk on both sides of a relationship based on trust. From a worker’s point of view, letting down one’s defences could open the way to overfamiliarity. Worse, giving away ‘too much information’ could lead to misinterpretation on the part of a patient. Healthcare staff are constantly aware of the risk of litigation. Taking time to follow up the concerns of one patient will have to be balanced with competing priorities – something has to give.

From the viewpoint of a patient, their very vulnerability adds an element of risk. For example, they have to trust nurses to administer medications correctly. Any routine medications they could normally manage themselves are now in the hands of someone else. Hospital acquired infections loom large as constant threats. Invasive treatments such as surgery or the insertion of needles, cannulas and tubes carry risk. Sick patients are at risk of developing pressure ulcers. I can go on.

Patients want to trust

I’ll conclude by drawing together the strands I have woven here. Patients want to trust healthcare staff. It goes with the hope they hold at a time of extreme need: hoping for recovery, or simply tender loving care. Body language gives important signals, but perhaps the more concrete demonstration of diligence and capacity to care hold sway. The vulnerability of patients can make them look for signs that someone has their back. They need to know that there is a trustworthy figure who is accountable. They need to see that staff respond appropriately in an emergency. Healthcare is a risky business. Risk is present on both sides. Everyone, including patients, needs to pull together to minimise these risks. I hope the steady, caring, trustworthy people get the recognition they deserve.

Prototyping for eHealth

I have never used eHealth apps myself. However I am in the middle of a course on Futurelearn that is all about developing eHealth. This means I am able to participate in discussions with people who do have experience of using eHealth apps. The course is great for me. I’ve always been curious about the processes involved in developing apps for mobile phones, and now is my chance to have a go.

Currently, I am at the stage of designing a ‘lo-fi’ prototype for an eHealth app. Prototypes of this sort don’t need to be technical in any way. In fact, we are being encouraged to make them on paper or on a computer screen. At the same time as taking this course, I am also working on a literature review about diabetes prevention. So what better topic for my first design than a Type 2 diabetes prevention app?

Developing the concept

Actually, I decided it might be more realistic to create a diabetes manager app. At least there is a clear target audience. After all, I am finding from the literature review that it can be very challenging to engage people to look after their health “just in case” they develop a preventable disease.

So far, I am discovering that an important key to a good design is to make the app visually appealing. The information needs to be concise, and people need to be enticed in with simple images, videos and clear navigation. Underneath all of that, the needs and requirements of the target audience need to be established. People need a very good reason to use it.

Understanding user requirements

Why would someone with diabetes want an app? I can only guess that people who use apps are already committed in some way. They are motivated to take good care of themselves and their condition. And, they are probably the kind of people who are interested in making technology work for them. I will have to leave to another day the problem of how to reach people who do not fit into these two categories.

I’m interested in how healthcare professionals and service users can work together to manage long-term conditions. There’s a huge scope for technology to play a supporting role here. I like the idea of technology levelling out the playing field between clinician and patient. People need to be empowered to take control.

For instance, when you attend a consultation with a clinician, they will have all the data about you available at their fingertips. But you don’t have the data in front of you in the same way. As a patient, you are aware of the time pressures, and you’ve been thinking for weeks about all the things you wanted to talk about. If you’re anything like me you come armed with the set of questions, but don’t ask them all because as you get the end of the list, it all starts to feel a little trivial. Perhaps those last two questions aren’t really very important. You feel you are already taking up too much of their time. That’s my reality, anyway.

When I first started to design my diabetes manager app, I had these experiences in mind. I was also considering a case study in which a clinician had to input a lot of patient self-report data by hand during a consultation. The challenge was to design an eHealth solution that would solve some of the problems for both clinician and patient.


At first, I felt completely overwhelmed by the task. But then I thought I should really just have a go. I tried to analyse the situation from both perspectives. I tried to apply the design principles. I also decided to have a go at using Balsamiq software .

Balsamiq is a software environment that allows you to do rapid mockups. It claims to reproduce the experience of sketching on a whiteboard. I always find a little bit of technology helps me to be creative and overcome the inertia of getting started on something that feels difficult.

Imagine Ravi who has type II diabetes. He’s really keen to manage it better, and would like to have a better relationship with the diabetic nurse specialist who he sees about once every couple of months.

So I thought, what if Ravi has a really clever glucose meter that will transmit data to a phone app, and the app will transmit this data to a central patient record. Then, I thought well, what if everything else like the exercise diary, food diary and weight measurements could also be sent by the app to the central record? Ravi could agree with the nurse specialist the kind of things to keep a record of.

I started to sketch out the home screen of the app. I majored on using photographs for visual appeal and interest. I had very little text on this screen. Alongside the exercise diary and food diary, I decided that Ravi might want some information as an aide memoire of exercises or types of physical activity he can do as well as the kind of foods he should be encouraged to eat or avoid.

Here is a screenshot of the prototype I came up with after about an hour fiddling around with Balsamiq. This of course is only the first step. Each link on the homepage opens up a range of possibilities. How will we exercise and food tips be presented? Should I use video and how much? How can I produce a blood glucose graph? And so on.


Learning with others and making connections

Learning as part of a large group has its benefits here. I’m able to pick up ideas from my peers. For example, wouldn’t it be great to incorporate reminders to exercise, measure blood glucose, or take medication? One of my fellow students included mood snapshots – how am I feeling now? What a great idea! Sending alerts to the clinician if things are going a bit haywire would also be very useful.

Then, I remember from my work on the literature review that waist measurement can be very significant in people with diabetes. Again, someone on the course had included this in their app. Aiming to reduce your waist measurement can have a huge impact on your metabolism. Controlling weight and waist size can even mean that a person no longer needs further interventions to control diabetes.

Once I have finished work on the literature review, and once I have moved on to usability testing on the course, I may have more to say. If you would like some help to manage your diabetes, would you use this sort of thing? What apps are you already using?

Blisters and spoons

How dealing with severe hand eczema is a bit like juggling the ‘spoons’ in spoon theory.

I’m the kind person who is always in a rush. I have a certain amount of things to get done in a day, and I dart between one thing and the next. I hate it when I have to stop to search for a piece of essential equipment. I’m always racing against the clock. I squeeze as much as I can into a day.

All this changed about four years ago. Something happened that meant I had to slow right down. It wasn’t multiple sclerosis or chronic fatigue syndrome, lupus or cancer. It wasn’t any those high profile diseases that are well known for making people reappraise their whole lives. It was eczema.

I know that a lot of people have eczema. And that many parents are grappling right now with children who are terribly debilitated by it. I have had eczema in different forms all through my life, although there have been periods when it hasn’t really bothered me very much. Four years ago, the eczema appeared on my hands. This wasn’t the first time I had had eczema on my hands, but this time it was different.


Whereas before the eczema was on the sides of my fingers, on the backs of my hands, knuckles, and on my wrists, this time it appeared on the palms of my hands and on the surfaces of my fingers that I use to touch things. That changed everything. Suddenly, I experienced pain and soreness every time I touched anything. The inevitable itching that accompanied the eczema was all the more intense for being located on the parts of my body that were particularly dense with nerve endings. It seemed it could only get worse. Dense clusters of itchy blisters eventually gave way to peeling skin. Skin peeled and peeled, eventually leaving a fragile and terribly thin layer of parchment-like skin.

Now I lived in a very different world. I started wearing cotton gloves to protect my hands. It wasn’t long before I started turning the gloves inside out so that the seams were pointing outwards. I couldn’t even bear to have the seams pressing against my skin. When I needed to touch something wet, I had a choice. I could either remove the gloves, get my hands wet, dry them and put the gloves back on, or I could don waterproof gloves over the top so that I didn’t have to remove the cotton gloves. As my skin flaked and peeled, the razor sharp edges would snag on the fibres and made it increasingly difficult to keep on putting the cotton gloves on and off, so increasingly I opted to wear waterproof gloves over the top whenever I wanted to do anything wet. Wet hands were bad news anyway, as my wettened skin always became more sensitive afterwards.

Working in the kitchen was the most trying. Peeling and chopping vegetables was a challenge. Carrots and potatoes slipped through my fingers and sometimes shot across the kitchen. My disposable vinyl gloves usually managed to get nicked or torn. This meant my cotton gloves would get wet, so I would have to change into clean dry pair before I could continue. Forging ahead with wet gloves usually stored up trouble for later, as my skin would never forgive me for leaving it wet for any length of time. Wearing waterproof gloves did not guarantee dry hands, even if they didn’t leak. Eventually perspiration would build up inside waterproof gloves, which meant that my hands would get damp after a certain length of time anyway.

I entered the world of having to make a decision every time I began a new task. How much do I fear damp hands at this moment? Which gloves should I put on? What can I achieve before my hands start to sweat inside these gloves? Cotton gloves alone were also pretty useless for more heavy-duty work such as going shopping and driving a car, not to mention gardening. Cotton gloves are not designed for such wear and tear. They rub against the surfaces of your hand, get grubby very quickly, and look rather shabby. And of course they are not waterproof which means that going out in the rain with them makes me feel anxious and a bit silly.

I managed to find some variations on gloves by searching the internet. I invested in cycle gloves and hockey gloves that I could wear over the top of my cotton gloves. They offered some firmer support and also some grip, as well as being tough enough to manage driving a car, handing shopping bags and trolleys, freezer goods, tins, packets, and so on. Since those early days of my adventures with gloves, I have accumulated quite a store of them. I have a whole drawer in my bedroom dedicated to gloves. Next to my skin I now wear cotton, viscose, silk, and a whole range of highly engineered moisture wicking fabrics. I have discovered external coatings on gloves that allow me to grip the credit cards in my purse. Some gloves will allow me to use my smart phone, others won’t.


My main motivation for writing this blog post was not so much to discuss gloves, but to talk about spoon theory. Spoon theory is a neat metaphor for expressing the amount of energy you have to get through a day – the possibilities available to you. You start the day with a certain number of spoons, and once you have spent them all you are out of choices. I feel very much like that when my hands are bad. I can only achieve so much in one day. If I do house work, I can’t do the garden. I can’t go shopping AND prepare a meal. If I hang out the washing, get it in again later and put it away, I might not be able to read a book that evening. I might not put the washing away the same day, though, as it’s difficult for me to work out whether or not it is dry. Depending on where the blisters are on my fingers, reading a book might be off the agenda anyway.

Just by seeing the links between my own situation and spoon theory, I feel connected to a wider community. However, I can’t help feeling a bit like an impostor. I haven’t seen anyone else relate hand eczema to spoon theory. How can I think that what I have is equivalent to some much more serious chronic diseases? Nevertheless, I’m starting to see how it can be a useful way of explaining to other people the wide ranging impact of having severe hand eczema.


The Spoon Theory by Christine Miserandino

What is the Spoon Theory?

Severe hand eczema: Major new clinical trial compares treatments “head to head”.

Unravelling digital health literacy

When I come across the phrase ‘Digital health literacy’, I feel uneasy. There seems to be ambiguity here and I often wonder if I’m the only one who senses it.

Finally, I have decided to explore whether there is a distinction between the potentially three meanings of the phrase:

  1. Is it about ‘digital information literacy’ in the world of health?
  2. Or perhaps we are talking about health literacy in the digital age (digital ‘health literacy’)
  3. Or is it about people’s ability to engage with digital health (‘digital health’ literacy)?

Or are they the same phenomena? To understand the knot I’ve created for myself, I’ll begin by disentangling these terms.

What is Literacy?

Literacy word cloud
Image source:

‘Literacy’ is a widely-used term, especially amongst educators. From nurseries to universities, literacy looms as a set of skills and competencies that need mastering.

UNESCO recognises the complexity of meanings the word represents. At a basic level, literacy is about simply being able to read and write. But it is also about being educated and knowledgeable, including knowing how to access, engage with, and share knowledge. Amongst academics, the main buzzword is ‘information literacy’. In this context, people develop the skills to interpret information sources, making ‘informed judgments’. They also learn how to produce information in their own right. Armed with these skills, people are empowered to make critical decisions about key aspects of their lives, including their own health.

Temporarily discarding the ‘digital’ label, this seems a good point at which to consider ‘health literacy’. According to the World Health Organisation (2015), health literacy refers to ‘the cognitive and social skills which determine the motivation and ability of individuals to gain access to, understand and use information in ways which promote and maintain good health.’ This definition of health literacy is very close to the idea of information literacy (as defined above), applied to health domain.

And so, to digital ‘health literacy’. The European Commission’s definition of ‘digital health literacy’ looks very much like a ‘digital’ extension of ‘health literacy’: ‘the ability to seek, find, understand and appraise health information from electronic sources and apply the knowledge gained to addressing or solving a health problem’. A scan of publically available literature on digital health literacy reveals a common, almost exclusive, focus on using the internet to search for health-related information. This observation is unnerving because it seems to be side-stepping the elephant in my room, which is the one obsessed with digital health.

Image of elephant in roomm labelled with 'digital health'

Digital health involves the application of digital tools (e.g. smartphones, wireless sensors, apps, and social networking) to monitor and help to maintain health. It draws on advances in genomics and mobile technologies to individualise healthcare interventions as well as to understand population health. Self-monitoring becomes a central feature of digital health tools.

In this world of digital health, consumers have access to raw data about their own bodies and they need to develop a new set of literacies around reading and interpreting this information. I was encouraged to see that a summary of ‘digital ambitions’ for healthcare in Wales included developing capability in both staff and patients to engage with smartphones and wearable devices, as well as online records. Engaging with digital health helps to meet the ambitions of providing increasingly personalised care.

This wider application of the term ‘digital health literacy’ can encompass a diverse range of electronic information sources. Greater use of technology-based health tools would open the way for the Internet of Things, as well as the pre-digested information found on the internet. The burgeoning of digital health technologies is no less a challenge for healthcare staff as it is for the general public.

Returning to the puzzle I set for myself, it seems that the first two meanings are practically the same. Most common understandings of ‘digital health literacy’ are about digital ‘health information literacy’, where the focus is mainly on the ability to engage with health-related texts on the internet. This is different to the third meaning, in which the focus is on ‘digital health’ interventions. I would like to see ‘digital health literacy’ represent the broadest range of digital tools, data and information, to keep pace with advances in technology.


How bad can foot eczema be?

shoe-1029734_640I remember hearing a woman interviewed on the radio a few years ago. She was claiming income support (welfare) and lamented that she was unable to attend job interviews because she was allergic to shoes. This was at a time when the press were vilifying any unemployed people who appeared ‘undeserving’ of welfare payments. In the prevailing mood, it was difficult to accept the foot condition as a valid excuse.

At the time, even though I had already succumbed to severe hand eczema, foot eczema was just an abstract idea, something for my imagination. I felt thankful it was only on my hands, bad as that was. But I did develop eczema on my feet, and it was only a matter of months after that radio interview. And it turned out, I had become allergic to my shoes.

Foot eczema

It was bad. The itching was intense. Tense blisters appeared, making it impossible to put shoes on. I had to sleep with my feet hanging over the edge of the mattress, as I couldn’t stand even the weight of my feet on the bed.  I was thankful that I was able to work at home for several days, as it was impossible for me to leave the house.

Eventually, I had some patch testing. It revealed that I was allergic to two rubber accelerator chemicals (vulcanisers), which commonly occur in the rubber found in the soles of shoes. I still remember studying the vulcanisation (creating a polymer) of rubber in A level chemistry.  These substances can also be found in the adhesives used in joining leather.  As shoes become worn in and are exposed to moisture, the chemicals leach out and come in contact with the skin. Step number one – know your enemy!

Translating chemical names into which shoes you can wear is something else entirely. Shoes don’t come labelled conveniently with things such as ‘contains thiuram chemicals’ or ‘contains carbamates’. The dermatology advice was to ‘wear all-leather shoes with no inner sole (like moccasins), plastic shoes or wooden clogs. If you have difficulty acquiring shoes without rubber insoles, remove the insoles before wearing and replace with those cut from piano felt, cork, or plastic.’ Try going into a shoe shop and asking what their insoles are made of! Try removing the insoles from women’s shoes.  As I suspected that simply sweating was also a likely factor in my eczema, I was keen to avoid plastic shoes too. The advice was also to discard old socks, as they can harbour the harmful chemicals.

I bought some wooden clogs online, and discarded all my socks. Hobbling around with my sore, bare feet (not yet able to go sock shopping) in wooden clogs, I began my campaign to find wearable shoes.

SatraFootwearI contacted Satra Technology, a company based in Kettering, UK, known as a ‘leading technical authority for footwear and leather’. They advised me on alternatives to leather or plastic soles. Interestingly, thermoplastic rubber (TR) and crepe are not vulcanised, so they were safe. Also, I could look out for polyvinyl chloride (PVC), polyurethane (PU) and ethylene vinyl acetate (EVA) soles. Try going into a shoe shop and asking what their soles are made of.

Some shoe retailers were more helpful than others. I was very relieved to find that my recently acquired Trespass walking boots could get the thumbs up – the only rubber component was the outer sole – the midsole and insoles were fine. I discovered that many shoe manufacturers source the leathers and various components from all over the world. It was almost impossible for them to vouch for the exact materials used, especially the glues. Even craft shoemakers relied on glues/cements to keep their leathers in line.

The only assurance I could obtain on leather adhesives was with Gore Tex.  Clarks informed me that glues were not used in joining the uppers and linings of Gore Tex shoes. I was also very thankful when I found Easy Wellies, who made it extremely easy to search their stock for PVC boots and garden shoes.

I eventually had to give up on asking about the cements in leather shoes. My life was slipping away week by week, with one manufacturer after the other unable to give assurances on this. I was becoming tired of existing in PVC wellies and wooden clogs (and my hefty walking boots).  Step number 2 – avoiding the enemy – was proving very hard.

These days, my first stop for shoes is Hotter. Many, but not all, of their styles use polyurethane for soling. Yes! Quite a few of their styles contain elastic (made with rubber?), so I avoid those. Their customer services department is very helpful and will bend over backwards to research the various materials in their shoes. I buy their leather shoes on the understanding that one day I might have a reaction to the leather cement, although apparently they use it sparingly. I just try not to sweat.

Is ageing inevitable?

I started writing this article when ageing was again in the news, this time in connection with shift workers. We have mounting evidence that working night shifts can accelerate ageing and decrease longevity. But what is ageing exactly? And can we protect ourselves against it?


Although we can see and experience getting old as a process of ‘slowing down’, becoming forgetful, and the accumulation of skin wrinkles and grey hair, it takes study at the cellular level to appreciate what is actually happening. Studying the internal workings of cells reveals unimaginable complexity and the inevitability of the ageing process. We will be hard pushed to come up with any elixir for longevity.

‘Longevity’ simply means a long life. The longest life on record is that of a French woman Jeanne-Louise Calment (1875-1997) who lived to nearly 123 years old. The biological world describes longevity as a phenotype – a set of observable characteristics of an individual resulting from the interaction of their genes with the environment. You might ask, then, how is the interaction between genes and environment affecting our longevity and causing us to age?

Our genes are strung together on chromosomes inside each of our body’s cells. Humans have 46 chromosomes in each cell. During growth and renewal, certain cells divide to produce new cells. A lifetime average is 10 million cell divisions every second. The genetic code perpetuates by processes of chromosome replication, followed by orderly separation of the duplicates in the creation of a new cell. We tend to think that each cell division faithfully reproduces and passes on the genetic blueprint, but this is far from true.


Take aneuploidy, for example. Chromosome separation sometimes goes wrong during cell division. The result of this is that some new cells may have more than or less than 46 chromosomes. Cells can usually limp along in these cases, but they have lost their vitality – in other words, they show signs of ageing.


Mutations are probably the genetic errors with which we are most familiar. Simply put, they are random changes occurring in the genetic code within a chromosome affecting a single gene or a larger piece of the chromosome.

Telomere shortening

Image of chromosome, with telomeres indicated pink

Another mechanism involved in cellular ageing is telomere shortening. Telomeres are specialised pieces of DNA that cap the ends of all chromosomes. Without a telomere, the integrity of a chromosome is severely threatened. This may result in the chromosome ends forming loops by joining together, and serious difficulty in replicating at all. Critically, each time a cell divides, the chromosomes naturally lose a fragment of telomere – they progressively shorten until they eventually reach the end of the line.


Epigenetics is a fascinating topic involving the study of gene expression. Certain structures within a cell are able to ‘switch’ genes on and off. Epigenetic mechanisms are what makes different body tissues do their specialised jobs – all our body cells contain the same genes, but only some are turned on.  In a chromosome, proteins known as histones form packaging material that helps to condense over 2 metres of DNA to fit into the nucleus of each of our cells. Subtle changes in these histones can interfere with their control of gene expression.

The effects of all this interference with the expression of the genetic code in our cells accumulate over time. The structure and function of the molecules that maintain the cell deteriorate with ageing, leading to a decline in the function of the cell. As the number of declining cells increases, so our bodies age more. This fits Edward Masoro’s classic definition of ageing as deterioration with advancing age, which increases vulnerability to biological challenges and hinders an individual’s ability to survive. This interpretation of ageing is also known as senescence.

Returning to the night-shift workers, the point is made that sleep is essential for enabling our body systems to replenish themselves. Upsetting normal wake-sleep patterns seems to trigger harmful processes – perhaps we do not handle sugar, stress, or appetite quite so well, for example. Toxins might not be cleared up quite as efficiently as normal.

Whether or not we work shifts, our bodies are continually mopping up harmful chemicals and dealing with the effects of bombardment by solar and other background radiation. Harmful chemicals can be external pollutants or by-products of metabolism. The more we can do to maintain our bodies in a healthy state, the better equipped we are to fend off these threats.  In the long run, though, it seems we have little defence against advancing senescence.

I gleaned much of this information through studying a free MOOC in Futurelearn: ‘Why do we age? The molecular mechanisms of ageing’  A very steep learning curve, but well worth the effort!

Do anti-ageing diets really work?

Three years ago this month, Dr Michael Mosley demonstrated ‘the power of intermittent fasting’ on the BBC programme Horizon.  He based his argument for this regime on evidence that had been mounting for some time that calorie restriction can prolong life.

However, many scientists challenge these assumptions, partly because of a lack of consistency between the various studies on monkeys, mice, rats, and even fruit flies. A review published in 2014 recognised that calorie restriction diets might be inadvertently correcting pre-existing imbalances in nutritional intake in the laboratory animals.

A team of researcher in Sydney took a different approach. They already knew that individuals who were deficient in certain growth factors did not suffer from cancer or diabetes, both of which are associated with the ageing process. They also knew that production of these cellular growth factors required particular amino acids. As amino acids are the building blocks of protein, it made sense to explore the effects of differing amounts of protein in the human diet. They analysed the proportion of protein in people’s diets, drawing on an existing national USA dataset. They also had access to health and mortality information about the people in their sample.

The Sydney team discovered something quite remarkable. Among the age group 50-65, high animal (not plant) protein intake was associated with shorter lives. This high protein group were almost four times more likely to die from cancer, when compared with the low protein group.

For those aged 66 plus, however, the tables turned. For this older group, longevity was associated with high protein intake. Those with a high protein intake were far less likely to develop cancer than those on low protein diets.

These are early days yet, and it is likely to be some time before any clear dietary recommendations can emerge. Much of the current advice for slowing the ageing process is based on having a good intake of antioxidants, dietary fibre and omega-3 fatty acids.

Antioxidants are purported to help moderate DNA damage – genetic mutations and chromosome damage – which can build over time and gradually disable more and more cells. Dietary fibre helps to moderate things such as the sugar and fats in our blood, as well as helping to maintain a healthy bowel. Omega-3 fatty acids are ‘good’ fats, for which many unproved claims are made, and even the case for promoting heart health is debated.

Does improving health through diet increase longevity? Having healthy heart and bowels may not protect us from the inevitable march of cumulative DNA and chromosome damage. Even the link between free radicals and antioxidants may not lengthen life.

It seems we have a long way to go before we can stop ageing in its tracks, but I suspect that many of us would opt for a moderately long and healthy life rather than simply a very long life.

%d bloggers like this: