Archive | Science in public RSS for this section

Drugs to make you smart

For as long as it has existed, the human race has strived to make itself better, to improve upon its natural ability and to push its boundaries. The Olympic games display impressive feats of human physical endurance, strength and skill. The Guinness book of world records celebrates some of the more ‘niche’ (yet no less impressive) human abilities, such as holding 43 snails on a face at once and squirting milk over impressive distances from an eyeball. These may not be particularly useful skills to have, but the collection of records in the Guinness book is still a demonstration of how we endeavour for success, to improve, and to be the best.

So what about our brains? Can we make them better, faster, smarter?

There are a group of drugs known as ‘Nootropics,’ or ‘Cognitive Enhancers’ that are used for just this purpose. These drugs act by changing the regulation of signalling systems within the brain – that is, they alter how brain cells communicate with each other, thereby subtly altering brain function. A cognitive enhancer aims to improve cognition – this is the brain’s ability to think, make decisions, learn, remember and solve problems. All are essential abilities for living independently, holding down a job, and succeeding at school.

SteroidBrain1

Various cognitive enhancers have been around for decades, and new ones are being developed all the time. However, these drugs are created with an aim to treating psychiatric and neurological problems, where poor cognition is a symptom or a side effect of the illness.  But do they work on healthy brains? Can we use drugs to push our cognitive abilities above and beyond our usual boundaries?

Can we make ourselves cleverer – *ahem* – I mean, more clever?

Well, it may be possible, although it’s not all that clear, and definitely not that simple. The use of cognitive enhancers in a healthy population is a relatively new consideration, so research into the effect of these drugs is very much in its early stages, and there is almost no data on the long-term effects of regularly taking such enhancers.

It’s also important to bear in mind that there is not one ‘wonder pill’ that can make someone smarter. Rather, there are many many different drugs that affect slightly different, and overlapping systems in the brain. Each one may therefore improve different and particular elements of cognition, which in turn means an individual is more able to learn, and as a result will be smarter. For example, different drugs will improve attention, others will affect memory, and others will increase alertness. By taking a cognitive enhancer, you will not suddenly be able to answer all of the questions on University Challenge.

So here is a summary of some of the most common cognitive enhancers currently being used:

ATTENTION (Ritalin/Atomexitine)

One of the illnesses that is most commonly treated with cognitive enhancers is ADHD (attention deficit hyperactivity disorder), which is characterised by a short attention span, hyperactivity and impulsiveness. Therefore cognitive enhancing drugs are thought to be a useful treatment. Two of the most common drugs used for ADHD are Ritalin (Methyphenidate) and Atomexitine.

Ritalin and Atomoxetine both increase noradrenaline and dopamine in the brain. Noradrenaline and dopamine are chemicals that send signals between brain cells, and are therefore known as ‘neurotransmitters.’ In ADHD, there is a reduction of both of these neurotransmitters, suggesting that communication within the brain is not efficient. By increasing the level of these neurotransmitters, Ritalin and Atomoxetine improve communication between brain cells, resulting in better alertness and attention.

So what about in a healthy individual without ADHD? Can these drugs further enhance cognition above and beyond what can be achieved with hard work alone? Many people believe so – ADHD-associated drugs are commonly found on university campuses, particularly in the USA when they have been illegally purchased by desperate students trying to improve their exam performance. But does it work?

The answer isn’t exactly clear. Several studies have indicated that taking Ritalin and Atomexitine can be beneficial in healthy adults – they can increase accuracy and performance on various cognitive-dependent tasks that require good attention and memory to perform well. But the size of the effect seems to be fairly modest, and seems to depend on the individual’s natural ability in the first place – those who had poor attention and memory to begin with saw an improvement in their performance after taking Ritalin, but there was no benefit for those who already performed well. Another dopamine enhancer, Bromocriptine (used for the treatment of Parkinson’s disease) actually lowered the performance of individuals who had initially performed well.

 WonderPill1

MEMORY (Aricept)

Cognitive enhancing drugs are also commonly used in those suffering from neurodegenerative diseases, such as Alzheimer’s or Parkinson’s disease. Both affect cognition and memory, and while there is no cure for either condition, cognitive enhancing drugs may delay or slow down the progression of the cognitive symptoms. Aricept (Donepezil) is commonly used for Alzheimer’s disease. It increases the levels of another neurotransmitter – acetylcholine – by stopping it being broken down and recycled in the brain

The majority of people experience and complain of a bad memory as they get older, so a memory-enhancing drug such as Aricept is going to be of interest to many people, not just those suffering with dementia. As such, research has begun to investigate how Aricept may affect healthy individuals. It has been found to enhance pilot performance after flight simulation training, although a review of multiple studies looking at Aricept found that evidence for its ability to enhance memory was unconvincing, with several studies finding no effect or even an impairment on cognitive ability following treatment.

ALERTNESS (Modafinil)

Modafinil (Provafil) is a treatment for narcolepsy and sleep apnea, and enhances cognition by increasing alertness and wakefulness. Apparently it is commonly used by some individuals in high-stress professions, or those that require long hours and shift work to help them stay awake,  such as doctors, military personnel, and academics. A study of British universities indicated that its use is pretty high among the undergraduate population too, and in 2013 described it as the ‘drug du jour’ to aid studying, despite being a prescription-only medication. The exact way that Modafinil affects the brain isn’t fully understood – it is thought that it may alter similar systems as Ritalin and Atomexitine, but it has also been associated with multiple other neurotransmitters and systems.

Modafinil has been shown to improve cognitive function in male volunteers by enhancing alertness and attention paid to the tasks they were given, and by inhibiting quick, impulsive responses. The volunteers also reported feeling more alert and energetic after taking the drug. However, other studies have found that similar to the ADHD drugs, Modafinil has a greater effect on those with a poor initial performance, and may be of limited use to those with a high cognitive ability.

Universitychallengedrugs1

Overall, there is some tentative evidence that cognitive enhancing drugs typically used for neurological problems could have some benefit in the healthy population. So what should stop you from grabbing a big ol’ box of pills to improve your performance at school or work? Well, lots of things, actually – here’s just a few:

  • A big deal is that no one knows the long term effects of taking any of these substances – the majority of studies investigate the effects of a single dose or treatment over just a few weeks.
  • It is important to also bear in mind that these drugs are likely to have undesirable side effects. When they are used in people with a neurological disease, it has been thought that the therapeutic effect of the drug outweighs the discomfort of the side effects. As the current evidence points to very modest effects in healthy people, the balance between the benefit and the risks may no longer be favourable.
  • It isn’t really understood how they work in healthy people, and it is different for everyone. Most of the studies I came across while researching this post pointed out that the effects of cognitive enhancers were very variable between different people. This may be down to an individual’s brain chemistry, gender or their genetics, but currently it isn’t possible to predict how or if a cognitive enhancer will work in any one person.
  • There’s a big ethical debate about whether the use of cognitive enhancers is ok. Is it cheating? How different is it to using caffeine? Would people feel coerced or pressured to take them in order to ‘keep up?’ Does it undermine the value of hard work?

I don’t know the answers to the ethical questions, and the argument is strong for both sides. Nevertheless, the use of cognitive enhancers is an interesting and divisive debate, and looks to be a growing field of research. However, the current consensus appears to be that these may be useful tools in the future for healthy individuals, but at the moment their benefits and effects are questionable.

Personally, I’m happy to continue to celebrate if/when I manage to answer just a single question on that darn University Challenge.

The Biocheminist.

Cake on the Brain

I love cake.

I mean, I really love cake. I exist in a constant battle between my desire for its delicious spongy goodness, and for maintaining a healthy BMI.

So of course, when considering what I would write about for my next blog post, cake was on my mind.

On my mind? Cake on my mind – that’s cake on the brain – that’s cake and neuroscience! Hurrah! I can write about cake!

It turned out that this was easier said than done. A quick google of ‘brain’ and ‘cake’ turned up a huge range of impressively brain-shaped desserts, but that wasn’t quite what I was after. So I turned to a database of scientific publications – a tool I typically use during my working day with precise and sensible search parameters to discover the latest work on a very specific topic. Not this time! I repeated my ‘cake’ and ‘brain’ search, and here are three things that I found:

 

  1. Oreos are practically Cocaine

Well, sort of. But not really. Oreos aren’t my delicious treat of choice, but it turns out that rats are pretty keen on them. A study identified which rats had a strong preference for Oreos and those that didn’t. They then compared how much cocaine either group of rats gave to themselves.

Now, rats that take cocaine aren’t dropped into a seedy nightclub, and don’t have teeny tiny credit cards or rolled up notes. Instead, a small tube (cannula) is implanted into their brain and is attached to a small pump. Whenever the rat presses a lever, this pump releases a controlled dose of cocaine through the tube and into their brain. The rats rather like this, and will learn that pressing the lever = getting high.

It turns out that the rats that loved the Oreos were also pretty darn keen on the coke. While both groups of rats learnt to press the lever, the Oreo-rats were slower to stop pressing it when the cocaine reward was stopped, and were much more enthusiastic in their pressing when the drug was reinstated. The study also found that presenting the rats with an Oreo before their stint in the coke-box temporarily boosted their lever pressing.

There was no such effect when rats were given rice cakes instead of biscuits (sorry, my American friends – ‘cookies’).

Oreo cartoon1

This suggests that Oreos (or any other delicious treat) are likely to work on the same pathways in our brains as drugs do – namely, those associated with motivation and reward. When we eat something delicious, we feel good due to the activation of ‘reward pathways’ in our brain – this feeling reinforces that behaviour so we are likely to do it again to get back that good feeling. Drugs hijack the same pathways. So when rats have a strong preference for Oreos, they may be more sensitive to how rewarding they are. This then makes them vulnerable to doing other things that will give a similar rewarding feeling. Such as cocaine.

It’s important to bear in mind, that although drugs and Oreos roughly use the same pathways of reward and motivation in our brains, they have vastly different effects on cells, addiction and health.

Take cake, not coke.

 

  1. Having that ice cream now will ruin your dessert.

…At least how much you enjoy it. In the previous section, I mentioned the brain’s reward pathway. Part of this pathway is regulated by Dopamine – a chemical released in the brain that transmits signals between cells (A.K.A a Neurotransmitter). Increased Dopamine signalling in the brain typically means something good has happened and you are being rewarded for it with some good feelings. So in the previous example, Oreos/drugs = lots of dopamine release. Much of this dopamine activity occurs in an area of the brain called the Striatum.

Another group of researchers gave people an fMRI scan (functional magnetic resonance imagingthis is a way of measuring which areas of the brain are most active, based on the level of oxygen-rich blood being delivered to those areas) while they were given either an ice-cream based milkshake, or an ever-so-appetising ‘tasteless wash’ to drink. Having a delicious creamy milkshake fed to you through a tube whilst lying in a big magnetic cylinder was apparently rewarding, and there was increased activation of the striatum in the people that drank the milkshake.

*However* The extent of the activation was lower in people that regularly ate ice cream, meaning that their experience of drinking a milkshake felt much less rewarding and just not as good. The implication is that repeated eating of a particular type of food will reduce the Dopamine response in your Striatum, possibly leading to overeating of that foodstuff to try and get the same great feeling as the first time it touched your tongue.

So I must hold back on the cake, or I just won’t appreciate it as much.

 

  1. Cavemen are responsible for ruining your diet

That’s enough about reward, we know eating delicious cake makes us happy, that’s why it’s described as delicious! I’ve never heard the phrases ‘delicious rice cake’ or ‘delicious tasteless wash.’

So how about cake and attention?

It has previously been shown that when people are hungry, they are more likely to pay attention to food-related words, and to recall more food-related items in a memory test. A study in 2010 used a test called the ‘Emotional Blink of Attention (EBA).’ This is based on a test where people are told to look out for a particular image (or ‘Target’), such as a landscape scene, while they are shown several different images in quick succession. However, if another image that is likely to provoke an emotional response is shown immediately before the target image they are looking out for, then they will pay more attention to the emotional image and will be less likely to notice the target that is shown immediately after. Images causing EBA have typically been related to violence, gore or sex.

Cake cartoon1

When people were hungry, the EBA effect was seen when a picture of cake was shown immediately before the target image. This was even the case when people were offered a monetary reward to ignore the pictures of food. They just couldn’t help paying attention to the cake!

Being able to automatically pay attention to food-related things when hungry is likely to be something that helped our ancestors adapt to their environment and be successful at hunting and staying alive. Now, however, food-related things are everywhere, but this survival response remains. That has big consequences for dieters who are frequently hungry – being hungry means you are more likely to notice that vending machine, the café over there and the charity cake sale a couple of floors up at work. Couple this with an urge for a tasty Dopamine response and now it makes sense why sticking to a diet can be so tough!

So basically – cake is like a drug that we just can’t ignore. So why fight it? Have a slice – just not too often!

 

The Biocheminist

 

Cat-calling and Mental Health

It would be difficult to find anyone who hasn’t at least heard about, if not watched, the now viral New York street harassment video (if you haven’t seen it, you can watch it here).

It summarises an all too familiar experience that most women have faced at least once in their lives – and I mean MOST – as a staggering 98% of women surveyed in 2008 reported that they had experienced cat-calling and harassment.  The video has caused an intense internet debate; as well as the majority outcry condemning the behaviour of the cat-callers and demands to change this all-too-common occurrence, there have also been more negative responses including the defence of the men involved and violent threats directed towards the subject of the video.

While a lot of the debate has centred on the acceptability and frequency of these behaviours, and how it can best be tackled, less attention has been given to the psychological effects of experiencing cat-calling and sexual harassment, and their impact on mental health.

So I did some digging. 

While there is a wealth of scientific literature investigating the effects of sexual harassment at home or in the workplace on mental health, the investigation of the effects of street harassment or cat-calling (referred to in these studies as ‘stranger harassment’) is a relatively new development. This came as a surprise to me, as there are studies that date as far back as 1978 that found that women felt unsafe in a variety of social contexts, and a Canadian study in 2000 identified that stranger harassment greatly reduced feelings of safety to a larger degree than harassment by known acquaintances. To put more simply, harassment by strangers makes women feel even less safe and more scared than harassment by a known individual at work or at home.

Sexual harassment has been associated with nausea, sleeplessness, anxiety and depression. However, the literature focuses on two main components that may affect mental health:

  1. Stress

Arguably the main risk of stranger harassment to mental health is its effect as a chronic stressor  – a stressor can be any environmental or external event  that causes stress to an individual, which becomes chronic when it is experienced on multiple occasions over time. For example, an individual may receive one cat-call on their walk to work. In isolation, this could be an unpleasant and mildly stressful event, or may not have any bearing on that person’s day. However,  should that experience of a mild stressor occur every day for months or years, then it becomes a chronic source of stress that can negatively impact mental health.

How does stress affect mental health?

One of the most studied outcomes of chronic stress is depression (which is also one of the reported outcomes of harassment). In fact, a popular mouse model of depression is called the ‘Chronic Unexpected Stress’ (CUS) model, which is created by exposing mice to…well…chronic unexpected stress. This includes social stress, (such as overcrowding or isolation) and predatory stress (the scent or presence of a predator). This is such a popular model for depression, because chronic psychological stress effectively and predictively causes anxiety and depression-like behaviours in these mice.

Predatory stress increased inflammation in several brain areas in these mice – inflammation is the body’s response to threat, and in the short term protects cells from harm. However if inflammation is present for a long time, it can start to cause damage. Increased inflammation in the brain has been found in, and may exacerbate Alzheimer’s disease and depression.  Studies in humans have also identified damage to the structure and communication networks of the brain as a result of chronic stress, which can have a negative effect on learning, memory and mood.

So it isn’t really such a leap to imagine that the fear or threat felt following harassment, and the powerlessness over its occurrence could become a chronic stressor. It can also arguably be equated with the ‘predatory stress’ used in mice.  In a study that focused on the workplace, an association between harassment and poor mental health was identified. Specifically, individuals who experienced sexual harassment early on in their careers were more likely to be depressed later in life. This was the case for both men and women.

  1. Objectification

Objectification is a societal issue that reaches beyond just cat-calling, but its role in stranger harassment has been investigated. The theory of self-objectification in the psychological literature says that when a person is sexually harassed by a stranger, they feel objectified. This causes ‘self-surveillance,’ or for them to view themselves as the stranger views them. This is usually as a sexualised object, with their worth determined by how they feel they are viewed by others. In other words, they are ‘self-objectifying themselves. This self-objectification has been found to have multiple negative effects on mental health, and has been associated with increased prevalence of eating disorders, depression and substance abuse.

However science hasn’t always been able to carry out this kind of study without bias and sexism.

Several studies that I have come across appear to lay responsibility of the effects of harassment on mental health and well-being on the women who have been targeted, rather than on the individuals who commit the harassment. After associating harassment and self-objectification with negative mental health and psychological consequences, it has been recommended that women should be educated in better coping strategies so that they become more resilient to the inevitable objectifying experiences as a way to prevent mental health problems. It is this attitude – that cat-calling/street harassment/stranger harassment is a ‘normal’ experience that should just be put up with – which has allowed it to remain a prevalent and distressing problem in society.

Despite cat-calling and street harassment having been identified as an issue for at least the past 14 years, there has been no reduction in the number of women experiencing it, and there has been very little attention given to the serious effects these experiences may have on mental health. The scientific community has not escaped without bias in this area, although it has identified the association between harassment, stress and depression, and recognised that there may be a substantial psychological effect of frequent harassment. As the role of harassment on mental health gains more attention, scientists are beginning to investigate more thoroughly; including the negative effects witnessing sexism has on bystanders  and some investigation into why some men do it.

There is still a long way to go – both scientifically and socially. But with cat-calling and harassment carrying such strong risks to mental health, perhaps they should be considered as a psychological assault.

For more information about cat-calling and harassment, and how it is being tackled, visit:

http://www.stopstreetharassment.org

http://www.ihollaback.org

The Biocheminist

You can’t spell ‘Love’ without ‘Vole’ – The Neurobiology of Love.

Love. A source of great joy and agonising pain (wait, didn’t I also say that about western blots…?). When we talk about love, we talk about the heart – love is heart-warming, losing a love is heart-breaking, you should enter relationships based on your heart, not with your head!

Nope! Sorry! I don’t want to break any hearts with this, but love is ALL in your head.

A lot of studies have been carried out where the brain has been scanned (or imaged) while individuals are looking at particular photos or carrying out activities and tasks – this is called fMRI (functional magnetic resonance imaging). This method can detect areas of the brain that are receiving higher blood flow, and are therefore likely to be more active. Using this method, it has been discovered that the areas of the brain responsible for regulating your temperature overlap with the areas of the brain associated with social warmth, defined in the study as the feeling of being loved and connected to other people (Inagaki & Eisenberger 2013). These areas were the Ventral Striatum and the Middle Insula (see pictures below for an idea of where these are). The association between physical temperature and feelings of love went so far that when people in the study held a warm object, they reported stronger feelings of social warmth, and those that read meaningful and loving messages from friends and family reported the room as feeling warmer.brain3 brain5

Here is a brain – my brain – with rough areas associated with love & reward drawn on. Left image from the side, right image from above.

Why would this be? How is that useful?

The authors suggest that it could be learnt from birth – many behaviours used to soothe a baby and show it love, such as rocking and being held, occur in close proximity to another person and subsequently cause a rise in temperature. We therefore learn that warmth is associated with being loved and cared for. And no one can deny that a warm hug (or Welsh ‘cwtch’) from a loved one feels pretty darn good!

The same brain areas identified in that study have also shown greater activation when people rate themselves as close to their romantic partner, this and was associated with longer relationship length. In fact a lot of brain areas have been linked to feelings of romantic love, and many of these, such the Hippocampus and Nucleus Accumbens (see previous picture!) are part of the reward system in the brain. The reward system is the network in our brains that makes us feel pleasure and happiness (a reward), often in response to a particular event or behaviour. Activation of this system makes us try to repeat the action that led to its activation in the first place, therefore resulting in another reward feeling – if spending time with a particular person activates our reward system, then we strive to see them again.

Dopamine is the signalling molecule that works within this reward system.

Prairie voles (super cute voles from North America) are the most frequently studied animal on the neurobiology of love- this is because they form monogamous relationships. Voles that had more receptors for dopamine (parts of the cell that are able to detect the presence of dopamine, allowing the cells to respond to it) had increased monogamous behaviour. This suggests that increased activity in the brain’s reward system may improve the longevity and fidelity of individuals in a relationship, because being with their partner feels particularly rewarding.

IMG_1041

Oxytocin has the reputation of being The Love Drug.

Oxytocin is a neuropeptide – which means it is a molecule that is used by brain cells to communicate with each other, although oxytocin is also capable of working as a hormone around the body. Oxytocin is well known as being associated with pregnancy and lactation, but its effects are much broader than that! It can also stimulate social behaviour, such as increasing trust and empathy. Looking back to those adorable voles, monogamous animals had more oxytocin receptors in the Frontal Cortex, Nucleus Accumbens and Striatum, which are the same areas that show increased activity in humans when shown a picture of their partner. A release of oxytocin in the brain during mating was essential for the important bonding to a partner in voles.

Enough of voles – in humans, oxytocin is increased by hugs, social support, massages and orgasm.

In fact, when heterosexual male subjects were given oxytocin intranasally (up their nose!), they rated their partner’s face as more attractive than other women’s faces, and showed increased activation of their brain reward systems. The author  stated that oxytocin could ‘improve the reward value’ of the subject’s partners…which is oh so romantic(!) A sniff of oxytocin in females improved their ability to determine the emotion felt by another person when just shown the eye region of their face – this is called the ‘reading the mind in the eyes test,’ or the slightly snappier ‘RMET‘.

IMG_1040

As well as oxytocin, multiple other hormones have been implicated in the neurobiology of love – testosterone, cortisol and dopamine have all been identified as contributing to either the longevity or demise of romantic relationships. Cortisol is a hormone that is associated with responses to stress, and particularly high levels in couples during an argument were associated with increased hostility and relationship breakup, particularly if levels were high in both individuals. High levels of oxytocin, on the other hand, were associated with increased empathy. High levels of testosterone are associated with competitiveness rather than stability and trust – it is much higher in single men than in men in relationships who no longer need to compete with other males for a partner.

You can be ‘Crazy in Love’ – Beyoncé was right!

The early stage of a new relationship is considered to be a separate phase that creates different and unique responses in the brain. There is a dramatic increase in the love drug, oxytocin, which in turn increases the activation of dopamine-related brain areas. When these areas are so strongly activated, large areas of the cortex experience a reduction in activity, which means that we lose our ability for rational judgement – which is an effect many of us may have observed in our friends in a new relationship! The activation of the dopamine reward system may also make us temporarily ‘addicted’ to our new beau, as their ‘reward value’ is through the roof! It is thought that this early and temporary addiction serves the purpose of keeping us around that person for long enough to form a meaningful attachment.

So yes, it might all be in your head and love might make you crazy, but it’s also a real biological phenomenon. And don’t forget to be romantic – let your significant other know that they have a high reward value, then give ‘em a cwtch.

 

The Biocheminist

4 Common Brain Myths…Explained!

Morgan Freeman has been really getting on my nerves this week. I’ll be sitting in my lounge going about my evening, and suddenly his smooth baritone voice will interrupt my thoughts/dinner/internet browsing with the statement:

‘It is estimated that us human beings only use 10% of our brain’s capacity. Imagine if we could access 100%!’  

I start internally screaming. The belief that we only use 10% of our brains is one of the most pervasive and prevalent neuroscience myths, but it’s just not true! Of course, I understand that this statement is part of a Hollywood film script and will typically be considered as fiction, but it has prompted me to address just a few of the most common misconceptions about the brain for this week’s post:

 

  1. We only use 10% of our brains.

IMG_1022

Nope. No. Nu-uh. Not true. We use the whole 100% of our brain. If we used only 10%, then we’d perhaps expect injuries caused by physical trauma, stroke or disease to have little or no effect, unless they hit the ‘functional’ 10%. In reality, loss or damage to even small areas of the brain can greatly, sometimes devastatingly, affect a person’s life and how they function. From an evolutionary point of view, it would make no sense for us to develop such large and complicated organs in our heads if the majority of it is useless. Modern imaging techniques have also confirmed that there is activity throughout the entire brain, even while we sleep.

So if there isn’t even a grain of truth to this statement, where did it come from, and why do people still believe it?

I haven’t been able to find one absolute initiation point for this belief. However, it may have resulted from very early experiments in animals, where it was found that simple tasks could still be completed after damaging pretty large areas of their brains, leading some to believe that there was a lot of redundant stuff in our heads. Alternatively, it has also been claimed that the statement ‘We are making use of only a small part of our possible mental and physical resources’ by prominent psychologist William James in 1908 may have initiated the idea that we are not making full use of our brains. Misinterpretation and misquoting over the years gradually morphed into the ‘10%’ myth we know (and loathe) today!

But why do we still believe it? Well it seems to have become ingrained in popular culture – it is repeated so commonly and mentioned so frequently in passing (à la Morgan Freeman), as well as in advertisements for self-improvement (e.g. brain-training) that it may simply be accepted as true.

 

  1. Fish have a three-second memory

IMG_1023

Again, no! Fish are more complicated and highly developed creatures than previously thought. A 2009 study demonstrated that not only could fish associate a specific noise with feeding time (in the same way as Pavlov’s famous bell = food experiment in dogs), but that this association was remembered three months later. In fact there has also been a study that has demonstrated that fish may have emotional states, and can learn that a particular environment is associated with a food reward, and another is associated with being chased by a net – when given the choice, fish spent more time in the environment with the food reward, and avoided the scary side! Even more astonishingly, fish have been found to be able to learn the difference between blues and classical music, and can even classify music they have never heard before into one of these two categories. And these aren’t the only examples of fantastic fish memory – the internet is full of different experiments and studies about fish learning and memory – take a look!

But if fish are so clever, where did the three-second myth come from?

There doesn’t appear to be a definitive source for this myth, but it has been suggested that it may have arisen to justify the small bowls that goldfish are commonly kept in – if they have such short memories, then they can’t get bored, and there’s no need to feel guilty! If anyone has a better answer for where this myth began, please let me know!

 

  1. Drinking alcohol kills your brain cells

IMG_1024

If this were entirely true, then there would be very few Freshers that would ever manage to graduate from their undergraduate degrees (or indeed make it past Fresher’s week). It appears that in the short term, acute alcohol intake (i.e. binge drinking) will not kill your brain cells – it alters how brain cells communicate with each other, causes dehydration and reduces glucose metabolism (their use of energy), but these functions can be restored following a period of abstinence. A study in 1993 also failed to find any difference between the number of neurons in the brains of alcoholics and non-alcoholics.

However, there is some more recent evidence that chronic alcohol abuse may lead to neurodegeneration, although exactly how this happens isn’t completely understood, and it may be a combination of cell death and dysfunction. However, excessive consumption of alcohol over a long period of time is able to indirectly lead to the death of brain cells – chronic alcoholism may lead to severe vitamin B deficiency, which is the cause of Korsakoff’s disease; a form of dementia associated with memory loss and confusion.

As we have a surviving and intelligent graduate population, where did the belief that alcohol kills brain cells come from?

Most likely, this belief has quite simply arisen from the ridiculous behaviour, slowed cognition and terrible decisions exhibited by drunk people, combined with the agony of a really bad hangover!

 

  1. Vaccinations cause Autism

NO. NO. I cannot stress this enough – THIS IS NOT TRUE. When deciding which myths I was going to include in this post, I chose to tackle this one as its propagation has been so damaging to children’s health – in fact it has been described as ‘the most damaging medical hoax of the last 100 years.’ 

There have been (and still are) several theories put forward to describe why the MMR (measles, mumps and rubella) vaccination would cause Autism – these have included the overwhelming of an infant’s immune system, the inclusion of toxic ingredients in the vaccine, and the vaccine causing damage to the intestinal walls which allows infection by disease-causing proteins. All of these have subsequently been tested, and none of them are true.

There have now been multiple studies that have demonstrated no causal link between the MMR vaccination and the development of Autism or other Autism spectrum disorders. This paper by Gerber & Offit (2009) summarises the work and reviews the evidence, and an analysis released this year re-states the fact that there is no association between the MMR and Autism.

But if there is absolutely no link between Autism and the MMR, why do so many people believe it, and why is it still going strong?

In contrast to the other three myths I write about here, there is one very definite source for this belief. It started with a fraudulent paper published in a medical journal in 1998 by author Andrew Wakefield – he noticed that a few children coincidentally exhibited some autistic-like behaviour shortly after receiving the MMR vaccination, and that they also had intestinal troubles. However, only 12 children were studied, and behavioural symptoms of Autism tend to appear around the same age as the MMR is typically given, so it is no surprise that these two events appeared close together in these children. Furthermore, there were multiple issues with the work – in 2010 the paper was fully retracted, and in 2011 a description of the fraudulent activity was published in the British Medical Journal.

But why do people still believe that there is a link? Parents are understandably very protective of their children and do not want to cause them any harm, and the doubt and fear over vaccines that occurred in response to the original paper is still enough to stop some parents from getting their child vaccinated – in these cases, it may be a lack of information about where the risks and dangers really lie that affects their decision. There is also a strong anti-vaccination movement that existed prior to the fraudulent paper, so the findings from this paper have been used as ‘evidence’ for that movement and strengthened their position, despite evidence to the contrary.

 

Are there any other myths about the brain that you are interested in knowing more about? Is there something you’ve heard in the office that you think might not be true? Just ask in the comments section below!

 

The Biocheminist

 

References & for more, see:               

10% myth:

http://www.scientificamerican.com/article/do-people-only-use-10-percent-of-their-brains/

http://web.archive.org/web/20060402235936/http://brainconnection.com/topics/?main=fa/brain-myth

 

Fish myth:

http://spot.humaneresearch.org/content/use-conditioned-place-preferenceavoidance-tests-assess-affective-states-fish

http://link.springer.com.abc.cardiff.ac.uk/article/10.1007/s10071-007-0103-6

http://picovolt.com/ava/fish/music-carp.pdf

http://mentalfloss.com/article/24763/do-fish-really-have-three-second-memory

 

Alcohol myth:

http://www.independent.co.uk/news/uk/doubts-on-alcohol-link-to-dead-cells-1503669.html

http://science.howstuffworks.com/life/inside-the-mind/human-brain/10-brain-myths9.htm

http://gizmodo.com/drinking-alcohol-doesnt-actually-kill-brain-cells-1498785941

 

Vaccine myth:

http://www.ncbi.nlm.nih.gov/pubmed/21917556

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2908388/?log$..

http://www.sciencedirect.com/science/article/pii/S0264410X14006367

http://www.bmj.com/content/342/bmj.c7452.full

 

 

 

Why don’t we take mental health seriously?

Mental health and psychological well-being does not, and never has received the same level of consideration and respect (not to mention funding) as physical health and fitness. Why?

Is it because there is still a strong stigma attached to having a mental health disorder? Do we not consider neurological diseases to be as ‘real’?

Regardless of the reasons, these damaging attitudes directly impact upon how and if people choose to seek help, making negative opinions and attitudes about mental health potentially dangerous to many individuals.

As up to 1/4 of us may experience some kind of mental health problem within our lifetimes, can we really afford to give it less attention and take it less seriously than our physical health?

And where do our negative attitudes come from? Why isn’t mental health considered as serious or as important as physical health?

I believe that one powerful aspect is the way that we refer to mental health problems in our everyday language. Many people will have used the phrase ‘I’m so depressed today’ when they’ve had a bad day, a break-up or are feeling a bit low. It’s a word that has lost its real impact within society. So when someone is actually suffering from depression, our understanding of how this might feel for them is derived from our day-to-day experience of the word – that is to say, feeling a bit sad and crappy – rather than from knowledge of what depression really is. To perhaps give the biggest understatement of the year, depression is not just feeling a bit sad and crappy. It is a severe and debilitating condition with a neurological cause, and can be treated with behavioural therapies and medication. A similar thing happens in reference to OCD (obsessive compulsive disorder) – I’ve known many people to say ‘Oh my gosh, I’m just so OCD!’ because they like to have their shoes in a neat line, or their desk tidy. As is the case for depression, this vastly underestimates the clinical condition, where an individual may spend several hours arranging those shoes for the crippling anxiety that something bad will happen if they get it wrong.

IMG_1013

The media is also often less than helpful.

The way mental health is addressed in the media is often inaccurate, sensationalised or it is just completely ignored, depending on the story being run. For example, the media has had a particularly damaging effect on how people with schizophrenia are perceived; schizophrenia is an incredibly complex disorder with a huge spectrum of associated symptoms ranging from hallucinations and involuntary muscle movements to complete loss of emotion and movement. However, schizophrenia is only mentioned in the media following a violent crime if the perpetrator is suffering from the disorder. While in rare cases the disorder may result in violent behaviour, the exaggeration of the disorder when reporting the crime gives the impression that schizophrenic = murderer. And why wouldn’t it? When is schizophrenia ever discussed otherwise? There is an interesting article explaining the association between schizophrenia and murder here, which puts things in perspective when they state “you would be just as likely to be murdered by the American who lives next door, as by the man with schizophrenia opposite and 3.5 times more likely be murdered by the Russian chap who lives down the road.

We aren’t mind readers, and so the experience of mental health issues is likely to be a mystery to most of us.

We have all felt physical pain; perhaps a strained muscle, a stomach ache or broken bone, and therefore understand the agony and severity of someone else suffering from these ailments. Whereas trouble with mental health can be a much more personal, isolating experience that is difficult to communicate, and isn’t always obvious to other people. Because of this, we also tend to forget that neurological diseases have a biological cause – there may be cell death, unbalanced signalling, and disordered cell communication, to name just a few potential problems. Scientists are dedicated to discovering and understanding the genes and processes that are responsible for mental health problems.

Therefore mental health problems are not something that someone chooses to experience or can just ‘snap out’ of in the same way someone who breaks their leg can’t just ‘run it off.’

The stigma attached to mental health disorders resulting from these attitudes and a lack of understanding leads to those who suffer to feel shameful, which leads to secrecy, which leads to further stigma and misunderstanding. Individuals will refuse to seek help, or will be totally unaware that help is available.

So let’s compare the current attitudes to mental health with a physical problem:

If a woman were to feel a lump in her breast and suspects she may have developed breast cancer, her reaction would be to visit a GP and get the problem treated. She would be far less likely to ‘see how it goes,’ ‘try to snap out of it,’ or just forget about it. She wouldn’t be likely to decline medication because she thinks she can sort it out by herself or would worry what her friends would think or be concerned that she would lose her job.

However these are exactly the responses someone who fears they may be suffering from depression, OCD, anxiety or schizophrenia – all of which can destroy lives as much as cancer can – are more likely to have.

Put this way, doesn’t it seem ridiculous that we as a society don’t discuss mental health more openly and give it the same attention, respect and care as physical health?

 

How do you think mental health problems are treated in society? What are your attitudes to mental health? Have you experienced any discrimination because of a mental health problem? And importantly, how can we fix it? Comment below!

The Biocheminist

Using Animals for Neuroscience Research

It’s a very touchy subject for some people, and as it was reported last week that the number of animal experiments being carried out has increased this year, I decided I would try to discuss some of the reasons behind using animals in research.

I’m not an expert in animal research, but I have had some experience of carrying out some animal experiments, have worked in the animal labs and work in a department that is actively engaged in using animal models of disease. So, unsurprisingly, I am not against the use of animals in scientific research (cosmetic and household testing is a completely different matter that I won’t be getting into here). However, I recognise, accept and agree that it is an uncomfortable idea. Pair that discomfort with the apparent secrecy surrounding many scientific research labs, and it’s easy to see why animal research, and the people who conduct it, are treated with suspicion.

If they are hiding something, then there must be something awful to hide, right?

From my experience, and from knowing the people who work with animals in research, this is really quite far from the truth. The animals most commonly used in research are, as everyone knows, rats and mice. This is because they are relatively small mammals that are capable of learning, and have some similarities to human biology. The animals in the labs are cared for round the clock by qualified technicians, many of whom used to be veterinary nurses, and a vet is always on call and makes regular visits. On top of this, the use of animals in the UK is under the strictest regulations in the world to ensure the highest level of animal welfare, and regular visits by inspectors ensure that this is the case.

photo1

But, while the animals are very well cared for, they are ultimately used for research purposes, and that’s the uncomfortable bit. Within neuroscience research, this may include deliberate injuries to the brain (carried out under strict and sterile surgical conditions), treatment with new or experimental drugs, or may consist purely of ‘behavioural tests’ – where the animal learns associations between signals and a reward (think Pavlov and his dogs) or where to go in a maze to get some food. In addition, the majority of mice used in research have been altered or bred to have a particular genetic mutation that mimics the mutations found in human diseases. This creates an ‘animal model’ of a disease, so that how a disease progresses over time can be investigated, or new experimental treatments for symptoms can be tested.

Without prior knowledge of the importance and usefulness of these experiments, or the details of how the animals are treated and cared for during the process, it’s easy to think that this is simply cruelty, carried out by evil scientists who are merely satisfying their curiosity.

This opinion may arise from a lack of information, and details from outdated studies. Before writing this post, I took a look on some popular anti-vivisectionist (anti-animal testing) websites to see the other side of the argument. While much of the information given on the websites was incorrect and outdated (particularly with regards to housing conditions, and on one webpage, all the references were at least 14 years old), there is one important argument that I wanted to address.

The most common argument is that it is pointless to use animals because they do not reflect the human condition, so anything that is carried out in mice is unlikely to work in humans and is a waste of time.

Well, there are two sides to this argument.

Yes, obviously humans are different to rats and mice. Their brains are different, and many drugs that have shown promising results in animals have failed in human clinical testing. But not all drugs have. Animal models have allowed us to uncover an abundance of essential information about many neurological diseases that couldn’t be identified in humans, and has pushed forwards our understanding of disease and improved research in the process. They are an essential part of investigating the effects of genetic manipulations, as well as the basic biology of disease and are a starting point for developing future life-saving treatments.

But, that being said, I don’t think there is a single scientist who would say that animal models are perfect.

So why do we have to use animals?

Well, a big reason is the lack of an alternative that can provide us with as much information. While it is being argued that human cell models, including the development of stem cells, are more relevant to medical research (they are human after all), the problem, particularly with the brain and in neuroscience, is one of connectivity. The brain is made up of many different cell types, all of which are responsible for different functions. But they are all connected and they communicate with each other, which means that they can affect and alter each other. This complexity has not been replicated in a petri dish with cells, but it is present in an animal brain. Something that may work in one cell model won’t necessarily work in a different cell model, never mind in a full working brain – whether it is human or not. Even if we could eventually one day grow an entire functioning brain just from cells in the lab, then would it have consciousness? And if was a conscious human brain, then experimenting on it becomes ethically questionable (at the very least!).

photo2

Cells also can’t tell us the behaviour that may result from an experiment – for example, a drug to improve the symptoms of Alzheimer’s disease may have the expected biochemical effects in a cell model, but does it actually improve memory? And are there any side effects? Another drug predicted to treat Parkinson’s disease may not change the cell as anticipated, but would it still give back some movement control?

The usefulness of animal models is frequently discussed within the scientific community, and the general consensus tends to be that although they are not the perfect solution, animal models need to remain an integral part of scientific research until appropriate and accurate alternatives can be developed. Unfortunately, we aren’t there yet. In the meantime, improved communication between scientists and the public may ease some of the tension and suspicion about what goes on in animal labs, and the fear of being targeted or attacked for working in animal research. It may or may not change any minds, but both sides will at least be better informed.

What are your views on scientific animal research? Is there anything you wish you knew more about? I’d love to hear your comments!

 

The Biocheminist

Alzheimer’s Disease – Explained!

Alzheimer’s Disease (named after Alois Alzheimer, who first described the disease in a lecture in 1906) is the most common dementia. But what is dementia? ‘Dementia’ is used to describe a collection of symptoms including difficulty with problem solving, thinking and language, but is most often associated with memory loss. There are many different types of dementia, but as Alzheimer’s Disease accounts for 50-60% of all cases of dementia, it is arguably the biggest problem and commands the most attention (and research funding!)

Alzheimer’s Disease is a growing problem because of our aging population. The symptoms of Alzheimer’s Disease tend to start to develop after 65 years of age – and we didn’t used to live that long! With better medical care and individuals living way past their 80th birthdays, the number of people that are old enough to develop Alzheimer’s disease is growing. It’s an awful and distressing condition for both those suffering from it, as well as for those caring for their afflicted family members or friends.

Most people have heard of Alzheimer’s Disease, and most people know it causes memory loss. But what actually causes Alzheimer’s Disease? What is it??

In the Brain

In Alzheimer’s Disease, cells in the part of the brain called the ‘Hippocampus’ begin to die. The Hippocampus is so called because the person who identified it thought it looked like a sea horse (in Greek, ‘hippo’ = horse, ‘kampus’ = sea monster). They were wrong, but the name stuck. The hippocampus is responsible for forming our memories, so damage to this area explains the memory loss in Alzheimer’s disease. As the disease progresses, cell death spreads to other parts of the brain, and can affect other functions such as problem solving and language.

Image

So why do the cells die?

There appears to be an accumulation of two different proteins within the Alzheimer’s Disease brain. These are ‘Beta-Amyloid,’ which clumps together to form ‘plaques,’ and ‘Tau’ that forms ‘tangles.’ However, the association between these and Alzheimer’s disease is not completely clear. Many people develop plaques and tangles in old age, but have no memory problems. On the other hand, some individuals with very severe Alzheimer’s Disease may not have very severe plaques and tangles. Nevertheless, the presence of a lot of Beta-Amyloid and Tau in brain cells interferes with how cells normally function and communicate with each other. If brain cells can no longer communicate with each other, memories can no longer be formed and the cells will die. Attempts are therefore being made to reduce Beta-Amyloid plaques and Tau tangles as a part of Alzheimer’s Disease research.

Genes

So there are plaques and tangles in the Alzheimer’s Disease brain, and brain cells die, but why?

I described in a previous post what genes are and why they are important to research (https://thebiocheminist.wordpress.com/2014/06/23/why-do-scientists-want-my-dna/). Genes act like templates for cells to make proteins, and these proteins can also control the function of other proteins. Therefore changes in genes may underlie the differences we see in Beta-Amyloid and Tau proteins.

Alzheimer’s Disease is not caused by a change in a single gene. It seems as though changes in several different genes all contribute to the increased risk of developing Alzheimer’s Disease, but by no means does carrying changes in these genes mean that you will definitely develop Alzheimer’s Disease.

However, the genes that have so far been associated with Alzheimer’s Disease are:

–          APP (‘Amyloid precursor protein’) – helps brain cells communicate with each other. Changes in APP therefore mean that brain cells can less effectively communicate.

 

–          Presenelins 1 & 2 – breaks down proteins in cells that are no longer needed, including Beta-Amyloid. If Beta-Amyloid is not effectively disposed of, it may form plaques

 

–          APOE (‘Apolipoprotein E’) – transports fats around cells, and also helps break down Beta-Amyloid. Some versions of APOE are less efficient at breaking down Beta-Amyloid.

Environment

So if the cause of Alzheimer’s Disease is not entirely due to our genes, then our environment and lifestyles are also likely to play a role. However, bear in mind that like genes, environmental and lifestyle factors have only been associated with increasing the risk of developing Alzheimer’s Disease, and are not in themselves a cause.

Obesity in mid-life increases the risk of developing Alzheimer’s Disease, as does sleep deprivation, although they don’t seem to affect the levels of beta-amyloid or tau found in brain cells. On the other hand, physical exercise may reduce the levels of tau in the brain and therefore reduces the risk of Alzheimer’s Disease. And good news for coffee drinkers! – regular caffeine intake might reduce the risk of developing Alzheimer’s Disease and delay when symptoms may develop.

However, it is difficult to prove whether environmental and lifestyle factors actually make a difference to the development of Alzheimer’s Disease, as these often rely on questionnaires or surveys, then a correlation between their answers and development of disease. Many studies have also only been carried out in rats or mice so far, and might not prove to be as important in humans. Therefore all of these things that are commonly reported in the media as ‘causing’ or ‘curing’ Alzheimer’s Disease (and/or cancer, usually!) should be taken with a pinch of salt.

But there is hope!

There have not been any recent major breakthroughs in developing a treatment for Alzheimer’s Disease, but this reflects the complexity of the problem, not the quality of the work or effort being put in to finding a cure. There are also several options available for slowing the progression of the disease, so it is definitely worth raising any concerns you may have either about yourself or a loved one with a GP. Below are some links to some fantastic charities and associations that can provide more information on Alzheimer’s Disease and dementia, or that you can donate to should you want to help fund the research effort (I am nodding enthusiastically!)

The Biocheminist

 

http://www.alzheimers.org.uk

http://www.alz.co.uk

http://www.alzheimersresearchuk.org

Why do scientists want my DNA?

I came across an article on the BBC news website this week that was discussing the pressure some individuals in Iceland are under to donate samples of their DNA to the private genetic research company deCODE (http://www.bbc.co.uk/news/magazine-27903831). The UK has an equivalent charitable organisation that collects several different types of biological samples from individuals over time; the UK Biobank (links to both deCODE and UK Biobank’s webpages are at the end of this post).

So why do scientists want your DNA, and what’s so special about the Icelandic variety?

Image

 

DNA (or deoxyribonucleic acid, if you want to impress your friends) acts like the instruction manual for the cells in your body, and therefore dictates how they behave, grow and function. Different cell types in your body (for example, a skin cell vs. a heart cell) ‘read’ from different ‘chapters’ of this manual, depending on what their job is (e.g. to produce skin pigment vs. rhythmically contract to make a heartbeat).

Keeping with the instruction manual metaphor …

DNA (the manual) is made up of genes (individual instructions). Every individual person has a different version of the manual, due to having slightly different versions of the instructions. The end result is ultimately the same (a human being!), but with a few variations. Some of these variations are obvious – such as hair or eye colour, height or nose shape. Other variations are less obvious, and may make a person more susceptible, or resistant to developing a particular disease or illness.

There may also be a typo in some of the instructions. Sometimes these don’t matter – the instructions can be read anyway and have the same result as if spellcheck had been completed. In other cases, the typo might change the meaning of the instructions, and the cell doesn’t quite work properly, therefore possibly causing a disease. These typos are referred to as ‘Single Nucleotide Polymorphisms’ (or SNPs).

In genetic research, the particular version of an instruction/gene you have is called a ‘Gene Variant.’

It is rarely the case that one particular gene variant causes a disease 100% of the time. It is more common that several different gene variants are ‘associated’ with a disease – that is to say that if you have one or more of those variants, you are more likely to get a particular disease. But, whether you actually do or not depends on many other things, such as your other genes and your diet & lifestyle.

But finding these ‘risk’ gene variants isn’t very straightforward.

Image

And that’s why geneticists (gene scientists) want your DNA!!

It is now possible to scan the whole of the DNA (speed-read the manual…if you want to keep the metaphor) from an individual relatively quickly, and this method is being used to find the genetic variants that might increase the risk of disease. However, as well as these important risk gene variants, human DNA is full of useless noise! Because a particular variant won’t ‘cause’ a disease 100% of the time, huge numbers of individual’s DNA needs to be read to try and sift out the important stuff from the noise.

So this takes us back to Iceland –

Genetic ‘noise’ can come from mixtures of different ethnicities and backgrounds; different continents typically have different genetic ‘signatures,’ which account for differences in disease rates, as well as more obvious features such as appearance. With immigration and international mobility, it is difficult to find a large population of people with a very similar genetic background to each other that would reduce the genetic noise and make the risk gene variants easier to find. However, Iceland is that population, and that is what makes their DNA particularly useful for research.

So what is the point of finding risk variants if they don’t necessarily cause the disease?

Well, the eventual aim is to identify the combination of risk variants and lifestyle factors that can accurately predict whether someone will develop a particular disease. A recent high profile example of the usefulness of this knowledge is Angelina Jolie’s choice to have a mastectomy to prevent the development of breast cancer because she carried the variant of the BRCA1 gene that increases the likelihood of developing the disease. Such interventions will be more complicated for neuroscience (removing the brain isn’t really an option), but identifying important genes can also focus the research done in labs so that it can be as targeted and specific as possible to increase the chances of finding a cure.

The ultimate (and very cool) goal is to be able to accurately predict the likelihood that any individual will develop a disease based on their genes and lifestyle, and tailor a treatment to their own personal genetic makeup. Although I feel I should point out, that’s a very long way off, and there is a whole minefield of ethical issues about whether people wish to share their DNA, how it will be stored, whether people want to know if they will develop a disease or not….perhaps the subject of another post.

I encourage you to take a look at the UK Biobank website for more information and perhaps even consider getting involved in some research!

 

The Biocheminist

 

www.deCODE.com

www.UKBiobank.ac.uk

 

So you’re a neuroscientist? Oh….

Ah! A phrase I hear quite often. On more than one occasion it’s been misheard as ‘Euroscientist’ (“Into politics are you?”) and even ‘Urinescientist’ (“Eurgh! Why??”).

However, the title of this post accurately describes the beginning – and end – of the majority of conversations I have had with strangers, acquaintances and occasionally, family members. It’s unusual, it’s uncommon and can feel like a pretty unapproachable subject … therefore it’s intimidating. I understand that. People don’t seem to know how to act in response to it; there’s no frame of reference, nothing to say.

And I believe that the fault lies entirely with Science.

It’s not exactly accessible – it’s a career only open to those with undergraduate degrees who then complete a PhD, with perhaps a Master’s degree in-between.  And even then, only those with significant amounts of luck and/or talent will go on to have successful careers.

It’s also not obvious what these academic types do all day, holed up in their universities, providing the occasional sound bite for documentaries, an interesting news story or their expert analysis on that day’s episode of Big Brother.

And on top of the mystery of what we do all day, the media reports both a new cure and cause for Alzheimer’s disease or Cancer seemingly every week! Of course that’s going to raise suspicion, and that suspicion breeds mistrust of where that information came from, and that information came from scientists!

Therefore, scientists cannot be trusted!

OLYMPUS DIGITAL CAMERA

 

Well, that’s not exactly true – the real information is often exaggerated and sensationalised to make it more interesting. No one wants to hear ‘cautious optimism’ – they want results! Cures!

So, given all of these misunderstandings, as well as cuts to essential funding and growing controversies over research methods (for example, animal testing and stem cells, to name just two of the big guns), only recently has it been acknowledged that perhaps, perhaps, Science should reach out and interact with the public to explain what we do and why it’s worth it. There is currently a huge drive for ‘Public Engagement’ in universities in the UK – a push to explain that what we do is worthwhile, how it benefits society, to encourage children to get interested and active in science, and importantly, to prove that we’re not all weird reclusive nerds!

I have started this blog because I want to explain to my friends and family what the hell it is that I do all day. I also want to share the bigger picture, dispel some of the myths and mystery about scientific research, as well as discuss its weaknesses and flaws. Even if you hated GCSE biology and chemistry, I reckon you’ll find some of it pretty awesome.

 

The Biocheminist

 

 

I’d love to hear your opinions on how you perceive science and scientific researchers – comment below! Also if you have any questions about any biology/chemistry/neuroscience-related stories that have been in the news or that you’ve wondered about – ask me in the comments box, and I’ll do my best to answer!

 

 

%d bloggers like this: