Did you see the autopsy of the incredibly well-preserved woolly mammoth last week??
I particularly enjoyed watching a group of scientists totally nerding out over a 40,000 year old frozen corpse – especially that one guy who was so darned determined to find a nice bit of mammoth poo! *Spoiler alert* – he finds some!
There wasn’t the same enthusiastic jumping-up-and-down excitement that followed the recent meteor landing, but I suppose there was more risk of a terrible mess.
During the programme, they mentioned a technique called ‘CRISPR’ (pronounced as we always aim to have our chips (or ‘fries’ to Americans) … ‘crisper‘) as a way to change or edit elephant DNA to be more similar to woolly mammoth DNA. And it’s not just mammoth cloners that are excited about this technique – the whole of genetic and molecular science is silently and carefully jumping up and down about it.
That’s because it’s pretty cool. It’s exciting because it replaces older technologies that take a lot longer, costs a lot more and are much more complicated. CRISPR can therefore potentially reduce the method of gene editing down from years to months at a fraction of the price. Although the actual practicalities of getting it to work are pretty fiddly, the general protocol is actually relatively straight forward.
So what is CRISPR?
CRISPR is not just a spelling mistake, it’s an acronym. It stands for ‘Clustered Regularly Interspaced Short Palindromic Repeats.’ Or put in simpler terms: short repeated sequences of DNA that read the same forwards and backwards, which tend to group together and have similar sized spaces between them. But ‘SRSODTRTSFABWTTGTAHSSSBT’ isn’t so pithy.
CRISPR DNA is found in bacteria and acts as their main form of defence against foreign DNA, such as from a virus. CRISPR RNA (CrRNA) locates and attaches to foreign DNA with a complementary sequence of nucleotides (see Express Yourself! for more on this). Identifying the intruder by CrRNA signals to a special enzyme, called ‘Cas9’ that can then cut (or ‘cleave’) the foreign DNA, leaving it inactive and harmless. This kind of enzyme is known as an endonuclease. The CRISPR/Cas9 system acts similarly to our immune system, in that it can remember previous infections in order to protect against future ones. The DNA sequence of an offending virus is assimilated into the CRISPR DNA so that it may be quickly identified, attacked and neutralised if it ever has the tenacity to attack again!
How is that going to bring back woolly mammoths? And why is it such a big deal?
Well, because molecular geneticists have been able to hijack this system so that it can cleave and edit the bits of DNA that they are interested in. The CRISPR/Cas9 system can be isolated from bacteria, and expressed in other cell types – such as elephant cells, or human brain cells. This means that scientists can fiddle about with different genes, and see how that changes the way cells function.
For example, imagine that a cell is expressing a gene associated with a particular disease. Stopping that gene from working – or ‘silencing’ it, may be an effective therapy or cure for that disease. To do this, scientists engineer a portion of the CRISPR DNA so that it recognises the gene they want to attack – in the same way that CRISPR would recognise previous viruses. This means that crRNA will track down the target gene, signal to Cas9 to cut it and stop it from working. Voila! The gene is silenced.
Or imagine it this way – a sniffer dog has been given the scent of a criminal they need to track down.
The scent acts as its guide to find the target, just as the DNA sequence from an interesting gene acts as a guide for the CRISPR system. The dog can then bring the criminal down, and CRISPR cuts the gene.
So how about gene editing rather than silencing? To do this, scientists hijack a different cellularfunction called ‘homologous recombination.’ Homologous recombination means that a piece of DNA that is broken can repair itself with a near-identical fragment of DNA that acts as a template. In order to edit genes, new DNA ‘templates’ are manufactured in the lab with the required edit, which are then added to the cell. That means that once CRISPR/Cas9 cuts the DNA, there will lots of near-identical templates available for cells to use to repair themselves – therefore editing their gene in the process.
If we take this back to the sniffer dog example, the homologous recombination process would be like putting a skin graft on a nasty bite the dog inflicts on the criminal – it’s not exactly the same as the original skin, but it’s close enough for a repair. Perhaps while they’re at it, the police add an electronic ankle tag so they can easily find the ‘edited’ criminal – this can also be done in cells by adding a fluorescent tag to the CRISPR/Cas9 system so it’s easy to see which cells have or haven’t been edited down a microscope.
And that’s how they might genetically engineer a mammoth! Or treat particular diseases! Or prevent inherited genetic conditions! The range of possibilities is vast, but the technology is still very much in its infancy and needs a lot of fiddling with.
However – the quick, easy and reliable editing of the human genome as a method for treating and curing disease is currently molecular genetics’ enthusiastic hunt for a woolly mammoth turd. We just need to keep on digging.
For more details about CRISPR, see:
It would be difficult to find anyone who hasn’t at least heard about, if not watched, the now viral New York street harassment video (if you haven’t seen it, you can watch it here).
It summarises an all too familiar experience that most women have faced at least once in their lives – and I mean MOST – as a staggering 98% of women surveyed in 2008 reported that they had experienced cat-calling and harassment. The video has caused an intense internet debate; as well as the majority outcry condemning the behaviour of the cat-callers and demands to change this all-too-common occurrence, there have also been more negative responses including the defence of the men involved and violent threats directed towards the subject of the video.
While a lot of the debate has centred on the acceptability and frequency of these behaviours, and how it can best be tackled, less attention has been given to the psychological effects of experiencing cat-calling and sexual harassment, and their impact on mental health.
So I did some digging.
While there is a wealth of scientific literature investigating the effects of sexual harassment at home or in the workplace on mental health, the investigation of the effects of street harassment or cat-calling (referred to in these studies as ‘stranger harassment’) is a relatively new development. This came as a surprise to me, as there are studies that date as far back as 1978 that found that women felt unsafe in a variety of social contexts, and a Canadian study in 2000 identified that stranger harassment greatly reduced feelings of safety to a larger degree than harassment by known acquaintances. To put more simply, harassment by strangers makes women feel even less safe and more scared than harassment by a known individual at work or at home.
Sexual harassment has been associated with nausea, sleeplessness, anxiety and depression. However, the literature focuses on two main components that may affect mental health:
Arguably the main risk of stranger harassment to mental health is its effect as a chronic stressor – a stressor can be any environmental or external event that causes stress to an individual, which becomes chronic when it is experienced on multiple occasions over time. For example, an individual may receive one cat-call on their walk to work. In isolation, this could be an unpleasant and mildly stressful event, or may not have any bearing on that person’s day. However, should that experience of a mild stressor occur every day for months or years, then it becomes a chronic source of stress that can negatively impact mental health.
How does stress affect mental health?
One of the most studied outcomes of chronic stress is depression (which is also one of the reported outcomes of harassment). In fact, a popular mouse model of depression is called the ‘Chronic Unexpected Stress’ (CUS) model, which is created by exposing mice to…well…chronic unexpected stress. This includes social stress, (such as overcrowding or isolation) and predatory stress (the scent or presence of a predator). This is such a popular model for depression, because chronic psychological stress effectively and predictively causes anxiety and depression-like behaviours in these mice.
Predatory stress increased inflammation in several brain areas in these mice – inflammation is the body’s response to threat, and in the short term protects cells from harm. However if inflammation is present for a long time, it can start to cause damage. Increased inflammation in the brain has been found in, and may exacerbate Alzheimer’s disease and depression. Studies in humans have also identified damage to the structure and communication networks of the brain as a result of chronic stress, which can have a negative effect on learning, memory and mood.
So it isn’t really such a leap to imagine that the fear or threat felt following harassment, and the powerlessness over its occurrence could become a chronic stressor. It can also arguably be equated with the ‘predatory stress’ used in mice. In a study that focused on the workplace, an association between harassment and poor mental health was identified. Specifically, individuals who experienced sexual harassment early on in their careers were more likely to be depressed later in life. This was the case for both men and women.
Objectification is a societal issue that reaches beyond just cat-calling, but its role in stranger harassment has been investigated. The theory of self-objectification in the psychological literature says that when a person is sexually harassed by a stranger, they feel objectified. This causes ‘self-surveillance,’ or for them to view themselves as the stranger views them. This is usually as a sexualised object, with their worth determined by how they feel they are viewed by others. In other words, they are ‘self-objectifying’ themselves. This self-objectification has been found to have multiple negative effects on mental health, and has been associated with increased prevalence of eating disorders, depression and substance abuse.
However science hasn’t always been able to carry out this kind of study without bias and sexism.
Several studies that I have come across appear to lay responsibility of the effects of harassment on mental health and well-being on the women who have been targeted, rather than on the individuals who commit the harassment. After associating harassment and self-objectification with negative mental health and psychological consequences, it has been recommended that women should be educated in better coping strategies so that they become more resilient to the inevitable objectifying experiences as a way to prevent mental health problems. It is this attitude – that cat-calling/street harassment/stranger harassment is a ‘normal’ experience that should just be put up with – which has allowed it to remain a prevalent and distressing problem in society.
Despite cat-calling and street harassment having been identified as an issue for at least the past 14 years, there has been no reduction in the number of women experiencing it, and there has been very little attention given to the serious effects these experiences may have on mental health. The scientific community has not escaped without bias in this area, although it has identified the association between harassment, stress and depression, and recognised that there may be a substantial psychological effect of frequent harassment. As the role of harassment on mental health gains more attention, scientists are beginning to investigate more thoroughly; including the negative effects witnessing sexism has on bystanders and some investigation into why some men do it.
There is still a long way to go – both scientifically and socially. But with cat-calling and harassment carrying such strong risks to mental health, perhaps they should be considered as a psychological assault.
For more information about cat-calling and harassment, and how it is being tackled, visit:
Love. A source of great joy and agonising pain (wait, didn’t I also say that about western blots…?). When we talk about love, we talk about the heart – love is heart-warming, losing a love is heart-breaking, you should enter relationships based on your heart, not with your head!
Nope! Sorry! I don’t want to break any hearts with this, but love is ALL in your head.
A lot of studies have been carried out where the brain has been scanned (or imaged) while individuals are looking at particular photos or carrying out activities and tasks – this is called fMRI (functional magnetic resonance imaging). This method can detect areas of the brain that are receiving higher blood flow, and are therefore likely to be more active. Using this method, it has been discovered that the areas of the brain responsible for regulating your temperature overlap with the areas of the brain associated with social warmth, defined in the study as the feeling of being loved and connected to other people (Inagaki & Eisenberger 2013). These areas were the Ventral Striatum and the Middle Insula (see pictures below for an idea of where these are). The association between physical temperature and feelings of love went so far that when people in the study held a warm object, they reported stronger feelings of social warmth, and those that read meaningful and loving messages from friends and family reported the room as feeling warmer.
Here is a brain – my brain – with rough areas associated with love & reward drawn on. Left image from the side, right image from above.
Why would this be? How is that useful?
The authors suggest that it could be learnt from birth – many behaviours used to soothe a baby and show it love, such as rocking and being held, occur in close proximity to another person and subsequently cause a rise in temperature. We therefore learn that warmth is associated with being loved and cared for. And no one can deny that a warm hug (or Welsh ‘cwtch’) from a loved one feels pretty darn good!
The same brain areas identified in that study have also shown greater activation when people rate themselves as close to their romantic partner, this and was associated with longer relationship length. In fact a lot of brain areas have been linked to feelings of romantic love, and many of these, such the Hippocampus and Nucleus Accumbens (see previous picture!) are part of the reward system in the brain. The reward system is the network in our brains that makes us feel pleasure and happiness (a reward), often in response to a particular event or behaviour. Activation of this system makes us try to repeat the action that led to its activation in the first place, therefore resulting in another reward feeling – if spending time with a particular person activates our reward system, then we strive to see them again.
Dopamine is the signalling molecule that works within this reward system.
Prairie voles (super cute voles from North America) are the most frequently studied animal on the neurobiology of love- this is because they form monogamous relationships. Voles that had more receptors for dopamine (parts of the cell that are able to detect the presence of dopamine, allowing the cells to respond to it) had increased monogamous behaviour. This suggests that increased activity in the brain’s reward system may improve the longevity and fidelity of individuals in a relationship, because being with their partner feels particularly rewarding.
Oxytocin has the reputation of being The Love Drug.
Oxytocin is a neuropeptide – which means it is a molecule that is used by brain cells to communicate with each other, although oxytocin is also capable of working as a hormone around the body. Oxytocin is well known as being associated with pregnancy and lactation, but its effects are much broader than that! It can also stimulate social behaviour, such as increasing trust and empathy. Looking back to those adorable voles, monogamous animals had more oxytocin receptors in the Frontal Cortex, Nucleus Accumbens and Striatum, which are the same areas that show increased activity in humans when shown a picture of their partner. A release of oxytocin in the brain during mating was essential for the important bonding to a partner in voles.
Enough of voles – in humans, oxytocin is increased by hugs, social support, massages and orgasm.
In fact, when heterosexual male subjects were given oxytocin intranasally (up their nose!), they rated their partner’s face as more attractive than other women’s faces, and showed increased activation of their brain reward systems. The author stated that oxytocin could ‘improve the reward value’ of the subject’s partners…which is oh so romantic(!) A sniff of oxytocin in females improved their ability to determine the emotion felt by another person when just shown the eye region of their face – this is called the ‘reading the mind in the eyes test,’ or the slightly snappier ‘RMET‘.
As well as oxytocin, multiple other hormones have been implicated in the neurobiology of love – testosterone, cortisol and dopamine have all been identified as contributing to either the longevity or demise of romantic relationships. Cortisol is a hormone that is associated with responses to stress, and particularly high levels in couples during an argument were associated with increased hostility and relationship breakup, particularly if levels were high in both individuals. High levels of oxytocin, on the other hand, were associated with increased empathy. High levels of testosterone are associated with competitiveness rather than stability and trust – it is much higher in single men than in men in relationships who no longer need to compete with other males for a partner.
You can be ‘Crazy in Love’ – Beyoncé was right!
The early stage of a new relationship is considered to be a separate phase that creates different and unique responses in the brain. There is a dramatic increase in the love drug, oxytocin, which in turn increases the activation of dopamine-related brain areas. When these areas are so strongly activated, large areas of the cortex experience a reduction in activity, which means that we lose our ability for rational judgement – which is an effect many of us may have observed in our friends in a new relationship! The activation of the dopamine reward system may also make us temporarily ‘addicted’ to our new beau, as their ‘reward value’ is through the roof! It is thought that this early and temporary addiction serves the purpose of keeping us around that person for long enough to form a meaningful attachment.
So yes, it might all be in your head and love might make you crazy, but it’s also a real biological phenomenon. And don’t forget to be romantic – let your significant other know that they have a high reward value, then give ‘em a cwtch.
Morgan Freeman has been really getting on my nerves this week. I’ll be sitting in my lounge going about my evening, and suddenly his smooth baritone voice will interrupt my thoughts/dinner/internet browsing with the statement:
‘It is estimated that us human beings only use 10% of our brain’s capacity. Imagine if we could access 100%!’
I start internally screaming. The belief that we only use 10% of our brains is one of the most pervasive and prevalent neuroscience myths, but it’s just not true! Of course, I understand that this statement is part of a Hollywood film script and will typically be considered as fiction, but it has prompted me to address just a few of the most common misconceptions about the brain for this week’s post:
- We only use 10% of our brains.
Nope. No. Nu-uh. Not true. We use the whole 100% of our brain. If we used only 10%, then we’d perhaps expect injuries caused by physical trauma, stroke or disease to have little or no effect, unless they hit the ‘functional’ 10%. In reality, loss or damage to even small areas of the brain can greatly, sometimes devastatingly, affect a person’s life and how they function. From an evolutionary point of view, it would make no sense for us to develop such large and complicated organs in our heads if the majority of it is useless. Modern imaging techniques have also confirmed that there is activity throughout the entire brain, even while we sleep.
So if there isn’t even a grain of truth to this statement, where did it come from, and why do people still believe it?
I haven’t been able to find one absolute initiation point for this belief. However, it may have resulted from very early experiments in animals, where it was found that simple tasks could still be completed after damaging pretty large areas of their brains, leading some to believe that there was a lot of redundant stuff in our heads. Alternatively, it has also been claimed that the statement ‘We are making use of only a small part of our possible mental and physical resources’ by prominent psychologist William James in 1908 may have initiated the idea that we are not making full use of our brains. Misinterpretation and misquoting over the years gradually morphed into the ‘10%’ myth we know (and loathe) today!
But why do we still believe it? Well it seems to have become ingrained in popular culture – it is repeated so commonly and mentioned so frequently in passing (à la Morgan Freeman), as well as in advertisements for self-improvement (e.g. brain-training) that it may simply be accepted as true.
- Fish have a three-second memory
Again, no! Fish are more complicated and highly developed creatures than previously thought. A 2009 study demonstrated that not only could fish associate a specific noise with feeding time (in the same way as Pavlov’s famous bell = food experiment in dogs), but that this association was remembered three months later. In fact there has also been a study that has demonstrated that fish may have emotional states, and can learn that a particular environment is associated with a food reward, and another is associated with being chased by a net – when given the choice, fish spent more time in the environment with the food reward, and avoided the scary side! Even more astonishingly, fish have been found to be able to learn the difference between blues and classical music, and can even classify music they have never heard before into one of these two categories. And these aren’t the only examples of fantastic fish memory – the internet is full of different experiments and studies about fish learning and memory – take a look!
But if fish are so clever, where did the three-second myth come from?
There doesn’t appear to be a definitive source for this myth, but it has been suggested that it may have arisen to justify the small bowls that goldfish are commonly kept in – if they have such short memories, then they can’t get bored, and there’s no need to feel guilty! If anyone has a better answer for where this myth began, please let me know!
- Drinking alcohol kills your brain cells
If this were entirely true, then there would be very few Freshers that would ever manage to graduate from their undergraduate degrees (or indeed make it past Fresher’s week). It appears that in the short term, acute alcohol intake (i.e. binge drinking) will not kill your brain cells – it alters how brain cells communicate with each other, causes dehydration and reduces glucose metabolism (their use of energy), but these functions can be restored following a period of abstinence. A study in 1993 also failed to find any difference between the number of neurons in the brains of alcoholics and non-alcoholics.
However, there is some more recent evidence that chronic alcohol abuse may lead to neurodegeneration, although exactly how this happens isn’t completely understood, and it may be a combination of cell death and dysfunction. However, excessive consumption of alcohol over a long period of time is able to indirectly lead to the death of brain cells – chronic alcoholism may lead to severe vitamin B deficiency, which is the cause of Korsakoff’s disease; a form of dementia associated with memory loss and confusion.
As we have a surviving and intelligent graduate population, where did the belief that alcohol kills brain cells come from?
Most likely, this belief has quite simply arisen from the ridiculous behaviour, slowed cognition and terrible decisions exhibited by drunk people, combined with the agony of a really bad hangover!
- Vaccinations cause Autism
NO. NO. I cannot stress this enough – THIS IS NOT TRUE. When deciding which myths I was going to include in this post, I chose to tackle this one as its propagation has been so damaging to children’s health – in fact it has been described as ‘the most damaging medical hoax of the last 100 years.’
There have been (and still are) several theories put forward to describe why the MMR (measles, mumps and rubella) vaccination would cause Autism – these have included the overwhelming of an infant’s immune system, the inclusion of toxic ingredients in the vaccine, and the vaccine causing damage to the intestinal walls which allows infection by disease-causing proteins. All of these have subsequently been tested, and none of them are true.
There have now been multiple studies that have demonstrated no causal link between the MMR vaccination and the development of Autism or other Autism spectrum disorders. This paper by Gerber & Offit (2009) summarises the work and reviews the evidence, and an analysis released this year re-states the fact that there is no association between the MMR and Autism.
But if there is absolutely no link between Autism and the MMR, why do so many people believe it, and why is it still going strong?
In contrast to the other three myths I write about here, there is one very definite source for this belief. It started with a fraudulent paper published in a medical journal in 1998 by author Andrew Wakefield – he noticed that a few children coincidentally exhibited some autistic-like behaviour shortly after receiving the MMR vaccination, and that they also had intestinal troubles. However, only 12 children were studied, and behavioural symptoms of Autism tend to appear around the same age as the MMR is typically given, so it is no surprise that these two events appeared close together in these children. Furthermore, there were multiple issues with the work – in 2010 the paper was fully retracted, and in 2011 a description of the fraudulent activity was published in the British Medical Journal.
But why do people still believe that there is a link? Parents are understandably very protective of their children and do not want to cause them any harm, and the doubt and fear over vaccines that occurred in response to the original paper is still enough to stop some parents from getting their child vaccinated – in these cases, it may be a lack of information about where the risks and dangers really lie that affects their decision. There is also a strong anti-vaccination movement that existed prior to the fraudulent paper, so the findings from this paper have been used as ‘evidence’ for that movement and strengthened their position, despite evidence to the contrary.
Are there any other myths about the brain that you are interested in knowing more about? Is there something you’ve heard in the office that you think might not be true? Just ask in the comments section below!
References & for more, see:
I have a brand new niece. Two, actually – although the other one has a full year’s life experience over the freshest one. Not having ever ‘known’ any other babies before, my two nieces have provided an amusing (and responsibility-free!) experience in human development and learning. As I specialise in the degeneration of the brain and when it goes wrong, it’s been pretty fun and interesting to see those two little girls develop and begin to understand the world.
So I thought I’d write a post about one of the things I’ve recently learned about baby development – they LOVE looking at black and white things. This is apparently a very well-known fact to new parents, but why do babies like looking at black and white things so much, and how does it affect their brain?
I didn’t know, so I have tried to find out!
A LOT is known about vision and the development of the visual system (by other people, not by me), so I have tried to really trim it down to specifically answer just the black and white question…
First things first…
At the back of your eye is your retina, which is the area that detects light and conveys information about that light down the optic nerve, which connects your eyes to your brain. The information then travels to the visual cortex – the region of your brain at the back of your head. It is here that we make sense of what the light entering our eyes means, and turns that information into what we perceive, or ‘see.’ This is our visual system.
We require visual input (i.e we need light and to be able to see things) for our brains to develop the ability to discriminate between objects and to tell the difference between colours and shapes. In the womb, there is no such input, so at birth the visual system is not yet fully developed.
It makes sense to assume that as a baby’s brain is less developed than an adult’s, then it would have less brain cells with fewer connections between them. However, this is not the case. In the visual system, babies have an excess of brain cells with many more connections. This is because during development in the womb, brain cells in the visual system will randomly fire (send signals) all over the place – this allows the brain cells to start growing, and prepares the visual system for that magical day when it gets to experience light.
However, there is not yet any structure or order to the system, and there would be a lot of meaningless communication (or noise) between cells, so it isn’t very efficient. The experience of light and vision in the first few months after birth are crucial for the strengthening of particular connections in the visual system and for the formation of a proper, organised structure. The connections and cells that are not used are lost (or pruned).
The development of the visual system in these first few months and years is known as the ‘critical period’ – this was described by Nobel Prize winners Hubel and Wiesel in the 1960’s. During this period, if there is no visual experience, the visual system will not develop properly, and normal sight will never be achieved.
So babies don’t have a well-developed visual system, and they require visual experiences to fine-tune their vision for normal sight.
But why do they prefer black and white?
Well, it seems that babies pay more attention to things that cause greater brain activation than things that cause less brain activation.
I mentioned the organisation and structure of the visual system before – different brain cells respond best to different visual stimuli, and brain cells that respond to similar things tend to group together. For example, some brain cells will have their strongest response when a vertical line is seen. The response of the cell will decrease as the line moves away from vertical, and will not respond at all to a horizontal line. However, another cell will prefer horizontal lines and will respond best to these. As clear, well defined vertical and horizontal lines elicit the strongest cellular reactions, these are initially given the most attention.
Similar preferences exist for colour saturation, hue and contrast. Black and white patterns have the greatest level of contrast, so will cause the largest response in the visual system – as lines become blurred together to form grey, there would be reduced contrast, which causes less of a response, so it is harder to discriminate and is therefore less interesting! The experience of seeing sharp lines and edges helps to shape the visual system and form important connections that will form the basis for understanding more complex shapes.
On top of this, the retina is still developing up until 3-4 months of age, and the cells that detect colour mature slower than those that detect light intensity or brightness. This means that babies are less able to tell the difference between different colours in the first few months of life. That means that a pattern of red and green is much less interesting than a pattern of black and white, which looks more like the presence and absence of light.
So that was a whistle-stop tour of my basic understanding of the development of the visual system!
My niece likes to look at black and white pictures because that’s what her visual system responds to best, and this kind of stimulation is helping to form strong, meaningful connections between her brain cells that lead to the development of her visual system.
If you have any questions, feel free to ask – but it will take me a while to respond as I’d have to look it up!
Also feel free to ask me about any neuroscience or biochemistry topic you are interested in or want to understand more about, and I’ll try to respond with a post about it!
PCR (see what I did there?) isn’t as terrifying as the title of this post suggests – although it has been known to induce screams of frustration in poor hard working students and researchers!
So what is PCR?
The ‘Polymerase Chain Reaction’ (don’t worry – the meaning of this will become completely clear!) is one of the most commonly used lab techniques. Briefly, it is a technique to ‘bulk up’ (or Amplify) a certain piece of DNA that you are interested in from a sample of mixed bits of DNA. This makes it easier to find the interesting piece among all the boring bits you aren’t interested in.
There are several reasons why you might want to do this, but most frequently it is used to check whether a certain piece of DNA is present in your sample.
Why would that be useful?
- You would want to do this if you tried to change the DNA in a cell line or animal model and you need to check if it worked.
- Alternatively, PCR can be used to see if there are any natural mutations between, say, the DNA from ‘healthy’ people and the DNA from a group of people with a particular disease
- Along the same lines as above, it can be used to test for genetic diseases (where the cause of the disease is known to be in the DNA and is passed down through families)
- Out of the research lab and hospitals, it is also the technique used for paternity testing (as appears frequently on The Jeremy Kyle Show etc) and in forensic science (g. as seen in CSI) – in these cases, two DNA samples are compared for their similarity to each other, or in the case of forensic science, it can also create a much larger sample for testing from an initially very small trace of DNA left at a crime scene.
Finding what you want in the DN-hAystack! (LOL)
But what actually is the polymerase chain reaction?
It’s a tricky one to explain as there are several stages, so first I’ll note the two important things you need to start with, followed by the process itself. A word of warning – it’s one of those things where looking at the pictures really helps!
You need to:
- Collect your DNA sample
This might be from a cell line, from a lab rat sample or from a sample taken from a patient or volunteer. Typically, the DNA is then ‘extracted’ from the cells (as animal/human samples will consist of cells – which is also where DNA is stored). By extracting the DNA and getting rid of the other bits of the cell, you should get a ‘cleaner’ sample that is less likely to fail during the PCR. This sample is called the ‘DNA template.’
- Prepare your primers
DNA is made up by a chain of ‘bases’ (or ‘nucleotides’) – Cytosine, Guanine, Adenine and Thymine (C, G, A and T), which pair together to form a double stranded DNA helix. In order for the PCR technique to amplify the part of DNA you are interested in, you need to tell it what part of the DNA to pay attention to. This is done with ‘primers’ (as they prime the reaction). Primers are short, single stranded chains of bases that are designed to match (‘complement’) the sequence of bases on the interesting region of DNA.
Steps for PCR:
This is basically just separating the two strands of DNA from each other to form single strands. This is so that the primers are able to pair with their complementary sequence on the DNA strand. Denaturation is done by briefly heating the DNA to 94-98˚C.
The temperature is dropped to 50-65˚C to allow the primers to pair (‘anneal’) with the DNA
An enzyme called DNA Polymerase (ß hence the ‘P’ in ‘PCR!) recognises the primer-DNA pair, and recruits spare bases/nucleotides from the surrounding area (these are added by the researcher, along with the DNA polymerase).
The DNA polymerase is typically taken from a bacteria called ‘Thermus Aquaticus’ and is referred to as ‘Taq’ – this is used because it can withstand the high temperature used in step 1, whereas polymerase from any other source would break down and stop working.
The polymerase then synthesises a new strand of DNA that matches the original strand of DNA. Primers are designed to match both strands of DNA (as the sequence of the second strand will be reversed compared to the first), so during the extension phase, both the ‘forward’ and ‘reverse’ strands of DNA are synthesised to form a copy of double stranded DNA.
- And repeat!
The previous three stages are repeated between 20-40 times. Each time it is repeated, the amount of DNA is doubled, so that there is an exponential increase in the number of copies of the DNA sequence you are interested in (until the spare nucleotides and DNA polymerase run out).
Now you have loads of a specific sequence of DNA! Yay!
The resulting DNA can be passed through a gel and separated by size in the same manner as the protein described in ‘Western Whats?‘ A difference in size indicates a different nucleotide sequence, and therefore a potential mutation, or mismatch between samples. Large amounts of amplified DNA are also required for other biochemical techniques such as sequencing (which reads the whole sequence of the DNA strand one nucleotide at a time) or for inserting into the DNA of another organism, such as yeast or bacteria (with the purpose of seeing the effect this region of DNA may have on a cell).
So that seems pretty straightforward, right?
Well, yes, PCR is one of those things that can be very easy – but only when it works! Unfortunately, every stage of PCR is very sensitive to disruption, and many different sequences of DNA will need slightly different conditions for the PCR to work. For example, both too much and too little template DNA can completely ruin a PCR, and if the primers are not specific enough to the region of interest, they can pair with the wrong section of your template DNA and cause all kinds of rubbish to be amplified!
It therefore takes an experienced/skilled/lucky researcher to get a perfect PCR first time; otherwise you start to hear those screams….
Did this post make sense to non-scientists? I’d love some feedback on how understandable my posts are and if I’m managing to explain biochemistry and neuroscience to you!
Is there more about PCR you would like to know? Or are there any other lab techniques or neurological diseases you’d like to learn more about? Comment below!
I love coffee. It’s delicious, aromatic and warming. And it comes in so many varieties! Lattés, Americanos, Iced…Alcoholic! But most importantly, it provides that much appreciated, and sometimes essential, pick-me-up when the day is beginning to drag.
And I’m not alone.
The component of coffee that gives us that ‘buzz’ is caffeine, and caffeine is the world’s most popular psychoactive drug. ‘Psychoactive’ describes a group of drugs that cause a temporary increase in mental function, and stimulates activation in the cells in the brain. When coffee is consumed, it can therefore increase cognition (e.g. problem solving, alertness and attention) as well as memory.
The effects of coffee might also be long-term – there is increasing evidence that regular consumption of moderate amounts of caffeine (3-5 cups of coffee a day) over many years can prevent age-related cognitive decline, and as I mentioned in my previous post Alzheimer’s Disease – Explained!, there is also some evidence that suggests caffeine can reduce the risk, or delay the onset of, Alzheimer’s Disease.
So what is happening in the brain when you drink a cup of delicious coffee?
Our brain cells have many different ‘detectors’ or ‘receptors’ that are specific to particular signals in the brain, so that they know how to react in response to different situations. One of these signals (or ’neurotransmitters’) is a chemical called Adenosine. When Adenosine attaches to the receptors that are designed to detect it (unsurprisingly called ‘Adenosine receptors’), activity within the cell is inhibited.
Inhibiting cell function in the brain sounds like it might be a bad thing, but actually activity in the brain needs to be ordered and carefully regulated in order to communicate meaningful information. That means that there must be a balance between stimulation and inhibition.
Caffeine is also able to attach to Adenosine receptors, but as it is not Adenosine, it does not have the same effect on the cell. Because caffeine is attached to the receptors, it is ‘blocking’ (or ‘antagonising’) the attachment of Adenosine. As the inhibitory Adenosine signal can no longer be detected, this causes increased cell stimulation and activity.
This increased activity has been found to be present in the parts of the brain that have been linked with cognition and attention, and may be why we may experience clearer thinking and feel more awake after a strong cup of the good stuff.
Blocking Adenosine receptors also causes an increase in another neurotransmitter, Dopamine. Dopamine acts as a reward system in the brain, so when it is increased we feel pleasure and are compelled to repeat the behaviour that gave us that good feeling (i.e. running down to our favourite café and sipping on a steaming cup of java).
But it’s not all good news
So its sounds like coffee, and importantly caffeine, is some kind of wonder drug that makes us happier and cures all ailments! Right?
Well, not exactly. While there is a lot of evidence for the positive effects of both caffeine and coffee, much of this research has been carried out in rats and mice as it is difficult, and ethically questionable, to manipulate the amount of coffee a group of people drink daily over the course of their adult lives. Simply, we just don’t know enough – how much is a useful amount of caffeine? How much is too much (caffeine overdose does exist and can be very dangerous)? When is the best time to start drinking it?
Too much caffeine early in life can also be damaging; caffeine easily crosses into the placenta, where it can have large effects on the developing foetus, as it cannot yet be efficiently processed. Even in teenagers the brain is still developing, and too much caffeine has been found to increase anxiety and change the number of Adenosine receptors found in the brain (albeit in rodents).
However, with all its cognitive perks in healthy adults, and the joy that comes from drinking it, it looks like coffee is going to be enjoyed in its many delightful forms, and will stay one of the most popular drugs in the world!
Alzheimer’s Disease (named after Alois Alzheimer, who first described the disease in a lecture in 1906) is the most common dementia. But what is dementia? ‘Dementia’ is used to describe a collection of symptoms including difficulty with problem solving, thinking and language, but is most often associated with memory loss. There are many different types of dementia, but as Alzheimer’s Disease accounts for 50-60% of all cases of dementia, it is arguably the biggest problem and commands the most attention (and research funding!)
Alzheimer’s Disease is a growing problem because of our aging population. The symptoms of Alzheimer’s Disease tend to start to develop after 65 years of age – and we didn’t used to live that long! With better medical care and individuals living way past their 80th birthdays, the number of people that are old enough to develop Alzheimer’s disease is growing. It’s an awful and distressing condition for both those suffering from it, as well as for those caring for their afflicted family members or friends.
Most people have heard of Alzheimer’s Disease, and most people know it causes memory loss. But what actually causes Alzheimer’s Disease? What is it??
In the Brain
In Alzheimer’s Disease, cells in the part of the brain called the ‘Hippocampus’ begin to die. The Hippocampus is so called because the person who identified it thought it looked like a sea horse (in Greek, ‘hippo’ = horse, ‘kampus’ = sea monster). They were wrong, but the name stuck. The hippocampus is responsible for forming our memories, so damage to this area explains the memory loss in Alzheimer’s disease. As the disease progresses, cell death spreads to other parts of the brain, and can affect other functions such as problem solving and language.
So why do the cells die?
There appears to be an accumulation of two different proteins within the Alzheimer’s Disease brain. These are ‘Beta-Amyloid,’ which clumps together to form ‘plaques,’ and ‘Tau’ that forms ‘tangles.’ However, the association between these and Alzheimer’s disease is not completely clear. Many people develop plaques and tangles in old age, but have no memory problems. On the other hand, some individuals with very severe Alzheimer’s Disease may not have very severe plaques and tangles. Nevertheless, the presence of a lot of Beta-Amyloid and Tau in brain cells interferes with how cells normally function and communicate with each other. If brain cells can no longer communicate with each other, memories can no longer be formed and the cells will die. Attempts are therefore being made to reduce Beta-Amyloid plaques and Tau tangles as a part of Alzheimer’s Disease research.
So there are plaques and tangles in the Alzheimer’s Disease brain, and brain cells die, but why?
I described in a previous post what genes are and why they are important to research (https://thebiocheminist.wordpress.com/2014/06/23/why-do-scientists-want-my-dna/). Genes act like templates for cells to make proteins, and these proteins can also control the function of other proteins. Therefore changes in genes may underlie the differences we see in Beta-Amyloid and Tau proteins.
Alzheimer’s Disease is not caused by a change in a single gene. It seems as though changes in several different genes all contribute to the increased risk of developing Alzheimer’s Disease, but by no means does carrying changes in these genes mean that you will definitely develop Alzheimer’s Disease.
However, the genes that have so far been associated with Alzheimer’s Disease are:
– APP (‘Amyloid precursor protein’) – helps brain cells communicate with each other. Changes in APP therefore mean that brain cells can less effectively communicate.
– Presenelins 1 & 2 – breaks down proteins in cells that are no longer needed, including Beta-Amyloid. If Beta-Amyloid is not effectively disposed of, it may form plaques
– APOE (‘Apolipoprotein E’) – transports fats around cells, and also helps break down Beta-Amyloid. Some versions of APOE are less efficient at breaking down Beta-Amyloid.
So if the cause of Alzheimer’s Disease is not entirely due to our genes, then our environment and lifestyles are also likely to play a role. However, bear in mind that like genes, environmental and lifestyle factors have only been associated with increasing the risk of developing Alzheimer’s Disease, and are not in themselves a cause.
Obesity in mid-life increases the risk of developing Alzheimer’s Disease, as does sleep deprivation, although they don’t seem to affect the levels of beta-amyloid or tau found in brain cells. On the other hand, physical exercise may reduce the levels of tau in the brain and therefore reduces the risk of Alzheimer’s Disease. And good news for coffee drinkers! – regular caffeine intake might reduce the risk of developing Alzheimer’s Disease and delay when symptoms may develop.
However, it is difficult to prove whether environmental and lifestyle factors actually make a difference to the development of Alzheimer’s Disease, as these often rely on questionnaires or surveys, then a correlation between their answers and development of disease. Many studies have also only been carried out in rats or mice so far, and might not prove to be as important in humans. Therefore all of these things that are commonly reported in the media as ‘causing’ or ‘curing’ Alzheimer’s Disease (and/or cancer, usually!) should be taken with a pinch of salt.
But there is hope!
There have not been any recent major breakthroughs in developing a treatment for Alzheimer’s Disease, but this reflects the complexity of the problem, not the quality of the work or effort being put in to finding a cure. There are also several options available for slowing the progression of the disease, so it is definitely worth raising any concerns you may have either about yourself or a loved one with a GP. Below are some links to some fantastic charities and associations that can provide more information on Alzheimer’s Disease and dementia, or that you can donate to should you want to help fund the research effort (I am nodding enthusiastically!)