Category: Uncategorized

The HIV-positive Brain in the ART Age

Ines has completed a research project in computational neuroscience, focusing on learning impairments in Parkinson’s disease. Her previous research experience during her BSc included studies on rodent models of ADHD (University of Manchester) and Alzheimer’s disease (Institute for Functional Genomics, Montpellier, France). Her interest in writing about HIV stems from the fact that she was born and grew up in Mozambique, a country where HIV/AIDS continues to be a significant health threat, affecting over 10% of the population and where over 800,000 people live with HIV on ART.

On the 5th of June 1981, the cases of 5 patients infected with what would later come to be known as the human immunodeficiency virus (HIV) were reported for the first time in Los Angeles, USA (CDC, 1981). In the years that followed, the virus rapidly spread across the globe, with a peak of new infections (3.5 million) recorded in 1997 (Roser and Ritchie, 2018). By 2016, there were approximately 36.7 million people living with HIV worldwide (UNAIDS, 2017).

The human immunodeficiency virus is transmitted through contact with bodily fluids. Although sexual transmission is the most common form, HIV can also be passed on through contact with infected blood or by vertical transmission. The latter refers to the case of mothers transmitting the virus to their children during pregnancy, childbirth or breastfeeding (Levy, 1993; Roser and Ritchie, 2018).

HIV belongs to a subgroup of viruses called lentiviruses. The term lentivirus comes from the Latin lentus, which means slow, owing to its long incubation period in the infected organism before any overt manifestation of disease is detectable (Haase,1986). Due to this defining characteristic, HIV infection is initially asymptomatic, but if left untreated, it may have notable negative effects and eventually lead to death (Moir et al., 2011). Although the viral infection in itself is not usually the direct cause of death, HIV suppresses the immune system, making patients particularly vulnerable to certain cancers and other infections, while being unable to naturally fight them. This state of immunosuppression and the spectrum of potentially fatal symptoms that accompany it are collectively termed Acquired Immune Deficiency Syndrome (AIDS) (Levy, 1993; Weiss, 1993; Naif, 2013). Over 70% of HIV infection cases have been reported in sub-Saharan Africa, where, to date, AIDS ranks as one of the major causes of death among adults (Roser and Ritchie, 2018).

Where the virus meets the brain

There is currently no vaccine or cure for HIV infection. Nonetheless, the last 37 years have seen major advances in the understanding of its pathophysiology as well as the wide range of clinical manifestations it may have. Among them, several neurological complications have been identified (Saloner and Cysique, 2017, Grant et al., 1987). These complications can be broadly categorised into two classes: primary and secondary. Primary complications arise as a direct effect of HIV replication in central nervous system cells; the most notable example being HIV-associated neurocognitive disorder (HAND). Secondary complications, however, result from the previously mentioned opportunistic infections that occur when patients are immunosuppressed (Kolson, 2017).

Evidence suggests that HIV enters the central nervous system within the first few weeks post-infection, an early stage in the course of the disease (Abidin et al., 2018). As the virus replicates, it may trigger a cascade of events at the molecular level that lead to chronic inflammation in the brain (Kolson, 2017). Consequently, the central nervous system may undergo primary damage that is only detected much later, when patients present with neuropsychological symptoms (Ellis et al., 2007, Abidin et al., 2018).

One of the most severe forms of cognitive impairment observed among patients is HIV-associated dementia (HAD) (Saylor et al., 2016, Grant et al., 1987, Ellis et al., 1997). In the early years of the AIDS epidemic, this was a common feature of a patient’s clinical presentation and was often fatal. This would go on to be so until the development of antiretroviral drugs in the mid 1990s (Saylor et al., 2016).

The impact of ART

Since the introduction of antiretroviral therapy (ART), there has been a major shift in  prognoses for patients living with HIV as well as in the manifestation of the virus in the brain. Briefly, antiretroviral drugs are a group of pharmacological agents that target different parts of the HIV life cycle. ART is not capable of eradicating the virus, but it is able to inhibit its replication, thus, reducing or entirely preventing immunosuppression (Ellis et al., 2007). In the present day, an HIV-positive diagnosis in areas where ART is readily available no longer necessarily translates to an imminent death or even to the development of AIDS (McArthur et al., 2003). Instead, HIV infection can become a chronic condition that allows patients to have close to normal life expectancies, especially when treatment is started early (Wandeler et al., 2016, Saylor et al., 2016). With regard to the brain, in patient populations undergoing ART, the incidence of severe neurocognitive problems such as HAD and that of secondary neurological complications has been shown to decrease drastically (Saylor et al., 2016).

Nevertheless, there are limitations to the therapeutic effect of ART. A neurologically relevant example of this is the finding that the blood brain barrier (BBB) restricts the entry of certain antiretroviral drugs into the central nervous system (Langford et al., 2006). The blood brain barrier separates neurons from circulating blood. Its permeability is highly selective, allowing for tight control of the material that is exchanged across it (Zlokovic, 2008). However, it has been hypothesized that by hindering the penetrance of certain antiretroviral drugs, the BBB allows the activity of HIV to partially persist in the brain (Ellis et al., 2007). In accordance with this hypothesis, over 50% of the HIV-positive population is estimated to be susceptible to HIV-associated neurocognitive disorders; though these are often milder forms of HAND than those observed pre-ART (Abidin et al., 2018, Chakradhar, 2018). However, it can also be hypothesised that independent of ART penetrance, such cognitive problems may be long-term sequelae of primary insults (Ellis et al., 2007).

Furthermore, the consequences of long-term administration of ART are currently not fully understood and it has been suggested that it could also play a role in neuronal damage (Kolson, 2017). Nevertheless, whatever the exact causes may be, mild forms of HAND are undeniably common among patients and can be detrimental to their quality of life (Saylor et al., 2016; Abidin et al., 2018). With the expanding availability of ART, the importance of addressing such issues has gained increased recognition given that  HIV positivity has become a long-term status for many of those affected (Wing, 2016; Chakradhar, 2018).

The aftermath of survival: from development to ageing 

The fact that ART allows HIV-positive individuals to live much longer than they could in the beginning of the AIDS epidemic has set several new neuroscience research challenges as there are still many gaps in the understanding of the long-term effects of HIV infection (Wing, 2016).

On one end of the age spectrum, there are increasing survival rates among children and adolescents who initiated ART soon after vertical transmission, creating a window of opportunity to investigate the role of HIV infection in the developing brain (Li et al., 2018, de Martino et al., 2000). Further, this group constitutes a new population of individuals, many of whom are now adults, who have been on ART for their entire lives.

On the other end, with the continuous (though decreasing) occurrence of new HIV infections and a reduction in deaths due to AIDS (UNAIDS, 2017), there is a growing HIV-positive population that is ageing (Wing, 2016). In fact, a study based on people living with HIV in the Netherlands projected that by 2030, almost 75% of this population would be composed of people aged 50 or older (Smit et al., 2015). This raises interest in the interaction between HIV infection and ageing-associated neurocognitive processes.

In 2016, Turner and colleagues reported the first case of an individual with both HIV and Alzheimer’s disease, the most common cause of dementia (Turner et al, 2016). As more and more of these cases appear, it is important to find the means to distinguish between Alzheimer’s and HAND symptoms as well as to identify any links that may exist between them (Chakradhar, 2018). In recent years, it has often been debated whether HIV infection might accelerate ageing (Chakradhar, 2018, Deeks et al., 2013). Given, that age is the strongest risk factor for Alzheimer’s disease, this could be a potential link to HIV. Furthermore, a study in neurons derived from mice found that ART may contribute to the deposition of the protein, amyloid beta, a hallmark of Alzheimer’s disease pathology (Giunta et al., 2011). Taken together, these ideas build a case for the possibility of interactions between age, HIV and ART.

Future prospects: where to look for clues?  

One of the main aims for the future of neuro-HIV research is to find differences, be they anatomical, functional or molecular, between healthy and HIV-positive individuals that might help detect and/or predict the occurrence of neurological problems at early stages (Clifford, 2017). Such characteristic traits of pathological processes are known as biomarkers. The use of tools to visualise the brain in vivo (neuroimaging) has been an increasingly popular approach to solving this problem. For example, a few recent studies have applied neuroimaging techniques to the investigation of anatomical and functional changes in the brains of young people infected with HIV (Ackermann et al., 2016, Randall et al., 2017, Toich et al., 2017). Others have looked into characteristics that may distinguish the effects of the virus from processes associated to ageing or non-HIV-related neurodegeneration (Zhang et al., 2016, Cole et al., 2017). Further, a study by Wiesman et al,. (2018) specifically investigated potential imaging-based biomarkers of HAND. These studies have highlighted specific alterations in grey matter volume, white matter integrity and network connectivity. In time, such findings may begin to shape an HIV-specific neurocognitive profile. Ideally, this will improve prognosis for patients and help define new therapeutic approaches that might improve the quality of life of those living with HIV.




Abidin AZ, Dsouza AM, Nagarajan MB, Wang L, Qiu X, Schifitto G and Wismuller A (2018) Alteration of brain network topology in HIV-associated neurocognitive disorder: A novel functional connectivity perspective. Neuroimage: Clinical 17:768-777.

Ackermann C, Andronikou S, Saleh MG, Laughton B, Alhamud AA, van der Kouwe A, Kidd M, Cotton MF and Meintjes EM (2016) Early antiretroviral therapy in HIV-infected children is associated with diffuse white matter structural abnormality and corpus callosum sparing. American Journal of Neuroradiology 37(12):2363-9.

CDC (1981) Pneumocystis Pneumonia — Los Angeles. Morbidity and Mortality Weekly Report 30:250-2.

Chakradhar S (2018) A tale of two diseases: Aging HIV patients inspire a closer look at Alzheimer’s disease. Nature Medicine 24(4):376.

Clifford DB (2017) HIV Associated Neurocognitive Disorder. Current Opinion in Infectious Diseases 30(1):117-22.

Cole JH, Underwood J, Caan MW, De Francesco D, van Zoest RA, Leech R, Wit FW, Portegies P, Geurtsen GJ, Schmand BA, Schim van der Loeff MF, Franceschi C, Sabin CA, Majoie CB, Winston A, Reiss P and Sharp DJ (2017) Increased brain-predicted aging in treated HIV disease. Neurology 88(14):1349-1357.

de Martino M, Tovo PA, Balducci M, Galli L, Gabiano C, Rezza G and Pezzotti P (2000) Reduction in mortality with availability of antiretroviral therapy for children with perinatal HIV-1 infection. The Journal of the American Medical Association 284(2): 190-7.

Deeks SG, Lewin SR and Havlir DV (2013) The end of AIDS: HIV infection as a chronic disease. The Lancet  382(9903):1525-33.

Ellis R, Langford D and Masliah E (2007) HIV and antiretroviral therapy in the brain: neuronal injury and repair. Nature Reviews Neuroscience 8(1):33.

Ellis RJ, Deutsch R, Heaton RK, Marcotte TD, McCutchan JA, Nelson JA, Abramson I, Thal LJ, Atkinson JH, Wallace MR and Grant I (1997) Neurocognitive impairment is an independent risk factor for death in HIV infection. Archives of Neurology 54(4):416-24.

Giunta B, Ehrhart J, Obregon DF, Lam L, Le L, Jin J, Fernandez F, Tan J and Shytle RD (2011) Antiretroviral medications disrupt microglial phagocytosis of beta-amyloid and increase its production by neurons: implications for HIV-associated neurocognitive disorders. Molecular Brain, 4(1):23.

Grant I, Atkinson JH, Hesselink JR, Kennedy CJ, Richman DD, Spector SA and McCutchan JA (1987) Evidence for early central nervous system involvement in the acquired immunodeficiency syndrome (AIDS) and other human immunodeficiency virus (HIV) infections. Annals of Internal Medicine 107(6):828-36.

Haase AT (1986) Pathogenesis of lentivirus infections. Nature 322(6075):130.

Kolson D (2017) Neurologic Complications in Persons with HIV Infection in the Era of Antiretroviral Therapy. Topics in Antiviral Medicine 25(3): 97-101.

Langford D, Marquie-Beck J, de Almeida S, Lazzaretto D, Letendre S, Grant I, McCutchan JA, Masliah E and Ellis RJ (2006) Relationship of antiretroviral treatment to postmortem brain tissue viral load in human immunodeficiency virus-infected patients. Journal of Neurovirology 12(2): 100-7.

Levy JA (1993) Pathogenesis of human immunodeficiency virus infection Microbiological Reviews 57(1):183-289.

Li J, Gao L, Wen Z, Zhang J, Wang P, Tu N, Lei H, Lin F, Gui XE and Wu G (2018) Structural Covariance of Gray Matter Volume in HIV Vertically Infected Adolescents. Scientific Reports 8(1):1182.

McArthur JC, Haughey N, Gartner S, Conant K, Pardo C, Nath A and Sacktor N (2003) Human immunodeficiency virus-associated dementia: an evolving disease. Journal of Neurovirology 9(2):205-21.

Moir S, Chun TW and Fauci AS (2011) Pathogenic mechanisms of HIV disease. Annual Review of Pathology 6:223-48.

Naif HM (2013) Pathogenesis of HIV Infection. Infectious Disease Reports 5(Suppl 1):e6.

Randall SR, Warton CMR, Holmes MJ, Cotton MF, Laughton B, van der Kouwe AJW and Meintjes EM (2017) Larger Subcortical Gray Matter Structures and Smaller Corpora Callosa at Age 5 Years in HIV Infected Children on Early ART. Frontiers in Neuroanatomy 11:95.

Roser M and Ritchie H (2018) HIV/AIDS Available at: 22/05/2018).

Saloner R and Cysique LA (2017) HIV-Associated Neurocognitive Disorders: A Global Perspective. Journal of the International Neuropsychological Society 9:860-9.

Saylor D, Dickens AM, Sacktor N, Haughey N, Slusher B, Pletnikov M, Mankowski JL, Brown A, Volsky DJ and McArthur JC (2016) HIV-associated neurocognitive disorder — pathogenesis and prospects for treatment. Nature Reviews Neurology 12(4):234.

Smit M, Brinkman K, Geerlings S, Smit C, Thyagarajan K, Sighem A, de Wolf F and Hallett TB (2015) Future challenges for clinical care of an ageing population infected with HIV: a modelling study. The Lancet Infectious Diseases 15(7):810-8.

Toich JTF, Taylor PA, Holmes MJ, Gohel S, Cotton MF, Dobbels E, Laughton B, Little F, van der Kouwe AJW, Biswal B and Meintjes EM (2017) Functional Connectivity Alterations between Networks and Associations with Infant Immune Health within Networks in HIV Infected Children on Early Treatment: A Study at 7 Years. Frontiers in Human Neuroscience 11:635.

Turner RS, Chadwick M, Horton WA, Simon GL, Jiang X and Esposito G (2016) An individual with human immunodeficiency virus, dementia, and central nervous system amyloid deposition. Alzheimer’s & Dementia 4:1-5.

UNAIDS (2017) UNAIDS DATA 2017. Available at: 20/05/2017).

Wandeler G, Johnson LF and Egger M(2016) Trends in life expectancy of HIV-positive adults on ART across the globe: comparisons with general population. Current Opinion in HIV and AIDS 11(5):492-500.

Weiss RA (1993) How does HIV cause AIDS?. Science 260(5112):1273-9.

Wiesman AI, O’Neill J, Mills MS, Robertson KR, Fox HS, Swindells S and Wilson TW (2018) Aberrant occipital dynamics differentiate HIV-infected patients with and without cognitive impairment. Brain 141(6):1678-1690.

Wing EJ (2016) HIV and aging. International Journal of Infectious Diseases 53:61-68.

World Economic Forum (2018) HIV/AIDS is no longer the leading cause of death in Africa. Available at: (Accessed:27/05/2018).

Zhang Y, Kwon D, Esmaeili-Firidouni P, Pfefferbaum A, Sullivan EV, Javitz H, Valcour V and Pohl KM (2016) Extracting patterns of morphometry distinguishing HIV associated neurodegeneration from mild cognitive impairment via group cardinality constrained classification. Human Brain Mapping 37(12):4523-4538.

Zlokovic BV (2008) The Blood-Brain Barrier in Health and Chronic Neurodegenerative Disorders. Neuron 57(2):178-201.

Interview with Professor Anne Lingford-Hughes: Neuroscience, Psychiatry and Passion

Professor Anne Lingford-Hughes is the Head, Centre for Psychiatry and Professor of Addiction Biology at Imperial College London. She is also a Consultant Psychiatrist with a particular interest in pharmacological treatments of alcohol problems and other substance addictions at Central North West London NHS Foundation Trust. Her research has focused on using neuroimaging and neuropharmacological challenges to characterize the neurobiology of addiction

The key aims of this new blog are to enhance the curriculum and innovate pedagogy, highlight the contribution of women in academia within and outside the College, and engage and inspire the society. The founder and editor of the blog is Dr Stefano Sandrone, Teaching Fellow within the Faculty of Medicine, and the contributors are Imperial’s MSc Translational Neuroscience students.

Swetha Umashankar) What inspired you to choose scientific research as a career? 

At school I liked all science subjects but particularly biology and chemistry. I also liked doing projects since you were allowed to do subjects in more depth. Projects also helped to satisfy my curiosity about things that I could not find the answer to!  There are no scientists in my family so my teachers at school and my tutor at University were very important in helping me develop my career.

Lucía Luengo Gutierrez) What were the difficulties you have to deal with when you decided to start a science career? 

I do not really remember any particular difficulties. It was helpful being able to move within the UK and to the USA for my post-doc since I had no particular ties to one place. I saw for some colleagues that it was much harder to take up opportunities since they had family and financial responsibilities.

Swetha Umashankar) How would you describe your career trajectory so far?

Hard one! I think most people think they can always do better or differently? You can always find people you regarded as contemporaries at the start of your career who have done better or worse than you. On the whole I am happy with where I am and what I want to do next.

Caroline Schaufelberger) Was there a specific event or reason that aimed your interests in research towards addiction?

The lab I joined for my post-doc was studying the GABA-A receptor. Amongst a range of modulators that we studied, it was known that some of the effects of alcohol were mediated through this receptor. When I returned to research after completing my clinical training I joined a group whose primary focus was on schizophrenia. At that time evidence had been growing for a potential role for the GABA-A receptor in schizophrenia. Having failed to get a fellowship to support an imaging study of the GABA-A receptor in schizophrenia, I reapplied to study alcoholism – and got funded. At that time very few people were looking at the biology of alcoholism using brain imaging so it was quite novel.

Claudia Ghezzou) Throughout you career, what have been the motivations and incentives that have kept you focused in your impact as a researcher regardless of the possible difficulty of the processes or discouraging results from the research carried out?

Being in the clinic where there is obvious unmet need continues to motivate me to keep going – particular as addicts are a vulnerable, stigmatised and often marginalised within society.  Being a member of a good team who can celebrate and commiserate is crucial. Support and understanding from my family has also been crucial.

Pavlina Pavlidi) Did you experience any conflicts between your career and personal life choices?

All the time! It is a constant juggling act particularly as for the last 10yrs I have lived away from my family during the week so I try to limit activities which impact on time with them at weekends. Modern technology has been helpful – so even if I am not at the dinner table with them in person, I can join via skype. On the other side, they have had some amazing experiences when they have joined me on a ‘work trip’ so they have benefited as well.

Shinil Raina) What gaps do you think exist between neurobiological and clinical research in the field of addiction?

Gaps still exist though having to argue that there is a neurobiology to addiction is less of an issue now compared with 20yrs ago. Compared with other areas of psychiatry we know quite a bit about the brain in addiction and many of the medications used came from knowledge about pharmacology and neurobiology. I think many people are aware of this however most people who work in the addiction field have no training in this area. This then means that the number of clinical researchers in this area is limited – this needs to change if we are to improve prevention and treatment. So to me the gap is about lack of people trained in research and who work clinically rather than highlighting a gap in a particular area.

Caroline Schaufelberger) To what extent do you think that the research you are doing in addiction will have an impact on the societal understanding of addiction?

I hope that by people understanding the role of the brain that addiction is destigmatised, that it is no longer seen as due to someone ‘lacking moral fibre’.

Ryan Dowsell) What’s the most ridiculous scientific report you’ve seen in the media?

Another hard question! No one report comes to mind but I do find it intriguing how the media can report that ‘alcohol is bad for you’ and ‘alcohol is good for you’ without any sense that they are being inconsistent or informing the reader how to interpret research.

Shinil Raina) Why do you think in the current political climate are some politicians against the advancements in science?

I am not sure it is just current – it has always been this way. I am not sure they are against scientific advances as such but more about how much they will cost financially and whether it will be popular with voters. As one elected official was quoted as saying when asked about cuts to budget for addiction services – no ‘man on the street’ in the run up to an election has ever asked me to give more money to addicts. Another factor is that in the UK (I do not know about elsewhere) very few politicians are scientists and also they like yes/no – whereas we speak in terms of probability.

Jessica Hain) Are there any areas in addiction research which you would like to see explored in the future?

I think we need more in the area of opiate addiction where we have not really seen a step-change in what we can offer for decades – we need new treatments to help those struggling with street heroin and also those whose use of opioid analgesics have escalated uncontrollably.

Leire Melgosa) What is your advice to young scientist (i.e. a hard truth or something we tend to worry about but we should not)?

Jessica Hain) Do you have any advice for neuroscience students?

Same answer to both – if you are interested in a topic or technique, join a group working in this area and use it like an apprenticeship. I think it is also important to have people around you that you enjoy working with to mentor and help you to keep going towards your goal. Neuroscience is a great area to be in and there is always something new and exciting happening.

Interview with Professor Helga Nowotny: ‘In basic or fundamental research one does not know what one will find, yet it is precisely this uncertainty that is attractive’


Professor Helga Nowotny is Former President of the ERC, the European Research Council and one of its Founding Members. She is Professor emerita of Science and Technology Studies, ETH Zurich and Nanyang Technological University Visiting Professor (source:

The key aims of this new blog are to enhance the curriculum and innovate pedagogy, highlight the contribution of women in academia within and outside the College, and engage and inspire the society. The founder and editor of the blog is Dr Stefano Sandrone, Teaching Fellow within the Faculty of Medicine, and the contributors are Imperial’s MSc Translational Neuroscience students.

Kausar Raheel) You hold different positions throughout your career. What is the most rewarding aspect of your job? What drives you?

Being able to follow what I like to do: to be curious about the world, people, ideas…and passionate in exploring how we might improve the human condition.

Kausar Raheel) How did you come to be passionate about what you do?

How does one become what one is? A mixture of luck with what nature, parents and our early social environment has endowed us with and what we were able to make of it. I feel very humble about what I have been able to achieve and privileged in being able to do what I strongly care about.

Kausar Raheel) What do you wish to see more from female scientists?

More self-confidence and persistence. Don’t let anyone discourage you – women can make it!

Kausar Raheel and Esther Awodipe) What are your aspirations and advices for young female scientists?

My answer is very simple and always the same: chose your partner well! Why? To embark on a scientific career and, even more important, to sustain it, you will need the full support of your partner. As probably neither you nor your partner has already experience in juggling the precarious life-work (im)balance that awaits you in science, you might want to involve a mentor to discuss with both of you what this entails for you. Don’t fall into illusions – life is sufficiently hard, but life in science is also immensely rewarding.

Esther Awodipe) Have you experienced any challenges working in STEM as a woman? If so, how did you overcome these challenges?

My first challenge came when I applied for my first job with a professor who knew my work well. He frankly declared that he prefers a man for the position – in those days this could still be stated openly! I asked him why and he had rational arguments: sooner or later I would get married, have kids and his investment into my training would be lost as I might leave the job or be less committed. In the end, we agreed: if he finds a man whose overall qualifications are better than mine, the job should go to him. Well, I got the job! But this is not the end of the story. After two years I got married and left with my husband to New York where I started a new career. The cunning of reason was on my side after all.

Esther Awodipe) Who is your biggest inspiration and why?

Although there are many people who do great things, I had to find my own way.

Esther Awodipe) How do you motivate yourself and stay motivated?

Motivation was never a problem for me. As I said before, I feel privileged that I can do what I like to do. But I had also had to work hard to get there.

Esther Awodipe) How do you manage your work-life balance?

Taking a break when it feels necessary, getting enough sleep, keeping fit, laughing…

Ines Das Neves) Based on your experience in several areas, but particularly through your work in the European Research Council, what, to you, are the major problems with the way peer review processes are currently conducted or set up to work? Do you think there is room for much change within the frameworks for research funding and policy that are presently in place?

We worked hard at the European Research Council to set up a peer review culture unlike any other. Not that we changed the procedure, but we made huge efforts to find the best scientific minds to serve in our evaluation panels, people who were broad in their scientific knowledge and open-minded, having good judgement and looking for scientific excellence only. But nobody is exempt from unconscious bias. We carefully monitored what we observed, giving feed-back to panel members and showing them examples of where unconscious bias might have occurred.

The general problem is that the peer-review system is under immense pressure. It is bursting at the seams. Funding agencies then resort to formal mechanisms, i.e. indicators of various kinds and quantitative measures. While this alleviates stress to the system, it that can never fully replace a carefully calibrated peer-review system with good human judgment.

The solution is to differentiate much better: which funding streams can rely on light reviewing, perhaps with very simple indicators? Which funding streams need a (perhaps quantitative) preselection, followed by careful interviews with the pre-selected candidates? When should reviewers read the publications of the candidates and when is it sufficient to look only at the publication list? Various other solutions are being discussed right now, but the most important is not to rely on ‚One size fits all’.

Ines Das Neves) Why do you believe that embracing uncertainty is important to scientific research and the way its findings are presented to the general public?

As I wrote in „The Cunning of Uncertainty“ science and scientists thrive on the cusp of uncertainty. In basic or fundamental research one does not know what one will find, yet it is precisely this uncertainty that is attractive, pushing scientists into the yet unknown. Speaking to the public, one has to have the courage to admit that absolute certainty does not exist and that we all have to learn to live with probabilities. We have to communicate better that scientific knowledge is always preliminary in the sense that it will be replaced by more and better knowledge. We also have to say loud and clearly that scientific findings hold under certain conditions that can be spelled out. There is no simple and unconditional ‚yes’ or ‚no’ to many of the questions that preoccupy the public. But we can and should explain better how science works and how scientific findings are arrived at. It can be done!

Ines Das Neves) You have recently written the book, An Orderly Mess. What drove you to write about this particular subject?

This little book consists of two essays. ‚Revisisting Eigenzeit’ is based on an Inaugural lecture I was invited to give for a big research project at the Haus der Kulturen in Berlin. It is an analysis of what has changed in our experience of time since I first published my book on Time (the German title was ‚Eigenzeit’) some thirty years ago. The Central European University Press (yes, from the embattled Central European University in Budapest that is under threat of being closed) wanted to translate and publish it. They asked me to write a second essay and I thought that the topic of ‚messiness’ cuts across societal concerns and can be found in science as well. In a sense, it follows the cross-cutting  theme of uncertainty. Messiness alerts us to the dynamics of order and disorder and our involvement in generating both, but also how to deal with it.

Carolina Beppi) The educational system can be considered a ‘positive constraint’ to our creativity, providing us with the scientific knowledge and tools that spur our creative ideas. One the other hand, the educational system represents a ‘negative constraint’, limiting our research to a corrupted ‘school of thinking’, where the scientific discovery and advancement is limited by the demand and interests of the capitalist economy and market. What ‘revolutionary action’ should we enact, in your opinion, as socially aware scientists?

I don’t think that ‚revolutionary action’ is anywhere on the horizon these days. We speak about ‚responsible research and innovation’, of more sharing, better cooperation and open science. The point is that greater awareness is needed about the many complex interrelationships between science and society in a global and globalising world. To become more aware of our interdependencies, but also of our ability to act. We are neither the pawns of history, nor its masters. But we often do have greater freedom to act than we may think. I would like to see more young scientists to become what I call ‚competent rebels’.

Carolina Beppi) Currently successful industries are now applying machine learning classification and prediction techniques on consumer data to guide their market strategies and maximise their outcomes. Only those industries that are successfully integrating these softwares will survive in the economy. At that point, economical power will only become a matter of ‘who owns the biggest data’. To what extent do you agree with this statement?

This is too deterministic for me. All big technological paradigm changes have produced monopolies. They have to be broken up and regulated by the state. This is the point where we are now with Google, Amazon, Facebook and the like. I admit that it is more complicated now as the state has partly been overtaken by markets and we have to factor in the global world. But I am convinced that regulation of the ‚big data’ world will happen.

Carolina Beppi) Algorithms that extent predictions at aggregate level, the increasing accumulation of big data, but also developments in risk management, cannot protect against ‘the unpredictable’ – and the 2008 financial crisis is an exemplification. Do you believe the development and integration of AI intelligence will soon lead to another worldwide economical and social risks in the context of both employment and security?

The risks are there, undoubtedly.

Carolina Beppi) Rita Levi-Montalcini was one of the greatest scientific minds of the 20th century. She believed ‘Women have always had to fight doubly. They always had to carry two weights, the private and the social. Women are the vertebral column of society’. What is your idea in this regards?

Rita Levi-Montalcini was right, but we have to challenge men more and better to share the burden. With humanity stumbling into an uncertain and artificial future in which humans and non-human artefacts will mix and mingle in unforeseen ways, new forms of living together will evolve. Perhaps much of the aggressive misogynic behaviour that we see today is the last rear-guard fight of a patriarchy that will no longer have a place in the future. Let us remember: whatever life forms will evolve depends also on us. My hope is that they will be more inclusive and based on mutual respect.

Interview with Professor Jackie de Belleroche: ‘Just be yourself, follow your dream’

Professor Jackie de Belleroche leads the Imperial College London research group that has a strong commitment to investigate on Amyotrophic Lateral sclerosis (ALS) through molecular genetics, expression profiling in spinal cord and through the use of experimental models to develop new therapeutic approaches.

The key aims of this new blog are to enhance the curriculum and innovate pedagogy, highlight the contribution of women in academia within and outside the College, and engage and inspire the society. The founder and editor of the blog is Dr Stefano Sandrone, Teaching Fellow within the Faculty of Medicine, and the contributors are Imperial’s MSc Translational Neuroscience students.

Leire Melgosa) What made you choose science and the area of research you work in?

I was always deeply fascinated by Molecular Biology from DNA to medical research.

Caroline Schaufelberger) What were the main considerations you made when taking your career forward in research rather than in a clinical environment?

I was lucky that my early research was always relevant to neurological disorders such as epilepsy and stroke. This background helped me to gain a lectureship, a joint appointment in Biochemistry and Neurology, which provided the perfect environment to pursue research in clinical Neuroscience.

Leire Melgosa) What have been the most challenging part of your career and the most satisfying one so far?

In common with many others, the most challenging stage was finding an established position that would enable me to develop as an independent researcher. For me, this came with the award of a Mental Health Foundation Fellowship. Research is full of surprises, but each new discovery, however small, is very satisfying. However, even more than this, is to see the success of members of my research group in setting up their own research teams and establishing themselves in their various chosen careers.

Ryan Dowsell) How realistic is it to think that gene mutations play a much more significant role in motor neurone disease (ALS) than we currently believe?

Understanding how gene mutations cause disease in families has had a phenomenal effect on our ability to define the processes that underpin disease processes and also occur in sporadic cases, and are therefore targets for therapeutic intervention. This is, of course, only part of the story, as ALS is an adult-onset disorder and effects of ageing play an important role and an increasing number of DNA variants are being discovered that modify gene function and act as risk factors. Overall, multiple factors will undoubtedly contribute to the final evolution of disease, e.g. age at onset and duration of disease.

Claudia Ghezzou) Throughout you career, what have been the motivations and incentives that have kept you focused in your impact as a researcher regardless of the possible difficulty of the processes or discouraging results from the research carried out? 

I have always been committed to finding out the basis of disease both in neurodegenerative and psychiatric disorders. This is never ending, whatever the setbacks, the challenge is always there to find a way forward. Discouraging results may even provide a greater depth of understanding and lead to a more fruitful approach.

Shinil Raina) How has your experience as a woman in science changed since when you started as an undergrad?

Jessica Hain) Did you encounter any difficulties in your pursuit to be a female scientist, and do you think attitudes have changed over the years? If so, to what extent?

Of course attitudes have changed monumentally which is a major achievement, but there is still room for improvement. As a scientist, I did not think of myself as being different, being female. Travelling to international conferences, where female representation amongst the speakers is low, just makes you realise how much has been achieved.

Caroline Schaufelberger) What do you think are the main challenges for women in science and what has your experience taught you that would aid the next generation tackle these challenges?

Danielle Kurtin) How do you balance work as well as motherhood?

These are of course relevant to all walks of life. The framing of your question is absolutely correct to address the issue of balancing work and motherhood. Once you return from maternity leave you need to find a balance, allocate your time as much as possible equally to your work and your family. There are many options available for childcare and universities like Imperial have many facilities available and provide other types of support, which all help to make life easier at what would otherwise be a challenging time. My daughters have greatly benefitted from attending the crèche and nursery which are highly recommended but many other options are available.

Ryan Dowsell) How has working at Imperial College London helped to shape your career as a woman in science?

Imperial is a very dynamic and supportive place to work, whether male or female.

Caroline Schaufelberger) Especially in our MSc cohort there is a higher percentage of women compared to men, what do you see is the importance of more women entering a career in science?

Ryan Dowsell) Throughout my undergraduate and postgraduate studies, I’ve noticed how the classes have always included more female than male students. So from your experience, what do you think is causing a gender disparity between the roots of STEM and the managerial positions?

Across UK Universities as a whole, there are similar gender differences in particular subjects, more females in biological subjects, more males in engineering. Medicine and chemistry are more evenly balanced. The disparity may be based historically on traditional attitudes, but there have been some marked changes in some subjects, such as Medicine, to reach an equality between males and females compared to a few decades ago. I am encouraged to see more women embarking on careers in sciences, where there will be increasing opportunities available.

Jessica Hain) Do you have any advice for neuroscience students?

If you have a passion for a subject, you should pursue it.

Leire Melgosa) Do you have specific advices for young female scientists (i.e. a hard truth or something we tend to worry about but we should not)?

Just be yourself, follow your dream. There are always difficult times, whether male or female, but there is always a way through.


Photography credit: Jackie King

Cognition, depression and cognitive changes in Major Depressive Disorder: a brief consideration

Carolina received her Psychology, Clinical and Cognitive Neuroscience bachelors degree from Royal Holloway University of London and she has recently obtained a MSc degree in Translational Neuroscience from Imperial College London: Cognition is extremely fascinating. It is my belief that understanding how we learn is the key to self-actualisation.


Cognition is a term that refers to all the mental processes involved in the interpretation of the environment, including overt behaviours and emotional states, by mean of sensory experience (Neisser, 2014). For example, while watching a rugby match, we may interpret a physical contact between two players as being painful based on the visual information of a bleed and on the emotional reaction of a cry. The mental processes supporting the production of deliberate actions are also components of cognition (Neisser, 2014). Reaching for an object, for instance, would require focused attention to the target and online adjustments of the body position in the space relatively to the object.

Cognitive abilities are a determinant aspect of quality of human life. Reports suggest that cognitively skilled children obtain more prestigious occupations later in life (Cheng & Furnham, 2012) and a significant portion of variability in academic achievement in adulthood is explained by general cognitive ability (Rohde & Thomposon, 2007). In the elderly population, good cognitive functioning is associated with more satisfactory interpersonal relationships (Watanabe et al., 2016), engagement in social activities, hobbies, sports, volunteering, and generally more diversified daily activities (Fu et al., 2018).

Aging is accompanied by a cognitive deterioration that differently affects fluid and crystallised abilities (Cattell, 1943). The former comprise reasoning skills employed to solve mental operations and tasks (Murman, 2015), such as inferring the fastest bus route to approach a specific city district. Crystallised abilities constitute instead cumulative knowledge (i.e. facts) and experience acquired in the course of time (Murman, 2015). In 2009, a large cross-sectional study revealed that while fluid skills constantly deteriorate from late adolescence, crystallised abilities increment along with age, reaching a plateau in the 60s (Salthouse, 2009).

Cognition is, in part, genetically determined (Hill et al., 2014). However, a wide range of modifiable environmental factors can ‘accelerate’ or ‘attenuate’ cognitive decline. Interpersonal relationships contribute to maintain cognitive health by promoting social identification and support (Haslam et al., 2016). Furthermore, cognitive functioning is supported by higher levels of education, socioeconomic status and intellectual activity (Parisi et al., 2012). Nonetheless, various environmental risk factors can negatively affect cognition, such as smoking (Waisman Campos et al., 2016), a poor diet (Shatenstein et al., 2012) and low levels of physical activity (Kelly et al., 2014).

Cognitive functions can also be affected by depression. Major depressive disorder (MDD) is a common psychiatric disorder that negatively affects mood, thoughts and behaviours, compromising the ability to function both at work and at home. It is characterised by symptoms including low mood, loss of interest, fatigue, appetite changes, suicidal thoughts, difficulty thinking and irritability, for a period of time of at least 2 weeks (American Psychiatric Association, 2013). Major depression is one of the most disabling illnesses: it contributes to early mortality causing 60% of worldwide suicides (World Health Organization, 2009) and it has elevated annual pharmaceutical costs (Kessler, 2012), thus constituting a major global social and economic concern.

The onset of MDD typically occurs in late adolescence, and it is more prevalent among women than men (Hasin et al., 2018). It annually affects ~6.7% individuals, and ~16.6% of the population experience it at least once it in their life (Kessler et al., 2005). Importantly, depression not only represents a risk factor of accelerated cognitive decline, but it affects cognitive functioning in the everyday-life experience. Depressed individuals are more likely to ‘misinterpret’ neutral faces as expressing anger (Watters & Williams, 2011) compared to healthy controls. The ‘tendency’ towards negative emotions in depression is moreover accompanied by a lower ‘reactivity’ to positive stimuli, whereby hedonic pictures are considered to be less pleasant and less arousing compared to healthy subjects (Sloanab et al., 1997).

Furthermore, depression is generally linked to deficits of memory (Burt et al., 1995) including a decreased recognition and retrieval of previously learned words (Brand et al., 1992), visuomotor ability, such as a reduced horizontal eye-movements span (Deijen et al., 1993), but also mental flexibility (Airaksinen et al., 2004), language fluency (Reischies & Neu, 2000), abstract reasoning (Naismith et al., 2003) and decision-making, with an increased risk of impulsively engaging in pleasurable activities with harmful consequences (Chamberlain & Sahakian, 2006).

What is the impact of existent therapies on depression and cognition?

Recently, the effect of three different antidepressant drugs has been tested on patients with MDD and diminished functioning in seven cognitive domains including attention, verbal memory, mental flexibility, decision-making and information-processing (Shilyansky et al., 2016). The results showed that while all the antidepressants succeeded in improving mood-related symptoms, their effectiveness on cognition was limited to only two domains: mental flexibility and executive functions, namely organizing and planning skills. Talking treatments, such as the cognitive behavioural therapy, have been shown to improve the clinical symptoms of depression (Twomey et al., 2014), although their impact on cognitive dysfunctions are yet to be explored.

This is an exciting research avenue for the future: further investigations are required to delineate the distinct components of cognition, to elucidate the nature and extent of the cognitive changes in depression, as well as, to develop novel approaches to address specific deficits.


Airaksinen, E, Larsson, M, Lundberg, I and Forsell, Y (2004) Cognitive functions in depressive disorders: evidence from a population-based study. Psychological medicine34(1):83-91.

American Psychiatric Association (2013) Diagnostic and statistical manual of mental disorders (DSM-5). American Psychiatric Pub.

Brand, AN, Jolles, J and Gispen-de Wied, C (1992) Recall and recognition memory deficits in depression. Journal of affective disorders25(1):77-86.

Burt, DB, Zembar, MJ and Niederehe, G (1995) Depression and memory impairment: a meta-analysis of the association, its pattern, and specificity. Psychological bulletin117(2): 285.

Cattell, RB (1943) The measurement of adult intelligence. Psychological Bulletin40(3):153.

Chamberlain, SR and Sahakian, BJ (2006) The neuropsychology of mood disorders. Current psychiatry reports8(6):458-463.

Cheng, H and Furnham, A (2012) Childhood cognitive ability, education, and personality traits predict attainment in adult occupational prestige over 17 years. Journal of Vocational Behavior81(2):218-226.

Deijen, JB, Orlebeke, JF and Rijsdijk, FV (1993) Effect of depression on psychomotor skills, eye movements and recognition-memory. Journal of affective disorders29(1): 33-40.

Fu, C, Li, Z and Mao, Z (2018) Association between social activities and cognitive function among the elderly in China: a cross-sectional study. International journal of environmental research and public health15(2): 231.

Kelly, ME, Loughrey, D, Lawlor, B.A., Robertson, I.H., Walsh, C. and Brennan, S. (2014) The impact of cognitive training and mental stimulation on cognitive and everyday functioning of healthy older adults: a systematic review and meta-analysis. Ageing research reviews15:28-43.

Hasin, DS, Sarvet, A.L., Meyers, JL, Saha, TD, Ruan, WJ, Stohl, M and Grant, BF (2018) Epidemiology of adult DSM-5 major depressive disorder and its specifiers in the United States. JAMA psychiatry75(4): 336-346.

Haslam, C, Cruwys, T, Milne, M, Kan, CH and Haslam, SA (2016) Group ties protect cognitive health by promoting social identification and social support. Journal of aging and health28(2):244-266.

Hill, WD, Davies, G, Van De Lagemaat, LN, Christoforou, A., Marioni, RE, Fernandes, CPD, Liewald, DC, Croning, MD, Payton, A, Craig, LC and Whalley, LJ (2014) Human cognitive ability is influenced by genetic variation in components of postsynaptic signalling complexes assembled by NMDA receptors and MAGUK proteins. Translational psychiatry4(1):e341.

Kessler, RC (2012) The costs of depression. Psychiatric Clinics35(1):1-14.

Kessler, RC, Berglund, P, Demler, O, Jin, R, Merikangas, KR and Walters, EE (2005) Lifetime prevalence and age-of-onset distributions of DSM-IV disorders in the National Comorbidity Survey Replication. Archives of general psychiatry62(6):593-602.

Leykin, Y, Roberts, CS and DeRubeis, RJ (2011) Decision-making and depressive symptomatology. Cognitive therapy and research35(4):333-341.

Murman, DL (2015) The impact of age on cognition. Seminars in hearing 36(3): 111. Thieme Medical Publishers.

Naismith, SL., Hickie, IB., Turner, K, Little, CL, Winter, V, Ward, PB, Wilhelm, K, Mitchell, P and Parker, G (2003) Neuropsychological performance in patients with depression is associated with clinical, etiological and genetic risk factors. Journal of clinical and experimental neuropsychology25(6): 866-877.

Neisser, U (2014) Cognitive psychology: Classic edition. Psychology Press.

Parisi, JM, Rebok, GW, Xue, QL, Fried, LP, Seeman, TE, Tanner, EK, Gruenewald, TL, Frick, KD and Carlson, MC (2012) The role of education and intellectual activity on cognition. Education, 15:19

Reischies, FM and Neu, P (2000) Comorbidity of mild cognitive disorder and depression–a neuropsychological analysis. European archives of psychiatry and clinical neuroscience250(4):186-19

Rohde, TE and Thompson, LA (2007) Predicting academic achievement with cognitive ability. Intelligence35(1):83-92.

Salthouse, TA (2009) Decomposing age correlations on neuropsychological and cognitive variables. Journal of the International Neuropsychological Society15(5):650-661.

Shatenstein, B, Ferland, G, Belleville, S, Gray-Donald, K, Kergoat, MJ, Morais, J, Gaudreau, P, Payette, H and Greenwood, C (2012) Diet quality and cognition among older adults from the NuAge study. Experimental gerontology47(5):353-360.

Shilyansky, C, Williams, LM, Gyurak, A, Harris, A, Usherwood, T and Etkin, A (2016) Effect of antidepressant treatment on cognitive impairments associated with depression: a randomised longitudinal study. The Lancet Psychiatry3(5):425-435.

Twomey, C, O’Reilly, G and Byrne, M (2014) Effectiveness of cognitive behavioural therapy for anxiety and depression in primary care: a meta-analysis. Family practice32(1): 3-15.

Waisman Campos, M, Serebrisky, D and Mauricio Castaldelli-Maia, J (2016) Smoking and cognition. Current drug abuse reviews9(2):76-79.

Watanabe, K, Tanaka, E, Watanabe, T, Chen, W, Wu, B, Ito, S, Okumura, R and Anme, T (2016) Association between Social Relationships and Cognitive Function among the Elderly. Public Health Research6(2):59-63.

Watters, AJ and Williams, LM (2011) Negative biases and risk for depression; integrating self‐report and emotion task markers. Depression and anxiety28(8):703-718.

World Health Organization (2009) Depression: A global public health concern, Accessed 28 May 2018, <>


Picture credits: Wellcome Collection

Neuroscience behind that perfect cup of morning coffee

Yi-Ting Wang (Tina) is a prospective neuroscientist at the starting point of her scientific career. She uses her little grey cellstrying to solve some of the 21st centurys most fascinating problems in Imperial College London. Like the entomologist in search of colourful butterflies, my attention has chased in the gardens of the grey matter cells with delicate and elegant shapes, the mysterious butterflies of the soul, whose beating of wings may one day reveal to us the secrets of the mind.’  -Santiago Ramon y Cajal

If you are an avid coffee drinker like me, the best way to kick a day off is brewing that one tasty cup of sweet-smelling morning coffee, or grab one at your favourite coffee shop on the way to the workplace. During the day, you probably need some more cups of coffee to keep you going. The active ingredient in coffee is caffeine, the most widely used psychoactive drug in the world. How does the whole process actually work, and how does caffeine affect our brain? Lets reveal the secret of this magical brain fuel from a neuroscientific perspective.

How does caffeine act on our brain?

It’s normal to grow tired as the day progresses because our brains naturally secrete a molecule called adenosine. Briefly, adenosine influences attention, alertness, and sleep. Adenosine builds up in our brain and when it reaches a certain level, our body knows its bedtime. In simple words, caffeine hijacks this system by competing with adenosine for the receptors. By blocking the action of adenosine, we end up feeling more alert and awake (Ribeiro and Sebastião, 2010). It is worth noticing that considerable amount of research reported that coffee can improve cognitive performance and decrease the risk for neurodegenerative disease such as Parkinsons disease (PD) and Alzheimers disease (AD).

Coffee every day, keep doctors away?

Three different epidemiological studies performed in Spain (Jimenez-Jimenez et al., 1992), Germany (Hellenbrand et al., 1996) and Sweden (Fall et al., 1999) reported an inverse, dose-responsive relationship between coffee consumption and the risk of developing PD. Two meta-analyses also showed that the risk of developing PD decreased by 31% (Hernan et al., 2002) and 25% (Costa et al., 2010) respectively in coffee drinkers compared to non-coffee drinkers. However, a more recent case control study suggested only a weak inverse association between coffee intake and the risk of PD (van der Mark et al., 2014). Though the debate is still going on, experimental studies have identified a possible mechanism behind caffeines potential preventative role in the development of PD.

Classically the primary pathology of PD involves the degeneration of dopaminergic neurons that originate in the substantia nigra and project to the striatum. Striatum is a principal component of the basal ganglia. Common PD symptoms such as slow movement, tremors, and rigidity are resulted from the cell death in basal ganglia and their connecting pathways. Low dose of caffeine was shown to mainly antagonise adenosine A2A receptors. The blockade of A2A receptors stimulates dopaminergic D2 receptors and as a result increases motor activity and improves motor deficits in PD models (Fenu and Morelli, 1998), (Kuwana et al., 1999).

A wealth of studies suggested that regular and moderate coffee intake over a lifetime reduces the risk of developing AD. A study published in 2012 gathered preclinical and clinical evidence and found a protective role of caffeine against AD. Results showed that caffeine can reduce risk, or delay onset of dementia. The effect was particularly evident in mild cognitive impairment patients (Cao et al., 2012). Among the most prominent studies, a case-control study showed that caffeine consumption was inversely associated with AD development (Maia and de Mendonça, 2002). In CAIDE study, 1409 elderly were analysed after a 21 yearsfollow-up. Coffee consumption in midlife was shown to decrease the risk of AD and dementia, with the lowest risk (65% decrease) found in people who drank 35 cups/day (Eskelinen et al., 2009). Animal studies helped us identify the possible mechanisms behind coffees effects on AD risk. Dr. Gary Arendash and colleagues found that caffeine improved learning and memory ability of transgenic mice and reduced the concentration of β-amyloid and presenilin in the hippocampus, the main brain structure involved in memory (Arendash et al., 2006). Caffeine also showed to reduce inflammatory mediators, which is another possible explanation of why it could ameliorate AD progression (Arendash et al., 2009; Cao et al., 2009).

How much coffee can we drink?

We have discussed some effects of caffeine on the brain based on different research findings. However, so far most of the evidence for both benefits and adverse effects of caffeine were derived mainly from observational studies, which means we couldnt draw any conclusion of caffeines causal effect on brain function. This awaits further randomised-controlled studies to confirm. Over the last decade, health authorities around the world have concluded that coffee/caffeine consumption is not harmful at levels of 300-500 mg daily (around 3-5 cups of coffee) (Nehlig, 2016). Back to the question, how much can we drink? Just remember moderation is the key, and be a happy coffee drinker!



Arendash GW, Mori T, Cao C, Mamcarz M, Runfeldt M, Dickson A, Rezai-Zadeh K, Tane J, Citron BA, Lin X, Echeverria V, Potter H (2009) Caffeine reverses cognitive impairment and decreases brain amyloid-β levels in aged Alzheimers disease mice. Journal of Alzheimers Disease 17(3):661-80.

Arendash GW, Schleif W, Rezai-Zadeh K, Jackson EK, Zacharia LC, Cracchiolo JR, Shippy D, Tan J (2006) Caffeine protects Alzheimers mice against cognitive impairment and reduces brain β-amyloid production. Neuroscience 142(4):941-52.

Cao C, Cirrito JR, Lin X, Wang L, Verges DK, Dickson A, Mamcarz M, Zhang C, Mori T, Arendash GW, Holtzman DM, Potter H (2009) Caffeine suppresses amyloid-β levels in plasma and brain of Alzheimers disease transgenic mice. Journal of Alzheimers Disease 17(3):681-97.

Cao C, Loewenstein DA, Lin X, Zhang C, Wang L, Duara R, Wu Y, Giannini A, Bai G, Cai J, Greig M, Schofield E, Ashok R, Small B, Potter H, Arendash GW (2012) High blood caffeine levels in MCI linked to lack of progression to dementia. Journal of Alzheimers Disease 30(3):559-72.

Costa J, Lunet N, Santos C, Santos J, Vaz-Carneiro A (2010) Caffeine exposure and the risk of Parkinsons disease: a systematic review and meta-analysis of observational studies. Journal of Alzheimers Disease 20 Suppl 1:S221-38.

Eskelinen MH, Ngandu T, Tuomilehto J, Soininen H, Kivipelto M (2009) Midlife coffee and tea drinking and the risk of late-life dementia: a population-based CAIDE study. Journal of Alzheimers Disease 16(1):85-91.

Fall PA, Fredrikson M, Axelson O, Granérus AK (1999) Nutritional and occupational factors influencing the risk of Parkinsons disease: a case-control study in southeastern Sweden. Movement Disorders 14(1):28-37.

Fenu S. and Morelli M (1998) Motor stimulant effects of caffeine in 6-hydroxydopamine-lesioned rats are dependent on previous stimulation of dopamine receptors: a different role of D1 and D2 receptors. The European Journal of Neuroscience 10(5):1878-84.

Hellenbrand W, Seidler A, Boeing H, Robra BP, Vieregge P, Nischan P, Joerg J, Oertel WH, Schneider E, Ulm G (1996) Diet and Parkinsons disease. I: A possible role for the past intake of specific foods. Results from a self-administered food-frequency questionnaire in a case-control study. Neurology 47(3):636-43.

Hernán MA, Takkouche B, Caamaño-Isorna F, Gestal-Otero JJ (2002) A meta-analysis of coffee drinking, cigarette smoking, and the risk of Parkinsons disease. Annals of Neurology 52(3):276-84.

Jimenez-Jimenez FJ, Mateo D, Giménez-Roldan S (1992) Premorbid smoking, alcohol consumption, and coffee drinking habits in Parkinsons disease: a case-control study. Movement Disorders 7(4):339-44.

Kuwana Y, Shiozaki S, Kanda T, Kurokawa M, Koga K, Ochi M, Ikeda K, Kase H, Jackson MJ, Smith LA, Pearce RK, Jenner PG (1999) Antiparkinsonian activity of adenosine A2A antagonists in experimental models. Advances in Neurology, 80:121-3.

Maia L and de Mendonça A (2002) Does caffeine intake protect from Alzheimers disease? European Journal of Neurology 9(4):377-82.

Nehlig A (2016) Effects of coffee/caffeine on brain health and disease: What should I tell my patients? Practical Neurology 16(2):89-95.

Ribeiro JA and Sebastião AM (2010) Caffeine and adenosine. Journal of Alzheimers Disease 20 Suppl 1:S3-15.

van der Mark M, Nijssen PC, Vlaanderen J, Huss A, Mulleners WM, Sas AM, van Laar T, Kromhout H, Vermeulen R (2014) A Case-Control Study of the Protective Effect of Alcohol, Coffee, and Cigarette Consumption on Parkinson Disease Risk: Time-Since-Cessation Modifies the Effect of Tobacco Smoking. PLoS One 9(4):e95297.

Advanced reading

Fredholm BB, Bättig K, Holmén J, Nehlig A, Zvartau EE (1999) Actions of caffeine in the brain with special reference to factors that contribute to its widespread use. Pharmacological Reviews 51(1):83-133.

Bridging the gap between neuroscience and spirituality: some notes on religion and mental health

Shivani graduated from the University of Hertfordshire in BSc (Hons) Biomedical Science and is currently studying MSc in Translational Neuroscience at Imperial College London

A hallmark of human life, religion is a crucial component of all cultures, although its biological basis has been fiercely debated (Bulbulia, 2004; Kapogiannis et al., 2014; Kapogiannis et al., 2009). The Oxford English Dictionary defines religion as the action of believing in, abiding by and revering to God or a superhuman power (Murray et al., 1961; Ashbrook et al., 1997). It involves social beliefs and traditions associated with ritual and ceremony (Simon et al., 2010; Ashbrook et al., 1997). Belonging to a religion provides support through various means which includes, though not limited to, creating peace, purpose, self-confidence, increasing acceptance and resilience, as well as a positive self-image (Moreira-Almeida et al., 2006). Furthermore, it comprises of a sequence of guidelines to enable better control and management over stress and strain (Behere et al., 2013). Studies have also demonstrated that certain aspects of religion relate to specific networks in the brain, which suggests that religious beliefs are embedded within neural networks (Kapogiannis et al, 2009).

Reports indicate synergism between religiousness and mental health and well-being. This means that there is a relationship between them. For example, review of the empirical literature found a correlation between religious commitment and mental issues (Gartner et al., 1991). The rate of suicide was indeed negatively correlated to the degree of religiosity, where church attendance contributed to an increased suicide prevention (Myers and Diener, 1995). Faith in widowed women provided a greater sense of joy, mothers of disabled children were less susceptible to depression, and people remain satisfied and content during serious illness, bereavement, unemployment or divorce (Myers and Diener, 1995). Maintaining firm faith in ones’ religion has been associated with higher self-esteem (Myers and Diener, 1995). Religion also appears to be one important way of having a sense of well-being. Well-being can be linked to several variables such as meaning, life satisfaction and purpose (Behere et al., 2013). Therefore, religion may enhance a sense of empowered and efficacious self (Gartner et al., 1991; Bailey, 1997; Behere et al., 2013).

Belonging to a religious group also has several advantages. This includes social cohesion in which members are part of a loving and caring environment (Behere et al., 2013). For instance, a part of Hinduism, the Bochasanwasi Shri Akshar Purushottam Swaminarayan Sanstha (BAPS), adopt the concept of samp, suradhbhav and ekta – unity, fraternity and solidarity – which is encouraged by their present guru and president, His Holiness Mahant Swami Maharaj, to maintain continuity in relationships with friends, families, as well as other devotees (Paramtattvadas, 2017). Additionally, Jewish Care, a health and social care provider for the Jewish community in the UK, aims to have a positive influence on Jewish people with mental health problems, dementia, and disabilities (Shaw, 2018). Islamic values and beliefs are also better suited in the treatment of Muslims with mental illnesses, where incorporation of Islamic beliefs support drug adherence (Sabry and Vohra, 2013). Consequently, religion might have an impact on promoting well-being by offering a ‘platform’. This provides meaning, direction and personal identity. Interestingly, religion may also be relevant for some aspects related to treatment of schizophrenia, a severe psychiatric disorder characterised by visual hallucinations, delusions, impaired motivation, social withdrawal, and cognitive dysfunction (Mohr and Huguelet, 2004; Owen, Sawa and Mortensen, 2016), and it may also aid to enhance coping and foster recovery (Mohr and Huguelet, 2004).

Several religious practices have been shown to be effective in managing mental health. Meditation is perhaps the most commonly studied religious practice. During meditation, there is a strong interaction between different regions of the brain implicated in self-monitoring and cognitive control (Brewer et al., 2011). The exercise can also stimulate changes in personality, reduce tension and anxiety, and stabilise emotions (Behere et al., 2013) as well as ameliorate panic attacks, anxiety disorder, depression, drug use, insomnia and stress (Behere et al., 2013; Grant, 2013).

In conclusion, it is unfortunate that this field of research has not been extensively studied in more recent years. Religion and neuroscience are two distinct entities, yet research explicitly suggests a strong relationship between them. It will be therefore interesting to elucidate the effect of a religious lifestyle on the precise molecular neuronal interactions in the brain. The current literature echoes future insights into determining effects of religion, if any, on the diseased brain, in order to decipher and fully unleash its potential.



Ashbrook J, Albright C, Harrington A (1997). The humanizing brain. Cleveland, Ohio: The Pilgrim Press.

Behere P, Das A, Yadav R, Behere A (2013). Religion and mental health. Indian Journal of Psychiatry 55(6):187.

Brewer J, Worhunsky P, Gray J, Tang Y, Weber J, Kober H. (2011). Meditation experience is associated with differences in default mode network activity and connectivity. Proceedings of the National Academy of Sciences 108(50):20254-59.

Bulbulia J (2004). The cognitive and evolutionary psychology of religion. Biology & Philosophy 19(5):655-686.

Gartner J, Larson D, Allen, G (1991). Religious Commitment and Mental Health: A Review of the Empirical Literature. Journal of Psychology and Theology 19(1):6-25.

Grant J (2013). Meditative analgesia: the current state of the field. Annals of the New York Academy of Sciences 1307(1):55-63.

Kapogiannis D, Barbey A, Su M, Zamboni G, Krueger F, Grafman, J (2009). Cognitive and neural foundations of religious belief. Proceedings of the National Academy of Sciences 106(12):4876-81.

Kapogiannis D, Deshpande G, Krueger F, Thornburg M and Grafman J (2014). Brain Networks Shaping Religious Belief. Brain Connectivity 4(1):70-9.

Kevern P (2013). Can Cognitive Science Rescue ‘Spiritual Care’ from a Metaphysical Backwater?. Journal for the Study of Spirituality 3(1):8-17.

Mohr S and Huguelet P (2004). The relationship between schizophrenia and religion and its implications for care. Swiss Medical Weekly 134(25-26):369-76.

Moreira-Almeida A, Lotufo Neto F, Koenig Harold G (2006). Religiousness and mental health: A review. Revista Brasileira de Psiquiatria 28(3):242-250.

Murray J, Bradley H, Craigie W and Onions C (1961). The Oxford English dictionary. Oxford: Clarendon Press.

Myers D and Diener E (1995). Who Is Happy? Psychological Science 6(1):10-19.

Owen M, Sawa A, Mortensen, P. (2016). Schizophrenia. The Lancet 388(10039):86-97.

Paramtattvadas S (2017). An Introduction to Swaminarayan Hindu Theology. Cambridge, UK: Cambridge University Press.

Sabry W, Vohra A (2013). Role of Islam in the management of Psychiatric disorders. Indian Journal of Psychiatry 55(6):205.

Shaw T (2018). Your Local Jewish Care. London: UK.

Interview with Dr Magdalena Skipper, new Editor-in-Chief of Nature

Kety Alania and Ines Das Neves) You are the first woman to become an Editor-in-Chief of Nature over the last 149 years. What does this mean to you personally?

It is interesting actually, at some level it means very little. When you think about the role, my gender should not matter. I am in the role because of what I have to offer, both as a scientist first and now as an editor. My professional setting should be gender-neutral. Of course, in general, there are fewer women and less diversity in leadership roles and roles of influence: for these reasons it is important that I am a woman in this role.

As it happens, I am the first woman. What is worth remembering is that although Nature has been around for almost 150 years, I am only the 8th Editor-in-Chief. My predecessor, the outgoing Editor-in-Chief Sir Philip Campbell, was appointed 22 years ago. When you go back to the previous appointments, unfortunately, there were fewer women in positions that would have allowed them to compete for the role then, both from perspective of personal engagement, education, and so on. Fortunately, things are changing, and this is important. Overall, I rarely think of myself as a role model in the true sense of the word, but if I can be a role model for young women, that is of course a delightful side effect of my appointment to this role.

Esther Awodipe) Have you experienced any challenges as a woman in science? And if so, how did you overcome these challenges?

I have to be honest and say that I personally have not, at least I do not believe that I have experienced any specific challenges, which of course is a delightful thing for me to be able to say. I have been fortunate enough to have the kind of mentors and teachers since the very beginning who looked at me as a person in a gender-neutral way, and then encouraged me in the direction I wanted to proceed. That has always been the case: when I was at school, at the University of Nottingham, where I had fantastic tutors in the Genetics department, and similarly during my PhD and my postdoc.

The biggest challenge we all have to face is the decision on how to divide the time between the professional engagement (and science is a very jealous profession, to which you have to devote so much time) and personal life. Early career stages are also times during which one thinks about a family; while these considerations are equally important for men and for women, for biological reasons women tend to be more intensively engaged, at least for a period of time.

Kety Alania) Have you seen a change in recent years in terms of number of females competing and being appointed in leadership roles within or outside academic publishing? Also, do you think we can do more about that?


In general, we do see many more women who are much more visible in roles of responsibilities and senior roles in academia, in publishing and in industry (or applied research, if you like), which is, of course wonderful to see. Just the right thing to be experiencing: there is not a reason why they should not be there. There never was, but it is great that we no longer see it so emphasised.

Publishing is a very interesting example. In my experience, and I have been working in science publishing for eighteen years now, has always been quite well balanced. If anything, I would say that it was almost skewed towards females. In my experience, many professional editors tend to be female rather than male.

Actually here within Springer Nature some of the most senior roles on the editorial side are actually filled by women. Editorial directory: is a woman. Nature editorial director: is a woman. Many among the Chief Editors of Nature Research titles are women. My previous role, Editor-in-Chief of Nature Communications, was filled by me, a woman, until some days ago. In this particular industry, women ‘drive’ and this is certainly recognised (and rightly so).

Ines Das Neves) Considering your background in genetics, what drew you to a career in publishing/editing or out of a career in research? 

My answer is actually agnostic of my specific background. Having spoken to many colleagues of mine over the years, the reason why I left the bench and active research itself and chose publishing actually resonates with many of my colleagues. The fact that I was a genetic researcher before is not related to that decision. I did a PhD and a postdoc. At some point during my postdoc years, I began to think in what way I could most fruitfully contribute to the advancement of science. My postdoc was such that I would have not been in a position to start an independent career after one postdoc, so the plan would have been then to go onto a second postdoc. While I enjoyed research very much, I decided that I was probably not the person who would make the strongest contribution to the advancement of science through research. And yet I really love science, ever since I was a child. I have always found scientific discoveries fascinating, and knowing how the world around me works fascinated me more than anything else. So I did not want to lose this kind of contact with scientific research.

Having considered a number of possibilities, I thought that the editorial career could have been ‘the’ thing for me. At that time I began to look for editorial opportunities and I saw a job advert for an Associate Editor at Nature Reviews Genetics, very much within the area in which I specialised. What happened next was this: the journal, like all  Nature Reviews journals do, sent me a set of tests to be done at home. The test consisted of a series of tasks you would do if appointed to the role. One of them was, for example, looking at a submitted review article to see whether it could be edited to improve the story flow. Another one was to come up with a number of commissions, to effectively create a table of contents for a future issue of the journal, and select some recently published research articles to write a short ‘research highlight’ about, and explain the choice. I had no previous experience in writing nor editing. But as it happened, I enjoyed the test that I had at home so much that by the time I came back for the interview I obviously was imbued with so much enthusiasm that, as I now joke, I almost bullied them into giving me the job.

For me, moving from research to being an editor was, in many ways, a  perfect transition for me.

Mash Coomaraswamy) Do you feel like more needs to be done to open pathways for a more equal gender representation in the biological sciences?

Definitely we do need to encourage girls and young women, and this effort has to start very early on, at school or pre-school. We, as a society, anywhere, continue to have pre-set ideas of what girls and boys are like, and we reward our children for different things depending on their gender. The stereotyping unfortunately begins very earlier on. I do think we have to place more emphasis on equality from the very early days. The responsibility is not only on the educators, but on the whole society, whether we are talking about our own children or children of others

Certainly this effort needs to continue through the schooling years, at university at undergraduate and postgraduate levels, during postdoc years and so on. I see no distinction between biological sciences, physical sciences or engineering. Right across the board, agnostic of the topic, we should encourage those who are interested in the study, regardless of their gender.

Esther Awodipe) How do you manage your work-life balance? 

With difficulty! (she smiles) I have never really made a complete distinction between work and life. Work is part of life for all of us, it is more of a continuum. I find what I do absolutely fascinating. Having said that, I do other things in life, which have nothing to do with work. For example I particularly enjoy pottery. One has to make an effort to switch off. Another thing I try to do as much as possible is not to check my emails when I am on holidays. It is for everyone’s sake, really. For most of us, balancing leisure and work requires some active management of time, but I believe it is very important that we have leisure time and can switch off completely.

Esther Awodipe) Who is your biggest science inspiration?

I am inspired by a numbers of characters, not necessarily within science. But let’s start with scientists.

My earliest inspiration, when I was still a child, was Isaac Newton. My parents bought me a book specifically written for young readers, and it was a book about Newton’s life. I guess that partly thanks to how the author put the story across and explained the different contributions that Newton made and how they came about in his life, I was absolutely fascinated by the dedication to discovery and knowledge, and it may be that this is why I become a scientists. Although I do not remember a very clear connection, now, from perspective of time, I think that was an early inspiration for me.

I have recently acquired a new inspiration – Alexander von Humboldt; he is somebody I did not know much about until recently. The remarkable thing about von Humboldt was that he was a bit of a Renaissance man and possibly the first environmental scientists. Multi-disciplinarian but most importantly, he was somebody who was not afraid to go out there and empirically test ideas and collect data. He spent months and months travelling around South American jungle, completely disregarding the associated discomforts because he was so interested in collecting data. Importantly, he was a very progressive thinker. One of the things he encountered during his travel was slavery. He very openly spoke against slavery and human inequality. Apparently, this was one of the few things he disagreed with George Washington about.

Another person I was very inspired by, if I am allowed to use an inspiration from outside sciences, was a woman called Rosita Forbes. She was not a scientist, but an incredible woman. During the early 1930s she spent quite some time travelling in the Central Asia, by herself and propelled by her own curiosity. At that time Central Asia was politically quite sensitive. She wrote the most wonderful diaries about her travels, where she described her amazement at the cultural variety in Central Asia: it is an amazing chronicle. She wasn’t formally trained as a scientist, but her writing tends towards social anthropology. I think that the reason why I am inspired by her is that in that climate, when women did not travel or did not have necessarily independent existence from their male companions, she just went forth to a part of the World that nobody knew very much about (and which was associated with many different perils) and brought back incredibly rich stories; I find it very inspiring.

Sarah Lee) Over the past years working at Nature, what do you think has been the greatest change (development) in regards to publishing scientific data?

Certainly data availability and being able to increasingly and effectively link scientific papers  to real underlying data. The problem is far from solved: a great challenge for publishers is to be able to even better link the data with the results as discussed in the paper and, ideally, to provide the data exactly as associated with each figure and each graph in a paper that has been published. Publishers spend a lot of time thinking about how we can best do this best. Our ability to directly link and lead the readers of a paper, and of a story, to the dataset would allow them to replicate (at least in silico) how the data were analysed and a particular conclusion is reached.

Sarah Lee) What has been the highlight of your career at Nature so far, from the very first job interview to nowadays?

I am most excited by is something I was involved in when I was the genetics and genomics editor at Nature. I am referring to the ENCODE project.

ENCODE is the Encyclopedia of DNA elements. The Consortium had been working for years, producing a very large body of work, some of which had been published previously. But at some point a very significant bulk of information was ready to be published in a number of papers that were going to come out simultaneously. The dataset and the analyses were so rich that it was quite tricky to choose the story to tell. If you imagine the project as a landscape, writing papers to convey certain messages only allow you to discover certain aspects of the ‘landscape’. But there are other paths through which you can explore a landscape.

Together with the authors we proposed to create what we called ‘threads’. Threads were composed of sections from the already accepted papers, sections of text, or figures, or tables. Threads orthogonal to the exiting papers revealed additional insights into the ‘ENCODE landscape’. The reason why I find it particularly exciting was that not only that was an added way of enriching the data and the analyses, but also a new thing in publishing: nobody has ever done this before. We did this not just with the papers we published in Nature, but together with papers that were published in other journals, such as Genome Biology and Genome Research, which were published by other publishers. For me that was a wonderful opportunity but a number of things had to align: we had amazingly rich datasets, which were ready to be published at the same time; we had the interested and willing authors who took an extremely active part in the whole process; we had the goodwill of the different publishers (including publishers of Nature, but also different publishers: now Genome Biology is part of Springer Nature, but at that time it was not, it was BMC, and Genome Research was published by Cold Spring Harbor Laboratory). A wonderful opportunity! And then the efforts were well rewarded by the response from the community: I had much positive feedback on this initiative

Ines Das Neves) Do you think there should be any major shifts in the way Nature and scientific publishing in general work?

I certainly can see the need for some shifts. Although not an entirely new initiative, we have made a lot of effort towards rewarding and surfacing efforts to make research reproduce and robust. Again, much work remains to be done.

Another important focus is transparency. Transparency of research goes hand in hand with reproducibility. That said, in my view publishers and editors, who demand transparency from researchers, should themselves be more transparent about their own practices. We at Nature have begun this journey already: we try to surface exactly what it is that editors do, how we consider submissions and how peer-review is conducted. For example, within the journal I led until some days ago, Nature Communications, with the author’s agreement, we publish the referees’ report: we call it ‘transparent peer-review’.

It is important to add much more transparency to the whole scientific discussion that surrounds the publication of a paper. These are just a couple of examples of where I think we should be increasingly moving, and certainly that I would like to champion.

Alexandra Rother) What is in your opinion the best strategy to make sure that papers present reproducible results? What is your opinion on this on both sides, editors and scientists?

I would say to the scientists that a lot of reproducibility uncertainty arises from unclear reporting of how research was actually done. It is not that there is something fundamentally wrong with the way something was done, but simply insufficient details are provided. In some way,  the most transparent way would be for all the researchers to switch to electronic notebooks and publish them as part of, or substituting, the methods section of the papers. This would lead, by definition, to a 100% transparency. Although I am not entirely sure whether that would be necessarily helpful to all the readers.

In the meantime, there are a number of platforms and venues available to researchers to publish their protocols in details, to deposit their data and to describe their datasets in a specific way. For example, a number of journals are specifically dedicated to publishing descriptors of datasets. Springer Nature publishes one of them, but there are also others. The one we publish is called Scientific Data; it publishes an article type called ‘Data Descriptors’. There are no results there, no conclusions, but a description of a dataset alongside, not necessarily coincident, with the analyses and the results that emerge from this dataset; the latter published elsewhere. Of course the beauty of spending more time on describing the dataset is that others can re-use it. There are already a number of tools that scientists could take better advantage of and which can be linked to papers themselves.

Mash Coomaraswamy) What advice would you give to those hoping to enter university or those unsure about pursuing a PhD and further study in science? 

The advice to those entering university: open your mind and enjoy learning about the most wonderful planet (and beyond) and the various aspects of whatever you have chosen to study. I remember being very excited when I was an undergraduate: learning about the mysteries of life. It sounds perhaps cheesy, but that really was the case. Uncovering how things work. Very importantly, uncovering what it is that we do not know yet.

I should say that I am still very excited by many of these questions today, so you do not have to stop being excited by them when you leave university. But I do think it is a wonderful time when you are a student. My advice would be: do not constrain yourself with other things, just immerse yourself in this sort of sea of knowledge as a student.

Moving on to the PhD and further, I would say the following: if you are thinking of doing a PhD because you cannot think of something else to do, you probably should not do it. PhD is hard work: you will have to dedicate yourself to work on a specific problem (to the exclusion of many other things) and to be persistent. Most of the time experiments do not work, you will have to encounter a lot of failures. But if you are genuinely interested in what you do, if you are curious and if you want to make a contribution to research, then the PhD and the postdoc are absolutely wonderful times to be exploring and making your own contribution.

If, after your PhD, you find that you are happy with your contribution, but you are not sure about what to do but the path of least existence  seems to be a postdoc, I would say: think again, there are many other things you can do after a PhD in science. You are incredibly well-qualified for all sorts different roles in society, outside of academia, in industry or actually in other aspects of academia, or publishing, industry etc. The somewhat cliché question ‘where do you see yourself in a five-year time?’ is useful to consider, despite being a cliché. Even if you do not know exactly what you may want to do, perhaps you know what you do not want to do; that is already quite informative.

Alexandra Rother) What tips would you give to someone wishing to pursue a carrier in scientific writing/editing?

Scientific writing and scientific editing are actually quite separate. For scientific editing, there is not specific training or preparation. What you need is an open mind, the ability to consider and discuss a broad range of topics, broader than just your own area of research. If you are interested in pursuing this career, what really helps is talking to an editor. From my own experience, I can tell you that editors are very happy to talk about their careers Editors attend scientific conferences and will often come and visit your university or your institute. Why not invite an editor to come over and maybe give a talk or just have a lunch-session with you and your peers so that you may find out what specific advice they have for you.

Scientific writing usually requires a little bit more of preparation. Some people are natural writers, but, as with any skills, it requires practice. If you do have an opportunity to write, whether it is a blog or a newspaper (it can be a student or a university newspaper or any other), then do so. Sometimes newspapers are looking for science writers as interns. There are also specific courses: Master Degrees or summer courses that will teach you about the fundamental of science writing or just give you tips on how to do that.