Author: Azeem Majeed

I am Professor of Primary Care and Public Health, and Head of the Department of Primary Care & Public Health at Imperial College London. I am also involved in postgraduate education and training in both general practice and public health, and I am the Course Director of the Imperial College Master of Public Health (MPH) programme.

Writing Your Student Essays and Dissertations: Some Tips on How to Do It Well

It’s that time of year when students are starting to enrol in higher education courses at universities and colleges. Every year, when marking essays and dissertations, I encounter numerous errors in students’ writing.

What are these errors, and how can you avoid them to make your dissertation more readable? Here are my top 10 tips for improving your academic writing:

1. Plan Your Outline

Most importantly, spend time planning the outline of your essay or dissertation. For dissertations, this means thinking about chapter headings and subsections for each chapter. Decide on the key tables, figures, and graphs you need to include to complement the main text. These visual elements should add value; they shouldn’t merely repeat what’s already said but should provide a different perspective or clearer illustration of your points.

2. Avoid Complexity

Many students assume that longer words are “more scientific” and thus preferable to shorter ones. For example, they might use “perspiration” instead of “sweat” or “haemorrhage” instead of “bleed.” Imagine if Winston Churchill had written his speeches in this “more scientific” manner.

3. Use Short Sentences

Shorter sentences are easier to read and help to ensure the examiner doesn’t miss the key points you’re trying to make. The same applies to paragraphs—don’t make them too long and look for natural breaks to start a new paragraph.

4. Choose Active Voice

Use active voice rather than passive voice in your text. For example, say, “I reviewed the literature,” rather than, “The literature was reviewed by me.” Active voice is easier to read, more direct, and makes it clear that you carried out the work.

5. Eliminate Superfluous Words

For instance, “based on” is better than “on the basis of,” and “even though” is preferable to “despite the fact that.” Eliminating unnecessary words gives you more room to present your work and helps you stay within the word count.

6. Use Clear Language

Use clear and professional language, and avoid clichés and colloquial expressions. These are seldom used in scientific writing and can be difficult for some examiners, especially non-native English speakers, to understand.

7. Master the Basics Early

When writing your dissertation, it’s not the time to be learning spelling, punctuation, and grammar. Most educational institutions offer writing assistance. Take these courses early in your program and invest in a good grammar and writing style guide.

8. Practice Scientific Writing

Many journals offer the opportunity to respond online to their articles. Use this opportunity to improve your critical thinking and argumentation skills. Working in a writing group can also be beneficial, as peer feedback can help you refine your work. There are also many guides available to help you improve your writing.

9. Study Good Examples

Read examples of excellent scientific writing to inspire your own work. For instance, consider reading “From Creation to Chaos: Classic Writings in Science” by Bernard Dixon.

10. Proofread Thoroughly

Before final submission of your work to the examiners, thoroughly check your spelling, punctuation, and grammar. You will be surprised how many errors can be easily caught with the spell and grammar check functions in word processing software.

Decoding Risk in Clinical & Public Health Practice: Absolute vs Relative Risk Reduction

What is the difference between Absolute Risk Reduction (ARR) and Relative Risk Reduction (RRR)? This is a common question from students and clinicians. Understanding these concepts is crucial for interpreting research findings, especially in clinical and public health settings.

Absolute Risk Reduction (ARR) refers to the difference in outcomes between a control group and a treated group in a clinical trial or an public health study.

Formula: ARR = CER – EER

Where: CER is the Control Event Rate (rate of event in the control group) and EER is the Experimental Event Rate (rate of event in the experimental group).

Example: Imagine a trial in which 10% of patients in the control group have an adverse event, and only 5% in the treatment group experience the same.

ARR = 10% – 5% = 5%

This means that the drug reduces the absolute risk of an adverse event by 5%. In total, 20 people need to be treated to prevent one event (the Number Needed to Treat, NNT).

Relative Risk Reduction (RRR) is the proportional reduction in outcomes between the treated and untreated groups. It’s a way to contextualize the effectiveness of a treatment by considering the baseline risk.

Formula: RRR = {(CER – EER)}{CER} \times 100

Example: Continuing with the same drug trial, RRR = {(10% – 5%)}{10%} \times 100 = 50%

Interpretation: The drug reduces the relative risk of an adverse event by 50% compared to the control group.

Key Differences between ARR and RRR

  1. Context: ARR gives you the actual change in risk, which is straightforward and easily interpretable. RRR puts this change in the context of the baseline risk, making the treatment appear seem more effective than it may actually be.
  2. Impact: ARR is more useful for understanding the individual benefit of an intervention, while RRR is often more impressive for public health interventions where a small absolute change can have a large impact when scaled up.
  3. Communication: RRR is often used in marketing or in media because it tends to produce a larger, more eye-catching number. However, this can be misleading if not used with the ARR, which provides a more direct measure of an intervention’s effect.
  4. Clinical Relevance: Knowing both ARR and RRR can aid in shared decision-making between clinicians and patients. While RRR can show the effectiveness of a treatment, ARR can guide on how much benefit an individual patient can expect.

By understanding both Absolute Risk Reduction and Relative Risk Reduction, clinicians and public health specialists can better interpret the data from clinical, public and epidemiological studies, and subsequently make more informed decisions about treatment options and public health interventions.

The Impact of Virtual Consultations in Primary Care

Virtual consultations have increased in healthcare in recent years, especially since the onset of the COVID-19 pandemic. While telehealth offers many benefits for patients, such as convenience and increased accessibility, questions surrounding its impact on the quality of primary care persist. Our recent systematic review “The Impact of Virtual Consultations on the Quality of Primary Care” offers valuable insights into this timely and topical issue in healthcare delivery.

The primary goal of the study was to evaluate how virtual consultations are influencing the quality of primary care. The study was comprehensive, covering various diseases and utilizing six databases for identifying studies. It employed a rigorous screening process to ensure that only pertinent data was included.

Key Findings

The review included 30 studies comprising 5,469,333 participants. The results were quite revealing:

1. Effectiveness: Virtual consultations were as effective, or even more so, than traditional face-to-face consultations for managing certain conditions such as mental illness, smoking, and excessive alcohol consumption.

2. Patient-Centeredness: Four studies showed positive impacts on patient-centeredness, although patients felt a decrease in perceived autonomy support when engaging with healthcare providers virtually.

3. Efficiency: Virtual consultations might reduce waiting times, decrease patient costs, and lead to fewer follow-ups in secondary and tertiary healthcare settings.

4. Patient Safety: Unfortunately, data on the impact of virtual consultations on clinical safety was found to be extremely limited.

5. Equity: The evidence is mixed regarding the equitable use of virtual consultations. They seem to be favoured more by younger, female patients, and disparities were observed among other demographic groups depending on contextual factors.

Areas for Further Research

The study identified several gaps in the existing body of evidence. Specifically, there is a need for more robust data regarding patient safety, equity, and patient-centeredness. The researchers stress the importance of utilizing real-world data and clinical trials to ensure that virtual consultations are both effective and inclusive.

Conclusions

While the systematic review brings optimism about the effectiveness and efficiency of virtual consultations, it also flags important areas where more research is needed. A tailored approach, based on more comprehensive data, is crucial for informing future policies in virtual primary care. By focusing on these areas, healthcare providers and policymakers can aim to offer a more balanced, equitable, and safe healthcare delivery system for patients.

Direct access to cancer diagnostics: the promise and perils of bypassing GPs

The Secretary of State for Health and Social Care, Steve Barclay, has confirmed the UK government is considering plans to allow patients in England to bypass their GP and directly access some diagnostic tests for suspected cancer. The clinical and cost-effectiveness of these new diagnostic pathways must be compared with alternative solutions such as investing more in core NHS general practice services. My article in the British Medical Journal discusses some of the key issues and challenges in implementing this radical new policy.

Electronic health records: The importance of implementation and training

A new article in the British Medical Journal from Carol Chan, Ana Neves and myself looks at the importance of implementation and training in the use of electronic health records (EHRs) in healthcare. The introduction of EHRs has been one of the most significant changes in how healthcare is delivered in recent decades. But while EHRs have brought many benefits to the NHS, for patients and clinicians, they have also created substantial challenges that must be addressed.

Addressing the health needs of refugees and asylum seekers

The health risks to refugees and asylum seekers has become very topical with the identification of Legionella on the Bibby Stockholm barge Refugees and asylum seekers will often come from countries that have high rates of infections such as tuberculosis and hepatitis B / C (among others).

Refugees and asylum seekers will also often not be vaccinated to UK standards. A comprehensive health screen is essential when they enter the UK to identify and treat any infections they might have (as well as other significant medical problems such as diabetes and mental health issues).

It’s also essential to offer any missing vaccines to bring them in line with UK vaccination standards; and address any physical and mental health problems they have; and ensure they have access to good NHS primary care services to deal with new and ongoing medical problems.

Legionella is sometimes found in the water systems of larger buildings, particularly those with older systems where water can pool at the temperature at which Legionella can multiply quickly. Suitable action to deal with the water system is needed when Legionella is identified to reduce the risk of Legionnaire’s disease to people using and living in the affected building.

The poorer the quality & maintenance of the water system, the more likely Legionella will be found and the more difficult it will be to control. Older people, those with chronic lung disease or other serious medical problems such as diabetes, and weak immune systems are at highest risk of serious illness from Legionella.

The accommodation for refugees and asylum seekers can be environments where infections spread rapidly, because these sites are often crowded and the people living in them will often congregate together. This poses a threat to both the health of the residents and the wider community because infectious and parasitic diseases such as Covid-19, gastroenteritis and scabies can then spread quickly among the residents. Appropriate surveillance, medical care and public health interventions are crucial to mitigate these risks.

The Increasing Impact of Heatwaves: A Global Health Challenge

The harsh reality of climate change is becoming increasingly apparent, with extreme temperatures emerging as an increasing global phenomenon. One of the most conspicuous manifestations of this climatic shift is the occurrence of heatwaves. These bouts of extreme heat aren’t just uncomfortable, they also pose significant health risks and can increase death rates; particularly amongst the most vulnerable people in societies – such as the elderly, children, and individuals with pre-existing health conditions.

Heatwaves don’t just affect the health of individuals; they also put enormous strain on healthcare systems. In times of extreme temperatures, the influx of patients seeking medical help for heat-related illnesses increases drastically. Often, other factors linked with extreme heat, like water shortages and poor air quality, exacerbate the situation, leading to an even greater health crisis.

The ability to effectively manage these health threats often comes down to the resources and infrastructure a country has in place. Countries with advanced infrastructures are typically better equipped to handle these challenges. They can provide the necessary healthcare, deploy strategies to keep the population cool, and improve the urban infrastructure to mitigate the impact of high temperatures. .

However, for lower-income countries, the picture isn’t as bright. In such countries, which regularly experiences high temperatures and have less developed infrastructure, the challenge is significantly more daunting. It’s much more difficult for these nations to provide the level of healthcare required during a heatwave or to put strategies in place to protect the population from the extreme heat.

This makes it even more imperative for such regions to establish robust measures to mitigate the health impacts of climate change and extreme heat. The strategies needed are wide-ranging – from improving their healthcare systems and response to heat-related illnesses, to launching comprehensive climate adaptation and mitigation policies. These actions are not just necessary, they are urgent, because when it comes to heatwaves and the health threats they bring, we are all feeling the impact.

The effects of community interventions on unplanned healthcare use in patients with multimorbidity

Multimorbidity, the coexistence of multiple chronic conditions within an individual, is a growing global health challenge affecting a significant portion of the population. Patients with multimorbidity often face complex healthcare needs, leading to increased unplanned healthcare utilization. In an effort to address this issue, community-based interventions have emerged as potential solutions for providing continued care outside of traditional hospital settings. Our systematic review published in the Journal of the Royal Society of Medicine aims to summarize the impact of these interventions on unplanned healthcare use in patients with multimorbidity.

The Burden of Multimorbidity

With the prevalence of multimorbidity increasing, affecting approximately one-third of the global population, it is crucial to find effective strategies to manage this complex condition. The challenges posed by multimorbidity often result in frequent emergency department visits and hospital admissions, placing a significant strain on healthcare resources.

Community-Based Interventions

Community-based interventions offer a promising approach to address the needs of multimorbid patients. These interventions focus on delivering care in community settings, with an emphasis on education, self-monitoring of symptoms, and regular follow-ups. Additionally, some interventions aim to improve care coordination, advance care planning, and provide palliative care for patients with severe conditions. By implementing these interventions, healthcare providers seek to enhance patient self-management, reduce the burden on emergency departments, and improve overall health outcomes.

Findings from the Systematic Review

Thirteen studies, involving a total of 6148 participants, were included in this systematic review. Notably, all the studies were conducted in high-income settings and primarily focused on elderly people. The primary outcome assessed across all studies was emergency department attendance. The risk of bias was generally low across the included studies.

The results revealed that all 13 studies reported a decrease in emergency department visits following the implementation of community-based interventions. The risk reduction ranged from 0 (95% confidence interval [CI]: –0.37 to 0.37) to 0.735 (95% CI: 0.688–0.785). This suggests that these interventions have the potential to effectively reduce unplanned healthcare usage among patients with multimorbidity.

Challenges and Future Directions

Identifying specific successful components of community interventions proved challenging due to the overlaps between different interventions. However, the overall findings strongly support the integration of community-based approaches into existing healthcare structures. Policymakers should recognize the importance of these interventions and work towards their implementation to alleviate the burden on emergency departments and improve patient outcomes.

Future research must explore the impact of community interventions on a broader range of participants. This will allow for a better understanding of the effectiveness of these interventions in diverse populations and settings. By expanding the scope of research, we can gain deeper insights into the potential benefits of community-based interventions for patients with multimorbidity.

Conclusion

Community-based interventions have shown promise in reducing emergency department visits among patients with multimorbidity. These interventions empower patients to manage their conditions, promote education, and improve care coordination. Policymakers and healthcare providers should recognize the value of these interventions and work towards integrating them into existing healthcare structures. By doing so, we can enhance patient care, reduce healthcare costs, and alleviate the burden on emergency departments. As we move forward, further research is needed to explore the broader impact of community interventions and their potential to improve outcomes for patients with multimorbidity in various contexts.

The Future of the Quality and Outcomes Framework (QOF) in England’s NHS

The Quality and Outcomes Framework (QOF) was introduced in 2004 as part of a new NHS GP contract with the aim of financially rewarding general practices for delivering evidence-based standards of care. While initially unique internationally, the QOF in the UK is now facing uncertainty, with calls to cut it back or abolish it due to various challenges faced by the NHS. In an article published in the journal BJJP Open, Mariam Molokhia and I discuss the role of the QOF in England’s NHS and argue for its importance in improving health outcomes and addressing public health challenges.

The Importance of Comprehensive Health Services

Primary care plays a vital role in providing comprehensive health services, covering both acute and long-term conditions. Beyond immediate patient needs, the focus should be on prevention, early diagnosis, and management of chronic diseases that contribute significantly to ill health, reduced quality of life, and increased NHS workload. Amid the COVID-19 pandemic, urgent care rightfully took precedence, but it is now crucial to restore high-quality care for long-term conditions.

The Role of QOF in Addressing Public Health Challenges

Public health challenges have underscored the importance of the QOF, especially in areas focused on secondary prevention and long-term condition management. Meeting QOF targets for conditions like type 2 diabetes leads to lower mortality rates, reduced emergency hospital admissions, and improved health outcomes. By using the QOF effectively, the NHS can alleviate pressures on other healthcare sectors and improve patient well-being.

Data Measurement and Research 

The QOF also facilitates data collection and measurement of healthcare quality, essential for planning health services, addressing health inequalities, and ensuring efficient use of public investments. The structured data entry required for QOF enables its use for clinical research, as shown during the COVID-19 pandemic. Abolishing or significantly cutting back the QOF would have far-reaching negative consequences, undermining these benefits.

Supporting Primary Care Teams and Addressing Challenges

Rather than discarding the QOF, it is crucial to support primary care teams in delivering structured care while addressing urgent patient needs. Adequate funding, including a review of funding allocation mechanisms, is necessary. Additionally, workforce issues should be addressed, promoting staff retention and expanding recruitment into new primary care roles. Integration of pharmacy and general practice services can also enhance primary care capabilities. Leveraging information technology and the wider primary care team can enable the delivery of QOF elements at scale, streamlining care processes and improving the efficiency of QOF.

Retaining Essential Elements of QOF

While criticisms exist regarding the QOF’s reporting domains and its evaluation of important dimensions of care quality, it is essential to retain its best elements. This includes focusing on early detection and management of long-term conditions while improving support through information technology and the wider primary care team. Recent research from Scotland demonstrates that the elimination of financial incentives can lead to reductions in recorded quality of care, emphasizing the importance of maintaining an effective QOF program.

Conclusion

The Quality and Outcomes Framework (QOF) remains an integral part of England’s NHS. Despite challenges faced by the healthcare system, the QOF’s role in improving health outcomes, addressing public health challenges, and promoting comprehensive care cannot be overlooked. By adequately supporting primary care teams, addressing workforce issues, and using technology and the wider primary care team, the QOF can continue to play a crucial role in reducing health inequalities and improving health outcomes in England.

Tools for measuring individual self-care capability

Our ability to engage in self-care practices plays a crucial role in promoting overall well-being and preventing and managing non-communicable diseases. To support individuals in assessing their self-care capabilities, many measurement tools have been developed. However, a comprehensive review specifically focusing on non-mono-disease specific self-care measurement tools for adults has been lacking. Our  scoping review in the journal BMC Public Health aims to identify and characterise such tools, including their content, structure, and psychometric properties.

Shifting Emphasis and Methodology

The review encompassed a thorough search of Embase, PubMed, PsycINFO, and CINAHL databases, covering a wide range of MeSH terms and keywords from January 1950 to November 2022. The inclusion criteria involved tools that assess health literacy, capability, and performance of general health self-care practices, targeting adults. Tools exclusive to disease management or specific medical settings were excluded. A total of 38 relevant tools, described in 42 primary reference studies, were identified from a pool of 26,304 reports.

A key observation from the descriptive analysis was the temporal shift in emphasis among the identified tools. Initially, there was a stronger focus on rehabilitation-oriented tools, while more recent tools have shown a shift towards prevention-oriented approaches. This reflects a growing recognition of the importance of proactive self-care practices to maintain optimal health and prevent the onset or progression of diseases.

Additionally, the method of administering these tools has evolved over time. Traditional observe-and-interview style methods have given way to self-reporting tools, which empower individuals to actively participate in assessing their own self-care capabilities. This shift in methods recognizes the value of self-awareness and self-reflection as integral components of self-care.

Content Assessment and Limitations

To provide a qualitative assessment of each tool, the review utilized the Seven Pillars of Self-Care framework. This framework encompasses seven domains of self-care: health literacy, self-awareness of physical and mental well-being, self-management of health conditions, physical activity, healthy eating, risk avoidance or mitigation, and good hygiene practices. Surprisingly, only five out of the identified tools incorporated questions that covered all seven pillars of self-care. This finding highlights the need for the development of a comprehensive, validated, and easily accessible tool capable of assessing a wide range of self-care practices.

While this review makes significant strides in identifying and characterizing non-mono-disease specific self-care measurement tools, it does have limitations. For example, the search was limited to specific databases and only included English-language studies. Therefore, some relevant tools and studies in other languages may have been overlooked.

Implications and Future Directions

The findings of this review underscore the importance of enhancing our understanding and assessment of self-care capabilities. By incorporating the Seven Pillars of Self-Care, a comprehensive tool can provide a holistic assessment, allowing for targeted health and social care interventions. Such interventions can empower individuals to improve their self-care practices, thereby promoting better health outcomes and reducing the burden of chronic diseases.

Moving forwards, future research should focus on developing a comprehensive, validated tool that encompasses a broader range of self-care practices. Additionally, efforts should be made to ensure the accessibility and usability of such a tool, considering diverse populations and their unique needs. Collaborative efforts between researchers, healthcare professionals, and technology experts can facilitate the creation of an effective and widely applicable self-care measurement tool.

Conclusion

Self-care is a fundamental aspect of promoting health and well-being across diverse populations. While several disease specific self-care measurement tools exist, this review highlights the need for a comprehensive, validated, and easily accessible tool that assesses a wide range of self-care practices. By embracing the Seven Pillars of Self-Care framework, we can effectively evaluate individual self-care capabilities, inform targeted interventions, and empower individuals to take an active role in their health and well-being. With continued research and collaboration, we can develop tools that facilitate and support the practice of self-care, ultimately leading to improved health outcomes for individuals and communities alike.