Blog posts

How to successfully supervise your student’s research project

Postgraduate students in universities across the UK will currently be undertaking their summer research projects. How can academics successfully support their students and ensure they have a good learning experience and successfully complete their research project?

The first meeting with the student sets the foundation for a successful supervisory relationship. It’s essential for academics to establish clear expectations, foster effective communication, and provide the necessary guidance to support the student during their research project.

1. Introduction & Background: Begin the meeting by introducing yourself and providing an overview of your research expertise and experience. Ask the student to introduce themselves and their background, including their research interests and motivations for pursuing the project.

2. Research Project Overview: Provide a detailed overview of the research project, including its objectives, scope, and any specific research questions that need to be addressed. Ensure that the student understands the broader context of the project and its significance in the field.

3. Project Timeline & Deliverables: Discuss the expected timeline for the project, including key milestones and deadlines. Establish a clear understanding of the deliverables expected at each stage, such as literature review, research proposal, data collection, analysis, and thesis writing.

4. Roles & Responsibilities: Clarify the roles and responsibilities of both the student and yourself as the supervisor. Discuss how you will provide guidance, support, and feedback throughout the project. Establish a regular meeting schedule and preferred communication channels.

5. Research Methods: Discuss the proposed research methods and any specific techniques or tools that will be used. Provide guidance on the selection of appropriate research methods and data collection techniques. Address any concerns or questions the student may have.

6. Resources & Support: Inform the student about the resources available to them, such as research materials, databases, software, and equipment. Discuss any potential collaborations, access to lab facilities or data, and funding opportunities that may be relevant to the project.

7. Ethical Considerations: Discuss the importance of ethical conduct in research and ensure that the student is aware of the ethical guidelines and regulations that apply to their project. If applicable, provide guidance on obtaining necessary ethics approvals or permissions.

8. Literature Review: Emphasize the importance of conducting a thorough literature review to understand the existing knowledge in the field. Provide guidance on how to search for relevant literature, critically evaluate papers, and organise the findings.

9. Expectations for the first stage: Discuss the specific tasks or goals that the student should focus on initially. This may include conducting a literature review, refining the research questions, or drafting a research proposal. Set clear expectations for what should be achieved by the next meeting.

10. Questions & Concerns: Encourage the student to ask any questions or express any concerns they may have. Create an open and supportive environment where they feel comfortable discussing their research project and seeking guidance.

11. The evaluation process: Discuss how the student’s work will be evaluated and how they will be graded. Explain what is needed to achieve a good outcome from the assessment by the dissertation markers.

12. Create a positive and supportive environment for the student. Let them know that you are there to help them succeed and that you are interested in their work. Be respectful. Listen to the student’s ideas and be open to their suggestions.

Wastewater Surveillance for Covid-19

Wastewater surveillance is a technique that can be used to detect and track the spread of infectious diseases, including Covid-19. Wastewater is a rich source of genetic material from the people who use facilities in locations such as schools. By testing wastewater for the presence of viruses, public health officials can get an early warning of an outbreak before it becomes widespread.

Our recent study published in the journal PLOS One found that wastewater surveillance can be used to detect Covid-19 with high accuracy. The study, which was conducted in England collected wastewater samples over a period of six months. We found that wastewater samples from areas with high rates of Covid-19 infection had significantly higher levels of SARS-CoV-2 genetic material than samples from areas with low rates of infection.

We also found that wastewater surveillance can be used to track the spread of new variants of SARS-CoV-2. We were able to identify the Alpha and Delta variants in wastewater samples before these variants were detected in clinical samples.

Wastewater surveillance is a valuable tool for public health officials who are working to prevent the spread of Covid-19. It is a cost-effective and efficient way to identify outbreaks early and take steps to mitigate them. In addition to detecting COVID-19, wastewater surveillance can also be used to detect other infectious diseases, such as influenza and norovirus. This makes it a valuable tool for public health surveillance and outbreak response.

Wastewater surveillance will become increasingly important for protecting public health. It is a valuable tool that can be used to identify outbreaks early, track the spread of new variants, and monitor the effectiveness of public health interventions.

Strategies and Interventions to Improve Well-Being and Reduce Burnout in Healthcare Professionals

Our recent article in the Journal of Primary Care & Community Health discusses burnout, a psychological response to chronic workplace stress that is particularly common in healthcare workers and which has been made worse by the impact of the Covid-19 pandemic. Burnout is caused by factors such as increasing workload, inadequate support from employers and colleagues, and a stressful work environment. It has negative effects on both patients and healthcare professionals, including reduced patient satisfaction, an increase in medical errors, and decreased quality of care. Addressing burnout requires a multi-pronged approach involving individual and organisational-level strategies.

Managing people’s workload, providing individual-focused interventions like stress management, and offering professional development opportunities can help reduce burnout. Supportive leadership, peer support, and a healthy work-life balance are also important. Organisational culture and leadership play a crucial role in fostering these kind of supportive work environments. A culture of openness and support without stigma is also essential, as is the importance of appropriate support programmes rather than relying solely on individual resilience. Ultimately, preventing burnout and managing when it does occur requires collaborative efforts between healthcare systems and individual healthcare professionals.

Electronic health records: Don’t under-estimate the importance of implementation and staff training

One of the most significant changes I have witnessed during my medical career is the introduction of electronic health records (EHRs). While they have brought many benefits to the NHS, patients and clinicians, they have also posed some challenges.

On the positive side, EHRs have made medical records more legible, accessible and secure. Many doctors and patients will remember the era when a patient’s medical record was often “missing” when they attended for an outpatient appointment. This made the management of the patient more difficult as the clinician attending the patient did not have all the information they needed; usually requiring the patient to return at a later date when hopefully by which time their medical records would be found.

With EHRs, in contrast, clinicians can access patient records from anywhere at any time, which has made it easier to provide care to patients in different locations. EHRs have also made it easier to conduct medical research, as they allow researchers to access large volumes of data in a more streamlined manner. Quality improvement has also been enhanced as EHRs make it much easier to measure the quality of healthcare and the impact of any interventions and change to the provision of health services.

However, EHRs have also forced clinicians to modify how they work, which is not always a positive change. The increased use of technology in healthcare for example can sometimes result in decreased interaction between clinicians and patients; as the clinicians is often focused on reading the EHR and entering new data. In addition, the use of EHRs can be time-consuming, as clinicians have to enter information into the system, which can increase their workload.

Another potential issue with EHRs is the risk of data breaches, which can compromise patient privacy and confidentiality. Cybersecurity is a major concern for healthcare providers, and it is important that they take appropriate measures to protect patient data. We have seen example in the NHS of significant data breaches which have disrupted the delivery of health services and compromised sensitive patient information. We have also seen examples of major IT failures (for example, during the heatwave in the summer of 2022).

Despite the challenges associated with EHRs, they are here to stay. It is crucial that healthcare providers adapt to this new way of working, but also that the systems are designed in a way that minimises the burden on clinicians while maximising the benefits to healthcare providers and patients. The ongoing development of EHRs and other technological advancements must always prioritise patient care and safety. This means designing IT systems with adequate input from staff and patients; and ensuring that sufficient time and resources are devoted to areas such as implementation and training.

Why the NHS needs to put the joy back into being a doctor

A complaint I often hear from colleagues is that “the NHS has taken the joy out of medicine”. Modern healthcare delivery is increasingly seen by NHS staff and by patients as an industrial-type activity with strict performance targets. This has resulted in many healthcare professionals feeling that they have lost the much of the flexibility and autonomy that was once a defining characteristic of their professions.

This feeling can also concern patients, as they may feel that they may not be receiving the personalised care and attention that they feel they need. The focus on targets, metrics and finances can create an environment where patients feel they are being treated as numbers rather than as individuals with unique needs and circumstances.

It is important for politicians, NHS managers and clinicians to acknowledge these concerns and work to address them. While performance targets, metrics and financial monitoring are important tools for measuring the effectiveness of healthcare delivery, they should not be the only focus of the NHS. Healthcare professionals must be given the freedom and flexibility to exercise their judgement and provide personalised care to their patients.

The NHS should also work to ensure that patients are seen as individuals with unique needs and circumstances, rather than simply as numbers on a spreadsheet. This can be achieved through providing adequate resources (both financial and personnel) fpr the NHS, better training for healthcare professionals, improved communication with patients, and greater emphasis on patient-centred care.

Ultimately, the goal of the NHS should be to provide high-quality, personalised care to all patients. This requires a shift in mindset away from the purely target-driven approach we often see in today’s NHS towards a more holistic approach that prioritises the needs and well-being of patients and healthcare professionals alike.

Uncertainty in public health and clinical medicine

I joined Twitter 10 years ago in May 2013. One of the lessons I’ve learned from social media is that too many people want “certainty”. But in public health and medicine, there often aren’t certainties; just probabilities of certain outcomes or unknowns due to a lack of evidence. This can be frustrating for people who are looking for clear answers, but science is a process of discovery, and there is always more to learn; either from new research or from summarising and synthesising evidence from current and past research. By looking at the existing evidence, we can make informed decisions about our health and the health of our communities.

Uncertainty is a critical aspect of scientific inquiry and helps researchers refine their understanding of health-related issues over time. Uncertainty can arise due to factors such as incomplete data, limitations in research, or the complexity of the systems being studied. Another way to deal with uncertainty is to be open to new information. As new research is conducted, we may learn more about the risks and benefits of different interventions. It is important to be willing to change our minds in light of new evidence.

Uncertainty doesn’t necessarily mean that nothing can be done to address health issues. Rather, it means that we need to rely on the best available evidence and make informed decisions based on that evidence, while recognising that there may still be unknowns and potential risks. Communicating clearly and transparently about the state of evidence, the limitations of that evidence, and the potential implications for health can help build trust and ensure that people have the information they need to make informed decisions about their health.

Finally, we are all in this together. Public health and medicine are complex areas, and we need to work together to find solutions. By working together and gaining public support, we can have a positive effect on the health of our communities.

The academic publication process: how it works

I am sometimes asked by junior researchers or by the public how the publication process for academic articles works. The academic peer review timeline varies depending on the journal, but it typically takes several months (sometimes even longer) from submission to publication.

1. Submission: You submit your paper to the journal. Make sure your paper is well-written, checked for spelling and grammatical errors, follows the journal’s style and formatting requirements, and that you submit your paper to a journal that is a good fit for your work.

2. Initial screening: An editor at the journal reviews your paper to make sure it is within the scope of the journal & meets the journal’s style and formatting requirements. Some articles are rejected at this stage, without external peer review (particularly, by larger journals).

3. Peer review: The editor sends your paper to one or more external experts in your field for review. Reviewers are asked to assess the originality, significance, rigour of your research methods, & the validity of your work. They may suggest revisions to your paper or rejection.

4. Initial decision: The editor reviews the reviewers’ comments and decides whether to accept, reject, or revise your paper. Acceptance without any revisions is unusual and generally, the authors have to respond to the comments from the referees and editor, and revise the paper.

5. Revisions: If your paper is accepted with revisions, you will be usually given a deadline to make the necessary changes. When sending back your revised paper, it is also normal practice to send a letter explaining how you have changed the paper in response to the comments.

6. Your response. Respond promptly to reviewer comments. Make sure your revisions are comprehensive and address all of the reviewer’s concerns and any comments from the editor. Be respectful and cooperative with the editor and reviewers.

7. Final decision: Once your paper has been revised, it may be accepted without further changes; you may be asked to revise it again; or it may be rejected. If accepted, the editor will send you a copy of the proofs for your final approval. This is your last chance to make changes.

8. Publication: Once you have approved the proofs, your paper will be published in the journal. Some journals (such as the BMJ) offer readers the opportunity to comment on a paper. It’s important to respond to these comments, which may sometimes highlight problems with your paper.

9. Responding to comments. When responding to comments, aim to be polit and respectful in your reply. Some comments can be constructive and others can be very critical of your paper. This post-publication review of a paper is an important part of the academic publication process.

10. The total time it takes to go through this process can vary from a few months to a year or more. It is important to be patient and to follow the instructions of the editor and reviewers. By doing so, you can increase the chances of your paper being published in a high-quality journal.

Why cost effectiveness analysis is important in public health

Cost-effectiveness analysis (CEA) is a method used in health economics and healthcare planning to compare the costs and benefits of different healthcare interventions. CEA is particularly important in public health because it helps policymakers and healthcare providers to make informed decisions about which interventions to prioritise and invest in.

Vaccination is a good example of why incremental CEA is important. Vaccination programs can be expensive, and policymakers need to know if the benefits of vaccination outweigh the costs. Incremental CEA can help answer this question by comparing the costs and health outcomes of vaccination to other interventions, or to doing nothing at all.

There are many factors that can affect the cost-effectiveness of a public health intervention. These include the cost of the intervention, the effectiveness of the intervention, and the value of the health outcomes that are achieved (such as a reduction in hospital admissions). The cost of an intervention can vary depending on a number of factors, such as the resources that are needed to implement the intervention, the number of people who are affected by the intervention, and the cost of any associated treatment or care.

The effectiveness of an intervention can also vary depending on the characteristics of the population that is being targeted. In general, public health programmes are more cost-effective in people with a higher risk of poor health outcomes. This is why older people are often targeted. By using incremental CEA, policymakers can identify which public health programmes provide the most health benefits for the lowest cost. They can also use this information to determine the optimal allocation of resources and funding to achieve the best population health outcomes.

Additionally, by comparing the cost-effectiveness of different public health strategies, they can make more informed decisions about which interventions to prioritize and invest in, helping to maximize the overall impact of limited public health resources. As well as vaccination, we can also use CEA to look at other public health programme such as screening for cancer, interventions to promote healthy diets and increase physical activity, and programmes to support people to quit smoking.

The covid-19 pandemic three years on

In a recent article in the British Medical Journal, I discuss where we are three years after the start of the Covid-19 pandemic in the UK and what broad comments can we make about the UK’s ongoing response to the pandemic.

The UK is certainly in a better place now than it was in the first year of the pandemic; a period when many people became seriously unwell, resulting in significant pressures on the NHS and a very large number of deaths. One positive step is the creation of the UK Health Security Agency. This begins to address the weaknesses that arose in England’s health protection functions following the abolition of the Health Protection Agency in 2013 and is recognition that the UK needed a government organisation that was focused on health protection. However, the devolved nature of the UK means that some responsibilities for health protection lie with the UK government, while others lie with the national assemblies in Wales, Scotland, and Northern Ireland. This does create scope for a fragmented response to the still-ongoing covid-19 pandemic (and future outbreaks of other infectious diseases); as well as the possibility of political tensions between the UK governments and the national assemblies in the devolved nations as we saw at times during the previous three years.

The UK government now views the worst of the pandemic as being over. The UK was among the first countries to start a vaccination programme against covid-19. Vaccination combined with immunity from prior infection has reduced the severity of illness from covid-19 in the UK with deaths and hospital admissions both now at a much lower level than they were in January 2021. The UK is now highly reliant on vaccination to suppress the impact of covid-19 on our society and its impact on the NHS. Maintaining this protection will probably require regular booster vaccinations for the most vulnerable groups in society, such as the elderly, the immunocompromised, and those with significant long term medical problems. Conversely, routine covid-19 vaccination for people under 50 years of age is likely to stop other than for those who are in a high clinical risk group or who are carers.

Take-up of the first two doses of covid-19 vaccination was very high thanks to the positive attitude to vaccination in the UK population and the rapid mobilisation by the NHS of sites for delivering vaccines. However, the most recent booster campaign carried out in the autumn and winter of 2022-23 saw a much lower uptake of vaccination. Addressing vaccine hesitancy, tackling disinformation, and improving confidence in vaccines will remain key aims for the NHS, health professionals and public health agencies in the UK. The risk of a further wave of serious illness from covid-19 remains, either from declining population immunity or from the emergence of a new variant of SARS-CoV-2 that can bypass pre-existing immunity and cause more serious illness than currently circulating variants. Regular vaccination of the most vulnerable groups will help mitigate these risks, as will covid-19 treatments for the groups at highest of serious illness.

One area that the UK excelled during the pandemic was in the use of data to monitor the epidemiology of covid-19 and the effectiveness of vaccines. The UK also set up a range of research studies that informed the pandemic response not just in the UK, but globally as well. However, much of this data collection and analysis infrastructure is now being dismantled. This will make the UK much more reliant on conventional methods of measuring the impact of a disease as opposed to using data from the new systems—such as the coronavirus (COVID-19) Infection Survey—established over the last three years. It is essential the information systems we have in place continue to provide the data needed to monitor covid-19 trends and rapidly identify any resurgence in covid-19.

The UK has spent considerable sums on managing the pandemic and mitigating its impact on the NHS and the economy. As we move forward into the next phase of pandemic, interventions to manage covid-19 will need to be evaluated through the usual routes used by the NHS; with slower adoption of interventions than we saw earlier in the pandemic—as shown, for example, by NICE refusing to endorse the use of Evusheld. Future pandemic planning will also need to consider the impact of interventions on children. Much of the focus earlier in the pandemic was on protecting older people. But the pandemic also had important impacts on the physical and mental health of children as well as on their educational and social development in the UK and elsewhere. The management of people with post covid-19 syndromes (long covid) also remains challenging with demand far outstripping the supply of services for diagnosis and management.

The NHS in the UK faces many challenges and investment in interventions for managing covid-19 will need to be compared to interventions for managing other health priorities—such as urgent care, general practice, mental health and cancer—to ensure that maximum population benefit is obtained. For example, vaccine booster programmes for covid-19 will need to examine the incremental cost-effectiveness of vaccination in different population groups to identify priority groups for vaccination rather than vaccination being made available to all adults. The era of issuing “blank cheques” for tackling covid-19 is now over and investment for interventions for covid-19 will need to compete with investment in other public health and healthcare services.

How can we improve the quality of data collected in general practice?

The primary purpose of general practice electronic health records (EHRs) is to help staff deliver patient care. In an article published in the British Medical Journal, Lara Shemtob, Thomas Beaney, John Norton and I discuss the need for the general practice staff entering data in electronic health records to be more connected to those using the information in areas such as healthcare planning, research and quality improvement.

Documentation facilitates continuity of care and allows symptoms to be tracked over time. Most information is entered into the electronic record as unstructured free text, particularly during time pressed consultations. Although free text provides a mostly adequate record of what has taken place in clinical encounters, it is less useful than structured data for NHS management, quality improvement, and research. Furthermore, free text cannot be used to populate problem lists, calculate risk scores, or feed into clinical management prompts in electronic records, all of which facilitate delivery of appropriate care to patients.

Creating high quality structured data that can be used for health service planning, quality improvement, or research requires clinical coding systems that are confusing to many clinicians. For example, coding can seem rigid in ascribing concrete labels to symptoms that may be evolving or of diagnostic uncertainty. It is time consuming for staff to process external inputs to the electronic record, such as letters from secondary care, and if this is done by administrators, comprehension of clinical information may be a further barrier to high quality structured data entry.

The content of digital communications such as text messages from patients to clinicians, emails, and e-consultations may also need to be converted to structured data, even if the communication exists in the electronic health record. This all represents additional work for clinicians with seemingly little direct incentive for patients. As frontline clinical staff are usually not involved in the secondary uses of data, such as health service development and planning, they may not consider the extra work a priority.

To maximise the potential of routinely collected data, we need to connect those entering the data with those using them, also incorporating patients as key beneficiaries. This requires adopting a learning health systems approach to improving health outcomes, which involves patients and clinicians working with researchers to deliver evidence based change, and making better use of existing technology to improve standardised data input while delivering care.

Image

Data from primary care played a key role in the UK’s Covid-19 pandemic response as shown in this slide which uses data from a range of sources – including general practice records – to examine the impact of vaccination on hospital admissions for Covid-19 in England.