Some sporadic insights into academia.
Science is Fascinating.
Scientists are slightly peculiar.
Here are the views of one of them.
Buy My Book

Monday 4 March 2024

Keeping track of a slippery customer



We are on the brink of something remarkable. The virus that I have been working on for 21 years (and that others have been studying for considerably longer) is about to have not one, but three vaccines. The virus is Respiratory Syncytial Virus (RSV), which is a cause of severe disease at both ends of life – putting both babies and the elderly into hospital. However, as we saw with COVID, getting a vaccine for a respiratory virus is not the end of the journey, merely the end of the beginning.

One of the major challenges with viruses is their tendency to mutate to escape the protective immune response from vaccines. This may be particularly problematic for another of the approaches being rolled out to prevent RSV – an antibody called Nirsevimab. Antibodies are proteins made by the immune system, they are able to bind other proteins with incredible specificity. In the case of viruses, the antibodies bind the surface proteins that viruses use to enter human cells. Nirsevimab targets an RSV protein called F (for Fusion) and has proved highly effective at reducing infection and severe disease.

The worry is that because Nirsevimab and the vaccines all target this RSV F protein, the virus might mutate to escape from being seen by the immune system. It is therefore important to keep a close eye on the virus to see if it is changing. This is where our latest paper Robust and sensitive amplicon-based whole-genome sequencing assay of respiratory syncytial virus subtype A and B comes in. Working with UKHSA (UK Health Security Agency), the government agency tasked with protecting the nation’s health, a new and improved method of sequencing RSV was developed. This is faster, cheaper and more reliable than the old method. The team sequenced over 1,000 different RSV isolates covering a 4 year period 2019-23. Importantly the approach is now in place ready for the rollout of the vaccines and antibodies, giving health authorities a fighting chance of keeping on top of this important cause of childhood disease.

Tuesday 27 February 2024

Alliterative vaccines for influenza: DNA NA


Despite SARS-CoV-2 taking the limelight for the last 4 years, Influenza virus continues to be a significant threat to human health. It poses a number of threats to our health and wellbeing. These fall into three categories: seasonal, pandemic and zoonoses.

1.       Seasonal influenza. Lockdown type interventions (sometimes called NPI or non-pharmaceutical interventions) were highly effective at reducing the spread of not just SAR-CoV-2 (the virus that caused COVID) but other respiratory viruses too. The number of influenza virus infections went right down during 2020 and 2021. So much so that in fact one of the strains of influenza has disappeared. However, you can’t keep a good (bad) virus down and in the following winter (2022) flu had bounced back up to its usual levels causing illness, hospitalisation and death. Influenza virus is at its most serious at different times of the year depending upon where you live, but flu loves the winter – if you are in the southern hemisphere this means it peaks around June/ July, if you are in the Northern Hemisphere December is peak flu. These viruses change slowly over time, which is why there is a need for annual boosters.

2.       Pandemic influenza. There is a grinding low level of influenza disease year on year which causes a catalogue of low to middle grade misery. However, every so often (about once every 20 years) a completely new strain of influenza virus emerges infecting everyone. Not dissimilar to COVID a flu pandemic would cause immense disruption and death.

3.       Zoonotic influenza. We think of influenza as something that people get, but really it is an animal disease, particularly birds. The natural host of influenza is ducks, they spread it to chickens who spread it to pigs and people and the cycle continues. At the moment there is an unpleasant bird variant of influenza that has even made it to the icy shores of Antarctica causing penguins distress.

Vaccination takes the edge off some of these problems, and it is definitely worth getting vaccinated to protect yourself against the worst/ most severe disease. But the vaccines could be improved – giving you broader protection for longer. The huge breakthroughs with the RNA vaccines for COVID showed that other, newer platforms enable rapid responses to viral infections.

In our study Optimizing a linear ‘Doggybone’ DNA vaccine for influenza virus through the incorporation of DNA targeting sequences and neuraminidase antigen, we worked with a company called Touchlight Genetics who have a process for making DNA without all the bother of cells. Their Doggybone DNA (so called because of its shape) can be rapidly produced and in large amounts. This makes it a strong contender for future vaccine programs, especially against pandemic viruses. However, the Doggybone DNA vaccine platform needs a bit more work to be effective as a human vaccine, which is where our joint project came in.

We looked at two aspects to improve responses. The first was quite technical and involved tweaking the DNA sequence to allow more of it to get to the place it was needed (the nucleus). The second was looking at targeting a different part of the virus. Influenza makes two proteins on its surface, one that it uses to get into cells (Haemagglutinin or HA) and another that it uses to get out of them (Neuraminidase or NA). One thing to remember about the HA protein is that in spite of publishing papers about it for the last 15 years, I still cannot spell it – putting in too many or too few G’s, T’s or N’s. Most influenza vaccine research targets the HA protein, the idea being if you stop the virus before it ever gets into cells you can stop it in its tracks. However, targeting the NA protein has some advantages – it changes less than HA, so possibly you can increase the breadth of responses. This is helpful because the broader the anti-influenza response, the more protection you have when the virus changes its coat. We explored using the Doggybone DNA to make a vaccine that targeted influenza NA and showed that it could indeed protect against infection and disease.

Overall this work demonstrated that it is possible to further improve a DNA vaccine. By taking this marginal gains type of approach, it may be possible to develop influenza vaccines that cover all strains for all people.

Monday 8 January 2024

Don’t worry – be happy

 In which I got to say Shit in Times Higher Education (first published there 2023)

The editor has allowed me 800 words to give you the secret to academic happiness, but I can sum it up in eight: stop giving a shit about every little thing. To be honest, it doesn’t even need the “about every little thing”. But I should probably expand a bit, and not least because I get paid by the word.

In case my head of department is reading this and I sound overly nihilistic, I need to provide some clarification. I am not saying “don’t try” and I am not saying you don’t have to work hard – whether we like it or not, academia isn’t a nine-to-five job. What I really mean is stop stressing about the things you cannot control – which, to be honest, is most things. I also mean loosen your attachment to the standard metrics of academic success – “high impact” papers, measures of esteem, fellowships of exclusive organisations. Most of these things have little or no relevance outside the ivory tower – as a fun way to test this, explain to a non-academic friend how you paid £8,490 for the privilege of someone else posting your research data online.

A more grown up way to put it is to have some perspective, but that way I don’t get to say shit in Times Higher Education.

An important point in the N.G.A.S. philosophy is that it applies predominantly to the higher levels of Maslow’s hierarchy of needs – those of self-esteem and self-actualisation. No amount of not giving a shit is going to help if you are underpaid, overworked and worrying where your next contract is coming from. If you are in this position, you have my utmost sympathy. But if you have survived that stage and are still feeling unfulfilled and miserable then read on.

Much of the current system equates academic happiness with academic success. But this can lead to chasing of endpoints for the sake of accolade rather than enjoyment of the thing itself. The goal should be a well-written paper that, through the effort of yourself and your team, pieces together a story addressing a research question that was important to you. The goal should not be getting it past a specific editor, who has a particular target audience in mind. One of the healthier developments in recent years has been the uptake of the Declaration on Research Assessment (DORA) and the move to recognise papers for their own merit, not just for where they are published.

Likewise with funding, write the best grant you possibly can, enjoy the process of thinking up new ideas, but accept that it may not be what the funders are looking for at that time and you might need to repackage for somewhere else.

And there are so many things that matter more than papers and grants. Strip away the stuff that is valued collectively by “the system” and focus on the stuff that matters to you. Be that teaching an enjoyable course with engaged students; widening participation in your field; answering a research question or finding the perfect bon mot for your writing. Academia sans merde gives you amazing opportunities to set your own path.

A corollary is to do things outside the academy that give you joy. If all you have in your life is your work, it is much easier for it to overwhelm you when things don’t go according to plan.

Not giving a shit doesn’t mean not caring about others. Another advantage of stepping away from externally defined success is that it is likely to reduce bad behaviour. The zero-sum model of perceived excellence, where only a select few PIs can win and everyone else loses, promotes toxicity. Focussing on actual excellence can be done with others, and everyone can win.

Hopefully, I have persuaded you of the case for caring less. But doing it isn’t as easy as it sounds. Letting go takes commitment. You need to work at it, especially in the face of the little siren voices that say you need more success to be happy.

A particular temptation is comparison. Never give in to this one. If there is one sure-fire way to be miserable in academia it is to compare yourself to other academics and their externally broadcast achievements. Bear in mind that those broadcasts are about achievements framed in defined (and quite artificial) terms, not about happiness or fulfilment. You can do better, by focussing on what matters to you and the consequences will be of far greater value.

AI and the science writer.

Originally published in Nature: https://www.nature.com/articles/d41586-023-00528-w 

And... the first to openly use AI to write a Nature article 

ChatGPT may not yet give us sparkling prose and only delivered “generic fluff” when it asked to write grant application text by John Tregoning. But it can      free scientists up to focus on more stimulating writing tasks, he says.

 Many of us have already been trying ChatGPT. If you’ve checked science-social-media recently, it’s likely you’ve already seen many of its writings. In common with many other researchers I worry about AI replacing me. I’m a vaccine researcher and spend much of my time writing grants,      papers and      articles about science careers, so I set it the task of writing an opinion piece about the use of AI in grant writing.

 In less than a minute ChatGPT produced the following:

In my opinion, ChatGPT has the potential to revolutionize the process of writing scientific grants. Traditionally, writing a scientific grant has been a time-consuming and often frustrating process. Researchers spend countless hours crafting proposals, only to have them rejected by funding agencies. This can be demoralizing, and it can also be a barrier to progress in scientific research. ChatGPT has the potential to change all of this. By using natural language processing and machine learning, ChatGPT can help researchers write more compelling and effective grant proposals. It can also help reviewers assess grant proposals more efficiently, allowing for a more efficient and fair grant review process. Of course, ChatGPT is not a magic solution to all of the challenges facing scientific research. But it has the potential to make a real difference, and it is worth exploring as a tool for improving the grant writing and review process.

Which, as the journal editor says is pretty good for an AI, but not good enough to consider publishing as a standalone piece of writing. It is, however, a massive step forwards. The simplicity and utility of platforms such as ChatGPT means that we’ll see them quickly drifting into everyday use; Microsoft is already working closely with OpenAI, the company that developed ChatGPT– and you may already be using an AI platform to help you with some writing tasks. At the time of writing (February 2023), ChatGPT was freely available, but things are moving extremely fast, so it is hard to know where things are heading.

 

The text here isn’t my first use of AI for science writing. My first real use of AI chat bots (beyond asking it to write lyrics to a song called ‘Eggy Eggy Woof Woof’ for my daughter) was when I got fed up of writing one part of a grant. I was asked to explain the world-changing ‘impact’ my science would have, if I was lucky enough to receive funding.

 

So I asked ChatGPT: ‘What impact could vaccine research have?’ and got 250 words of generic fluff. It suggested: reducing the burden of disease, saving lives, improving global health and supporting economic development. None of what it said was in any way original or enormously surprising, but it was an excellent starting point, that I could then flesh out with specifics.

 

So far I’ve found AI to be enormously helpful at generating a lot of the low-level filler documents that otherwise take up time. For example, it can write easy, generic, on-the-one-hand-this-but-on-the-other-that statements about internet usage policy or data management. However, as it’s still early days much more thought needs to go into exploring what using AI means with regards to plagiarism and attributing credit – one issue is that the source documents that trained it are owned by someone.

 

One example where I have found it extremely useful is writing multiple choice questions. I found it relatively easy to write the question and the right answer, but coming up with 3 plausibly wrong answers is tricky. I found that if I prompted ChatGPT with the following: ‘write a multi choice answer about ‘topic of interest’ with 4 answers, and not using all of the above as an option’ it came up with good wrong answers. This was incredibly helpful. And as someone suggested on twitter – if students are going to use ChatGPT to help them write answers, we might as well use it to set the questions!

As another example, I entered the prompt ‘Write a diversity statement for a grant application’. The output starts:

 

Diversity is a core value at our organization and we are committed to promoting and maintaining an inclusive and equitable environment for all individuals.

 

And continues for 169 words that were generic to the point of meaningless. Anyone can write this and it requires no evidence or backing. Instead, it’s better for the form to ask questions that an AI can’t answer, like: “give examples of how your organisation is promoting diversity and evidence of the impact these activities have had.” This could be applied to a whole range of different questions that we are forced to answer.

 

This made me reflect: if there is a section in a grant application that can be written by an AI then does that section really serve any purpose? If a computer can churn up something deeply generic that still answers the question (more or less), why are we asking people to address the question at all? I think the answer is clear: these sections never really did serve a purpose and certainly don’t now. For science and the process of grant writing to be improved, two things need to happen, one – the needless sections (that can’t pass a writer’s Turing test, or in other words, can’t be recognised as being written by a human) are removed; and two – the sections that remain are changed in scope to be shorter and be action centred.

 

For now though, while we are forced to fill in unnecessary boxes on forms, AI offers a way to free up headspace, which should be a good thing. In an article last month about the pace of science disruption slowing down [https://www.nature.com/articles/s41586-022-05543-x] one of the suggestions was that academics needed ‘the gift of time’. AI could well give us this gift.

 

The question is then how do we use the time given? One comparator is the automatic washing machine as it became universal in the 1970’s – it freed up time, which was then replaced with other household tasks. The sociologist Joann Vanek argued in 1974 that in spite of new household devices, there was no change in the time devoted to housework in the past half century https://www.jstor.org/stable/24950221. Her argument has been debated since then, but the key question is what impact do time saving devices have? Are we going to fill the time saved by AI with other low value tasks or will it free us to be more disruptive in our thinking and doing?

 

I have some unrealistically high hopes of what AI can deliver. I want low-engagement tasks to take up less of my working day, allowing me to do more of what I need to do to thrive (thinking, writing, discussing science with colleagues). And then because I don’t have a Sisyphean to-do list I go home earlier because I have got more of the thinking, writing, and discussing done in working hours rather than fitting them around the edges.

 

We are unlikely to arrive at these sunlit uplands without some disruption. Just as domestic appliances significantly shrank the need for domestic staff, AI is going to change the labour market. For some tasks, AI will replace people. The aim of the game is don’t do a job that can be replaced by an AI. To which end, hopefully, I have persuaded you that whilst AI can write, it isn’t immediately going to replace me. I’m not the only one to think this – the songwriter Nick Cave put it much more eloquently here https://www.theredhandfiles.com/chat-gpt-what-do-you-think/. Even if you’re not convinced that AI won’t make writers obsolete, one piece of good news in terms of not immediately replacing me, is that AI isn’t very good at telling jokes – I will leave you with its best effort:

 

Why was the math book sad?

Because it had too many problems.

Can we save lives by deliberately infecting people?

 In early March 2021, in the middle of the COVID-19 pandemic a surprising-sounding experiment was taking place. Researchers at Imperial College London and Oxford University in partnership with hVIVO were deliberately infecting healthy volunteers with SARS-CoV-2. This was in fact the latest in a long line of controlled human infection studies – where volunteers are deliberately infected with an infectious pathogen under extremely controlled conditions.

Deliberate human infection for health benefit goes back a long way – the earliest evidence of infection for beneficial use is 10th Century China, deliberately inoculating healthy people with smallpox to make them immune to the disease. This practice continued into the 18th century, when an English Doctor, Thomas Dimsdale deliberately infected Catherine the Great and her son with a very low dose of smallpox virus to protect them against disease.

This idea of infecting people deliberately to protect them from disease led to Edward Jenner’s famous studies inventing the first ever vaccine. Jenner hypothesized that you didn’t need to use material derived from smallpox to be protected, you could use material from a related virus, cowpox. He proved this worked using a human challenge study; he vaccinated James Phipps (his gardener’s son) with cowpox then deliberately exposed him to smallpox repeatedly, showing that the vaccine worked and Phipps was immune to smallpox.

The practice of deliberate infection for scientific benefit really took off after the demonstration by Pasteur, Koch and others that microbes cause disease. In the early 1900’s, Walter Reed, the American public health pioneer, was trying to understand where yellow fever came from – he had a suspicion that it came from mosquitos. This was important because identifying the source could alter behaviour and reduce the incidence. To test his hypothesis, Reed recruited 11 volunteers to be bitten by mosquitos that had previously bitten a yellow fever patient; two of the volunteers contracted yellow fever, strongly supporting his idea. One important development in Reed’s infection studies was informed consent. The volunteers were told about the risk to themselves of participation. Sadly, later in the 20th century, some human infection studies entered a darker chapter where this consent was not sought, such as experimentation on prisoners in Nazi Germany and Imperial Japan.

Informed consent is the bedrock upon which all modern research involving volunteers is built, and infection studies are not exempt from that. The landscape of human infection studies has changed dramatically since the middle of the 20th century; now, ensuring the health and safety of participants is of paramount importance and trials are carefully designed to minimise any potential risks. Studies are only performed following extensive ethical review by an external body, for example all human infection studies carried out at Imperial College London have ethical approval from the UK Health Research Authority. There is ongoing debate whether infection studies can ever be ethical, in terms of deliberately exposing someone to the risk of harm; even in the context of minimising the risk. However, there are many benefits to the studies and when volunteers understand the risk and choose to participate for the greater good, they can achieve important things.

One of the ways in which deliberate human infection studies are most beneficial is in the testing of vaccines. Vaccines are tested in the same way as any drug, the first studies involve a small number of participants who are closely monitored to check first and foremost whether the vaccines are safe. These early studies (called Phase I clinical trials) can also inform about whether the vaccine is making an immune response. However, in order to demonstrate that the vaccine can prevent disease, much larger studies are needed. These, phase III, studies are often very large, the Pfizer COVID trial had 43,548 participants and Moderna 30,420. One of the reasons for these large numbers of participants is the uncertain nature of infection. Even during a pandemic, most people will not be exposed to the infectious agent (particularly if other measures, such as stay at home and social distancing are in place). This means that to get to statistically meaningful numbers to compare infection rates with and without the vaccine, you need more subjects. Infectious challenge studies can get around this, especially when the pathogen being tested is rare. One example of this is typhoid, a bacterial infection that causes diarrhoea in approximately 10-20 million people a year, mostly in low and middle income countries. A research team in Oxford gave volunteers a typhoid infection and tracked them till they had clinical symptoms before treating with antibiotics. Again, pausing to think of the volunteers – knowing that you are likely to get a bout of diarrhoea and going ahead for other people’s benefit takes a special mindset. Indeed, without volunteers, modern medicine would falter, so we all owe a large debt of thanks to these selfless individuals. Having shown it was possible to infect people in a controlled way, the group tested whether 2 new vaccines could reduce disease. They showed that whilst 77% of the volunteers without a vaccine developed typhoid, only 35% of the vaccinated volunteers did. Deliberate infection studies have also been used to support the rollout of vaccines for cholera, malaria and shigella.

Another important benefit of deliberate human infection studies is in understanding how specific viruses cause disease and how we can be protected against them. The common cold unit was a British research centre operating on Salisbury plain between 1946 and 1989. It set out to understand respiratory infections; being somewhat isolated it was able to look at transmission of colds, by infecting one volunteer and then housing them together with other uninfected volunteers. It also provided us with important information about the levels of immunity required to protect against influenza. By measuring antibodies in the blood of people before they were infected it was possible to identify a threshold above which infection was unlikely to occur; this threshold is still in use for the development of influenza vaccines.

Some diseases have more challenges than others in setting up the infections. Whilst respiratory viruses can be grown and dripped into the nose, other infections get into our bodies through a third organism, called a vector. In the case of schistosomiasis (sometimes called bilharzia), the parasites live in snails before infecting people. To help develop drugs and vaccines for this neglected tropical disease, researchers have had to learn snail husbandry!

Returning to the coronavirus infection study, this looks to address both the development of vaccines and improve our understanding about infection. In the earliest results from the study, it was seen that volunteers who had not had COVID before could be infected with an extremely low dose of the virus, which might help to explain why SARS-CoV-2 is so infectious. It can also inform more generally about the behaviour of respiratory viruses. These studies are now progressing to help in the design and testing of the next generation of vaccines and drugs. As we have seen in the last 2 years, infections can be extraordinarily disruptive; studying how they behave, why we get infected and how to prevent this is extremely important – when performed safely and ethically, human infection studies are an important part of our toolkit.

Balance is needed when discussing academic careers

Originally published in Nature: https://www.nature.com/articles/d41586-022-03216-3

Is the stream of negativity around academia putting people off?

It’s no secret that an academic career has many challenges, short-term contracts, low pay, long hours – as well as in the uncertain, exploratory nature of science. And they don’t go away with tenure: academic time is getting increasingly proscribed, funding is reducing, more is expected from less, management are more remote. As a principal investigator I am constantly juggling and hopping from one uncertainty to the next: yes I have a job, but I still need to find money to pursue my research and develop the careers of my team.

As with most academics, I have often considered quitting because of these challenges. The three times that I came closest were six months into my doctorate, when absolutely nothing was working, finishing/writing-up my PhD thesis, which drove me to despair, and the first big grant rejection early in my PI position. I still wonder about other/different career paths. I’ve worked in academia my whole adult life (+/- a period in the army reserves). The little voice suggesting something better definitely gets louder when I am weighed down with admin or stresses about how to keep the lab going.

The structural challenges in academia are not going away any time soon. In the UK, where I work, an increased workload, reduced pensions and destabilisation of long-term positions are making academia as a career increasingly unattractive. Recent survey data paints a picture of a substantial number of mid-career scientists who are extremely dissatisfied with their career opportunities https://www.nature.com/articles/d41586-022-01512-6.  Data from Advance HE, a British organisation that champions improvements in higher education, suggests a slight decline in postdoc numbers between 2019 and 2021 https://www.nature.com/articles/d41586-022-02781-x. The net impression is that early career scientists are being deterred from academic careers.

This drift from basic science begins before students even go to university. There is a lack of understanding in schools as to what scientific careers involve. I only did a science degree because it was what I was best at, I did a PhD because I didn’t really know what to do with my life after my degree.

To improve understanding, increasing numbers of academics are doing outreach programmes in schools to describe the career and paths into science. But in my experience enthusiastic high school biologists are more interested in careers in medicine than in biological research. I can see their reasoning. A career in medicine, as a doctor or a surgeon, has a structure and a job plan that’s likely more attractive than ‘scientist’ which let’s face it is pretty nebulous even to those of us who are doing it as a career. Of course, ‘scientist’ is not the same as ‘academic’. Academia is the alternative career, with most science post-graduates employed outside of universities.

That all said, focussing solely on the negative is I think problematic, especially when people are discouraged from trying an academic career at all. We need to celebrate the good parts, and by that I mean not just successes in terms of papers or grants, but celebrating where academia brings us joy – an experiment that surprisingly worked, a colleague who helped you, a student who got you to look at a problem with a different light, a trainee who flourished.

And there are many good bits to academia. For me, it’s the science, the freedom and the people. And by freedom, I don’t just mean the freedom to research what you want, but also the freedom to choose how you spend your time be that teaching, researching, writing a book.

These good bits come with a cost, but in the end, nothing of value ever came easy. Academia is hard, there are no two ways around that, but so is working in a biotech, or a charity, or a school, or a hospital, or a publishing house. Jobs outside the academy come with their own list of challenges. These might be softened with increased pay, but the pound of flesh expected in return can be more substantial.

It comes down to making choices. And to make those choices, you need the best, most accurate information. For this I would recommend applying the same scientific method as you might to the day to day workings of your career.

  1. Collect data. You don’t have to love all of the job; I think we all need to make a judgement call, are we happier more days than not and do the rewards offset the costs? Try doing this systematically – dedicate a period of time to reflecting on what you do and do not like. Or more simply just write down a score at the end of each day.
  2. Expand the sample size. Who do you listen to? Social media is notoriously self-repeating, so it may be you are missing other voices and other opinions: burst your bubble. Talk to others within your department at seminars or outside of it at meetings. Ask them about both good and bad aspects.
  3. Experiment. There are several schemes that can support a shorter (or longer) placement with another organisation. These may give you a chance to see if the grass really is greener. For more dyed in the wool academics, sabbaticals can perform the same role.

For those of us who are more established, to enable others to make those choices, both sides of the argument need to be presented. After schoolteachers, academics are the most visible scientists to students and trainees and therefore do have an influential voice. It needs to both bemoan the hard parts and celebrate the good. My constant moaning about paper portals needing fax numbers and the committee I am on over-running but not telling about the things that bring me joy only paints one side of the picture. As with most things, academia is a mixed bag, so let’s celebrate positives in equal measure to bemoaning the negatives.

How I wrote a Pop Science book

 Originally from Nature: https://www.nature.com/articles/d41586-021-02835-6 


I have always loved the writing part of my scientific career but had never seen a path to develop it. Beyond the output of research papers in my field of immunology, there was never time to think about a larger piece of work. The cycle of grants and papers didn’t leave much room for anything else. Little did I know at the start of last year, that all this was going to change: I was about to find myself writing a book about a pandemic, during the pandemic, whilst also working as a medical researcher investigating the pandemic.

I’d not really intended to write a book on infectious diseases, but the closure of my lab in March 2020 left me in the need for something other than home-schooling and worrying about my lack of new experimental data. Luckily, I had a plan B. In November 2019 I had been in touch with a literary agent, Caroline Hardman of Hardman Swainson (https://www.hardmanswainson.com/), and tried to pitch her a book about my life in science – after all, I’d written already for online periodicals and started a blog (http://drtregoning.blogspot.com/) which seemed generally well received. Not entirely surprisingly she said ‘no’, as a book about my career would be of interest solely to me and maybe some members of my family. Caroline had the far better idea of writing a book about the science behind infectious disease and then, six months later, fate intervened in the form of the coronavirus, giving me both time to write by shutting my lab, and additional thematic motivation on the subject matter.

Getting an agent came through a number of factors; many of are the same as those required for a scientific career. Persistence, I had been trying to pitch my careers book for 12 months before I spoke to Caroline. Networking, Caroline also represents Professor Dan Davis (University of Manchester), who has written three books and served as an informal writing mentor to me. Luck, the pandemic meant the topic I was best placed to write about (infections) was of a wide interest.

As my agent, Caroline helped me develop the overall pitch for the book, getting it into a shape that would be looked at by the editors at publishing houses. Pitching a non-fiction book is not dissimilar to a grant submission. You write an abstract of the whole work, a brief outline of the plan, include some preliminary data (in the form of previous written work and a sample chapter), a short CV, a comparison to the rest of the field and something resembling an impact statement (a summary of who might actually buy this book and why). Caroline then shopped the manuscript around to various publishers. This part of the process was definitely familiar from trying (and failing) to get manuscripts into journals; the best part was I didn’t have to do it myself.

The pitch was accepted by Sam Carter, Editorial Director at OneWorld (https://oneworld-publications.com/) in May 2020, with a due date for my first draft of December the same year. OneWorld are an independent publisher with a focus on non-fiction and have also published two Booker prize winners. In their science list, I am in extremely good company, both the Nobel Prize winner Barry Marshall and the former president of the Royal Society Venki Ramakrishnan have published books with them.

But a pitch is not the same as a completed book; up to May 2020 the process hadn’t been especially demanding of my time, requiring the odd hour here and there, but didn’t really prepare me for what was next. OneWorld gave me a target of 90,000 words, which even from my experience in academia working on theses and lengthy papers seemed a lot. I did some quick calculations: I needed to write around 3,000 words a week, every week, between May and December.

Thus commenced a mad dash against the clock. In an attempt to motivate myself, I used Excel to record words written vs words required. Some weeks I’d be ahead of the curve, many weeks behind it. My mood on a Sunday evening was directly linked to how many words I’d got down on paper the week before. For the six months May-December, Saturday and Sunday mornings would find me in front of the laptop. One of the unexpected side effects was that I found myself unable to read any other books, I just had too many words in my head already. But curiously having that structure and a forced deadline was in some ways beneficial; there was no time to get bored.

I owe a great debt to my wife who on top of her own job took on an extra burden of the childcare on these weekend mornings. Finally, late one night in December, it was done -- or at least the first draft was. There were still multiple edits, re-edits, comma splices to unsplice, proofs and print drafts to be read over the next 9 months.

The book itself is an overview of all aspects of Infectious disease. It starts with the underpinning ‘ologies  - epidemiology, microbiology, immunology that help us understand infectious diseases and then moves to the drugs and vaccines we use to prevent and cure infections. It draws a lot on the subject matter I have been teaching for the last 10 years and I have filled it with as many facts, anecdotes and curiosities as I could find. With the help of my teenaged son as proof-reader, I have tried to aim it at an audience aged 13 years and upwards.

So, what have I learnt?

  1. It takes a lot of time. If you want to write a book, be sure you have the time, the space and the support to do it. It took me the best part of 12 hours a week for 30 weeks, the equivalent of 12 working weeks. This was on top of the day job and the unbridled joy of home-schooling. Some might say this was quite quick; but it certainly required a lot of extra effort.

  2. Build a portfolio. None of this happened from a standing start. Building a portfolio of writing was vital. Evidence that I could string a sentence together that others were prepared to read was critical in getting the process moving. It doesn’t have to all happen at once; I started writing when I was 19. If it is a path you are interested in set up your own blog and speak to editors about writing for them. It is equally important to develop a voice – this comes from time and experience.

  3. Writing a book is really hard. There is a persistent myth that everyone has one book in them. In the same way that nearly everyone has an appendix. Getting either out is not a simple matter. The process of writing took a lot of time and drained a lot of creative energy for other activities.

  4. Learn the craft. Whilst reading widely is important, there are also technical aspects to writing. My shelves are weighed down with books about writing. From the creative side of the process (Stephen King’s On writing) to the technical (Roy Peter Clark’s On Writing). Writing well pays off in other areas: papers and grants are often best presented as stories, for example.

 

  1. Read widely. Read whatever you can, magazines, journals, high-brow literature, tweeny dystopia. All have value in terms of pacing, words, structures that you can draw upon. They also give you a wellspring of facts to draw upon.

  2. Word dump. ‘Pantsing’, where you just throw the words on the page in a rough order and worry about shaping them later, can be an immensely helpful way to start writing. Hemingway described this process as ‘write drunk, edit sober’. The way the creative bit of my brain works is different to the critical part and they don’t work well together- this is common for most people. It applies to all writing – theses, papers, applications. Get the words out there and then worry if they make sense later.

  3. Persist. The book I wanted to write about my fabulous life in science was rejected multiple times (as was my children’s book about an octopus who only laughs after he gets ten-tickles #spoiler) before the preliminary conversation that lead to an entirely different book. As with all things in science, persisting does eventually pay off, but you need to develop a thick skin.

As the book finally reaches physical publication on October 14th 2021, I ask myself: was it worth it? I have no idea how it will affect my ‘career’. In terms of the route one grants/papers academic track it may not change that much. However, it will open opportunities for science communication through festivals and talks which may then open up other unanticipated opportunities. It also gave much needed structure to my pandemic year and writing a book is one of the things I have always wanted to do, and I really enjoyed it. So on balance, yes it has been worth it, but ultimately judge for yourself – INFECTIOUS is out now.