Some sporadic insights into academia.
Science is Fascinating.
Scientists are slightly peculiar.
Here are the views of one of them.

Thursday, 25 May 2017

What actually prevents viral lung infection?

The protection provided by our immune system against infection is multi-layered. Each individual cell has a degree of self-defence where it is able to recognise and kill infectious pathogens, this is called intrinsic immunity. Then there is a rapid response called the innate immune system that recognises infection in general. Finally there is a pathogen specific response tailored to each individual virus subtype called adaptive immunity. In turn the adaptive immunity has several elements to it there is a cellular arm made up of two flavours of T cells (CD4 and CD8) and an antibody arm which is also divided into 5 different subtypes based on the structures of the immunoglobulin molecule produced, these are called IgA, IgD, IgE, IgG and IgM. Why they are not called IgA,B,C,D and E is unclear to me, but then again much of immunology nomenclature is opaque (think of the HLA/MHC gene numbering system – or don’t): some might say is deliberately difficult to keep out interlopers from other fields.


Whilst we know that these different components exist, what produces them and how they work to kill infections, we don’t have a complete picture of the relative contributions each component makes. Thanks to studies performed in the 1970’s in the common cold unit, Porton Down (in the rolling Wiltshire countryside of the UK), we do know that antibodies in the blood protect against influenza infection. In these studies, volunteers were deliberately infected with influenza and the rate of infection compared with antibody levels in the blood. The researchers found that volunteers whose blood scored greater than 40 on a particular test called Haemagglutination inhibition (HAI), which measures the functional activity of antibodies, were significantly less likely to get infected. This benchmark number of 1:40, is now used to assess new vaccines. However, the HAI test only assesses one of the arms of the immune system – IgG. We were interested in the role of other components.


In order to assess the role of another antibody subtype, IgA, in our recently published study we went back to human challenge studies. Working with a biotech company – Altimmune – volunteers were deliberately infected with influenza. However in this study, individuals were deliberately selected who had a sub-protective HAI titre. This enabled us to look at the role of other components without the masking effect of blood IgG. Having screened the patients to have low levels of functional antibody in the blood, one prediction might be that they should all get infected. However of the 47 volunteers infected, fifteen had no recoverable virus or symptoms of infection. This suggests that there are indeed other factors that can protect against infection. We measured influenza specific antibody and found that volunteers with high levels of flu binding IgA antibody in their nose or their blood produced less virus over the course of the study. This suggests that IgA can also protect against flu.


However, there were patients with low IgA and low IgG who didn’t get infected, suggesting that there are additional factors contributing to protection. We have data that suggest that CD8 T cells could also be playing a role. CD8 T cells are also called cytotoxic T cells, they work by recognising little bits of virus that are displayed on the surface of infected cells as little flags of infection. Recently it has been shown that there is a special population of T cells that live in the lungs and are primed to recognise and prevent infections. We found high levels of these cells in the lung after a viral infection (Respiratory Syncytial Virus: RSV, which has a very large burden of disease in children). What was really striking was that by transferring these cells alone from one animal that had been exposed to RSV to another animal who hadn’t we could also transfer protection against infection. This means that CD8 T cells are also able to protect against infection, the full study is described in our paper in Mucosal Immunology.

A model: 
So where does this leave us? We think there is a layered defence against infection. IgA, which is mostly found in the upper airway, forms a barrier to the virus getting into cells in the first place. If this barrier is breached, then the IgG prevents the virus from moving from the upper to the lower airway. If the IgG fails to prevent infection of the lungs, CD8 T cells resident in the lungs rapidly kill the infected cells reducing the burden of disease. What this means is that when designing vaccines for these infections, we need to target all three components of the immune response for the best protection.

Friday, 21 April 2017

How to turn 19,000 data points into 1 graph.

Science is stories.

Good stories move science forwards. The stories come from the data and turning data into a story is a long and iterative process. The more data you have the longer it can take, as our tools get better at producing more data per sample it is getting harder to find the story. In our recently published study (Inflammatory Responses to Influenza Vaccination at the Extremes of Age) we were measuring 27 different mediators after giving 2 different vaccines 3 times to 3 different ages of mouse, sampling at 8 timepoints after vaccination with 5 replicate animals at each timepoint leading to 19,440 data points. This was a tricky knot to unpick.

Inflammatory responses

The aim of the study was to investigate whether age changed the immune response to vaccination. In particular we were interested in whether age affected inflammation after immunisation. Inflammation sounds bad, but we actually need a small amount to kick the immune system and make the vaccine work. We know that vaccines work less well at the extremes of age and wanted to determine whether the initial reaction to the vaccine shaped how well it worked. To investigate the inflammatory response, we used a tool called Luminex. Luminex measures chemical messengers in the blood called cytokines; these chemical messengers recruit cells of the immune system to the site of vaccination, activate them and shape the type of response they generate. However, as mentioned, Luminex generates LOTS of data: 19,440 data points. The first time we had the complete dataset, we had to book a study room to have sufficient space to spread out all the bits of paper with the data on. So how did we move it from there into a story?

Data Compression
 It took four things –perseverance, perspective, peer review and bio-informatics.

Perseverance: With any dataset, but large ones in particular, time is the most critical factor in finding the story. You need to spend time with the dataset, getting to know it, formatting and reformatting: sorting by size, time, alphabetically, into classes of cytokines. Analysis can’t be done piecemeal; several times I would get close to understanding the data but then have to take time off to do something else and when I came back to the data would have forgotten the trends I had been close to identifying and have to start from scratch. There were several dead ends and times when I wanted to give up as there was no discernible pattern in the data.

Perspective: That said, analysis can’t all be done in one sitting. You need time for the subconscious to churn it through, you need to read around the subject to see what other people have seen, you need conversations with colleagues and chance insights when on the loo. The creative process can’t be rushed.

Peer-review: Exposing your precious story to the slings and arrows of outrageous review is often frustrating and can be soul-destroying. However, in this case (and I grudgingly admit quite frequently for other studies) peer review significantly improved the paper. It gave us time and perspective to rethink the conclusions and suggested new ways of analysing and thinking about the dataset.

Bioinformatics: It turns out that, whilst easy and accessible, excel may not be the most effective tool for looking at big datasets. There are a range of other bioinformatic tools, which can help in the analysis. In this case we used principal component analysis. Now I have no idea how the maths behind this actually works, but I do know it squishes the 19,000 or so variables into 2 so that you can then see broad trends in the data and then from there go back and look for individual variables of interest.

So what did we learn?                        

Having spent time staring at the data, a number of patterns did emerge. First of all, age is a major factor in the inflammatory response to vaccination; with different cytokines being produced in young, adult and elderly animals. Secondly adjuvants can shape the response. Adjuvants are compounds that improve vaccine efficacy; the addition of an adjuvant called MF59 reduced age associated differences, inducing higher levels of the cytokines IL-5, G-CSF, KC, and MCP-1. The level of these four cytokines correlated with the level of antibody produced after vaccination. This is important because it shows that poor responses at the extremes of age can be overcome through the addition of adjuvants; it also gives us some insight into what response to a vaccine can lead to the best results. Taking a complex (and large) dataset and turning it into a story was a lengthy process, but has helped us understand more about the immune response to vaccines.

Tuesday, 28 February 2017

From Great sweetness came forth infection.

Bacteria, like all living things, need food to grow. The bacteria that infect us are no exception to this and their food source is us! The airways are surprisingly rich in nutrients for bacterial growth, some of this comes from the food we ate (micro-inhalation) and some leaks out from the blood or cells lining the airways. We know that underlying lung diseases increase the risk of bacterial infection and have recently shown that this is related to the levels of glucose in the airways. We think that this works a little like leaving a jam jar open – bacteria will colonise and grow on the available sugar.

New Treatments for Bad Bugs

Antibiotic resistance bacteria (bacteria that are not killed by antibiotics) are a crisis in global health. If antibiotics stop working, as well as an increase in the severity infections that are treatable, much of the medical advances of the last 50 years including surgery and transplant also become ineffective. We therefore need new ways of killing bacteria. This could either be by finding drugs that directly attack the bacteria, or by changing strategies.

War on bugs

Our finding that bacteria grow better when sugar is high opens up new treatment strategies – to starve the bug, rather than attacking it. In our recent study, we investigated whether an anti-diabetic drug (Dapagliflozin, made by AstraZeneca) could prevent bacterial lung infection. Treating diabetic mice with Dapagliflozin reduced the blood sugar; critically it also reduced the airway sugar levels. The reduction in airway sugar led to a reduction in bacterial infection in the drug treated mice. We have seen a similar effect using another anti-diabetic drug – metformin. These studies suggest that reducing blood and lung sugar will reduce the number of infections seen in people with diabetes.

Sunday, 1 January 2017

New Year's Resolution 2017

My first resolution is a work-centric one. It is not dissimilar to the resolution I made in 2016 (and 2015, 2014 and 2013). It is to publish 10 papers in the same year and to get promoted! In some ways, this is the academic equivalent of saying that I will quit smoking and lose 2st (12kg) in weight: it is aspirational, but lacks the detail needed to achieve it.
The second resolution is a political call to arms, to myself and the whole academic community. I think it is fair to say that we, the experts, lost 2016. Somewhere in post-truth politics, our voices stopped being heard. In the next four years, the truths I hold to be self-evident – that vaccines work, evolution happens and the climate is changing – will be under attack and no amount of clever Facebook posts that I make to my like-minded friends will help defend them. I need to come up with better ways to get the message across: fighting rhetoric with reason, fear with facts and populism with pragmatism.
It’s going to be a long year.

This post first appeared on Times Higher Education on the 5th Jan 2017

Saturday, 31 December 2016

Support basic science

The promised injection of £2 billion into the UK science ecosystem is without doubt a good thing. However, there is some uncertainty as to how it will be handed out.
Since this is taxpayers’ money, there needs to be a demonstration that the money has been “well spent”: the big question, then, is what defines well-spent science funding? In the event of the government not opting for the “give it all to John Tregoning” option, I wanted to make a case for the funding of basic science.
Translation versus inspiration
While all science involves repeated testing of ideas, we artificially split the world of scientific effort into two very broad areas: basic science (pure research, learning about stuff for the sake of learning); and translational science (testing things like drugs, chemicals, devices, bridges and computers to improve the quality of human existence).
To those with a commercial mindset, the translational approach has the greater value. You put money in, you get better stuff out. So why invest in pure research?
Essentially, basic science underpins translational research: the ideas about how to make stuff better come out of pure research. Lots of modern engineering depends on us understanding how gravity works, but Newton’s aim wasn’t to put rockets on the moon. While the results are not immediately tangible, basic science underpins technologies that are the foundations of billion-dollar industries – for example cancer immunotherapy, lasers, the internet, GPS, fluorescent and luminescent proteins.
I strongly believe that we need both: funding translational science at the expense of basic science may pay off in the short term, but it damages advances in the long term.
The home of basic research
I also believe that in the current research ecosystem, universities are best placed to deliver the pure research and companies small and large are best placed to develop it into real things.
Companies utilise (and often contribute to) the basic research being performed by academia, but rarely initiate basic research programmes by themselves: though there are exceptions, the IBM Zurich Research Laboratory (which has gained two Nobels) has just celebrated its 60th year and the AT&T Bell labs earned 8 Nobel prizes.
If universities are initiating the research, it raises a question about who financially benefits from the basic research, as the money may not seem to come directly back to the originator. But it will trickle back in tax revenue, employment, better medicines, cleaner cars and other indirect benefits.
This is a strength of bringing Innovate UK and RCUK (Research Councils UK) into one umbrella organisation, enabling the flow from academic basic science to small and medium enterprise led innovation (ie, by any firm with up to 250 employees) to large company implementation.
The other benefit of basic research is the teaching and training element.
The economy needs people with science backgrounds. A PhD provides the student with very much more than just the ability to move colourless liquids around – it gives them problem-solving, teamwork and analytical skills, tenacity, flexibility and independence. But just as no one expects doctors to train without ever seeing a patient, the best way to learn science is by doing science.
Basic research delivers this apprenticeship in science. To quote the National Science Foundation in the US: “Basic science is a gamble because it deals with the unknown, but a sure thing because it always leads to improvements in knowledge.”
Reap what you sow
The good news is that the public have repeatedly demonstrated support for basic science: a 2014 survey by the British Science Association reported that 8 out of 10 people questioned supported research with no immediate benefit. So please include basic research in the mix – not to the exclusion of work with an immediate pay-off, but as part of a long term strategy to further develop our scientific excellence.
To paraphrase John F. Kennedy: we choose to do the research we do, not because it is easy, but because it is hard; we choose to do basic science because it is there and new hopes for knowledge are there and we are going to climb these mountains.
Surely that is as uplifting a message as we can hope to end 2016 on.
This article was first published on the Times Higher Education Supplement 31/12/16

Monday, 19 December 2016

I need space to breathe, to create

Creativity – probably the best PI skill in the world

What is the most important skill to become a PI? An eye for numbers, an ability to perform repetitive tasks accurately, optimism in the face of relentless failure, the ability to play nicely with others, sheer bloody mindedness, self-belief? All of these skills will strap you into the driving seat but once there, you’ll need to press the pedals yourselves. The most vital skill is creativity; the ability to see new connections — linking old data in new ways and using what we do know to interpret what we don’t.
Creativity is the most nebulous, ephemeral, and elusive of qualities and often feels at odds to the scientific process, but without creativity, you ain’t going nowhere.
In my experience there’s an arc to developing an idea. It starts with staring in despair at a steaming pile of mismatched data that has recently been deposited onto your desk. After the initial shock, you might begin to see strands of a story coalescing. You start to sew it together, ambitiously demanding new datasets and proposing experiments that will never be undertaken.
Finally, you pull all of the ideas into a shining gem of scientific writing, polished to perfection for your dream journal, only to have it crushed by some faceless, nameless, and possibly soulless reviewer and have to begin again. However, these steps are extremely tricky and involve a lot of tea, pacing round the office and crumpled sheets of paper. Here are some things that may help you to have, and then develop your ideas.

Be receptive

Ideas come at the most inconvenient of times — at 4 in the morning or when you have no access to pen/paper/internet. Accept this and provide yourself with tools to mitigate it: keep a pen besides your bed; use the notes feature on your phone; carry a notebook everywhere.

Stand on the shoulders of giants: read

There are no new ideas. Everything is a development from something else: this makes it both easier and harder. Easier because you can read around and adapt ideas from other disciplines; harder because someone else has no doubt had the same idea, reducing its novelty, impact and therefore marketability.

Follow your dreams

Allow yourself periods of not actively thinking about an idea — when you come back to it the problem will often be clearer. A lot of the heavy lifting can be done by your subconscious; give it time to do the groundwork and feed it by reading around the topic. But try to keep it focussed, as it is prone to drift off to the land of chocolate (mmmm. Chocolate).

Work the problem

“My subconscious is working on my grant” is a great excuse, but doesn’t get you funded — you do actually have to do something. Even if all you have to show for it is a bin full of crumpled paper; sitting, thinking and writing are all needed to add substance to any idea. I’ve spent many mornings going round in circles stuck on a particular issue, but you need to put in those miles in order to achieve breakthroughs. The trickiest part is knowing when to push and when to stop.

Take a break

Even short breaks can help. Archimedes had his Eureka moment in the bath, Newton was chillin’ by a tree when he got beaned by the apple good, and programmers have been communing with rubber ducks for 17 years. The first two probably didn’t happen (and the third, bizarrely, does) but that’s beside the point — stepping away from your desk can often lead to moments of clarity.

Don’t overthink it

Ideas are strange ephemeral things and in their earliest stages, they are staggeringly easy to destroy: direct scrutiny is the death of creativity. To paraphrase Douglas Adams, the brain just edits out them out, like a blind spot: your only hope is to catch them by surprise out of the corner of your eye. There is a difference between coming up with an idea, when you need to be creative, imaginative and think of the big picture; and developing an idea, when you need to be critical, analytical and focussed on the details.

It’s good to talk (but only sometimes)

It can help to discuss your idea with someone else as advice is always valuable, but you need to find the right person. Some people are good at giving unstructured support. Others are more critical, which can make your ideas stronger, but it can also kill them stone dead. Be clear with what you need when approaching someone for advice.
The timing of the discussion is critical. At their inception, when I can’t even find the words to describe the ideas to myself, there is no point trying to describe them to others; I get tongue-tied and frustrated while the person I am talking to just stares, bewildered. As the ideas become more formed, my excitement increases, but they are no less fragile.
When they’re developing, but not complete, the ideas (and I) both need unconditional praise to develop further: detailed questioning can make me doubt my idea, lose enthusiasm and bin the whole thing, including the good bits. Finally, only when fully mature, do I feel robust enough for ‘instant feedback’
The single best thing about academia is that you get to have ideas and test them, no matter how crazy they are. But you must feed the beast: it takes more than one good idea to sustain a career. Yes, a “break-in” idea might get you your first PI job, but maintain a stream of ideas at various stages of development from half-baked plan devised in the pub to rejected grant. So get out there and start thinking.
This article first appeared on Nature Jobs Blog on 19 Dec 2016.
One additional tip that was too risque for the original: 

Have a (mind) dump
Even short breaks can help. Whilst Archimedes had his Eureka moment in the bath, there are other small rooms where water displacement and inspiration are linked. Stepping away from your desk can often lead to moments of clarity.

Friday, 11 November 2016

Take my advice (or don't)

Academia is a complex, challenging, highly competitive career and it is easy to feel lost. In the absence of a simple route from PhD to professor, we are forced to hunt for advice.
The problem is how does one get good advice? There is certainly no shortage of it; advice can be found everywhere, from your mate in a pub beer garden; to colleagues, coaches, mentors and heads of department; through training courses, conferences and lectures; to books and the infinite echo chamber of the internet. Some of it is excellent (may I, ahem, recommend this excellent blog), some of it is excrement.
The problem is not really finding advice, but in acting upon the right advice. It can be because the advice is poor, but more often it is because the recipient is not receptive because of hubris, egocentric bias, emotional investment, mistiming, lack of head space, failure to understand or advice saturation.
Here are five scenarios in which advice, however good, may not be acted upon:
Unique and beautiful snowflakes: all of us face different challenges at different times. These challenges are different to those faced by the generation before us (the people we often turn to for advice). The circumstances and career path for me as a lecturer are different to the professors in front of me and the postdocs behind me, leading to a misalignment of advice and problem.
Change sucks: sometimes to act on advice requires change. Change is hard at the best of times; change when it implies you have been doing something wrong, impossible.
You just don’t understand me: incorporation of feedback is inversely proportional to emotional investment. Often the advice sought concerns a piece of work into which you have invested considerable effort, sweat and tears. It is easy to confuse feedback with criticism.
Ostrich approach: additionally, if the advice received about a paper or grant identifies a problem that is difficult to solve, it can be easier not to address it and hope that the reviewers don’t spot the same problem (trust me, they always do).
I just want to be loved: however, sometimes when we say we are looking for feedback, we are actually looking for validation. Honest feedback may be useful in the long run, but when you have hit a wall, there are times when encouragement and support are more valuable.

So how to get more out of advice received?

Respect your elders. The first place most of us look for advice is senior faculty, and there are two good reasons to listen to them. First, academia hasn’t changed that much since monks set up the first universities, so their experience is still relevant. Second, senior faculty sit on the grant panels and promotion boards that you are targeting. They know what works and more importantly what doesn’t work. If they raise red flags about your work, it is likely that their peers, who are evaluating you for real, will raise the same red flags. Don’t ignore feedback identifying problems in your work, however difficult they are to fix.

Role model(s). We all need a role model – someone who has got to where you want to be, in whose footsteps we can tread. Of these people, there will be some people with whom you resonate more, whose advice is phrased in a way that is easier for you to take. Identify them and turn to them more often.

But don’t stop at one – have many role models. The routes to the end are many and varied. Different people have different skills and experience that you can draw upon. Jim Collins, who teaches and writes about leadership, advises establishing your own personal board of directors. I use Peter for politics, Robin for Research, Alan for all matters recruitment, Charlie for choice words of support and Sarah for sense and sensibility (admittedly, I am lucky to have friends whose names conveniently align with their expertise).
They don’t all have to be university-based: people outside academia have useful opinions too.

Negative role model. While there are people with whom you resonate, there are inevitably others with whom you don’t, be it a bad ex-boss, an uncollaborative collaborator or a conniving colleague. Identify patterns of behaviour in these people whom you find loathsome and make an effort to do the opposite.

Be clear what you need. Advice can be great and there is no shortage of advice or people willing to give it. Don’t be shy about approaching people; everyone likes to give advice. But be clear in your mind when you need overly honest feedback and when you need a hug. Compartmentalise advisers into those who will give you the unpalatable truth and those who rose-tint your world. And when you do approach someone, be very specific with the questions you ask; if you say “what should I do with my life?” a professor doesn’t know where to begin. If you say “I am considering x or y but not sure how to think about it. I’d love you thoughts”, then it’s easier to engage and be practical.

Stress-test it. Finally, we are scientists, we test hypotheses. Take this approach to advice. The best way to decide whether to follow someone’s advice, is to see if it actually works. However, don’t let the adviser know or they may not be so forthcoming.

This article first appeared on the Times Higher Education website on 10th October 2016