Some sporadic insights into academia.
Science is Fascinating.
Scientists are slightly peculiar.
Here are the views of one of them.

Saturday, 31 December 2016

Support basic science

The promised injection of £2 billion into the UK science ecosystem is without doubt a good thing. However, there is some uncertainty as to how it will be handed out.
Since this is taxpayers’ money, there needs to be a demonstration that the money has been “well spent”: the big question, then, is what defines well-spent science funding? In the event of the government not opting for the “give it all to John Tregoning” option, I wanted to make a case for the funding of basic science.
Translation versus inspiration
While all science involves repeated testing of ideas, we artificially split the world of scientific effort into two very broad areas: basic science (pure research, learning about stuff for the sake of learning); and translational science (testing things like drugs, chemicals, devices, bridges and computers to improve the quality of human existence).
To those with a commercial mindset, the translational approach has the greater value. You put money in, you get better stuff out. So why invest in pure research?
Essentially, basic science underpins translational research: the ideas about how to make stuff better come out of pure research. Lots of modern engineering depends on us understanding how gravity works, but Newton’s aim wasn’t to put rockets on the moon. While the results are not immediately tangible, basic science underpins technologies that are the foundations of billion-dollar industries – for example cancer immunotherapy, lasers, the internet, GPS, fluorescent and luminescent proteins.
I strongly believe that we need both: funding translational science at the expense of basic science may pay off in the short term, but it damages advances in the long term.
The home of basic research
I also believe that in the current research ecosystem, universities are best placed to deliver the pure research and companies small and large are best placed to develop it into real things.
Companies utilise (and often contribute to) the basic research being performed by academia, but rarely initiate basic research programmes by themselves: though there are exceptions, the IBM Zurich Research Laboratory (which has gained two Nobels) has just celebrated its 60th year and the AT&T Bell labs earned 8 Nobel prizes.
If universities are initiating the research, it raises a question about who financially benefits from the basic research, as the money may not seem to come directly back to the originator. But it will trickle back in tax revenue, employment, better medicines, cleaner cars and other indirect benefits.
This is a strength of bringing Innovate UK and RCUK (Research Councils UK) into one umbrella organisation, enabling the flow from academic basic science to small and medium enterprise led innovation (ie, by any firm with up to 250 employees) to large company implementation.
Teaching
The other benefit of basic research is the teaching and training element.
The economy needs people with science backgrounds. A PhD provides the student with very much more than just the ability to move colourless liquids around – it gives them problem-solving, teamwork and analytical skills, tenacity, flexibility and independence. But just as no one expects doctors to train without ever seeing a patient, the best way to learn science is by doing science.
Basic research delivers this apprenticeship in science. To quote the National Science Foundation in the US: “Basic science is a gamble because it deals with the unknown, but a sure thing because it always leads to improvements in knowledge.”
Reap what you sow
The good news is that the public have repeatedly demonstrated support for basic science: a 2014 survey by the British Science Association reported that 8 out of 10 people questioned supported research with no immediate benefit. So please include basic research in the mix – not to the exclusion of work with an immediate pay-off, but as part of a long term strategy to further develop our scientific excellence.
To paraphrase John F. Kennedy: we choose to do the research we do, not because it is easy, but because it is hard; we choose to do basic science because it is there and new hopes for knowledge are there and we are going to climb these mountains.
Surely that is as uplifting a message as we can hope to end 2016 on.
This article was first published on the Times Higher Education Supplement 31/12/16

Monday, 19 December 2016

I need space to breathe, to create

Creativity – probably the best PI skill in the world

What is the most important skill to become a PI? An eye for numbers, an ability to perform repetitive tasks accurately, optimism in the face of relentless failure, the ability to play nicely with others, sheer bloody mindedness, self-belief? All of these skills will strap you into the driving seat but once there, you’ll need to press the pedals yourselves. The most vital skill is creativity; the ability to see new connections — linking old data in new ways and using what we do know to interpret what we don’t.
Creativity is the most nebulous, ephemeral, and elusive of qualities and often feels at odds to the scientific process, but without creativity, you ain’t going nowhere.
In my experience there’s an arc to developing an idea. It starts with staring in despair at a steaming pile of mismatched data that has recently been deposited onto your desk. After the initial shock, you might begin to see strands of a story coalescing. You start to sew it together, ambitiously demanding new datasets and proposing experiments that will never be undertaken.
Finally, you pull all of the ideas into a shining gem of scientific writing, polished to perfection for your dream journal, only to have it crushed by some faceless, nameless, and possibly soulless reviewer and have to begin again. However, these steps are extremely tricky and involve a lot of tea, pacing round the office and crumpled sheets of paper. Here are some things that may help you to have, and then develop your ideas.

Be receptive

Ideas come at the most inconvenient of times — at 4 in the morning or when you have no access to pen/paper/internet. Accept this and provide yourself with tools to mitigate it: keep a pen besides your bed; use the notes feature on your phone; carry a notebook everywhere.

Stand on the shoulders of giants: read

There are no new ideas. Everything is a development from something else: this makes it both easier and harder. Easier because you can read around and adapt ideas from other disciplines; harder because someone else has no doubt had the same idea, reducing its novelty, impact and therefore marketability.

Follow your dreams

Allow yourself periods of not actively thinking about an idea — when you come back to it the problem will often be clearer. A lot of the heavy lifting can be done by your subconscious; give it time to do the groundwork and feed it by reading around the topic. But try to keep it focussed, as it is prone to drift off to the land of chocolate (mmmm. Chocolate).

Work the problem

“My subconscious is working on my grant” is a great excuse, but doesn’t get you funded — you do actually have to do something. Even if all you have to show for it is a bin full of crumpled paper; sitting, thinking and writing are all needed to add substance to any idea. I’ve spent many mornings going round in circles stuck on a particular issue, but you need to put in those miles in order to achieve breakthroughs. The trickiest part is knowing when to push and when to stop.

Take a break

Even short breaks can help. Archimedes had his Eureka moment in the bath, Newton was chillin’ by a tree when he got beaned by the apple good, and programmers have been communing with rubber ducks for 17 years. The first two probably didn’t happen (and the third, bizarrely, does) but that’s beside the point — stepping away from your desk can often lead to moments of clarity.

Don’t overthink it

Ideas are strange ephemeral things and in their earliest stages, they are staggeringly easy to destroy: direct scrutiny is the death of creativity. To paraphrase Douglas Adams, the brain just edits out them out, like a blind spot: your only hope is to catch them by surprise out of the corner of your eye. There is a difference between coming up with an idea, when you need to be creative, imaginative and think of the big picture; and developing an idea, when you need to be critical, analytical and focussed on the details.

It’s good to talk (but only sometimes)

It can help to discuss your idea with someone else as advice is always valuable, but you need to find the right person. Some people are good at giving unstructured support. Others are more critical, which can make your ideas stronger, but it can also kill them stone dead. Be clear with what you need when approaching someone for advice.
The timing of the discussion is critical. At their inception, when I can’t even find the words to describe the ideas to myself, there is no point trying to describe them to others; I get tongue-tied and frustrated while the person I am talking to just stares, bewildered. As the ideas become more formed, my excitement increases, but they are no less fragile.
When they’re developing, but not complete, the ideas (and I) both need unconditional praise to develop further: detailed questioning can make me doubt my idea, lose enthusiasm and bin the whole thing, including the good bits. Finally, only when fully mature, do I feel robust enough for ‘instant feedback’
The single best thing about academia is that you get to have ideas and test them, no matter how crazy they are. But you must feed the beast: it takes more than one good idea to sustain a career. Yes, a “break-in” idea might get you your first PI job, but maintain a stream of ideas at various stages of development from half-baked plan devised in the pub to rejected grant. So get out there and start thinking.
This article first appeared on Nature Jobs Blog on 19 Dec 2016.
One additional tip that was too risque for the original: 

Have a (mind) dump
Even short breaks can help. Whilst Archimedes had his Eureka moment in the bath, there are other small rooms where water displacement and inspiration are linked. Stepping away from your desk can often lead to moments of clarity.

Friday, 11 November 2016

Take my advice (or don't)

Academia is a complex, challenging, highly competitive career and it is easy to feel lost. In the absence of a simple route from PhD to professor, we are forced to hunt for advice.
The problem is how does one get good advice? There is certainly no shortage of it; advice can be found everywhere, from your mate in a pub beer garden; to colleagues, coaches, mentors and heads of department; through training courses, conferences and lectures; to books and the infinite echo chamber of the internet. Some of it is excellent (may I, ahem, recommend this excellent blog), some of it is excrement.
The problem is not really finding advice, but in acting upon the right advice. It can be because the advice is poor, but more often it is because the recipient is not receptive because of hubris, egocentric bias, emotional investment, mistiming, lack of head space, failure to understand or advice saturation.
Here are five scenarios in which advice, however good, may not be acted upon:
Unique and beautiful snowflakes: all of us face different challenges at different times. These challenges are different to those faced by the generation before us (the people we often turn to for advice). The circumstances and career path for me as a lecturer are different to the professors in front of me and the postdocs behind me, leading to a misalignment of advice and problem.
Change sucks: sometimes to act on advice requires change. Change is hard at the best of times; change when it implies you have been doing something wrong, impossible.
You just don’t understand me: incorporation of feedback is inversely proportional to emotional investment. Often the advice sought concerns a piece of work into which you have invested considerable effort, sweat and tears. It is easy to confuse feedback with criticism.
Ostrich approach: additionally, if the advice received about a paper or grant identifies a problem that is difficult to solve, it can be easier not to address it and hope that the reviewers don’t spot the same problem (trust me, they always do).
I just want to be loved: however, sometimes when we say we are looking for feedback, we are actually looking for validation. Honest feedback may be useful in the long run, but when you have hit a wall, there are times when encouragement and support are more valuable.

So how to get more out of advice received?

Respect your elders. The first place most of us look for advice is senior faculty, and there are two good reasons to listen to them. First, academia hasn’t changed that much since monks set up the first universities, so their experience is still relevant. Second, senior faculty sit on the grant panels and promotion boards that you are targeting. They know what works and more importantly what doesn’t work. If they raise red flags about your work, it is likely that their peers, who are evaluating you for real, will raise the same red flags. Don’t ignore feedback identifying problems in your work, however difficult they are to fix.

Role model(s). We all need a role model – someone who has got to where you want to be, in whose footsteps we can tread. Of these people, there will be some people with whom you resonate more, whose advice is phrased in a way that is easier for you to take. Identify them and turn to them more often.

But don’t stop at one – have many role models. The routes to the end are many and varied. Different people have different skills and experience that you can draw upon. Jim Collins, who teaches and writes about leadership, advises establishing your own personal board of directors. I use Peter for politics, Robin for Research, Alan for all matters recruitment, Charlie for choice words of support and Sarah for sense and sensibility (admittedly, I am lucky to have friends whose names conveniently align with their expertise).
They don’t all have to be university-based: people outside academia have useful opinions too.

Negative role model. While there are people with whom you resonate, there are inevitably others with whom you don’t, be it a bad ex-boss, an uncollaborative collaborator or a conniving colleague. Identify patterns of behaviour in these people whom you find loathsome and make an effort to do the opposite.

Be clear what you need. Advice can be great and there is no shortage of advice or people willing to give it. Don’t be shy about approaching people; everyone likes to give advice. But be clear in your mind when you need overly honest feedback and when you need a hug. Compartmentalise advisers into those who will give you the unpalatable truth and those who rose-tint your world. And when you do approach someone, be very specific with the questions you ask; if you say “what should I do with my life?” a professor doesn’t know where to begin. If you say “I am considering x or y but not sure how to think about it. I’d love you thoughts”, then it’s easier to engage and be practical.


Stress-test it. Finally, we are scientists, we test hypotheses. Take this approach to advice. The best way to decide whether to follow someone’s advice, is to see if it actually works. However, don’t let the adviser know or they may not be so forthcoming.

This article first appeared on the Times Higher Education website on 10th October 2016

Tuesday, 1 November 2016

#realworldacademic

On Saturday 29th October, The Rt Hon Glyn Davies MP declared on Twitter that: “Personally, never thought of academics as 'experts'. No experience of the real world.”. This is a widely held, but wrong belief and twitter was quick to point this out to Glyn Davies and the world. The reasons why fell into three categories – prior experience, the job of a modern academic and the research we do.

The first stereotype is that academics have never left university and are perpetual students so cut off from the “real world” of work outside the ivory tower. But it was clear from Twitter that academics come from a huge range of backgrounds working in environments as diverse as prisons, schools, hospitals, cell centres, shops, banks and the armed forces. This work was either done in a previous life or as a means to subsidise student fees in order to become established. Or in several cases as part of the academic role - as Doctors, dentists, civil engineers, scientific advisors, legal experts etc.
The second stereotype of academics is that we sit in common rooms smoking pipes, drinking sherry pondering the nature of life. If only! I’ve written about this before, but the life of a modern academic is extremely varied. As researchers, we are essentially running an average size, not for profit, small business (SME) with an annual turnover between £100-500k. In order to sustain that company we need to apply for funding; manage the funds we have; purchase materials and equipment – some of which is extremely specialist, even unique; train and manage staff working with extremely dangerous materials; publicise the current work and plan the next round. As teachers we need inspire and educate the next generation with teaching styles from 300 students in one lesson to small group tuitions to practical labs and thesis supervision. On top of this we are expected to help with the administration of large complex organisations with upwards of 10,000 staff. All of which are skills I would argue are pretty standard in the ‘real world’ outside academia.
The final pernicious stereotype is that the research we do has no impact on the problems of people who are not academics; essentially we are using public money to navel gaze enabling us to show off to other academics for no actual purpose. This is also false. There is an increasing emphasis on the ‘impact’ of our work and whilst there are arguments for whether pure research is better than applied and whether the metrics of impact are valid, there are clear examples of how academic research directly impacts on the world outside the university

However the final word needs to go to this gem, for pointing out the irony of an MP critiquing the worldliness or otherwise of academics:

Monday, 24 October 2016

Time for T? The role of T cells in influenza vaccination.

It is approaching flu season again. If you are in any way susceptible to flu – over 65, have an underlying health condition (asthma, MS, pregnancy) get the flu vaccine (I have nagged my parents already – but Mum, Dad if you are reading this don’t forget). The current flu vaccine works well and will protect you against this year’s flu.

See my coat of many, many colours

But we could do better. Vaccines work by training your body to recognise the coat that surrounds the flu virus. The big problem is that flu mutates, changing its coat, hiding from our immune system. The gene machinery encoded by the influenza virus is inefficient (leaky). Imagine using a photocopier to make repeat copies of the same document each time using the copy as the template for the next round; over time the quality of the copies declines. DNA is copied in the same way with faults in the copies leading to errors in genes (mutations), most of which are harmless, some harmful and some beneficial – this is the driving engine of evolution. Eukaryotic cells (us) have proofreading in the gene copying machinery, if a faulty copy is made it is deleted, viruses do not have this, so the chance of a faulty copy increases. This means that viruses can mutate/ change quickly leading to the emergence of new viruses with new coats each year necessitating new vaccines.

Killer cells

A goal of influenza research is to develop vaccines that cover a wider spectrum of viruses – the universal flu vaccine. But to achieve this goal, we need to understand more about the way in which the body fights off flu infection. This is what we set out to do in our recently published paper: “DNA Vaccines Encoding Antigen Targeted to MHC Class II Induce Influenza-Specific CD8+ T Cell Responses, Enabling Faster Resolution of Influenza Disease”. Using unique vaccines from our Norwegian collaborators, Vaccibody, we dissected one aspect of the immune response called the CD8 T cell. These cells are able to sense when other cells have viruses in them and then kill the infected cells. The vaccines are designed to target different types of cells and can be used to alter the flavour of the immune response. Using a CD8 T cells specific Vaccibody, we showed was that vaccines that evoke a T cell response led to a faster resolution of disease – infected animals got better, quicker. This is important because the influenza virus may be less good at escaping CD8 T cells than other parts of the immune response. Based on these studies, we believe that the next generation of influenza vaccines need to increase the CD8 T cell response. This idea is supported by research performed during the influenza pandemic in 2009: patients who had a functional CD8 response were much less likely to get sick after infection. The more we understand about the immune response, the better vaccines get and the less people will  get sick from infections.

Thursday, 8 September 2016

Startup Advice

What was the most helpful advice I got when starting my group? There is no answer to that question because, like most of my colleagues, I didn’t get any. I got the keys to the lab, a nice pub lunch, a PC, a small amount of start-up money and a seat in a shared office in which the most commonly used word was “fuck”.But while there is no disgrace in lying for comic effect, I should confess that, in reality, I did get two pieces of advice. One was a not-so-helpful recommendation to never become a PI in the first place because it was hard and getting harder. The other, much more helpful recommendation was to read Kathy Barker’s book At the Helm: Leading Your Laboratory. It is thorough and thought-provoking, and covers the whole spectrum of the academic experience from situations you will have considered to those that you will never have imagined (and hopefully will never be in).
In the absence of any further second-hand advice to pass on, here are the key things I had to learn the hard way:
Learn to say no
New staff represent a brilliant opportunity to offload unpopular lectures, roles on health and safety committees and other rubbish no one else wants to do. Do not unwittingly take on busywork in an attempt to be popular with the cool kids; otherwise you too will end up having to dump it on the next generation.
If you do say yes, do it well
Being a safe pair of hands is a valuable skill. If you can be trusted to deliver something tricky, you will raise your profile in the department. But be aware: competence can lead to an even heavier workload.
Get some top cover
From providing lab space and access to equipment, to mentoring and speaking up for you on promotion committees, you need someone senior to look out for you; find someone simpatico.
Build a brand
A dirty word in academic circles, but important. It’s a big and competitive world, and being known as the expert in a particular area or technique will lead to collaborations and conference invitations.
Recruit the right team (for you)
I am lucky enough to have a fantastic team. But picking the wrong people will lead to a toxic lab culture that will sink you. The first person you recruit sets the tone for the rest of your career. Get experience of interviewing by being on recruitment panels for colleagues. Think very carefully about the process, particularly the questions you ask and what characteristic they actually probe. Then choose your recruit very carefully.
Toughen the heck up
You are going to fail, often. Even well-established PIs fail. It is part of the process. Learn methods to deal with it.
Be a tiger
Remember that you earned this position on your ability; try not to let impostor syndrome overcome you.
Have fun
Academia is tough, but there are good bits: don’t forget to enjoy them.


This first appeared in the Times Higher Education on 8th September 2016

Thursday, 1 September 2016

Cheer up its only brexit


On the slow return from the long vacation, as a community, we academics find ourselves in a bit of a post-referendum pickle (I think it is a reasonable assumption that most academics were anti-Brexit).
Put aside the shock that “not everyone thinks like us”. Regardless of the final outcome of Brexiting, we are in for a time of uncertainty. We can face this uncertainty in two ways: hand-wringing pessimism or fatalistic optimism. In private, I find myself swinging between these viewpoints; but in public, I take on the role of departmental pessimism-eater: for every “woe is us” I respond with “it’ll be OK”; for every cloud a silver lining and for every hand-wring a cheery backslap.
I admit this Pollyanna approach can be quite annoying.

Reasons not to be miserable
The silver linings feel few and far between. However, I take hope from the following:
  • Thirteen non-European Union nations, through one mechanism or another, can access Horizon 2020 funding, suggesting that it will be possible for the UK, too
  • At present, due to government imposed restrictions, we are basically limited to recruiting from within the EU. In the future, employing non-EU nationals may become easier, broadening the talent pool
  • The arguments for and against the EU were tight, and while to us Remain felt like the least bad option, not everyone voting Leave was a frothing-at-the-mouth Little Englander
  • The pound diving means that grants won in foreign currencies are now worth more, and a possible reduction in house prices and interest rates may allow “generation rent” to get on the property ladder (provided that they haven’t spent their deposit money on Poké Balls).
Ultimately, in the months since the referendum, unless you are a politician, nothing has actually changed: Blackshirts have not been marching in Whitechapel, countries in the EU still need to sell us cheese, wine and fast cars, and as far as I can tell the stock market is at exactly the same place as it was a year ago.

Things can only get worse
Of course all of the above may be bollocks, and Brexit should be seen as another contributory factor to the inevitable decline of British universities. For example, economic uncertainty may affect medium to long-term investment, some British-based scientists have been dropped from Horizon 2020 projects, the economy has slowed down prompting the record low interest rates, and the rhetoric from some EU politicians has been fairly acerbic, prompting fears that the exit process will not be pain-free.
You no doubt have your own personal favourite reason that we are all DOOMED, but I believe that optimism can break the debilitating miasma of gloom (perpetuated by social media) that is hanging over the ivory tower.
Critically, our happiness is – mostly – under our control, we can become more happy by doing more of the things that make us happy (reading, exercising, enjoying our jobs, pausing to notice the little things). Moping around, blaming the government/anyone who voted Leave/Donald Trump for all that ails you is, a bit like fast food, satisfying in the short term, but leaves you bloated and sad.
But don’t just take my word for it; TED talks are littered with talks about the value of positive psychology.

The wind in academia doesn’t blow, it sucks
This optimistic mindset extends beyond the current crise du jour and is a core skill for a better, happier, more productive career.
Academia is characterised by a string of events over which we have little control: student expectations, student realities, grant panels, peer reviews, promotions boards, Tory governments, global recessions, equipment not working, experiments not working, students not working, the lack of a tea room on campus and exponential increases in teaching load because in a moment of weakness you said yes to a pleading colleague.
You can throw your hands in the air, say “this is all shit” and run the clock down to retirement on your ever-decreasing pension pot.
Or you can brush yourself off and start again.

Best foot forward
Do not let each negative comment eat away at you and become embittered. Although tempting, especially when reviewing too soon after your own work has been rejected, poisoning the well for others with angry reviews, adding to a vicious cycle of rejection and recrimination, ends up making life terrible for everyone.
If you are not concerned with the general mental well-being of the body académique, there is a more selfish reason to be optimistic: pessimism directly affects your ability to succeed. Simplistically, you will never get funded if you don’t apply for grants because “no one ever gets funded”.
But additionally, if you don’t believe in a grant or paper, then why would the reviewer? So, cheer up. It’s a new academic year after all, and we have just had eight blessed weeks with no students to fix all the things that they have broken, catch up on all the paperwork they have generated, and maybe squeeze in some uninterrupted thought before the next intake.

Finally, don’t mourn, organise
I want to clarify something here.
I am not endorsing accepting the current state of affairs and doing nothing. Get out there and do something, anything: join a political party, write to your MP, contact the Commons Science and Technology Committee, demand that your professional body canvases Parliament, take to the streets and engage with others.
I am, however, endorsing a mindset to deal with the state that we are in. Look for the positives in the situation and do not let things that are out of your control affect your ability to manage the things that are in your control. To paraphrase Reinhold Niebuhr: “Have the strength to accept the things you cannot change, the courage to change the things you can, and the wisdom to tell the difference”.

This article first appeared in the Times Higher Education on 1st September 2016

Wednesday, 29 June 2016

Too much data

A Simple Screening Approach To Prioritize Genes for Functional Analysis Identifies a Role for Interferon Regulatory Factor 7 in the Control of Respiratory Syncytial Virus Disease

Scientists love data. It is like flower to florists, canvas to artists, money to bankers or ingredients to chefs. It answers the questions we have and sets the direction for new ones. From Mendel and his pea plants to Darwin and his finches through Rosalind Franklin and her X-ray crystals of DNA to CERN and their atom smashing tube thingy, the aim of experiments is to generate data to answer questions. We invest considerable time and effort to work out if the data we have is true and representative of the whole or a unique subset caused by chance (statistics) or the way we did the study (experimental design). We often repeat the same experiment multiple times to convince ourselves (and more importantly others) about the validity of our data. Without data, we are just messing around in a white coat.

Too much data

So you would think the more data the better. However, you can have too much of a good thing. Whereas before you would ask does my treatment increase or decrease a single factor, we can now measure 1000’s of things in a single experiment generating huge piles of data (datasets).In biology,, methods that generate large datasets are described as ‘omics. This is named after the genome (all the genes that make up an organism). We now have the transcriptome (all the mRNA – the messages that make proteins - at a certain timepoint), the proteome (all the proteins), the metabolome (all of the bacteria), the microbiome (all of the bacteria on or in the body) and the gnomeome (the number of garden ornaments per square metre). Each technique generates a long list of stuff that goes up or down after a certain treatment. These long lists of data are where the problems arise, being comprised of genes with weird short names like IFIT1, LILRB4, IIGP1 many of which have no known function. All of which leads to a mountain of data languishing in supplemental tables of half-read papers in obscure journals.

Biologist + computer = ???Xxx!!!

The surfeit of data has led to a whole new discipline to interpret these lists called bioinformatics. But bioinformatics requires special skills, knowledge of the mythical ‘R’ programming language, access to software tools with laborious jokey names based on forced acronyms like PICrust (Phylogenetic Investigation of Communities by Reconstruction of Unobserved States) and time. Faced with these datasets, I get a bit flustered: like many biologists, I type with 2 fingers, get nervous flushes if someone mentions Linux and can just about use Excel to add two numbers together. This is a problem because it means that there is a wealth of data out there that is inaccessible to me.

Bioinformatics for dummies

I am interested in how the body fights off viral infections in the lungs, particularly a virus called Respiratory Syncytial Virus (RSV). Part of the body’s defences is a family of proteins that restrict viruses ability to hijack our cells to make copies of themselves. There are a lot of these proteins, many of which we have no idea about how they work. A brief look at some of the ‘omics studies reveals long lists of these proteins, with no insight as to what they do. There are probably clever, but inaccessible, AI based algorithms that can search for all the relevant papers and compile them somehow; but I wouldn’t know how to use them or even where to start looking. Instead we used a ‘brute force’ approach, which meant that I/we/Jaq (first author on the paper) sat down and searched for every paper ever published on RSV that contained a big data. Having found the papers, we then harvested the gene lists from them. This was not trivial, some of the papers had to be ignored because they had inaccessible data locked behind pay walls, or were in Chinese, or were just rubbish papers or a combination of the three. But we were left with gene lists from 33 papers and stuck all the data in a big pile. At this point we employed the services of Derek, a bonafide bioinformatician, who through some computer wizardry wrote us a piece of software called geneIDs (which is freely available here, if you need such a thing), which handily counts and ranks the genes. This gave us a brand new list of all the other lists (sometimes called metadata) which can then be used as the basis for further analysis. Which we did and published the results here.

More data: better tools

First of all we compared our computer generated list to some new data from a clinical study. Children with severe RSV had higher levels of 56% of the genes on our list. This supports the approach demonstrating that the genes are important during infection. Taking a subset of these genes, we then performed experiments that showed that they are able to reduce RSV ability to infect cells and animals. In particular we demonstrated that a gene called IRF7 was central to the anti-RSV response. So ultimately the answer to the question, can you have too much data is no, but there is a need for tools to interpret it. In the current study we developed one such tool, which we feel is more accessible to biologists with little to no computer skills.
 

Monday, 27 June 2016

Sweeter lungs more bugs

Jam jar lungs
Why do some people, for example people with diabetes, get colds more often? We believe we have found a contributing factor – sugar, in particular glucose. Diabetes is defined by elevated blood glucose. 13 years ago, Prof Emma Baker and Prof Debbie Baines (at St George’s University of London) noticed that additionally, people with diabetes have increased airway glucose. Normally, the cells that line the airways pump any glucose that leaks into the lungs back into the blood. In diabetes, there is too much sugar in the blood and the pumps are overwhelmed, leading to a rise in airway glucose.  They hypothesized that the increased level of sugar in the lungs would allow more bacteria to grow in the lungs – the biological equivalent of leaving a jam jar open!

Diabetes = more lung bacteria

In our latest paper (Increased airway glucose increases airway bacterial load in hyperglycaemia) we set out to test this hypothesis using a number of different techniques. First we looked in hospitalised patients to see if there was a link between glucose and bacterial infection, and there was, patients with high blood sugar were twice as likely to have a bacterial lung infection. We know this thanks to our collaborators, Dr Luke Moore and Professor Alison Holmes, who have been tracking bacterial infections in London hospitals. This kind of a study is called an association or correlation study, and these studies are very good at showing that one thing is linked to another, but do not tell whether the link is causal and if it is how (the mechanism in scientific parlance).

Knockout bugs


In order to understand the how, we investigated how bacteria use glucose in the lung. The way we do this is to delete individual bacterial genes and compare the function of these gene deleted mutant bacteria to bacteria with all their genes (wild type). We deleted four different genes that based on their shape and similarities to genes from other bacteria were predicted to be important for the bug to be able to use glucose. These studies were performed using a bacteria called Pseudomonas aeruginosa, which, unless you have cystic fibrosis, you’ve probably never heard of, but causes many cases of pneumonia each year, especially in hospitalised patients. The first step was to demonstrate that deleting the genes affected Pseudomonas ability to use glucose to grow. Great news, they do.

Hypothesis - tested


The final step was to link everything - high glucose, in the lungs and bacteria - together. We did this using mice with diabetes (yes they do exist). As seen in people with diabetes, diabetic mice get more severe bacterial lung infections, unless you infect them with bacteria that can’t use glucose. When these bacteria were used, there was no difference in the bacterial lung infection. Boom, job done.

Drugs for bugs

But why stop there, understanding the factors that increase infection gives us new ways to fight infection. This is particularly important for bacterial infections because our arsenal of antibiotics is rapidly being depleted and we desperately need new treatments. If increased lung glucose increases infection it follows that drugs that reduce lung glucose should reduce infection. We tested the common anti-diabetic drug, metformin. Diabetic mice treated with metformin had lower lung glucose and less bacterial infection.

In conclusion, we have linked increased bacterial infection in people with diabetes to the level of glucose in the lungs, and used this finding to test new antibacterial treatments. If you want to read more details the paper is here.