Second post as I sit and think about my first five years running a lab...
Inevitably, there is going to be something about your lab/research situation that you're not happy with. If these things are within your control, great! You can hopefully fix them and move on. Other institutional things will be out of your immediate control or above your paygrade. With this second set, you can either look to find jobs elsewhere or find creative ways to make your current situation improve.
One of the most difficult things for me to deal with as a PI at Arizona is that there is no 'real' central Microbiology program. There are numerous smart and talented researchers across the campus who are microbiologists, but it's not as cohesive a unit as other places I've been. Part of this is historical inertia. Part of this is stuff that's over my paygrade. Part of this is that many of us have obligations to other programs on campus and simply can't devote the time that we would like to fostering those relationships. There are only 24 hours in the day, and my loyalty (for lack of a better word) has to be to Plant Sciences first and foremost because that is my home department.
For whatever reason, and I may definitely be the person to blame for this, I have felt a bit lonely on campus researchwise. Everyone else is doing great things, but I've never truly felt that other research interests on campus significantly overlapped with my own interests in evolutionary microbiology. Sure, I can bounce grant/experiment ideas off of people and receive very useful feedback, but I haven't been able to find a community of researchers on campus to discuss topics like "adaptation", "pleiotropy", "horizontal gene transfer" etc...in general ways. I really enjoy lofty discussions about where the field of experimental evolution is going, but I haven't met anyone else on campus to grab beers with and talk shop. If you're out there, please come find me! On top of all this, many of the microbiology folks associated with the EEB department here have up and left in the last few years.
I didn't appreciate it until recently, but our own research careers are hugely shaped by the environment we are in. My last post was about how my research trajectory changed in the first five years of my lab. I can definitely say that those changes were precipitated by what kinds of scientific interactions were available to me on campus. Lately I've been wondering though, how would my own experiments or grants have changed if there were a couple of more labs on campus generally involved in evolutionary micro (or if I knew about them)? Would I have had a bunch of different collaborations than I do right now? My research ideas would no doubt be shaped again if I changed Universities and joined more of an EEB department, do I want that to happen now?
Without going into details, I think that the problem described above is institutional. Without hiring numerous new PIs, which has been a bit difficult at state universities since 2008, there are only two options to remedy my intellectual withdrawl. I could change institutions, which has it's own set of issues, or I could find a way to get the interactions I needed off campus.
A couple of years ago I had to defend my time spent on Twitter to my department head. What I said then, and still do, is that Twitter has been an intellectual lifesaver in addition to any other tangible benefits. I can go there and find papers that I wouldn't have the time to search out otherwise. I can interact with people whose intellectual interests better align with my own, and carry out great (to the extent that any character limited discussion can be) discussions with people about new results or research trajectories. I've tried to get better grant feedback by posting these and asking for comments, which hasn't worked quite like I'd have hoped but I think was still worthwhile. I connected with a couple of folks who were willing to read over other versions of grants and offer really constructive critiques.
Ecology/evolutionary microbiology seminars on campus are few and far between (for instance, EEB has had some micro people in to give seminars). Sometimes I've been able to invite people to give Plant Sciences seminars, but you have to fill a certain niche for me to feel OK doing that. Given this context, microseminar has been another intellectual lifesaver and has filled one of my on campus blindspots. Along these same lines, I've participated in Google hangout journal clubs and am thinking about incorporating those kinds of things into my own lab meetings. It's fun to actually have the person who wrote the paper get in on the discussion, and with the magic of Youtube these discussions are archived for everyone to see. I'm going to try and work both of these activities into my lab meetings next term, because that way the time is already scheduled.
Long story short, there is no perfect situation as far as I can tell, but there may never have been. Some research environments may foster new discoveries (i.e. Bell labs) but there were all sorts of downsides and infighting that happened there too. There are tons of letters (actual letters!) from back in the day between researchers talking about their ideas to each other and which provide a bit of coloring for how experiments are described in textbooks. I don't think I've ever written a letter to another researcher with a pen and paper, however, I've been able to find ways to placate some of my intellectual cravings through social media. FOIA requests aside, future historians are going to have a lot of archived tweets/blog posts/videos to sift through to understand how scientific revolutions happened in the 2000s. I think loneliness happens to everyone in this job at some point or another. At least for me, time spent interacting online has helped to quell these feelings a bit and because of that it's time well spent.
Monday, September 28, 2015
Tuesday, September 22, 2015
The five year plan redux
Rounding into about my fifth year on the job as a PI, I've started to look back and think about how I made it to this point. This will probably be a series of posts as the ideas jump into my head, but today I've been wondering about how a lab's research focus changes.
Five years ago I sat down and thought about a five year plan for my lab. I was just coming off of a postdoc using comparative genomics to study virulence evolution in the plant pathogen Pseudomonas syringae. Given that I was (and still am) in a Plant Sciences department, I thought that it would be a good idea to continue studying virulence in P. syringae, and I knew that there were some interesting/safe results left to mine from my postdoc data. In parallel, I was thinking that I wanted to get back into studying experimental evolution of microbial populations. I actually chose the lab for my postdoc in order to get experience with Pseudomonas with the hope of eventually setting up such systems. For my "riskier" projects I wanted to use experimental evolution to look at the effects of horizontal gene transfer on adaptation.
For a couple or three years I followed my plan. I've been able to publish a few papers on virulence in P. syringae. I've got my experimental evolution system up and running and have published some of the necessary background work. There remain a bunch of different paths that I can follow for either of those two projects that I am exciting to try and follow up on.
The amazing thing to me though is that I'm not currently funded to do any of that. I've written numerous grants (20?) for these projects across multiple agencies, but they just haven't been successful. A few of these grants came REEEAAAALLLYYYYY close to funding, but just didn't make the cut. Getting any of these grants would have been great, but I think my research program is actually stronger because of those failures. Sure, every rejection email sucks, but I was constantly evaluating and reevaluating research directions. For the plant pathogen work, there was a lot of competition and my grants (even though they were solid I think) just didn't stand out because there were many other labs doing approximately similar things. The experimental evolution work just didn't seem to hit the right chord to the right people, again and again and again.
I've been lucky to get a handful of grants recently, but none of these projects was on my radar five years ago. One of the funded projects started out as a random email question between Betsy Arnold and I in about year 2 of my lab and has blossomed incredibly since then. It's one of those exciting projects where we find a new result every week or so, yet every thing about the system remains pretty black-boxish. Another newly funded project started as an observation by my postdoc Kevin Hockett around year 3 of the lab. He started out playing around with diverse strains of P. syringae, seeing how these strains interacted with one another. We kept pushing the genetics of the system because nothing published could explain the results. Turns out we stumbled into a really cool evolutionary story.
The point of this whole post is that I had a plan, but the plan necessarily changed. Since grad school, I've imagined how my research career would look. Never did I think I'd end up in a Plant Sciences department (there are pluses and minuses, but that's a post for another time). The questions I thought my lab would be focusing on have fallen to the wayside. I'm still quite interested in them and have a variety of undergrads plowing ahead, but they aren't on the forefront anymore. The projects that have been successfully funded came together after I spent a couple of years focused on completely different topics. I'm an N of 1, and I have no idea if my story is shared by other researchers, but there are so many posts about how to be a PI that I figure I'd share this data point. I have no clue what the future truly holds, but I'm just going to keep being curious about the world because it's been good to me so far.
Five years ago I sat down and thought about a five year plan for my lab. I was just coming off of a postdoc using comparative genomics to study virulence evolution in the plant pathogen Pseudomonas syringae. Given that I was (and still am) in a Plant Sciences department, I thought that it would be a good idea to continue studying virulence in P. syringae, and I knew that there were some interesting/safe results left to mine from my postdoc data. In parallel, I was thinking that I wanted to get back into studying experimental evolution of microbial populations. I actually chose the lab for my postdoc in order to get experience with Pseudomonas with the hope of eventually setting up such systems. For my "riskier" projects I wanted to use experimental evolution to look at the effects of horizontal gene transfer on adaptation.
For a couple or three years I followed my plan. I've been able to publish a few papers on virulence in P. syringae. I've got my experimental evolution system up and running and have published some of the necessary background work. There remain a bunch of different paths that I can follow for either of those two projects that I am exciting to try and follow up on.
The amazing thing to me though is that I'm not currently funded to do any of that. I've written numerous grants (20?) for these projects across multiple agencies, but they just haven't been successful. A few of these grants came REEEAAAALLLYYYYY close to funding, but just didn't make the cut. Getting any of these grants would have been great, but I think my research program is actually stronger because of those failures. Sure, every rejection email sucks, but I was constantly evaluating and reevaluating research directions. For the plant pathogen work, there was a lot of competition and my grants (even though they were solid I think) just didn't stand out because there were many other labs doing approximately similar things. The experimental evolution work just didn't seem to hit the right chord to the right people, again and again and again.
I've been lucky to get a handful of grants recently, but none of these projects was on my radar five years ago. One of the funded projects started out as a random email question between Betsy Arnold and I in about year 2 of my lab and has blossomed incredibly since then. It's one of those exciting projects where we find a new result every week or so, yet every thing about the system remains pretty black-boxish. Another newly funded project started as an observation by my postdoc Kevin Hockett around year 3 of the lab. He started out playing around with diverse strains of P. syringae, seeing how these strains interacted with one another. We kept pushing the genetics of the system because nothing published could explain the results. Turns out we stumbled into a really cool evolutionary story.
The point of this whole post is that I had a plan, but the plan necessarily changed. Since grad school, I've imagined how my research career would look. Never did I think I'd end up in a Plant Sciences department (there are pluses and minuses, but that's a post for another time). The questions I thought my lab would be focusing on have fallen to the wayside. I'm still quite interested in them and have a variety of undergrads plowing ahead, but they aren't on the forefront anymore. The projects that have been successfully funded came together after I spent a couple of years focused on completely different topics. I'm an N of 1, and I have no idea if my story is shared by other researchers, but there are so many posts about how to be a PI that I figure I'd share this data point. I have no clue what the future truly holds, but I'm just going to keep being curious about the world because it's been good to me so far.
Friday, July 24, 2015
A healthy dose of skepticism and the need for editors
As some of you know, I've been engaged in an interesting dialogue as a reviewer for Frontiers recently. This got me thinking about the role of editors in the process of publication, but also about how my own brain interprets experimental data. I was originally going to write a couple of posts, but I think they work together so now you just get a singular post that's slightly longer.
Long story short, currently, I disagree with the way that the authors have analyzed their data and am waiting to actually "endorse" the publication at a Frontiers journal. If you snoop around in a couple of months I'm guessing that you'll be able to figure out what I'm talking about because at Frontiers the reviewers names are listed openly on the final PDF. This whole process has led me to rethink the way that Frontiers actually performs review (which takes place in an interactive forum where authors respond directly to comments from reviewers). I love the idea of open, non-anonymous review and am strongly in favor of making public a record of review for each paper. For reasons I'll elaborate on below, I think this system is slightly flawed.
Maybe I've just grown cynical over the years, but the first thing I do when I get awesome new data is to question how I screwed up. Was everything randomized? Did the strains get contaminated? Etc, etc...Ideally all of these questions are answered by experimental controls, but I'm good at thinking of extravagant and elaborate ways in which I'm wrong. Nature is often quite good at this too I've found, although that's the fun of biology (after a period of cursing the sky). Thanks in a large part to this self-skepticism, I'm always thinking about the next ways to adequately control for experiments which leads me to wait to pull the trigger on submitting publications. My grad school and PD advisors helped to reign in these skeptical tendencies and slowroll of manuscript submissions just a bit by pointing out that nothing is ever perfect. The voices are always still there though.
These same tendencies act when I'm reviewing other papers. Sometimes things are easy to believe just by comparing summary stats to the reported data, but other times I'd like to see the primary data and dig my hands personally into the underlying statistical model/assumptions until I truly believe it. In many cases I have to actually ask to see this primary data, which is not great, but at least with anonymity I don't worry about directly questioning the author's abilities. I mean, inherently if you are asking for primary data because the stats seem wonky then you're implicitly questioning other people's abilities. When my name is not going to be known I don't worry as much about the social ramifications of it all and I sleep better at night.
I am way too over-critical of my own experiments. A little bit of skepticism is healthy, but too much self-skepticism as a scientist paralyzes your career. Even as a reviewer I worry about being over-critical and asking for tedious and minuscule changes that might not ultimately matter. When you are knee deep into reviewing a paper it's easy to lose sight of the bigger picture. This is where the editor comes in. Each time we review a paper, we make a list of critical and less than critical things that need to be "fixed" before publication. Oftentimes the editors will read these list from every reviewer and distill down the absolute requirements. Editors often have their own impression of what makes a publishable unit (that's for another post though, suffice it to say that's why direct track 2 submission to PNAS no longer exists). What I've come to think is that editors are absolutely required in the current publishing process. Reviewers and authors are on about the same level in the dynamic, but the editor inherently has an overriding sense of authority in the whole process. They can take reviewers comments and immediately disregard the ones that aren't critical. They can emphasize to authors exactly everything that needs to be done. The authority is key because both reviewers and authors are deferential to it. As a reviewer I'm not worried about asking too small a question because 1) everything I write in the review is important to me and 2) I know that the good editors will know when I'm being too specific are nit-picky.
With this Frontiers article I've had to respond to the authors that "I'd like to see the primary data". Having received many reviews in my career, I know exactly how this comment will be received. When it comes from a reviewer directly it seems nit-picky and maybe even a bit of a personal affront. If the editor agrees, there is a bit more weight to the comment. It felt weird having to directly comment to the authors that I wanted to see their data. They're a good lab and I worry that their impression of me (since they'll know my name after it's published) will change for the worse. These are things you can't control, but that's how it goes.
For all of you out there who have papers I'll review in the future, know that I'm even harder on myself. I'd like to think self-skepticism is part of what makes me good at my job though.
Long story short, currently, I disagree with the way that the authors have analyzed their data and am waiting to actually "endorse" the publication at a Frontiers journal. If you snoop around in a couple of months I'm guessing that you'll be able to figure out what I'm talking about because at Frontiers the reviewers names are listed openly on the final PDF. This whole process has led me to rethink the way that Frontiers actually performs review (which takes place in an interactive forum where authors respond directly to comments from reviewers). I love the idea of open, non-anonymous review and am strongly in favor of making public a record of review for each paper. For reasons I'll elaborate on below, I think this system is slightly flawed.
Maybe I've just grown cynical over the years, but the first thing I do when I get awesome new data is to question how I screwed up. Was everything randomized? Did the strains get contaminated? Etc, etc...Ideally all of these questions are answered by experimental controls, but I'm good at thinking of extravagant and elaborate ways in which I'm wrong. Nature is often quite good at this too I've found, although that's the fun of biology (after a period of cursing the sky). Thanks in a large part to this self-skepticism, I'm always thinking about the next ways to adequately control for experiments which leads me to wait to pull the trigger on submitting publications. My grad school and PD advisors helped to reign in these skeptical tendencies and slowroll of manuscript submissions just a bit by pointing out that nothing is ever perfect. The voices are always still there though.
These same tendencies act when I'm reviewing other papers. Sometimes things are easy to believe just by comparing summary stats to the reported data, but other times I'd like to see the primary data and dig my hands personally into the underlying statistical model/assumptions until I truly believe it. In many cases I have to actually ask to see this primary data, which is not great, but at least with anonymity I don't worry about directly questioning the author's abilities. I mean, inherently if you are asking for primary data because the stats seem wonky then you're implicitly questioning other people's abilities. When my name is not going to be known I don't worry as much about the social ramifications of it all and I sleep better at night.
I am way too over-critical of my own experiments. A little bit of skepticism is healthy, but too much self-skepticism as a scientist paralyzes your career. Even as a reviewer I worry about being over-critical and asking for tedious and minuscule changes that might not ultimately matter. When you are knee deep into reviewing a paper it's easy to lose sight of the bigger picture. This is where the editor comes in. Each time we review a paper, we make a list of critical and less than critical things that need to be "fixed" before publication. Oftentimes the editors will read these list from every reviewer and distill down the absolute requirements. Editors often have their own impression of what makes a publishable unit (that's for another post though, suffice it to say that's why direct track 2 submission to PNAS no longer exists). What I've come to think is that editors are absolutely required in the current publishing process. Reviewers and authors are on about the same level in the dynamic, but the editor inherently has an overriding sense of authority in the whole process. They can take reviewers comments and immediately disregard the ones that aren't critical. They can emphasize to authors exactly everything that needs to be done. The authority is key because both reviewers and authors are deferential to it. As a reviewer I'm not worried about asking too small a question because 1) everything I write in the review is important to me and 2) I know that the good editors will know when I'm being too specific are nit-picky.
With this Frontiers article I've had to respond to the authors that "I'd like to see the primary data". Having received many reviews in my career, I know exactly how this comment will be received. When it comes from a reviewer directly it seems nit-picky and maybe even a bit of a personal affront. If the editor agrees, there is a bit more weight to the comment. It felt weird having to directly comment to the authors that I wanted to see their data. They're a good lab and I worry that their impression of me (since they'll know my name after it's published) will change for the worse. These are things you can't control, but that's how it goes.
For all of you out there who have papers I'll review in the future, know that I'm even harder on myself. I'd like to think self-skepticism is part of what makes me good at my job though.
Wednesday, June 24, 2015
Metagenomics and "A feeling for the organism"
Evelyn Keller's biography of Barbara McClintock is entitled "A feeling for the organism". A few paragraphs in this last chapter sum up quite nicely how McClintock viewed the scientific enterprise and discoveries:
"Over and over again, she tells us one must have the time to look, the patience to "hear what the material has to say to you," the openness to "let it come to you." Above all, one must have "a feeling for the organism." One must understand "how it grows, understand its parts, understand when something is going wrong with it. [An organism] isn't just a piece of plastic, it's something that is constantly being affected by the environment, constantly showing attributes or disabilities in its growth. You have to be aware of all of that.... You need to know those plants well enough so that if anything changes, ... you [can] look at the plant and right away you know what this damage you see is from-something that scraped across it or something that bit it or something that the wind did." You need to have a feeling for every individual plant. "No two plants are exactly alike. They're all different, and as a consequence, you have to know that difference," she explains. "I start with the seedling, and I don't want to leave it. I don't feel I really know the story ifI don't watch the plant all the way along. So I know every plant in the field. I know them intimately, and I find it a great pleasure to know them." This intimate knowledge, made possible by years of close association with the organism she studies, is a prerequisite for her extraordinary perspicacity. "I have learned so much about the com plant that when I see things, I can interpret [them] right away." Both literally and figuratively, her "feeling for the organism"
I agree wholeheartedly. Others may science differently, but I make my living by looking and studying everything about the bacteria I work with. Going into each experiment I have an idea of what to expect (even if these "experiments" simply involve streaking out cultures from frozen). If you give me a genome of Pseudomonas syringae, I can tell you the main components you'll find . I can tell you how certain strains will grow (or won't), what the colonies will look like, how long they'll take to pop up, what color they'll be. It took me a few years to gather this intuition, but now that it's engrained I like to think I have an innate sense when something is "off". I liken this to a scientific Spidey-sense. The challenging part is truly knowing when to follow up on these odd results, when to store away for the future, and when to disregard them as uninteresting.
I was reminded of McClintock's "feeling for the organism" by a couple of stories from metagenomics that have popped up across my feeds. Before I say anything else, I don't intend to denigrate the quality of the science or data underlying these stories by any means. The work is solid, I just think we're starting to find some limitations in the power of "big science" and these holes usually pop up in the discussion sections of papers and press releases. The first of these stories was a tour de force looking at metagenomics of the NYC subway system. The authors reported a variety of interesting results, but the tag line that a lot of news outlets seemed to focus on were the presence of Yersinia pestis (plague) and Bacillus anthracis (anthrax) within the subway system. The limitations of these methods have been hashed out already (here and here), but I want to focus on the inherent lack of "a feeling for the organism" when dealing with metagenomic data. Studies of any open microbial ecosystems are going to find a diversity of taxa. Unless you bring in specialists, there is simply no way to know the ins and outs of each organism. In the case of the NYC subway metagenome, from my interpretation at least, the authors looked at only bits and pieces of the Yersinia and Bacillus genomes without capturing the whole picture. They had to do this because the story was so inherently large that you couldn't possibly investigate everything in depth. However, specialists with "a feeling" for either Yersinia or Bacillus could have provided a viewpoint on which directions (other genes to look at, levels of nucleotide diversity which seem a bit too high) to follow up on to truly demonstrate presence of these bugs within the subway.
Likewise, one part of the story on urban microbes in this piece caught my eye:
"Rodents are under study, too. White-footed mice (Peromyscus leucopus) in New York City carry more Helicobacter and Atopobium bacteria — associated with stomach ulcers and bacterial vaginosis in humans — than their suburban counterparts"
I worked and slaved over Helicobacter pylori cultures all through grad school. I simultaneously loved and hated that bug. It's finicky growth patterns are the reason I moved over to study the reliably growing Pseudomonas after grad school. Unless something has dramatically changed since I've been in the literature (which is completely possible), rodents are terrible hosts for H. pylori strains that cause stomach ulcers (general overview here). You can get a subset of H. pylori strains to grow in mice, but there are often a variety of genetic changes that take place that allow them to adapt (see here). I wouldn't be surprised if there were multiple Helicobacter strains within mice in NYC, but my money is on the fact that they aren't Helicobacter pylori that could cause stomach ulcers.
These are the tradeoffs that are made when dealing with immense data sets, and I'm not quite sure how to fix this. No one has a feeling for ALL THE MICROBZ. If you have a fun/interesting story on microbiomes that focuses on a couple of taxa in bold, at least try to run your data and ideas past someone that truly has a feeling for these organisms before publishing the paper. If it holds up after that, more power to you.
"Over and over again, she tells us one must have the time to look, the patience to "hear what the material has to say to you," the openness to "let it come to you." Above all, one must have "a feeling for the organism." One must understand "how it grows, understand its parts, understand when something is going wrong with it. [An organism] isn't just a piece of plastic, it's something that is constantly being affected by the environment, constantly showing attributes or disabilities in its growth. You have to be aware of all of that.... You need to know those plants well enough so that if anything changes, ... you [can] look at the plant and right away you know what this damage you see is from-something that scraped across it or something that bit it or something that the wind did." You need to have a feeling for every individual plant. "No two plants are exactly alike. They're all different, and as a consequence, you have to know that difference," she explains. "I start with the seedling, and I don't want to leave it. I don't feel I really know the story ifI don't watch the plant all the way along. So I know every plant in the field. I know them intimately, and I find it a great pleasure to know them." This intimate knowledge, made possible by years of close association with the organism she studies, is a prerequisite for her extraordinary perspicacity. "I have learned so much about the com plant that when I see things, I can interpret [them] right away." Both literally and figuratively, her "feeling for the organism"
I agree wholeheartedly. Others may science differently, but I make my living by looking and studying everything about the bacteria I work with. Going into each experiment I have an idea of what to expect (even if these "experiments" simply involve streaking out cultures from frozen). If you give me a genome of Pseudomonas syringae, I can tell you the main components you'll find . I can tell you how certain strains will grow (or won't), what the colonies will look like, how long they'll take to pop up, what color they'll be. It took me a few years to gather this intuition, but now that it's engrained I like to think I have an innate sense when something is "off". I liken this to a scientific Spidey-sense. The challenging part is truly knowing when to follow up on these odd results, when to store away for the future, and when to disregard them as uninteresting.
I was reminded of McClintock's "feeling for the organism" by a couple of stories from metagenomics that have popped up across my feeds. Before I say anything else, I don't intend to denigrate the quality of the science or data underlying these stories by any means. The work is solid, I just think we're starting to find some limitations in the power of "big science" and these holes usually pop up in the discussion sections of papers and press releases. The first of these stories was a tour de force looking at metagenomics of the NYC subway system. The authors reported a variety of interesting results, but the tag line that a lot of news outlets seemed to focus on were the presence of Yersinia pestis (plague) and Bacillus anthracis (anthrax) within the subway system. The limitations of these methods have been hashed out already (here and here), but I want to focus on the inherent lack of "a feeling for the organism" when dealing with metagenomic data. Studies of any open microbial ecosystems are going to find a diversity of taxa. Unless you bring in specialists, there is simply no way to know the ins and outs of each organism. In the case of the NYC subway metagenome, from my interpretation at least, the authors looked at only bits and pieces of the Yersinia and Bacillus genomes without capturing the whole picture. They had to do this because the story was so inherently large that you couldn't possibly investigate everything in depth. However, specialists with "a feeling" for either Yersinia or Bacillus could have provided a viewpoint on which directions (other genes to look at, levels of nucleotide diversity which seem a bit too high) to follow up on to truly demonstrate presence of these bugs within the subway.
Likewise, one part of the story on urban microbes in this piece caught my eye:
"Rodents are under study, too. White-footed mice (Peromyscus leucopus) in New York City carry more Helicobacter and Atopobium bacteria — associated with stomach ulcers and bacterial vaginosis in humans — than their suburban counterparts"
I worked and slaved over Helicobacter pylori cultures all through grad school. I simultaneously loved and hated that bug. It's finicky growth patterns are the reason I moved over to study the reliably growing Pseudomonas after grad school. Unless something has dramatically changed since I've been in the literature (which is completely possible), rodents are terrible hosts for H. pylori strains that cause stomach ulcers (general overview here). You can get a subset of H. pylori strains to grow in mice, but there are often a variety of genetic changes that take place that allow them to adapt (see here). I wouldn't be surprised if there were multiple Helicobacter strains within mice in NYC, but my money is on the fact that they aren't Helicobacter pylori that could cause stomach ulcers.
These are the tradeoffs that are made when dealing with immense data sets, and I'm not quite sure how to fix this. No one has a feeling for ALL THE MICROBZ. If you have a fun/interesting story on microbiomes that focuses on a couple of taxa in bold, at least try to run your data and ideas past someone that truly has a feeling for these organisms before publishing the paper. If it holds up after that, more power to you.
Friday, June 19, 2015
Navigating the waters of NSF grant submission
*Disclaimer: What follows is a post about structural biases I've perceived within the NSF Biology system. I think these biases are intrinsic but keep in mind I could be completely wrong (and if you have different views, please feel free to comment). They also aren't inherently bad or need to be fixed, they just exist based on the pool of reviewers/panelists and timing of the grant cycles. It's a bit rambling, but I'm hoping to provide at least a slightly useful insight or two.
Even before the new office smell has worn off, and in many cases before you've actually moved into your office, the thoughts of many PIs newly merging onto the tenure track are focused on grant writing. This isn't going to be a post about how to get NSF grants, but more along the lines of "things I've experienced writing grants across panels". Grant writing is truly an art. Something that I didn't truly appreciate before is that, as with any piece of art, each target audience has their own subjective opinions. I've had my lab for 4 1/2 year now and have written grants to DEB, MCB, and IOS. I've been fortunate enough to sit on preproposal and full proposal panels. One of the most difficult ongoing lessons I'm learning is that grants written to each of these are very different beasts.
1) Preproposals change the game. DEB and IOS require preproposals, MCB does not. I'll save most of the comments about pre vs. full proposals for other posts, but suffice it to say that writing a convincing preproposal takes a different skill set than writing a convincing full proposal. Since preproposals don't go out for external review, the fate of your grant is entirely influenced by the composition of the panel. In panels that get a lot of submissions focused on similar systems (at least from what I've seen at IOS where there are only a handful of well-worn symbiosis models) novelty can be a benefit. If you propose to work with a new/novel system, and the science makes sense, you can get some bonus points if every other grant is focused on model organisms. Furthermore, while there are certainly benefits to working with a model system, it's more likely that someone on the preproposal panel will know little details about the nuances of the organism and can call you out for poor experimental design. On the other hand, if you are proposing to work in a system that no one on the panel truly has expertise in, you better be able to convince them in four pages that the experiments are feasible. Depending on overlap of the panel's expertise with your own grant, there could be details missed during the preproposal discussion/reviews, and their will likely be subtle misinterpretations. It's just how it goes and feeds into the noise of the system. These things can be ironed out in the full proposal though because those will go out for external review. I get the feeling that DEB grants and review panels have a much higher variance in topic and system than IOS panels. If such a difference truly exists it definitely adds a new psychological layer into the process.
One last thing to mention in regards to the effects of preproposals. There is likely to at least be a little overlap between reviewers of your successful preproposal and your full proposal. I can't speak to anyone else on this, but when discussing full proposals I remembered the discussions surrounding the preproposals. I remembered perceived weaknesses and strengths and I tried to see how the authors dealt with these criticisms. I can't help but think that it's a good idea to dedicate some of your full proposal to laying out a response to your preproposal reviews.
2) Timing can matter for CAREER grants, especially since you have a choice about which panel to submit to. Submission of IOS/DEB full proposals occurs in summer and overlaps with CAREER award deadlines. Panels evaluating full proposals for both of these programs will also evaluate CAREER awards at the same time. Given the vetting of ideas that occurs due to preproposals, differences between CAREER grants and full proposals were often pretty glaring. It's also possible that you could have turned your non-invited preproposal into a CAREER grant, and that it would be reviewed by the same panel for both IOS/DEB.
In contrast, the normal deadlines for MCB panels that I've applied to are now in November. Therefore, if I submit a CAREER award to MCB there is no chance that the grant could be reviewed by the same panel that it would be as a regular submission. This matters because I've had some CAREER grants go to what I perceive as weird places at MCB (like Engineering panels) and they get evaluated very differently than they do at the regular November panels. Differences in criteria between regular and CAREER grants aside, the science may be essentially the same in the grants I've submitted but I get a feeling that there is much more variance in the CAREER reviews simply because the panel isn't quite the fit I imagine it to be. I think this also factors in because I'm not convinced that reviews of CAREER grants inform my writing of regular MCB grants (and vice versa), whereas I think you can get more traction out of reviews regardless of grant type at both IOS and DEB.
3) Funding rates are low regardless, but DEB (evolutionary processes at least, I can't speak to anything ecology) feels like an even steeper climb for microbiologists than for other biologists. The first few times that I had grants rejected from DEB, the POs made statements like "you have to convince frog biologists that your work is important". These comments were spot on and looking back I did a terrible job at describing how my work applied across systems. However, and I could be wrong about this although the few people I've asked back up my intuition, I'm not sure that grants from frog biologists at DEB get the reverse critique of "convincing microbiologists that your work is important". I'm not sure what this means, and certainly some great microbiology work gets funded through DEB, but it feels like there is a slightly implicit bias from the reviewers against microbial evolution work at DEB. There are some generally important evolutionary phenomena in bacteria (like rampant horizontal gene transfer) that simply don't apply across systems. Likewise, there are some generally important evolutionary phenomena in eukaryotes (sex ratio biases, diploidy) that don't really cleanly apply to bacteria. Given the broad makeup of review panels at DEB, I think it's just hard to get some types of microbial work funded through there even though in a world with unlimited funding it's the right place for it. It's possible that the reverse is true at IOS because most model symbiosis systems involve microbes.
Even before the new office smell has worn off, and in many cases before you've actually moved into your office, the thoughts of many PIs newly merging onto the tenure track are focused on grant writing. This isn't going to be a post about how to get NSF grants, but more along the lines of "things I've experienced writing grants across panels". Grant writing is truly an art. Something that I didn't truly appreciate before is that, as with any piece of art, each target audience has their own subjective opinions. I've had my lab for 4 1/2 year now and have written grants to DEB, MCB, and IOS. I've been fortunate enough to sit on preproposal and full proposal panels. One of the most difficult ongoing lessons I'm learning is that grants written to each of these are very different beasts.
1) Preproposals change the game. DEB and IOS require preproposals, MCB does not. I'll save most of the comments about pre vs. full proposals for other posts, but suffice it to say that writing a convincing preproposal takes a different skill set than writing a convincing full proposal. Since preproposals don't go out for external review, the fate of your grant is entirely influenced by the composition of the panel. In panels that get a lot of submissions focused on similar systems (at least from what I've seen at IOS where there are only a handful of well-worn symbiosis models) novelty can be a benefit. If you propose to work with a new/novel system, and the science makes sense, you can get some bonus points if every other grant is focused on model organisms. Furthermore, while there are certainly benefits to working with a model system, it's more likely that someone on the preproposal panel will know little details about the nuances of the organism and can call you out for poor experimental design. On the other hand, if you are proposing to work in a system that no one on the panel truly has expertise in, you better be able to convince them in four pages that the experiments are feasible. Depending on overlap of the panel's expertise with your own grant, there could be details missed during the preproposal discussion/reviews, and their will likely be subtle misinterpretations. It's just how it goes and feeds into the noise of the system. These things can be ironed out in the full proposal though because those will go out for external review. I get the feeling that DEB grants and review panels have a much higher variance in topic and system than IOS panels. If such a difference truly exists it definitely adds a new psychological layer into the process.
One last thing to mention in regards to the effects of preproposals. There is likely to at least be a little overlap between reviewers of your successful preproposal and your full proposal. I can't speak to anyone else on this, but when discussing full proposals I remembered the discussions surrounding the preproposals. I remembered perceived weaknesses and strengths and I tried to see how the authors dealt with these criticisms. I can't help but think that it's a good idea to dedicate some of your full proposal to laying out a response to your preproposal reviews.
2) Timing can matter for CAREER grants, especially since you have a choice about which panel to submit to. Submission of IOS/DEB full proposals occurs in summer and overlaps with CAREER award deadlines. Panels evaluating full proposals for both of these programs will also evaluate CAREER awards at the same time. Given the vetting of ideas that occurs due to preproposals, differences between CAREER grants and full proposals were often pretty glaring. It's also possible that you could have turned your non-invited preproposal into a CAREER grant, and that it would be reviewed by the same panel for both IOS/DEB.
In contrast, the normal deadlines for MCB panels that I've applied to are now in November. Therefore, if I submit a CAREER award to MCB there is no chance that the grant could be reviewed by the same panel that it would be as a regular submission. This matters because I've had some CAREER grants go to what I perceive as weird places at MCB (like Engineering panels) and they get evaluated very differently than they do at the regular November panels. Differences in criteria between regular and CAREER grants aside, the science may be essentially the same in the grants I've submitted but I get a feeling that there is much more variance in the CAREER reviews simply because the panel isn't quite the fit I imagine it to be. I think this also factors in because I'm not convinced that reviews of CAREER grants inform my writing of regular MCB grants (and vice versa), whereas I think you can get more traction out of reviews regardless of grant type at both IOS and DEB.
3) Funding rates are low regardless, but DEB (evolutionary processes at least, I can't speak to anything ecology) feels like an even steeper climb for microbiologists than for other biologists. The first few times that I had grants rejected from DEB, the POs made statements like "you have to convince frog biologists that your work is important". These comments were spot on and looking back I did a terrible job at describing how my work applied across systems. However, and I could be wrong about this although the few people I've asked back up my intuition, I'm not sure that grants from frog biologists at DEB get the reverse critique of "convincing microbiologists that your work is important". I'm not sure what this means, and certainly some great microbiology work gets funded through DEB, but it feels like there is a slightly implicit bias from the reviewers against microbial evolution work at DEB. There are some generally important evolutionary phenomena in bacteria (like rampant horizontal gene transfer) that simply don't apply across systems. Likewise, there are some generally important evolutionary phenomena in eukaryotes (sex ratio biases, diploidy) that don't really cleanly apply to bacteria. Given the broad makeup of review panels at DEB, I think it's just hard to get some types of microbial work funded through there even though in a world with unlimited funding it's the right place for it. It's possible that the reverse is true at IOS because most model symbiosis systems involve microbes.
Friday, January 23, 2015
Monday, January 19, 2015
My thoughts on "The type VI secretion system of Vibrio cholerae fosters horizontal gene transfer"
I was on a late Christmas break last week, when I caught wind of a newly published study (here) and associated write ups (best one by Ed Yong here) which suggested that natural transformation and type VI secretion (T6S) were linked in Vibrio cholerae. Given my research interests into microbe-microbe interactions, and my experience studying and writing about natural transformation and evolution, I was naturally intrigued. I was also wary, however, because these kinds of studies tend to oversell correlations and tend towards "just so" stories. Having now read the paper a couple of times, I actually think it's quite a good example of a microbial genetics story and much less so evolutionary biology story.
I won't go into the gory details too much, but the authors start out pointing out that little is known about regulation of the T6S system in Vibrio. The main take home result of this paper is that the T6S system operons are controlled by TfoX and also by quorum sensing through HapR and QstR. That's a solid story and worthy of publication in a pretty high tier journal. However, due to a happenstance of history moreso than anything else, TfoX is also said to be a master regulator of competence for natural transformation in Vibrio. This association arose because TfoX was originally identified as a regulator of competence in the presence of chitin. Looking back in hindsight, maybe TfoX should be referred to as a master regulator of pathways associated with chitin presence.
The authors decide to run with with regulatory association between T6S and competence, and test whether killing of cells by T6S facilitates horizontal gene transfer through natural transformation. As a way to suggest that there is a more evolutionary link between these processes, the authors set up experiments to demonstrate that genetic exchange dependent on T6S killing can occur. For this experiment, the authors test for the ability of their focal T6S wielding strain to be transformed by a kanamycin resistance gene integrated into the genome of another Vibrio cholerae strain. Surprise, surprise (/sarcasm) the experiment works and T6S facilitates genetic exchange. <edit, that last sentence and the tone of this section aren't appropriate. I realize this now and apologize>. I say that snarkily because ANY process that releases DNA from cells can facilitate horizontal transfer by natural transformation...heat, lightning, whatever you can imagine. My problem here is that the authors place their finger on the scale to effectively rig an experiment whereby they will get the "sexy" result that would be undoubtedly overspun in press releases. The problem, and this goes for a lot of papers (especially of the microbiome sort) is that just because something is possible under experimental conditions doesn't make that phenomenon evolutionarily relevant. How did the authors bias this experiment and why am I annoyed enough to hastily craft a blog post?
1) Natural transformation frequency of genomic DNA is highly dependent on similarity of donor and recipient genomes. Transformation by plasmids is a bit different because these don't require recombination. The authors used two (relatively closely related) different Vibrio strains ensuring that recombination could occur. I doubt this experiment would have nearly the success rate (if at all) if different Vibrio species were used as prey. The chances of success fall with genomic divergence from the recipient strain. I have no clue of what the spectrum of other bacteria that live on planktonic crustaceans that would be killed by Vibrio, but the more diverse they are the less likely that T6S truly affects genetic exchange.
2) The type of selection matters. The authors set up the experiment with kanamycin resistance, because they can plate out strains onto antibiotics and strongly select for transformants. Not critiquing that part, and it's certainly how you'd do the experiment, but I'm not sure that such selective environments are representative life on crustaceans or in the ocean. For T6S to have evolved to significantly affect genetic exchange requires a constantly changing environment with strong selection pressures whereby prey strains can be more adapted than predator strains. To this point, laboratory experiments have begun to show that natural transformation can increase rates of adaptation, but generally only in "stressful" environments. It's possible that such conditions could consistently arise for Vibrio, but it's a hard sell.
3) Since T6S preferentially targets dissimilar strains, there is a much much much greater chance that transformation of DNA from prey cells would be detrimental than beneficial. Rosie Redfield has already made the case (here and here) that transformation of DNA from closely related strains is likely detrimental because transformable DNA will contain more deleterious alleles on average than living cells. Additionally, there is always the chance of incorporating alleles that lower the transformation rate and which can't easily be replaced once incorporated. Transformation of DNA from prey cells targeted by T6S systems introduces two related problems. Although transformable DNA won't inherently contain deleterious mutations (unlike Rosie's paper, cells are killed by other cells rather than by deleterious mutations) many of the genes within this pool will be diverged from those in the recipient genome. Therefore, it would be much (much much+++) more likely that predator cells would be transformed by alleles of housekeeping genes that wouldn't function efficiently when placed into a new genomic context than by beneficial genes (here although see here). Is it likely that Vibrio cells will grow equally well if you replace their copy of rpoD with that of Pseudomonas? Probably not. On average then, forgive the lack of a mathematical model but I could whip one up if you'd really like, it is probably much easier to lower fitness of Vibrio through transformation after killing by T6S than to increase fitness. Added to this, analogous to alleles that lower competence in Rosie's model, is that genes that render strains sensitive to killing by T6S will be overrepresented in the transformable DNA pool.
4) Last but not least...I can understand why authors and press releases would be spun to suggest a tight evolutionary link between T6S, competence, and genetic exchange. As Rosie has pointed out, it's a much cleaner evolutionary story to think that predator cells are killing prey for nutrition. Also see her comment on Ed's blog post (here). The authors chose to play up the genetic exchange angle rather than test whether DNA from killed cells could be used as a nutrient. They don't even mention that DNA (and proteins, and a bunch of other things from lysed cells) could be used as a nutrient even though they use the terms predator and prey. Now to bring everything full circle, TfoX is actually the ortholog of Sxy, the gene in Haemophilus influenzae that Rosie's nutrient research is focused on. C'mon folks, at least acknowledge the literature.
So in conclusion, it's a nice genetics story.
I won't go into the gory details too much, but the authors start out pointing out that little is known about regulation of the T6S system in Vibrio. The main take home result of this paper is that the T6S system operons are controlled by TfoX and also by quorum sensing through HapR and QstR. That's a solid story and worthy of publication in a pretty high tier journal. However, due to a happenstance of history moreso than anything else, TfoX is also said to be a master regulator of competence for natural transformation in Vibrio. This association arose because TfoX was originally identified as a regulator of competence in the presence of chitin. Looking back in hindsight, maybe TfoX should be referred to as a master regulator of pathways associated with chitin presence.
The authors decide to run with with regulatory association between T6S and competence, and test whether killing of cells by T6S facilitates horizontal gene transfer through natural transformation. As a way to suggest that there is a more evolutionary link between these processes, the authors set up experiments to demonstrate that genetic exchange dependent on T6S killing can occur. For this experiment, the authors test for the ability of their focal T6S wielding strain to be transformed by a kanamycin resistance gene integrated into the genome of another Vibrio cholerae strain. Surprise, surprise (/sarcasm) the experiment works and T6S facilitates genetic exchange. <edit, that last sentence and the tone of this section aren't appropriate. I realize this now and apologize>. I say that snarkily because ANY process that releases DNA from cells can facilitate horizontal transfer by natural transformation...heat, lightning, whatever you can imagine. My problem here is that the authors place their finger on the scale to effectively rig an experiment whereby they will get the "sexy" result that would be undoubtedly overspun in press releases. The problem, and this goes for a lot of papers (especially of the microbiome sort) is that just because something is possible under experimental conditions doesn't make that phenomenon evolutionarily relevant. How did the authors bias this experiment and why am I annoyed enough to hastily craft a blog post?
1) Natural transformation frequency of genomic DNA is highly dependent on similarity of donor and recipient genomes. Transformation by plasmids is a bit different because these don't require recombination. The authors used two (relatively closely related) different Vibrio strains ensuring that recombination could occur. I doubt this experiment would have nearly the success rate (if at all) if different Vibrio species were used as prey. The chances of success fall with genomic divergence from the recipient strain. I have no clue of what the spectrum of other bacteria that live on planktonic crustaceans that would be killed by Vibrio, but the more diverse they are the less likely that T6S truly affects genetic exchange.
2) The type of selection matters. The authors set up the experiment with kanamycin resistance, because they can plate out strains onto antibiotics and strongly select for transformants. Not critiquing that part, and it's certainly how you'd do the experiment, but I'm not sure that such selective environments are representative life on crustaceans or in the ocean. For T6S to have evolved to significantly affect genetic exchange requires a constantly changing environment with strong selection pressures whereby prey strains can be more adapted than predator strains. To this point, laboratory experiments have begun to show that natural transformation can increase rates of adaptation, but generally only in "stressful" environments. It's possible that such conditions could consistently arise for Vibrio, but it's a hard sell.
3) Since T6S preferentially targets dissimilar strains, there is a much much much greater chance that transformation of DNA from prey cells would be detrimental than beneficial. Rosie Redfield has already made the case (here and here) that transformation of DNA from closely related strains is likely detrimental because transformable DNA will contain more deleterious alleles on average than living cells. Additionally, there is always the chance of incorporating alleles that lower the transformation rate and which can't easily be replaced once incorporated. Transformation of DNA from prey cells targeted by T6S systems introduces two related problems. Although transformable DNA won't inherently contain deleterious mutations (unlike Rosie's paper, cells are killed by other cells rather than by deleterious mutations) many of the genes within this pool will be diverged from those in the recipient genome. Therefore, it would be much (much much+++) more likely that predator cells would be transformed by alleles of housekeeping genes that wouldn't function efficiently when placed into a new genomic context than by beneficial genes (here although see here). Is it likely that Vibrio cells will grow equally well if you replace their copy of rpoD with that of Pseudomonas? Probably not. On average then, forgive the lack of a mathematical model but I could whip one up if you'd really like, it is probably much easier to lower fitness of Vibrio through transformation after killing by T6S than to increase fitness. Added to this, analogous to alleles that lower competence in Rosie's model, is that genes that render strains sensitive to killing by T6S will be overrepresented in the transformable DNA pool.
4) Last but not least...I can understand why authors and press releases would be spun to suggest a tight evolutionary link between T6S, competence, and genetic exchange. As Rosie has pointed out, it's a much cleaner evolutionary story to think that predator cells are killing prey for nutrition. Also see her comment on Ed's blog post (here). The authors chose to play up the genetic exchange angle rather than test whether DNA from killed cells could be used as a nutrient. They don't even mention that DNA (and proteins, and a bunch of other things from lysed cells) could be used as a nutrient even though they use the terms predator and prey. Now to bring everything full circle, TfoX is actually the ortholog of Sxy, the gene in Haemophilus influenzae that Rosie's nutrient research is focused on. C'mon folks, at least acknowledge the literature.
So in conclusion, it's a nice genetics story.
Subscribe to:
Posts (Atom)