Tuesday, November 13, 2012

Why Next Gen Now?

A confession: I've considered writing this piece for a lot of years now; not quite as many as this space has existed, but many years.  Some ideas get stuck in my head, but I never force them out through my fingers.  Finally,with this one, I will pose the question: Why did "next-generation" sequencing happen when it did?
The next confession is that I don't really plan to answer that question.  It's an interesting question, but I don't have answers to it.  I have some rudimentary notions of where the answers my lie, but it really needs to be explored by serious analysts of science, not an occasional blogger.  Perhaps some of the answers are already out there.  Books such as Kevin Davies' $1000 Genome skirt around the subject, though I suspect the mountains of material that didn't make it into that book would cast some light on the subject.

In addition to being an important question, I think it is one important to ask now, before memories of the process are completely jumbled.  I recently got into a minor tussle on LinkedIn over the history of the human genome project; my recollection of certain events was quite different than my correspondent's (or actually, the other way 'round -- I was challenging his version of events).   In a similar vein, I've already had one acquaintance lament that Lynx got too little credit for its contribution to Solexa.  Speaking of poor memories, I recently saw a presentation on the web on Manteia, but can't find it again (After posting this, I found it!)

In any case, there are a number of key areas which I think should be explored.  There was clearly a major technological shift from capillary fluorescent Sanger sequencing, which was dominant for around a decade and had succeeded very similar slab gel and radioactive Sanger, to a set of radically different technologies. The 'next generation' label is problematic in the same way 'post-modern' is (what comes next?), but second generation has its issues as well since then there is the question of what constitutes 'third generation' and so on.  The technologies that first arose to in the space had many things in common yet were each distinct, but they were far more like each other than like the Sanger technology that preceded them, in particular because they resolved sequences as spots over time rather than bands on an electrophoretic gel.

One explanation of the rise of that wave of Sanger-successors could be that the right ideas were simply percolating around.  A number of groups, seemingly independently, came up with very similar ideas.  In particular, the ones I am a bit familiar with are Manteia in Switzerland, the work in George Church's lab and the 454 story.  The Church angle is remarkable in a special way, as when I left the lab in late 1996 the talk of new sequencing technologies were around nanopores or highly-multiplexed Maxam-Gilbert sequencing; yet less than 3 years later the first 'polony' paper was published.   I remember some talk of 'sequencing by synthesis' back at some of the mid-90s Hilton Head meetings, though I don't remember any concept of clonal amplification then.  

Another class of explanation would be around money.  Manteia went bankrupt, and some of its core technology had come from a prior bankrupt company Mosaik.  I was approached in 2000 about joining a to-be-formed company to commercialize the polony technology; that company never got off the ground (I find it a bit amazing now that I didn't leap into it; lucky I didn't!).  Somehow, 454 was kept on by Curagen until it was ready to launch.  On the public side, my memory is that even before Celera a conscious decision was made by the public project to deemphasize  sequencing technology development; might a new generation of sequencing arisen sooner if that decision had been different?  While we can't actually hit rewind, it is an interesting point to ponder in the context of public support of new technologies.

Yet another area to explore would be the critical contribution of developments outside the field.  In particular, all the first round of replacement technologies relied critically on imaging and image processing.  Could a technology like 454 have been remotely feasible a decade earlier, given the expense required for the computational support?  How critical were advances in microscopy?  Or were the key advances in the molecular biology?  For example, 454 couldn't have happened without the prior development of pyrosequencing.  

Perhaps another profitable area to explore are the technologies that didn't happen, or didn't quite happen.  Lynx's technology had many aspects of the second generation: clonal populations interrogated serially.   Back in the 1990s it seemed like a logical path was to simply build bigger arrays of even smaller capillaries, or perhaps even lab-on-a-chip-Sanger.  Did those approaches die because they were truly dead ends, or did they starve of attention until too late?

I think exploring any and all of these could inform future technology development, and in any case would simply be fascinating.  Of course, some of the competing technologies may still arise.  Electron microscopic DNA sequencing was discussed in the 90s, and still remains an area of interest.  I'm starting to be optimistic that some sort of useful nanopore device will hit the market next year, though whose nanopore and how useful (and for what) remain very murky.  Who knows?  Perhaps a microfluidic Sanger chip will storm the market?  

9 comments:

  1. I'm wondering how useful a next-gen sequencing technology producing short reads would have been before 2003 without a finished human reference genome.

    There might be several applications, but the biggest commercial incentive in bringing such a technology to market is maybe the resequencing of human genomes.

    ReplyDelete
  2. Interesting that Sanger sequencing is still used as the 'gold standard' though - virtually every talk I sat through at ASHG that were doing any form of diagnostics were still confirming using Sanger sequencing.

    It will also be interesting to see how the field of bioinformatics will change when new long reads start appearing with, for example, nanopore technology. Will we all be heading back to our Staden packages?

    ReplyDelete
  3. Thanks Keith for an informative post! That Manteia link is fascinating, as that is what saved Solexa from being forced to go the single-molecule route.

    One point of reference was the $38M NGHRI poured into next-generation sequencing for the $100K and $1K genome. http://www.nih.gov/news/pr/oct2004/nhgri-14.htm There was a lot of work done in the prior years to germinate the idea and garner support within the NIH; just a matter of time, money and a lot of effort.

    Another item regarding micro-capillary and other development has to do with the IP landscape. I'm no expert, but freedom to operate often limits how technology gets developed and what path is chosen. I've seen my share of very promising approaches killed off in development due to these kinds of things.

    You hit the nail on the head with your comment on developments outside the field. Way back in 2001-2002 I was asked some very interesting questions from a secretive company called 454 about the magnetic agarose beads I was a product manager for, how they were made etc. Without magnetic strepavidin beads made uniformly enough, able to generate enough light in a picotiter plate etc. the 454 method would not work; also without a sensitive enough CCD camera, the system would also fail in development; not to mention the amplfication of the signal with PMTs and other electronics.

    So many dependencies...

    Interesting you mention electron microscopy, as I just downloaded ZS Genetic's first paper on their approach published last month.

    Best -
    Dale

    ReplyDelete
  4. Keith you forgot to mention the impact of computing power on the development of the NGS technolgies. Without massive compute power none of what we do today would be possible.

    I wrote a three-part story for BiteSizeBio in the history of DNA sequencing and the scariest thing for me was whether people who knew the history in better detail would point out horrible inadequecies of my version of events. Illumina would not be here without Solexa, who would not have been available to buy without Manteia, etc, etc, etc. It's like Rock Family Trees.

    You are absolutely right that we should be recording this now. I have a collection of molecular biology "toys" including a 2005 Affy chip still in its packet! Most of this kind of stuff gets used or thrown away. I am putting together a little museum in a glass case of my hoard. Perhaps others should do the same.

    PS: Does anyone have a Helicos chip they could send me?

    ReplyDelete
  5. James: Oy! A silly oversight when writing this up; the mental drafts had that as a key point. Without fat, cheap compute, all that data wouldn't be useful.

    ReplyDelete
  6. Can I also predict that Sanger sequencing will cease to be the gold standard. It does identify four bases with the highest accuracy but not methylated bases and, who knows, other bases of which we are not even yet aware are contained in the genome.


    Besides the slowness is fatally prohibitive and the error rates of next gen can be compensated for by repeat coverage which is standard.

    Sanger sequencing will be remembered as revolutionary but there's no room for nostalgia retarding to advancement of science.

    ReplyDelete
  7. Why should Lynx getting any credit for the success of the Solexa sequencing? Sure at the time they had sequenced more bases than anyone outside of HGP and Celera but their method, completely different and quite convoluted, was severely limited/flawed by back pressure created by stuffing tiny beads into a tube and trying to pass reagent through. It was like a blocked drain. Solexa acquired Lynx as a reverse takeover for the sole purpose of listing Solexa on the NASDAQ. It was a clever move by the then CEO (who should ultimately get the credit for the success of Solexa, if we lived in an honest, egoless world) to list without the high profile IPO risk.

    True though, Manteia was the missing piece without which the technology may never have happened. It's a good example of what biotechs should do more of, that is acquire rather than reinvent as the management team at Solexa did so very well.

    ReplyDelete
  8. With regard to Lynx, the claim made by my source, is that new hardware Lynx had prepared just before the acquisition by Solexa ended up forming the basis of the original Solexa instrument.

    ReplyDelete
  9. I had have to say that's dubious as the hardware and systems are very different and the hardware eventually commercialised by Illumina differed very little from the plans devised several years earlier.

    ReplyDelete