Forbes has an article co-written by Matthew Herper and Robert Langreth titled "The Next Big Move For The Smartest Biotech Investor", profiling Randal Kirk. Kirk is described as one of the few billionaires who can ascribe that status to biotechnology. Kirk made his money through two companies in the psychiatric drug space: New River Pharmaceuticals developed an ADHD drug (lisdexamfetamine) and then was acquired by Shire whereas Clinical Data developed an antidepressant (vilazodone) and was then purchased by Forest. A key part of the article profiles Kirk's investment in a little-known and secretive synthetic biology company called "Intrexon".
The title is probably meant to gall; it certainly raises my hackles. The most obvious quibble is that it isn't clear either of the drug development companies were really biotech. Of course, that would require defining biotech, but it ideally it would refer to companies which highly depend on recombinant DNA and related technologies. Now, such technologies are embedded in virtually all drug development today, but neither of these drugs sounds like they used much. Both drugs are interesting twists on prior approaches (though I'm not enough of a chemist to judge the novelty of vilazodone).
A computational biologist's personal views on new technologies & publications on genomics & proteomics and their impact on drug discovery
Showing posts with label biotech companies. Show all posts
Showing posts with label biotech companies. Show all posts
Wednesday, February 23, 2011
Sunday, February 20, 2011
Will Cheap Gene Synthesis Squelch Cheaper Gene Synthesis?
Among the vast piles of items which I've meant to write about but have slipped are a paper last year on gene synthesis and some subsequent announcements about trying to commercialize the method described in that paper. This is an area in which I have past experience, though I would never claim this gives me indisputable authority or omniscience in the matter.
The paper, primarily by scientists from Febit but also with two scientists from Stanford and George Church in the author list, finally describes an interesting approach to dealing with some serious challenges in gene synthesis which substantially increase the costs. By finally, I mean that the idea has certainly been kicking around for a while and was mentioned when I visited Codon Devices in the fall of 2006 looking for employment.
To fill in some background first, gene synthesis is a powerful way to generate DNA constructs which can enable all sorts of experiments. The challenge is that the cost of gene synthesis, currently starting at around $0.40 per base pair for very easy and short stuff (say, less than 2Kb), tends to restrict what you can use it for. I have a project concept right now that would be a slam dunk for gene synthesis -- but not at $0.40/bp (which I think I couldn't even get for the project). Whack that price by a few factors of two and the project becomes reasonable.
There are many cost components to commercial gene synthesis, and only someone who has carefully looked over the books while wearing a green eyeshade is going to have a proper handle on them. But three of the big expenses are the oligos themselves, the sequencing of constructs to find the correct ones and labor. What the Febit paper does is illustrate a nice way to tackle the first two in a manner that shouldn't require a lot of labor.
The oligo cost is a serious issue. Conventional oligos can be had for around $0.08 or maybe a bit less a base. However, each base in the final construct requires close to 2 bases in the oligo set. Some design strategies might get this down a bit. However, conventional columns generate far more oligo than you actually need. An approach which has been published (but not commercialized as far as I know), is to scale down the synthesis using microfluidics. This method matches better the amount synthesized and the amount you need, though the length and quality of the oligos needs refinements from what was reported in order to be truly useful. Microarrays are a means to synthesize huge numbers of oligos, but their quality also tends to be low and the quantity of each oligo species is much too small without further amplification. Amplification schemes have been worked out, but add to the processing costs of the oligos.
What Febit and company have done is take those microarray-build oligos and screen them using 454 sequencing. The beads containing the amplicons with correct oligos are then plucked out of the 454 flowcell (with 90% success of getting the right bead) and used as starting points.
Now, this has several interesting angles. First, it has been challenging to marry the non-Sanger new sequencing technologies to gene synthesis. The new technologies tend to have short reads, too short to read even a short construct. The new technologies also require library construction and it is difficult to trace a given sequence back to a specific input DNA. In other words, short read technologies are great at reading populations, but not individual wells in a gene synthesis output. Sanger on the other hand, is ill-suited for populations but great for individual clones. One solution to this problem is clever pooling and barcoding strategies, but these necessitate having enough different clones to be worth pooling and barcoding. In other words, second generation sequencing is difficult to adapt to retail gene synthesis, but looks practical for wholesale gene synthesis.
Getting the oligos right has important positive side-effects. While the stitching together of oligos into larger fragments (and larger fragments into still larger ones) can generate errors, and awful lot of the problems stem from bad input oligos. Not only can error rates be troublesome, but some of the erroneous sequences may have advantages over the correct ones in later steps. For example, a deleted fragment may PCR more efficiently than the full length, and slightly toxic gene products may be disfavored in cloning steps over frameshifted versions of the same reading frame. So, by putting the sequencing up front it should be possible to reduce the later sequencing downstream. So even if that sequencing remains Sanger, it should be possible to do a lot less.
Okay, that's the science. Now some worries about the business. Febit announced in January they are looking for investors to fire off a new company to commercialize this approach. This makes good business sense, since Febit itself must be encrusted with all sorts of business barnacles, having lurched from one business to another in trying to commercialize their microfluidic microarray system. Previously failed attempts include gene synthesis as well as microarray expression analysis and hybridization capture (I even ran one experiment with their system, whose results certainly didn't argue for them staying in that business!). The press release stated they were hoping to attain pricing in the 0.08$ per base range, which would make my current experiment concept feasible. That would be great.
Now, they will need to refine their system and perhaps adapt other sequencers. A 454 Jr would probably not be a difficult adaptation, but moving on to Ion Torrent must be tempting. Getting things to work for one paper and one set of genes is unfortunately different than being able to keep things working over an entire spectrum of customer designs.
Which leads me to where I think they will have a great challenge, though one which I think can be finessed with the proper business approach. They will be brining to market a methodology whose benefit is cost at the expense but with the caveat of attaining that cost advantage only with sufficient volume. Initially, they will be unable to reliably predict delivery times (due to kinks showing up). Finally, they are adding some additional processing steps (454 sequencing, bead recovery & oligo recovery from the bead) which may add to the time.
The abyss into which this new company must plunge is a world in which very fast gene synthesis is available from a large number of vendors in the $0.40 price range. So, they must find very large customers who are willing to be a bit patient and keep their pipeline filled. Such customers do exist, but they aren't always easy to find and pry away from their existing suppliers. In theory much cheaper synthesis would unleash new orders for projects (such as mine) which are too costly at current prices, but that is always a risky assumption to bank a company on (c.f. Codon Devices' gene synthesis business).
It's the alternative route that I predict this NewCo is likely to go down. That would be to link up with an established provider in the field. Said provider, through their salespersons and sales software, could offer each customer an option -- I can build your genes for $0.40 if you want them fast or hack that down to $0.10 a base if you can wait. In order to preserve customer satisfaction, that long time would need to include an insurance period to build the genes by the conventional route if the new route fails -- but of course if you are frequently forced to build $0.40/bp genes for which you charged $0.10/bp, that would be financial suicide.
So, in summary, I think this is a clever idea which needs to be pushed forward. But, after a long gestation in the lab, it faces a very rocky future in the production world. I hope they succeed, because it is not hard to imagine projects I would like to do which would be enabled by such a capability.
The paper, primarily by scientists from Febit but also with two scientists from Stanford and George Church in the author list, finally describes an interesting approach to dealing with some serious challenges in gene synthesis which substantially increase the costs. By finally, I mean that the idea has certainly been kicking around for a while and was mentioned when I visited Codon Devices in the fall of 2006 looking for employment.
To fill in some background first, gene synthesis is a powerful way to generate DNA constructs which can enable all sorts of experiments. The challenge is that the cost of gene synthesis, currently starting at around $0.40 per base pair for very easy and short stuff (say, less than 2Kb), tends to restrict what you can use it for. I have a project concept right now that would be a slam dunk for gene synthesis -- but not at $0.40/bp (which I think I couldn't even get for the project). Whack that price by a few factors of two and the project becomes reasonable.
There are many cost components to commercial gene synthesis, and only someone who has carefully looked over the books while wearing a green eyeshade is going to have a proper handle on them. But three of the big expenses are the oligos themselves, the sequencing of constructs to find the correct ones and labor. What the Febit paper does is illustrate a nice way to tackle the first two in a manner that shouldn't require a lot of labor.
The oligo cost is a serious issue. Conventional oligos can be had for around $0.08 or maybe a bit less a base. However, each base in the final construct requires close to 2 bases in the oligo set. Some design strategies might get this down a bit. However, conventional columns generate far more oligo than you actually need. An approach which has been published (but not commercialized as far as I know), is to scale down the synthesis using microfluidics. This method matches better the amount synthesized and the amount you need, though the length and quality of the oligos needs refinements from what was reported in order to be truly useful. Microarrays are a means to synthesize huge numbers of oligos, but their quality also tends to be low and the quantity of each oligo species is much too small without further amplification. Amplification schemes have been worked out, but add to the processing costs of the oligos.
What Febit and company have done is take those microarray-build oligos and screen them using 454 sequencing. The beads containing the amplicons with correct oligos are then plucked out of the 454 flowcell (with 90% success of getting the right bead) and used as starting points.
Now, this has several interesting angles. First, it has been challenging to marry the non-Sanger new sequencing technologies to gene synthesis. The new technologies tend to have short reads, too short to read even a short construct. The new technologies also require library construction and it is difficult to trace a given sequence back to a specific input DNA. In other words, short read technologies are great at reading populations, but not individual wells in a gene synthesis output. Sanger on the other hand, is ill-suited for populations but great for individual clones. One solution to this problem is clever pooling and barcoding strategies, but these necessitate having enough different clones to be worth pooling and barcoding. In other words, second generation sequencing is difficult to adapt to retail gene synthesis, but looks practical for wholesale gene synthesis.
Getting the oligos right has important positive side-effects. While the stitching together of oligos into larger fragments (and larger fragments into still larger ones) can generate errors, and awful lot of the problems stem from bad input oligos. Not only can error rates be troublesome, but some of the erroneous sequences may have advantages over the correct ones in later steps. For example, a deleted fragment may PCR more efficiently than the full length, and slightly toxic gene products may be disfavored in cloning steps over frameshifted versions of the same reading frame. So, by putting the sequencing up front it should be possible to reduce the later sequencing downstream. So even if that sequencing remains Sanger, it should be possible to do a lot less.
Okay, that's the science. Now some worries about the business. Febit announced in January they are looking for investors to fire off a new company to commercialize this approach. This makes good business sense, since Febit itself must be encrusted with all sorts of business barnacles, having lurched from one business to another in trying to commercialize their microfluidic microarray system. Previously failed attempts include gene synthesis as well as microarray expression analysis and hybridization capture (I even ran one experiment with their system, whose results certainly didn't argue for them staying in that business!). The press release stated they were hoping to attain pricing in the 0.08$ per base range, which would make my current experiment concept feasible. That would be great.
Now, they will need to refine their system and perhaps adapt other sequencers. A 454 Jr would probably not be a difficult adaptation, but moving on to Ion Torrent must be tempting. Getting things to work for one paper and one set of genes is unfortunately different than being able to keep things working over an entire spectrum of customer designs.
Which leads me to where I think they will have a great challenge, though one which I think can be finessed with the proper business approach. They will be brining to market a methodology whose benefit is cost at the expense but with the caveat of attaining that cost advantage only with sufficient volume. Initially, they will be unable to reliably predict delivery times (due to kinks showing up). Finally, they are adding some additional processing steps (454 sequencing, bead recovery & oligo recovery from the bead) which may add to the time.
The abyss into which this new company must plunge is a world in which very fast gene synthesis is available from a large number of vendors in the $0.40 price range. So, they must find very large customers who are willing to be a bit patient and keep their pipeline filled. Such customers do exist, but they aren't always easy to find and pry away from their existing suppliers. In theory much cheaper synthesis would unleash new orders for projects (such as mine) which are too costly at current prices, but that is always a risky assumption to bank a company on (c.f. Codon Devices' gene synthesis business).
It's the alternative route that I predict this NewCo is likely to go down. That would be to link up with an established provider in the field. Said provider, through their salespersons and sales software, could offer each customer an option -- I can build your genes for $0.40 if you want them fast or hack that down to $0.10 a base if you can wait. In order to preserve customer satisfaction, that long time would need to include an insurance period to build the genes by the conventional route if the new route fails -- but of course if you are frequently forced to build $0.40/bp genes for which you charged $0.10/bp, that would be financial suicide.
So, in summary, I think this is a clever idea which needs to be pushed forward. But, after a long gestation in the lab, it faces a very rocky future in the production world. I hope they succeed, because it is not hard to imagine projects I would like to do which would be enabled by such a capability.
Thursday, November 19, 2009
Three Blows Against the Tyranny of Expensive Experiments
Second generation sequencing is great, but one of it's major issues so far is that the cost of one experiment is quite steep. Just looking at reagents, going from a ready-to-run library to sequence data is somewhere in the neighborhood of $10K-25K on 454, Illumina, Helicos or SOLiD (I'm willing to take corrections on these values, though they are based on reasonable intelligence). While in theory you can split this cost over multiple experiments by barcoding, that can be very tricky to arrange. Perhaps if core labs would start offering '1 lane of Illumina - Buy It Now!' on eBay the problem could be solved, but finding a spare lane isn't easy.
This issue manifests itself in other ways. If you are developing new protocols anywhere along the pipeline, your final assay is pretty expensive, making it challenging to work inexpensively. I've heard rumors that even some of the instrument makers feel inhibited in process development. It can also make folks a bit gun shy; Amanda heard first hand tonight from someone lamenting a project stymied under such circumstances. Even for routine operations, the methods of QC are pretty inexact so far as they don't really test whether the library is any good, just whether some bulk property (size, PCRability, quantity) is within a spec. This huge atomic cost also the huge barrier to utilization in a clinical setting; does the clinician really want to wait some indefinite amount of time until enough patient samples are queued to make the cost/sample reasonable?
Recently, I've become aware of three hopeful developments on this front. The first is the Polonator, which according to Kevin McCarthy has a consumable cost of only about $500 per run (post library construction). $500 isn't nothing to risk on a crazy idea, but it sure beats $10K. There aren't many Polonators around, but for method development in areas such as targeted capture it would seem like a great choice.
Today, another shoe fell. Roche has announced a smaller version of the 454 system, the GS Junior. While the instrument cost wasn't announced, it will supposedly generate 1/10th as much data (35+Mb from 100Kreads with 400 Q20 bases) for the same cost per basepair, suggesting that the reagent cost for a run will be in the neighborhood of $2.5K. Worse than what I described above, but rather intriguing. This is a system that may have a good chance to start making clinical inroads; $2.5K is a bit steep for a diagnostic but not ridiculous -- or you simply need to multiplex fewer samples to get the cost per sample decent. The machine is going to boast 400+bp reads, playing to the current comparative strength of the 454 chemistry. The instrument cost wasn't mentioned. While I doubt anyone would buy such a machine solely as an upfront QC for SOLiD or Illumina, with some clever custom primer design one probably could make libraries useable 454 plus one other platform.
It's an especially auspicious time for Roche to launch their baby 454, as Pacific Biosciences released some specs through GenomeWeb's In Sequence and what I've been able to scrounge about (I can't quite talk myself into asking for a subscription) this is going to put some real pressure across the market, but particularly on 454. The key specs I can find are a per run cost of $100 which will get you approximately 25K-30K reads of 1.5Kb each -- or around 45Mb of data. It may also be possible to generate 2X the data for nearly the same cost; apparently the reagents packed with one cell are really good for two run in series. Each cell takes 10-15 minutes to run (at least in some workflows) and the instrument can be loaded up with 96 of them to be handled serially. This is a similar ballpark to what the GS Junior is being announced with, though with fewer reads but longer read lengths. I haven't been able to find any error rate estimates or the instrument cost. I'll assume, just because it is new and single molecule, that the error rate will give Roche some breathing room.
But in general, PacBio looks set to really grab the market where long reads, even noisy ones, are valuable. One obvious use case is transcriptome sequencing to find alternative splice forms. Another would be to provide 1.5Kb scaffolds for genome assembly; what I've found also suggests PacBio will offer a 'strobe sequencing' mode which is akin to Helicos' dark filling technology, which is a means to get widely spaced sequence islands. This might provide scaffolding information in much larger fragments. 10Kb? 20Kb? And again, though you probably wouldn't buy the machine just for this, at $100/run it looks like a great way to QC samples going into other systems. Imagine checking a library after initial construction, then after performing hybridization selection and then after another round of selection! After all, the initial PacBio instrument won't be great for really deep sequencing. It appears it would be $5K-10K to get approximately 1X coverage of a mammalian genome -- but likely with a high error rate.
With the ability to easily sequence 96 samples at a time (though it isn't clear what sample prep will entail) does have some interesting suggestions. For example, one could do long survey sequencing of many bacterial species, with each well yielding 10X coverage of an E.coli-sized genome (a lot of bugs are this size or smaller). The data might be really noisy, but for getting a general lay-of-the-land it could be quite useful -- perhaps the data would be too noisy to tell which genes were actually functional vs. decaying pseudogenes, but you would be able to ask "what is the upper bound on the number of genes of protein family X in genome Y". if you really need high quality sequence, then a full run (or targeted sequencing) could follow.
At $100 per experiment, the sagging Sanger market might take another hit. If a quick sample prep to convert plasmids to usable form is released, then ridiculous oversampling (imagine 100K reads on a typical 1.5Kb insert in pUC scenario!) might overcome a high error rate.
One interesting impediment which PacBio has acknowledged is that they won't be able to ramp up instrument production as quickly as they might like and will be trying to place (ration) instruments strategically. I'm hoping at least one goes to a commercial service provider or a core lab willing to solicit outside business, but I'm not going to count on it.
Will Illumina & Life Technologies (SOLiD) try to create baby sequencers? Illumina does have a scheme to convert their array readers to sequencers, but from what I've seen these aren't expected to save much on reagents. Life does own the VisiGen technology, which is apparently similar to PacBio's but hasn't yet published a real proof-of-concept paper -- at least that I could find; their key patent has issued -- reading material for another night.
This issue manifests itself in other ways. If you are developing new protocols anywhere along the pipeline, your final assay is pretty expensive, making it challenging to work inexpensively. I've heard rumors that even some of the instrument makers feel inhibited in process development. It can also make folks a bit gun shy; Amanda heard first hand tonight from someone lamenting a project stymied under such circumstances. Even for routine operations, the methods of QC are pretty inexact so far as they don't really test whether the library is any good, just whether some bulk property (size, PCRability, quantity) is within a spec. This huge atomic cost also the huge barrier to utilization in a clinical setting; does the clinician really want to wait some indefinite amount of time until enough patient samples are queued to make the cost/sample reasonable?
Recently, I've become aware of three hopeful developments on this front. The first is the Polonator, which according to Kevin McCarthy has a consumable cost of only about $500 per run (post library construction). $500 isn't nothing to risk on a crazy idea, but it sure beats $10K. There aren't many Polonators around, but for method development in areas such as targeted capture it would seem like a great choice.
Today, another shoe fell. Roche has announced a smaller version of the 454 system, the GS Junior. While the instrument cost wasn't announced, it will supposedly generate 1/10th as much data (35+Mb from 100Kreads with 400 Q20 bases) for the same cost per basepair, suggesting that the reagent cost for a run will be in the neighborhood of $2.5K. Worse than what I described above, but rather intriguing. This is a system that may have a good chance to start making clinical inroads; $2.5K is a bit steep for a diagnostic but not ridiculous -- or you simply need to multiplex fewer samples to get the cost per sample decent. The machine is going to boast 400+bp reads, playing to the current comparative strength of the 454 chemistry. The instrument cost wasn't mentioned. While I doubt anyone would buy such a machine solely as an upfront QC for SOLiD or Illumina, with some clever custom primer design one probably could make libraries useable 454 plus one other platform.
It's an especially auspicious time for Roche to launch their baby 454, as Pacific Biosciences released some specs through GenomeWeb's In Sequence and what I've been able to scrounge about (I can't quite talk myself into asking for a subscription) this is going to put some real pressure across the market, but particularly on 454. The key specs I can find are a per run cost of $100 which will get you approximately 25K-30K reads of 1.5Kb each -- or around 45Mb of data. It may also be possible to generate 2X the data for nearly the same cost; apparently the reagents packed with one cell are really good for two run in series. Each cell takes 10-15 minutes to run (at least in some workflows) and the instrument can be loaded up with 96 of them to be handled serially. This is a similar ballpark to what the GS Junior is being announced with, though with fewer reads but longer read lengths. I haven't been able to find any error rate estimates or the instrument cost. I'll assume, just because it is new and single molecule, that the error rate will give Roche some breathing room.
But in general, PacBio looks set to really grab the market where long reads, even noisy ones, are valuable. One obvious use case is transcriptome sequencing to find alternative splice forms. Another would be to provide 1.5Kb scaffolds for genome assembly; what I've found also suggests PacBio will offer a 'strobe sequencing' mode which is akin to Helicos' dark filling technology, which is a means to get widely spaced sequence islands. This might provide scaffolding information in much larger fragments. 10Kb? 20Kb? And again, though you probably wouldn't buy the machine just for this, at $100/run it looks like a great way to QC samples going into other systems. Imagine checking a library after initial construction, then after performing hybridization selection and then after another round of selection! After all, the initial PacBio instrument won't be great for really deep sequencing. It appears it would be $5K-10K to get approximately 1X coverage of a mammalian genome -- but likely with a high error rate.
With the ability to easily sequence 96 samples at a time (though it isn't clear what sample prep will entail) does have some interesting suggestions. For example, one could do long survey sequencing of many bacterial species, with each well yielding 10X coverage of an E.coli-sized genome (a lot of bugs are this size or smaller). The data might be really noisy, but for getting a general lay-of-the-land it could be quite useful -- perhaps the data would be too noisy to tell which genes were actually functional vs. decaying pseudogenes, but you would be able to ask "what is the upper bound on the number of genes of protein family X in genome Y". if you really need high quality sequence, then a full run (or targeted sequencing) could follow.
At $100 per experiment, the sagging Sanger market might take another hit. If a quick sample prep to convert plasmids to usable form is released, then ridiculous oversampling (imagine 100K reads on a typical 1.5Kb insert in pUC scenario!) might overcome a high error rate.
One interesting impediment which PacBio has acknowledged is that they won't be able to ramp up instrument production as quickly as they might like and will be trying to place (ration) instruments strategically. I'm hoping at least one goes to a commercial service provider or a core lab willing to solicit outside business, but I'm not going to count on it.
Will Illumina & Life Technologies (SOLiD) try to create baby sequencers? Illumina does have a scheme to convert their array readers to sequencers, but from what I've seen these aren't expected to save much on reagents. Life does own the VisiGen technology, which is apparently similar to PacBio's but hasn't yet published a real proof-of-concept paper -- at least that I could find; their key patent has issued -- reading material for another night.
Tuesday, November 17, 2009
Decode -- Corpse or Phoenix?
The news that Decode has filed for bankruptcy is a sad milestone in the history of genomics companies. Thus falls either the final or penultimate human gene mapping companies, with everyone else having either disappeared entirely or exited that business. A partial list would include Sequana, Mercator, Myriad, Collaborative Research/Genome Therapeutics, Genaera and (of course) Millennium. I'm sure I'm missing some others. The one possible survivor I can think about is Perlegen, though their website is pretty bare bones, suggesting they have exited as well.
The challenge all of these companies faced, and rarely beat, was how to convert mapping discoveries into a cash stream which could pay for all that mapping. Myriad could be seen as the one success, having generated the controversial BRCA tests from their data, but (I believe) they no longer are actively looking. In new tests are in-licensed from academics.
Most other companies shed their genomics efforts as part of becoming product companies; the real money is in therapeutics. Mapping turned out to be such a weak contributor to that value stream. A major problem is that mapping information rarely led to a clear path to a therapeutic; too many targets nicely validated by genetics were complete head-scratchers as to how to create a therapeutic. Not that folks didn't try; Decode even in-licensed a drug and acquired all the pieces for a full drug development capability.
Of course, perhaps Decode's greatest notoriety came from their deCodeMe DTC genetic testing business. Given the competition & controversy in this field, that was unlikely to save them. The Icelandic financial collapse I think did them some serious damage as well. That's a reminder that companies, regardless of how they are run, sometimes have their fate channeled by events far beyond their control. A similar instance was the loss of Lion's CFO in the 9/11 attacks; he was soliciting investors at the WTC that day. The 9/11 deflation of the stock market definitely crimped a lot of money-losing biotechs plans for further fund raising.
Bankruptcies were once very rare for biotech, but quite a few have been announced recently. The old strategy of selling off the company at fire sale prices seems to be less in style these days; assets are now being sold as part of the bankruptcy proceedings. Apparently, this and perhaps other functions will continue. Bankruptcy in this case is a way of shedding incurred obligations viewed as nuisances; anyone betting on another strategy by buying the stock is out of luck.
Personally, I wish that the genetic database and biobanks which deCode have created could be transferred to an appropriate non-profit such as the Sanger. I doubt much of that data will ever be convertable into cash, particularly at the scale most investors are looking for. But a non-profit could extract the useful information and get it published, which was deCode's forte but I doubt they've mined everything that can be mined.
The challenge all of these companies faced, and rarely beat, was how to convert mapping discoveries into a cash stream which could pay for all that mapping. Myriad could be seen as the one success, having generated the controversial BRCA tests from their data, but (I believe) they no longer are actively looking. In new tests are in-licensed from academics.
Most other companies shed their genomics efforts as part of becoming product companies; the real money is in therapeutics. Mapping turned out to be such a weak contributor to that value stream. A major problem is that mapping information rarely led to a clear path to a therapeutic; too many targets nicely validated by genetics were complete head-scratchers as to how to create a therapeutic. Not that folks didn't try; Decode even in-licensed a drug and acquired all the pieces for a full drug development capability.
Of course, perhaps Decode's greatest notoriety came from their deCodeMe DTC genetic testing business. Given the competition & controversy in this field, that was unlikely to save them. The Icelandic financial collapse I think did them some serious damage as well. That's a reminder that companies, regardless of how they are run, sometimes have their fate channeled by events far beyond their control. A similar instance was the loss of Lion's CFO in the 9/11 attacks; he was soliciting investors at the WTC that day. The 9/11 deflation of the stock market definitely crimped a lot of money-losing biotechs plans for further fund raising.
Bankruptcies were once very rare for biotech, but quite a few have been announced recently. The old strategy of selling off the company at fire sale prices seems to be less in style these days; assets are now being sold as part of the bankruptcy proceedings. Apparently, this and perhaps other functions will continue. Bankruptcy in this case is a way of shedding incurred obligations viewed as nuisances; anyone betting on another strategy by buying the stock is out of luck.
Personally, I wish that the genetic database and biobanks which deCode have created could be transferred to an appropriate non-profit such as the Sanger. I doubt much of that data will ever be convertable into cash, particularly at the scale most investors are looking for. But a non-profit could extract the useful information and get it published, which was deCode's forte but I doubt they've mined everything that can be mined.
Tuesday, January 20, 2009
Ah, them gold rush days!
Derek Lowe had a nice piece yesterday looking back on the genomics bubble. I might quibble with his benchmarking of the end of the insanity -- the stock market bubble would not peak until just before the 2000 elections, but it's a fine piece & pretty accurate.
I should know -- I was there. I was more than just there, I was a significant part of it. No, I didn't think it up & I won't try to exaggerate my importance, but for what is perhaps the poster child of genomics excess (and if not that, certainly in the Pantheon of genomanic deities).
When I got to Millennium they were still largely focused on the positional cloning of disease genes. But, they had started throwing sequencing capacity at ESTs, small bits of genetic message which serve as toeholds to larger ones. The catch was that the sequencing analysis software had been designed for positional cloning work & not ESTs, and it's a very different ballgame. When sequencing genomic DNA seeing anything which looked like a gene was interesting. But when sequencing stuff that is almost nothing but genes, the challenge was to sort the wheat from the chaff. Lots of scientists spent mind-numbing hours scanning BLAST reports for things of interest, and often found things. But this is a lousy technique -- not only might eyes glaze over (or neurons croak) from monotony, but a really interesting match might not be obvious -- what if the top hit was "Uncharacterized protein X" but the 3rd match down was "TotalPharmaceuticalGold"? Or worse, that BLAST couldn't even find a useable match? Plus, was that a match or an identity -- did you find something new or just rediscover a lousy fragment of the old? More mind numbing staring.
Enter a cocky recent Ph.D. After building up some expertise and some more refined tools (which in their embryonic form nailed me the one gene patent of mine perhaps worth something), I had built a system which churned through all the ESTs and crudely organized them by what made things interesting (and tried to ignore all the boring stuff). Ion channels -- look on this web page. GPCRs -- that's over here. Possible secreted proteins, look at this analysis. Furthermore, it also attempted to amalgamate all the different ESTs into a view which was higher quality, longer and more compact -- and tell you which things were already described as proteins and which might be novel. Plus, more sensitive algorithms than BLAST were used to pull things into families.
Now in all honesty, it wasn't nearly perfect. Some of the mind-numbing review had shifted to me -- the early versions in particular had every homology approved (and named!) by me. The semi-automatically generated names were ugly. Various EST artifacts could join webs of unrelated genes into a horrible tangle. But, now there could be reviews of consolidated, pre-analyzed data (though also in fairness nobody ever totally trusted it, so the manual sequence-by-sequence reviews often continued).
Of course, if you have a mountain of loot you probably want to protect it. Enter the lawyers. Millennium had always filed on their discoveries; now they had lots of discoveries to protect. But protect from what? Well, the paranoia was a loss of "Freedom to Operate", usually known as FTO. Nobody knew what would stand up as a patent -- but there were instructive examples from the early biotech era of business plans sunk by a loss of FTO -- and expensive lawsuits that clearly marked that loss. So the patenting engine took off -- an expensive insurance policy against an unpredictable future.
Of course, what the lawyers wanted for the filing was as much info as possible -- and the automated analyses provided lots for them. But, they had been designed to be viewed in a web browser individually, not printed out en masse. Worse yet, by this time Informatics & Legal were in separate buildings -- one of my least pleasant Millennium memories was trying to script the printing a raft of analyses on a printer located in the other building. Plus, if there were inventions then somebody had to have invented them -- such as the person who wrote the code to find them & then reviewed the initial output. And so, I started having dates with the paralegals, an hour of hand-cramping signing of document after document. At one point, there were somewhere between 120-140 patent applications where I was sole or co-inventor.
This was the late 90's and the hype was getting thick -- we were guilty but so were others. Millennium wasn't a big pusher of high gene counts -- at least in the terms of the day (but that's another whole story), but certainly we started selling all those genes we had & the ones we extrapolated were still out there. A key part of the business model was to sell the genes many times -- if we could sell the same gene to Lilly for cardiovascular & Roche for metabolic and AstraZeneca for inflammation, all the better. Not that anything underhanded went on; we'd present the case to each company & most of the deals had exclusivity only within a therapeutic area.
How much did we believe our own Kool Aid? It varied. There was one day where I got in a blue mood because I convinced myself that once MLNM found all the genes we'd put ourselves out of work! But that was an extreme ( and what I hope is the height of my own personal stupidity); most of the time we thought we might be right or we might be overestimating a bunch -- but that our partners were intelligent adults who could make the same calculations. Never did I see an attitude that we were fleecing the suckers.
In particular, I remember one of my colleagues making a comment when the Bayer deal was about to be signed. A premise of that deal is that Millennium would identify proteins which could be easily screened, associate them by multiple means with a plausible role in disease, configure an HTS assay for them -- and then Bayer would quickly get hits from their libraries. Those hits in turn would be used to finish determining whether the protein of interest really played a role in disease. MLNM's (over)confidence in genomics matched by Bayer's (over)confidence in chemistry. My colleague said it was one thing to think up such an idea -- and another to 'go over the cliff' -- and he was nervously surprised that someone else was joining us. He was one of the most sober minded fellows around & wasn't making allusions to
Bayer being foolhardy -- just that we were both taking the leap together. Alas, I didn't think to laugh & reply "The fall will kill you".
The genomics rush, alas, did not end with a huge rush of new drug candidates. We thought we'd get a huge leap in biology -- and we did, but not as big as we thought. Traditional drug development & biology had cleaned out the easy stuff; there weren't tons of hidden gems. The chemical biology concept pretty much disappeared from the Bayer collaboration -- turned out it was long-and-painful to configure all those assays (though we did get them done).
BUT, I will admit to being only a partially reformed genomics fan. We got oversold, and it hurt. Much effort was wasted, and just think of the savings if the patent office had declared that you had to have actual causal function to patent a gene! But, much of what we proposed doing still is worth doing -- or has been done. In some sense the genomics companies were just too early for their own good (though the late entrants such as DeCode haven't fared much better). There are no genomics companies -- yet genomics is everywhere. Basic biology fueled by the genome or the technologies pushed by genomics permeate the drug industry (based on the 2 large pharmas I interviewed at in the year MLNM laid me off & what I can read; constructive dissent on this point is welcomed). Probably no novel small molecule drug development history will be directly pinned back to a 1990's genomics effort -- but also virtually no drugs going forward will have their development unaffected by the knowledge of the genome. Everything is tangled up & confused & merged.
The genomics gold rush was insane & wasteful -- but they were fun times!
I should know -- I was there. I was more than just there, I was a significant part of it. No, I didn't think it up & I won't try to exaggerate my importance, but for what is perhaps the poster child of genomics excess (and if not that, certainly in the Pantheon of genomanic deities).
When I got to Millennium they were still largely focused on the positional cloning of disease genes. But, they had started throwing sequencing capacity at ESTs, small bits of genetic message which serve as toeholds to larger ones. The catch was that the sequencing analysis software had been designed for positional cloning work & not ESTs, and it's a very different ballgame. When sequencing genomic DNA seeing anything which looked like a gene was interesting. But when sequencing stuff that is almost nothing but genes, the challenge was to sort the wheat from the chaff. Lots of scientists spent mind-numbing hours scanning BLAST reports for things of interest, and often found things. But this is a lousy technique -- not only might eyes glaze over (or neurons croak) from monotony, but a really interesting match might not be obvious -- what if the top hit was "Uncharacterized protein X" but the 3rd match down was "TotalPharmaceuticalGold"? Or worse, that BLAST couldn't even find a useable match? Plus, was that a match or an identity -- did you find something new or just rediscover a lousy fragment of the old? More mind numbing staring.
Enter a cocky recent Ph.D. After building up some expertise and some more refined tools (which in their embryonic form nailed me the one gene patent of mine perhaps worth something), I had built a system which churned through all the ESTs and crudely organized them by what made things interesting (and tried to ignore all the boring stuff). Ion channels -- look on this web page. GPCRs -- that's over here. Possible secreted proteins, look at this analysis. Furthermore, it also attempted to amalgamate all the different ESTs into a view which was higher quality, longer and more compact -- and tell you which things were already described as proteins and which might be novel. Plus, more sensitive algorithms than BLAST were used to pull things into families.
Now in all honesty, it wasn't nearly perfect. Some of the mind-numbing review had shifted to me -- the early versions in particular had every homology approved (and named!) by me. The semi-automatically generated names were ugly. Various EST artifacts could join webs of unrelated genes into a horrible tangle. But, now there could be reviews of consolidated, pre-analyzed data (though also in fairness nobody ever totally trusted it, so the manual sequence-by-sequence reviews often continued).
Of course, if you have a mountain of loot you probably want to protect it. Enter the lawyers. Millennium had always filed on their discoveries; now they had lots of discoveries to protect. But protect from what? Well, the paranoia was a loss of "Freedom to Operate", usually known as FTO. Nobody knew what would stand up as a patent -- but there were instructive examples from the early biotech era of business plans sunk by a loss of FTO -- and expensive lawsuits that clearly marked that loss. So the patenting engine took off -- an expensive insurance policy against an unpredictable future.
Of course, what the lawyers wanted for the filing was as much info as possible -- and the automated analyses provided lots for them. But, they had been designed to be viewed in a web browser individually, not printed out en masse. Worse yet, by this time Informatics & Legal were in separate buildings -- one of my least pleasant Millennium memories was trying to script the printing a raft of analyses on a printer located in the other building. Plus, if there were inventions then somebody had to have invented them -- such as the person who wrote the code to find them & then reviewed the initial output. And so, I started having dates with the paralegals, an hour of hand-cramping signing of document after document. At one point, there were somewhere between 120-140 patent applications where I was sole or co-inventor.
This was the late 90's and the hype was getting thick -- we were guilty but so were others. Millennium wasn't a big pusher of high gene counts -- at least in the terms of the day (but that's another whole story), but certainly we started selling all those genes we had & the ones we extrapolated were still out there. A key part of the business model was to sell the genes many times -- if we could sell the same gene to Lilly for cardiovascular & Roche for metabolic and AstraZeneca for inflammation, all the better. Not that anything underhanded went on; we'd present the case to each company & most of the deals had exclusivity only within a therapeutic area.
How much did we believe our own Kool Aid? It varied. There was one day where I got in a blue mood because I convinced myself that once MLNM found all the genes we'd put ourselves out of work! But that was an extreme ( and what I hope is the height of my own personal stupidity); most of the time we thought we might be right or we might be overestimating a bunch -- but that our partners were intelligent adults who could make the same calculations. Never did I see an attitude that we were fleecing the suckers.
In particular, I remember one of my colleagues making a comment when the Bayer deal was about to be signed. A premise of that deal is that Millennium would identify proteins which could be easily screened, associate them by multiple means with a plausible role in disease, configure an HTS assay for them -- and then Bayer would quickly get hits from their libraries. Those hits in turn would be used to finish determining whether the protein of interest really played a role in disease. MLNM's (over)confidence in genomics matched by Bayer's (over)confidence in chemistry. My colleague said it was one thing to think up such an idea -- and another to 'go over the cliff' -- and he was nervously surprised that someone else was joining us. He was one of the most sober minded fellows around & wasn't making allusions to
Bayer being foolhardy -- just that we were both taking the leap together. Alas, I didn't think to laugh & reply "The fall will kill you".
The genomics rush, alas, did not end with a huge rush of new drug candidates. We thought we'd get a huge leap in biology -- and we did, but not as big as we thought. Traditional drug development & biology had cleaned out the easy stuff; there weren't tons of hidden gems. The chemical biology concept pretty much disappeared from the Bayer collaboration -- turned out it was long-and-painful to configure all those assays (though we did get them done).
BUT, I will admit to being only a partially reformed genomics fan. We got oversold, and it hurt. Much effort was wasted, and just think of the savings if the patent office had declared that you had to have actual causal function to patent a gene! But, much of what we proposed doing still is worth doing -- or has been done. In some sense the genomics companies were just too early for their own good (though the late entrants such as DeCode haven't fared much better). There are no genomics companies -- yet genomics is everywhere. Basic biology fueled by the genome or the technologies pushed by genomics permeate the drug industry (based on the 2 large pharmas I interviewed at in the year MLNM laid me off & what I can read; constructive dissent on this point is welcomed). Probably no novel small molecule drug development history will be directly pinned back to a 1990's genomics effort -- but also virtually no drugs going forward will have their development unaffected by the knowledge of the genome. Everything is tangled up & confused & merged.
The genomics gold rush was insane & wasteful -- but they were fun times!
Friday, June 20, 2008
Don't do it Josh!
The Globe this week had a number of articles on the passing of the $1B biotech bill in Massachusetts and the proxy fight for Biogen Idec. But a third item really raised my eyebrows.
Vertex's CEO Joshua Boger announced that Vertex is contemplating moving out of the state. The apparent driver of this is a concern that Vertex might outgrow the Boston area and that now might be the time to move, before the company grows even larger. Previous discussion of moving had produced a striking plan to relocate to the Boston waterfront.
Now, I'll confess a certain personal interest. I'm probably going to be in this area for most of my employment life, so I don't want to see employers leave (I can see Vertex headquarters from my office). Furthermore, I believe big companies like Vertex, BiogenIdec and such have a beneficial effect on their overall corporate neighborhood -- they tend to grow more talent than they need and those persons tend to start new ventures near the old ones.
Which is the point -- people don't really like to move. Yes, some folks will follow their job to the ends of the earth, but a lot of folks won't. So atop the disruption & distraction of moving, a lot of good people will leave in a short timespan. My general prejudice is that planners recognize such costs but then grossly underestimate them.
Why might Vertex be contemplating such a move? The most cynical explanation is to try to extract tax incentives from either Massachusetts or wherever they move to. Such incentives have driven previous moves or new sites, with mixed success. Rhode Island trumpeted extracting Alpha-Beta from Massachusetts, until Alpha-Beta failed in the clinic and disappeared into the dust.
More practically Boston does have its drawbacks & tradeoffs. Traffic is awful; but that's true of a lot of America. Housing prices are insane. Neither of these encourages new workers. On the other hand, the academic & hospital environment is huge and Boston has a decent transit system, which somewhat offsets the traffic issue. It is striking that so many large biotech & pharma have been trying to move in to Cambridge/Boston over the last decade or so (Merck, Novartis, Schering, Astra, Amgen, Sanofi-Aventis, etc).
But in any case, I return to my main argument. I'm sure Vertex could thrive in many places -- Boston is not Mecca, and if they moved they would recover and thrive again -- but after paying a steep price of disruption & lost talent.
Are there other options? One of course is to stick it out in Boston. Another is to have multiple locations, which incurs its own inefficiencies. No solution is perfect. But please leave migrations for the birds!
Vertex's CEO Joshua Boger announced that Vertex is contemplating moving out of the state. The apparent driver of this is a concern that Vertex might outgrow the Boston area and that now might be the time to move, before the company grows even larger. Previous discussion of moving had produced a striking plan to relocate to the Boston waterfront.
Now, I'll confess a certain personal interest. I'm probably going to be in this area for most of my employment life, so I don't want to see employers leave (I can see Vertex headquarters from my office). Furthermore, I believe big companies like Vertex, BiogenIdec and such have a beneficial effect on their overall corporate neighborhood -- they tend to grow more talent than they need and those persons tend to start new ventures near the old ones.
Which is the point -- people don't really like to move. Yes, some folks will follow their job to the ends of the earth, but a lot of folks won't. So atop the disruption & distraction of moving, a lot of good people will leave in a short timespan. My general prejudice is that planners recognize such costs but then grossly underestimate them.
Why might Vertex be contemplating such a move? The most cynical explanation is to try to extract tax incentives from either Massachusetts or wherever they move to. Such incentives have driven previous moves or new sites, with mixed success. Rhode Island trumpeted extracting Alpha-Beta from Massachusetts, until Alpha-Beta failed in the clinic and disappeared into the dust.
More practically Boston does have its drawbacks & tradeoffs. Traffic is awful; but that's true of a lot of America. Housing prices are insane. Neither of these encourages new workers. On the other hand, the academic & hospital environment is huge and Boston has a decent transit system, which somewhat offsets the traffic issue. It is striking that so many large biotech & pharma have been trying to move in to Cambridge/Boston over the last decade or so (Merck, Novartis, Schering, Astra, Amgen, Sanofi-Aventis, etc).
But in any case, I return to my main argument. I'm sure Vertex could thrive in many places -- Boston is not Mecca, and if they moved they would recover and thrive again -- but after paying a steep price of disruption & lost talent.
Are there other options? One of course is to stick it out in Boston. Another is to have multiple locations, which incurs its own inefficiencies. No solution is perfect. But please leave migrations for the birds!
Thursday, April 10, 2008
Sayonara Millennium?
Boy, if today's news can't break me out of my blogging neglect, then nothing can. Japanese pharma Takeda is buying my old shop for a 50% premium, putting MLNM's share price to a level it hasn't seen since before the Cor merger mistake & market cap at a level unseen since the genomics bubble.
Reports are still coming in, but apparently Takeda is really buying the company -- it is not a raid for the pipeline assets but an attempt to get more or less the whole enchilada. Retention plans are rumored to be in place & it's claimed Dunsire will be staying on. On the other hand, time will tell if Sidney Street will soon feel like a tepanaki table (at least I got the cuisine right this time!) with the chef twirling a large cleaver. A lot of the key folks from Cor were supposed to drive MLNM forward, but they pretty much all bailed after a while.
Management always wanted to get a Japanese deal going, but nothing ever seemed to go beyond secretive hints. Finally, it comes in and it is the ultimate deal.
Many thoughts spring to mind, and perhaps I'll try to cover some later. But in particular, was this the result of a deliberate selling attempt or just some talks that blossomed? Two years ago MLNM refused to sell to an unnamed suitor (though one friend of mine joked about it with a lawyer at a local biotech & decided he'd love to play poker with the lawyer, given the size of the 'tell'); this time Takeda was apparently welcomed with open arms. It will be interesting to see what the merger materials say about the timeline of the deal.
Another key question is how tightly will Takeda attempt to integrate with Millennium? MLNM isn't the same loose place it was when the CEO dressed in drag every October (and just before I got there the high jinx bordered on Animal House), but it still had a soupcon of a laid back atmosphere. Last time I was in the lobby there was a display of each year's T-shirt; not your usual corporate display. I haven't had much dealing with Japanese companies, but this certainly doesn't fit the stereotype. Perhaps Takeda will see the wisdom in a largely hands-off approach, much like Warren Buffett does with his acquisitions -- the parent company funds the subsidiaries but otherwise just acts like a typical board member (though with Buffett, that's still a bit activist). Notable Buffett companies include a prominent local furniture store (where you can go to the movies or try to get free furniture if the Sox sweep the Series again) and the one insurance company unafraid to admit to a reptilian quality.
On the other hand, in theory the greatest value comes from integrating -- cross synergies, reduced duplicative effort, etc. My skepticism of such an approach scales with the distance both physical & cultural, so I doubt it would work. I've recently heard from a former colleague now in a large multi-national pharma how badly its integrated, and it's a company which has had years to do so & common language and nationality.
In any case, I'm sure the weekly sushi day in the cafe will be more popular than ever.
Reports are still coming in, but apparently Takeda is really buying the company -- it is not a raid for the pipeline assets but an attempt to get more or less the whole enchilada. Retention plans are rumored to be in place & it's claimed Dunsire will be staying on. On the other hand, time will tell if Sidney Street will soon feel like a tepanaki table (at least I got the cuisine right this time!) with the chef twirling a large cleaver. A lot of the key folks from Cor were supposed to drive MLNM forward, but they pretty much all bailed after a while.
Management always wanted to get a Japanese deal going, but nothing ever seemed to go beyond secretive hints. Finally, it comes in and it is the ultimate deal.
Many thoughts spring to mind, and perhaps I'll try to cover some later. But in particular, was this the result of a deliberate selling attempt or just some talks that blossomed? Two years ago MLNM refused to sell to an unnamed suitor (though one friend of mine joked about it with a lawyer at a local biotech & decided he'd love to play poker with the lawyer, given the size of the 'tell'); this time Takeda was apparently welcomed with open arms. It will be interesting to see what the merger materials say about the timeline of the deal.
Another key question is how tightly will Takeda attempt to integrate with Millennium? MLNM isn't the same loose place it was when the CEO dressed in drag every October (and just before I got there the high jinx bordered on Animal House), but it still had a soupcon of a laid back atmosphere. Last time I was in the lobby there was a display of each year's T-shirt; not your usual corporate display. I haven't had much dealing with Japanese companies, but this certainly doesn't fit the stereotype. Perhaps Takeda will see the wisdom in a largely hands-off approach, much like Warren Buffett does with his acquisitions -- the parent company funds the subsidiaries but otherwise just acts like a typical board member (though with Buffett, that's still a bit activist). Notable Buffett companies include a prominent local furniture store (where you can go to the movies or try to get free furniture if the Sox sweep the Series again) and the one insurance company unafraid to admit to a reptilian quality.
On the other hand, in theory the greatest value comes from integrating -- cross synergies, reduced duplicative effort, etc. My skepticism of such an approach scales with the distance both physical & cultural, so I doubt it would work. I've recently heard from a former colleague now in a large multi-national pharma how badly its integrated, and it's a company which has had years to do so & common language and nationality.
In any case, I'm sure the weekly sushi day in the cafe will be more popular than ever.
Thursday, December 27, 2007
A Vertex by the sea?
If one thinks like a builder, it is not difficult to scan the prime biotech zone in Cambridge and see it full at some point. I remember bicycling to the Harvard Medical School in the 90's past an empty zone with just a couple of lone buildings; those buildings are now thickly surrounded, save some parkland. There are still some parking lots that might be made over, but in general there isn't a lot of free space left. Some single-story buildings might go (someone must be eyeing the boarded up saloon up the street from Novartis, but not those close to residential land -- which is a lot of them -- and there isn't much room up. Between a general Cantabrigian disdain for high rises & fire department restrictions on where labs can go, up is not a great option for biotech.
Throughout the zone there are also other uses for what space there is. MIT owns much of the land, and must be wondering whether it will be hemmed in. Urban planning has shifted away towards favoring a variety of uses, and so some of the new development in the zone has gone to residences, hotels & shops -- a good thing, too! Hopefully ways will be found to preserve some of the grittier older businesses, the car repair shops & such that are so convenient. But space must be found, or the biotech industry will stagnate.
There is a lot of open space to the far east, where once a large railroad yard sat in the netherlands between Cambridge and Charlestown. New buildings are springing up there & the developers have already advertised in biotech real estate sections.
However, others are thinking of a really big conceptual leap. In a Globe article before Christmas it was revealed that Vertex is contemplating moving their entire operation to new buildings to be constructed at Fan Pier. This is an area just off the center of Boston and on the waterfront, and which is in an area which is becoming a magnet for development. Better road connections, thanks to the Big Dig, a new transit line, a new federal courthouse, and the new convention center have led to other businesses, such as restaurants and hotels.
It's not hard to see the attraction of the place. Walkable to downtown Boston and a short walk to the transit hub (intercity & commuter train, bus, subway) at South Station. Within site (across the water, traversed by a tunnel) of the airport.
The obvious uses of this space were offices (particularly legal ones; the courthouse is next door) -- but biotech? I wouldn't have thought of it, but somebody did. It's a bold move, one to announce that Vertex has arrived as a FIPCO (Fully Integrated Pharmaceutical Company). The developer has already started acquiring permits around buildings suitable for lab space. And the location has other perks -- nearby Red Line access to the Harvard & MIT campuses, so it's almost like being in Cambridge. The new transit line doesn't yet go many places, but if a proposed tunnel is dug it could connect to the Longwood Hospitals area.
Despite all the hubbub in Cambridge, Boston itself doesn't host much biotech. I think there is some incubator-type space over in Charlestown and maybe some bits elsewhere, but mostly the main city plays a subsidiary role. Remote sites such as a derelict state hospital have sometimes been proposed, but nothing much has happened -- perhaps this could jump-start other unconventional locations for biotech in the Hub of the Universe.
Throughout the zone there are also other uses for what space there is. MIT owns much of the land, and must be wondering whether it will be hemmed in. Urban planning has shifted away towards favoring a variety of uses, and so some of the new development in the zone has gone to residences, hotels & shops -- a good thing, too! Hopefully ways will be found to preserve some of the grittier older businesses, the car repair shops & such that are so convenient. But space must be found, or the biotech industry will stagnate.
There is a lot of open space to the far east, where once a large railroad yard sat in the netherlands between Cambridge and Charlestown. New buildings are springing up there & the developers have already advertised in biotech real estate sections.
However, others are thinking of a really big conceptual leap. In a Globe article before Christmas it was revealed that Vertex is contemplating moving their entire operation to new buildings to be constructed at Fan Pier. This is an area just off the center of Boston and on the waterfront, and which is in an area which is becoming a magnet for development. Better road connections, thanks to the Big Dig, a new transit line, a new federal courthouse, and the new convention center have led to other businesses, such as restaurants and hotels.
It's not hard to see the attraction of the place. Walkable to downtown Boston and a short walk to the transit hub (intercity & commuter train, bus, subway) at South Station. Within site (across the water, traversed by a tunnel) of the airport.
The obvious uses of this space were offices (particularly legal ones; the courthouse is next door) -- but biotech? I wouldn't have thought of it, but somebody did. It's a bold move, one to announce that Vertex has arrived as a FIPCO (Fully Integrated Pharmaceutical Company). The developer has already started acquiring permits around buildings suitable for lab space. And the location has other perks -- nearby Red Line access to the Harvard & MIT campuses, so it's almost like being in Cambridge. The new transit line doesn't yet go many places, but if a proposed tunnel is dug it could connect to the Longwood Hospitals area.
Despite all the hubbub in Cambridge, Boston itself doesn't host much biotech. I think there is some incubator-type space over in Charlestown and maybe some bits elsewhere, but mostly the main city plays a subsidiary role. Remote sites such as a derelict state hospital have sometimes been proposed, but nothing much has happened -- perhaps this could jump-start other unconventional locations for biotech in the Hub of the Universe.
Tuesday, November 20, 2007
Gene Logic successfully repositions, Ore What?
Gene Logic today announced that Pfizer has filed a patent based on a Gene Logic drug repositioning effort. This would appear to be one of the most significant votes of confidence in such efforts by an outside partner.
Drug repositioning is the idea of finding new therapeutic uses for advanced compounds, particularly compounds which are very advanced but failed due to poor efficacy in the originally targeted disease. A number of companies have sprung up in this field -- the two I am most familiar with are Gene Logic and Genstruct -- and at least some large pharmas have in-house programs.
The reality is that many existing drugs have origins in therapeutic areas which are quite different than those they started in. Perhaps the most notorious case is Viagra, which was muddling along as an anti-hypertensive until an unusual side effect was spotted. Minoxidil similarly began in the anti-hypertensive until its side effect was noted. The route to some psychiatric medications began with anti-tuberculosis agents and antihistamines. I doubt that's a complete list.
Gene Logic is one of the original cohort of genomics companies and has been through many iterations of business plan. If memory serves, they were one of several companies originally built around a differential display technology, a way of obtaining mRNA signatures for diseases which predated microarrays. Gene Logic later became one of the major players in the toxicogenomics space, and as part of that effort built a large in-house Affy-based microarray effort. They built microarray databases for a number of disease areas (I've used their oncology database), built a sizable bioinformatics effort, and even acquired their own CRO.
However, none of that could quite be converted into a stream of gold, so over the last year or so the whole mess has been deconstructed, leaving behind the drug repositioning business which had begun as a unit of Millennium (which is one reason I'm familiar with it). They'll even be changing their name soon, to Ore Pharmaceuticals (presumably Overburden and Slag, while appropriate for the mining theme, did not last long in the naming queue).
While there is certainly historical precedent for repositioning, the question remains whether companies can make money doing it, and whether those companies will simply be the big pharmas or the gaggle of biotechs chasing after the concept. Depending on the company, some mixture of in vivo models, in vitro models and computational methods are used. One way to think of it is doing drug discovery, but with a compound which already has safety data on it. There is also extensive interest in the concept in the academic sector, which is a very good thing -- many drugs which may be repositionable have little or no patent life yet, meaning companies will find it difficult to invest in them with any hope for a return.
Gene Logic / Ore has one repositioned drug which has gone through clinical trials, GL1001 (nee MLN4760). This is a drug originally developed by Millennium as an ACE2 inhibitor. Since I'm among the discoverers of ACE2, I tend to key an eye on this one. Millennium gave it a whirl in obesity, but now Gene Logic has found a signal in inflammatory bowel disease in animal models.
That Pfizer bothered to file a patent is significant, as it triggered a milestone payment -- amount unspecified, but these are usually something interesting. But that is still a long way from starting a new trial -- that will be the real milestone, and whichever drug repositioning firm can claim that will really be crowing -- that is, until somebody actually gets a drug approved this way.
Drug repositioning is the idea of finding new therapeutic uses for advanced compounds, particularly compounds which are very advanced but failed due to poor efficacy in the originally targeted disease. A number of companies have sprung up in this field -- the two I am most familiar with are Gene Logic and Genstruct -- and at least some large pharmas have in-house programs.
The reality is that many existing drugs have origins in therapeutic areas which are quite different than those they started in. Perhaps the most notorious case is Viagra, which was muddling along as an anti-hypertensive until an unusual side effect was spotted. Minoxidil similarly began in the anti-hypertensive until its side effect was noted. The route to some psychiatric medications began with anti-tuberculosis agents and antihistamines. I doubt that's a complete list.
Gene Logic is one of the original cohort of genomics companies and has been through many iterations of business plan. If memory serves, they were one of several companies originally built around a differential display technology, a way of obtaining mRNA signatures for diseases which predated microarrays. Gene Logic later became one of the major players in the toxicogenomics space, and as part of that effort built a large in-house Affy-based microarray effort. They built microarray databases for a number of disease areas (I've used their oncology database), built a sizable bioinformatics effort, and even acquired their own CRO.
However, none of that could quite be converted into a stream of gold, so over the last year or so the whole mess has been deconstructed, leaving behind the drug repositioning business which had begun as a unit of Millennium (which is one reason I'm familiar with it). They'll even be changing their name soon, to Ore Pharmaceuticals (presumably Overburden and Slag, while appropriate for the mining theme, did not last long in the naming queue).
While there is certainly historical precedent for repositioning, the question remains whether companies can make money doing it, and whether those companies will simply be the big pharmas or the gaggle of biotechs chasing after the concept. Depending on the company, some mixture of in vivo models, in vitro models and computational methods are used. One way to think of it is doing drug discovery, but with a compound which already has safety data on it. There is also extensive interest in the concept in the academic sector, which is a very good thing -- many drugs which may be repositionable have little or no patent life yet, meaning companies will find it difficult to invest in them with any hope for a return.
Gene Logic / Ore has one repositioned drug which has gone through clinical trials, GL1001 (nee MLN4760). This is a drug originally developed by Millennium as an ACE2 inhibitor. Since I'm among the discoverers of ACE2, I tend to key an eye on this one. Millennium gave it a whirl in obesity, but now Gene Logic has found a signal in inflammatory bowel disease in animal models.
That Pfizer bothered to file a patent is significant, as it triggered a milestone payment -- amount unspecified, but these are usually something interesting. But that is still a long way from starting a new trial -- that will be the real milestone, and whichever drug repositioning firm can claim that will really be crowing -- that is, until somebody actually gets a drug approved this way.
Tuesday, September 25, 2007
A First Commercial Nanopore Foray?
Today's GenomeWeb carried the news that Sequenom has licensed a bit of nanopore technology with the intent of developing a DNA sequencer with it. The press release teases us with the possibility of sub-kilodollar human genomes.
Nanopores are an approach which has been around for at least a decade-and-a-half -- a postdoc was working on it when I showed up in the Church lab in 1992. The general concept is to observe single nucleic acid molecules traversing through a pore. It's a great concept, but has proven difficult to turn into reality. I'm unaware of a true proof-of-concept publication showing significant sequence reads using nanopores, though I won't claim to have really dug in the literature. Even such an experiment would represent a small step but not an imminent technology -- the first polony sequencing paper was in 1999 and only in the last few years has that approach really been made to work.
Which is one reason I'm a bit apprehensive as to who bought the technology. Sequenom has done interesting things and has a great name (I had independently thought of it before the company formed; if only I had thought to cybersquat!). But, they have had a rough time in the marketplace, and were even threatened with NASDAQ delisting a bit over a year ago. Their stock has climbed from that trough, but they're hardly flush: only $33M in the bank and still burning cash at a furious rate. Can Sequenom really invest what it will take to bring nanopores to an operational state, or will nanopores be stuck with a weak dance partner which steps on its toes? I hope they pull it off, but it's hard to be optimistic.
It would also be nice to learn more about the technology. I found the most recent publication of the group, but it is (alas!) in a non-open access journal (Clinical Chemistry, though oddly Entrez claims it is). I might spring the $15 to read it, but that's not exactly a good habit to get into. The most enticing bit in that the current version apparently relies on generating cleverly-labeled DNA polymers that somehow transfer the original sequence information ("Designed DNA polymers") and then detecting the sequence due to passage through the nanopore activating the labels. It sounds clever, but moves away from the original vision of really, really long read lengths by reading DNA directly through the nanopore. The question then becomes how accurate is that conversion process and what sorts of artifacts does it generate?
Nanopores are an approach which has been around for at least a decade-and-a-half -- a postdoc was working on it when I showed up in the Church lab in 1992. The general concept is to observe single nucleic acid molecules traversing through a pore. It's a great concept, but has proven difficult to turn into reality. I'm unaware of a true proof-of-concept publication showing significant sequence reads using nanopores, though I won't claim to have really dug in the literature. Even such an experiment would represent a small step but not an imminent technology -- the first polony sequencing paper was in 1999 and only in the last few years has that approach really been made to work.
Which is one reason I'm a bit apprehensive as to who bought the technology. Sequenom has done interesting things and has a great name (I had independently thought of it before the company formed; if only I had thought to cybersquat!). But, they have had a rough time in the marketplace, and were even threatened with NASDAQ delisting a bit over a year ago. Their stock has climbed from that trough, but they're hardly flush: only $33M in the bank and still burning cash at a furious rate. Can Sequenom really invest what it will take to bring nanopores to an operational state, or will nanopores be stuck with a weak dance partner which steps on its toes? I hope they pull it off, but it's hard to be optimistic.
It would also be nice to learn more about the technology. I found the most recent publication of the group, but it is (alas!) in a non-open access journal (Clinical Chemistry, though oddly Entrez claims it is). I might spring the $15 to read it, but that's not exactly a good habit to get into. The most enticing bit in that the current version apparently relies on generating cleverly-labeled DNA polymers that somehow transfer the original sequence information ("Designed DNA polymers") and then detecting the sequence due to passage through the nanopore activating the labels. It sounds clever, but moves away from the original vision of really, really long read lengths by reading DNA directly through the nanopore. The question then becomes how accurate is that conversion process and what sorts of artifacts does it generate?
Wednesday, September 05, 2007
Iconix bows out
One of the end-of-summer news items is that toxicogenomics firm Iconix will be purchased by Entelos, one of the small group of physiology modeling firms out there. The deal is worth between $14.1M and $39M, dependent on certain milestones.
Toxicogenomics is an area which just hasn't panned out as a business model. This was one of Gene Logic's big pushes, but they're mostly seem to be driving on their drug repositioning these days. At least one other company in the area came and went, along with my memory of their name.
Toxicogenomics has a strong appeal. Iconix & Gene Logic at the first level looked very similar: many compounds screened against key toxicology sites (liver, kidney) by microarray & then digested into predictive algorithms. In concept, you run your own compounds in the same models & profile them and then see which patterns come up. If you see a nasty red flag going up, the compound dies early and cheaply.
Iconix had a nice little roadshow that would stop in Boston every 2 years or so with a mix of academic and industrial folks talking about toxicogenomics & its near cousin genomic-profiling-for-mechanism-of-action-determination (MOAmics?). I went to at least two of them: I was interested & it didn't hurt they were free.
As low as $14M seems pretty cheap, and Gene Logic's downplaying of this business also suggests that the market is not strong for these services. Part of the catch is the size of your database: customers aren't going to like to find ugly effects later when they screened more expensive systems. If the Anna Karenina principle extends to toxicology (each unhappy compound is unhappy in its own way, or nearly so, then your database can never be big enough. The rosier view is that you simply get profiles for 'kidney unhappiness' or 'liver unhappiness' which are downstream of the unique insult. In any case, building up big databases of profiles isn't cheap, though that price is falling with various innovations -- so perhaps some of these companies were just too early for their own good.
One of the positions I explored after Millennium had a fair dose of toxicogenomics, suggesting that industry hasn't given up. But it may well be yet another area where Big Pharma doesn't really see the advantage of small biotech in doing it, or perhaps doesn't trust that work to outsiders. Myself, I was involved in a tiny way with one toxicogenomics project at Millennium, which had also decided to mostly go the in-house route (though they did license in the Gene Logic database) -- right before toxicogenomics pretty much disappeared. Actually, that wasn't the first genomics project at Millennium where I arrived just in time for the shutdown -- at least one other project (antibody production) had the same synopsis. Not something I want to think about too hard...
Toxicogenomics is an area which just hasn't panned out as a business model. This was one of Gene Logic's big pushes, but they're mostly seem to be driving on their drug repositioning these days. At least one other company in the area came and went, along with my memory of their name.
Toxicogenomics has a strong appeal. Iconix & Gene Logic at the first level looked very similar: many compounds screened against key toxicology sites (liver, kidney) by microarray & then digested into predictive algorithms. In concept, you run your own compounds in the same models & profile them and then see which patterns come up. If you see a nasty red flag going up, the compound dies early and cheaply.
Iconix had a nice little roadshow that would stop in Boston every 2 years or so with a mix of academic and industrial folks talking about toxicogenomics & its near cousin genomic-profiling-for-mechanism-of-action-determination (MOAmics?). I went to at least two of them: I was interested & it didn't hurt they were free.
As low as $14M seems pretty cheap, and Gene Logic's downplaying of this business also suggests that the market is not strong for these services. Part of the catch is the size of your database: customers aren't going to like to find ugly effects later when they screened more expensive systems. If the Anna Karenina principle extends to toxicology (each unhappy compound is unhappy in its own way, or nearly so, then your database can never be big enough. The rosier view is that you simply get profiles for 'kidney unhappiness' or 'liver unhappiness' which are downstream of the unique insult. In any case, building up big databases of profiles isn't cheap, though that price is falling with various innovations -- so perhaps some of these companies were just too early for their own good.
One of the positions I explored after Millennium had a fair dose of toxicogenomics, suggesting that industry hasn't given up. But it may well be yet another area where Big Pharma doesn't really see the advantage of small biotech in doing it, or perhaps doesn't trust that work to outsiders. Myself, I was involved in a tiny way with one toxicogenomics project at Millennium, which had also decided to mostly go the in-house route (though they did license in the Gene Logic database) -- right before toxicogenomics pretty much disappeared. Actually, that wasn't the first genomics project at Millennium where I arrived just in time for the shutdown -- at least one other project (antibody production) had the same synopsis. Not something I want to think about too hard...
Wednesday, August 08, 2007
Peeking in on the Old Homestead
I had the occasion to walk by 640 Memorial Drive, the building in which I spent half of my Millennium career. It's a grand old building with an interesting history.
640 was original built by Henry Ford as an automobile assembly plant located close to a major market -- shipping cars from Michigan was proving troublesome and he wanted an alternative. To economize on land, he envisioned a semi-vertical assembly line -- the standard assembly line would be folded into a series of floors. Giant overhead cranes would lift parts and semi-completed assemblies between floors. The scheme proved impractical, and Ford later built a conventional assembly line over in Somerville. The building went through a number of industrial uses, including being a Polaroid camera assembly plant. It was apparently quite an eyesore in the late 80's, but by the time I first noticed it in the mid-90's it had been rehabbed very nicely. The huge bay once ranged by the cranes is now a soaring atrium & the site of the old railyard is parking.
When I interviewed at Millennium in 1996 they occupied top 2 floors, and by the time I arrived a portion of the middle (3rd) floor had been taken, plus the mouse facility in the basement. Eventually, another major tenant in the building (who made medical alert bracelet systems) was enticed to vamoose, leaving only a single other tenant (a pathology lab).
Around the time I moved back into 640 in 1999 there was a huge effort to fit out all this space. But, before a few years passed Millennium started its deflation and the parking lot starting getting empty again. Eventually, everyone moved out, leaving Millennium with an empty building with a lot of lease left on it.
I peered in a few windows and was surprised to see more occupied than expected. I didn't have time to browse a lot, but while some 1st floor offices were clearly vacant some of the space on the 2nd and 3rd floors were clearly occupied -- though I think my old haunt wasn't. I know there was at least recently some significant lab space vacant, as Codon took a look at it.
Millennium has, of course, been trying to unload the space ever since they moved out. Because it was lumped into restructuring costs, the space was absolutely off-limits -- even when a major power failure crippled the other buildings, 640 was not even seriously considered -- accounting rules are rules.
Which brings up a question. A major reason for vacating buildings was to save money, and even renting empty space is cheaper than having it occupied (light, heat, security, IT support, etc). But, a huge chunk of the cost savings were supposed to come from subletting the space -- a story repeated with other facilities. I wonder how big the gap is (and how fast it is growing) between projected savings and actual ones. Perhaps its buried in a financial statement somewhere, but it is certainly not a bit of forecasting anybody is going to be crowing about.
640 was original built by Henry Ford as an automobile assembly plant located close to a major market -- shipping cars from Michigan was proving troublesome and he wanted an alternative. To economize on land, he envisioned a semi-vertical assembly line -- the standard assembly line would be folded into a series of floors. Giant overhead cranes would lift parts and semi-completed assemblies between floors. The scheme proved impractical, and Ford later built a conventional assembly line over in Somerville. The building went through a number of industrial uses, including being a Polaroid camera assembly plant. It was apparently quite an eyesore in the late 80's, but by the time I first noticed it in the mid-90's it had been rehabbed very nicely. The huge bay once ranged by the cranes is now a soaring atrium & the site of the old railyard is parking.
When I interviewed at Millennium in 1996 they occupied top 2 floors, and by the time I arrived a portion of the middle (3rd) floor had been taken, plus the mouse facility in the basement. Eventually, another major tenant in the building (who made medical alert bracelet systems) was enticed to vamoose, leaving only a single other tenant (a pathology lab).
Around the time I moved back into 640 in 1999 there was a huge effort to fit out all this space. But, before a few years passed Millennium started its deflation and the parking lot starting getting empty again. Eventually, everyone moved out, leaving Millennium with an empty building with a lot of lease left on it.
I peered in a few windows and was surprised to see more occupied than expected. I didn't have time to browse a lot, but while some 1st floor offices were clearly vacant some of the space on the 2nd and 3rd floors were clearly occupied -- though I think my old haunt wasn't. I know there was at least recently some significant lab space vacant, as Codon took a look at it.
Millennium has, of course, been trying to unload the space ever since they moved out. Because it was lumped into restructuring costs, the space was absolutely off-limits -- even when a major power failure crippled the other buildings, 640 was not even seriously considered -- accounting rules are rules.
Which brings up a question. A major reason for vacating buildings was to save money, and even renting empty space is cheaper than having it occupied (light, heat, security, IT support, etc). But, a huge chunk of the cost savings were supposed to come from subletting the space -- a story repeated with other facilities. I wonder how big the gap is (and how fast it is growing) between projected savings and actual ones. Perhaps its buried in a financial statement somewhere, but it is certainly not a bit of forecasting anybody is going to be crowing about.
Thursday, June 28, 2007
Psst! Hot Stock Tip! This company is going to be average!
The last two days have been active on the NASDAQ for the old stomping grounds. Prior to the trading day yesterday a stock analyst upgraded the stock, and MLNM gained about 6% on the day with a trading volume significantly (but less than 2X) above average volume. Today, the company announced some positive results in front-line multiple myeloma treatment, and the stock again turned over 5M+ shares but just nudged up a bit.
What is more than a little funny about yesterday is what the analyst actually said: instead of 'underperforming' the market, he expected Millennium to "Mkt Perform" -- that's right, that it would be exactly middling, spectacularly average, impressively ordinary. Indeed, he put a target on the stock -- $10, or a bit less than what it was selling for that day. For that he was credited with sparking the spike.
What's even more striking is that the day before another investment house downgraded Millennium from 'Overweight' to 'Equal weight'. Each company picks its own jargon, but this is really agreeing -- they both predict Millennium to do as well as the market. Oy!
Far more likely a cause in the spike was leakage of the impending good myeloma news. I've never looked systematically, but good news in biotech seems to be preceded by trading spikes as much as it is followed by them. Periodically someone is nailed for it (and not just domestic design goddesses), but there is probably a lot of leakage that can never be pursued.
I'm sure there are a lot of smart people earning money as stock analysts who carefully consider all the facts and give a well-reasoned opinion free of bias, but they ain't easy to find. For a while I listened to the webcasts of Millennium conference calls, but after a while I realized that (a) no new information came out and (b) some of the questions were too dumb to listen to. Analysts would frequently ask questions whose answer restated what had just been presented, or would ask loaded questions which were completely at odds with the prior presentation. How the senior management answered some of those with a straight face is a testament to their discipline; I would have been lucky to get by with a slight grimace. Some analysts were clearly chummy with company X, and others with company Y, and little could change their minds.
If you look at the whole thing scientifically, the answer is pretty clear: listening to stock analysts is a terrible way to invest. If you want average returns, invest in index funds. If you want to soundly beat the averages, start looking for leprechauns -- their pots of gold are far more plentiful than functional stock picking schemes. Buy a copy of 'A Random Walk on Wall Street' and sleep easy at night. Yes, there are a few pickers who have done well, but they are so rare they are household names. Plus, there are other challenges: Warren Buffett has an impressive track record, but if he continues it until my retirement his financial longevity will not be the point of amazement.
Disclosure: somewhere in the bank lock box I have a few shares of Millennium left -- I think totaling to about the same as the blue book value on my 11-year old car (though perhaps closer to the eBay value of my used iPod). The fact they are in a bank protected them from the grand post layoff clean out.
What is more than a little funny about yesterday is what the analyst actually said: instead of 'underperforming' the market, he expected Millennium to "Mkt Perform" -- that's right, that it would be exactly middling, spectacularly average, impressively ordinary. Indeed, he put a target on the stock -- $10, or a bit less than what it was selling for that day. For that he was credited with sparking the spike.
What's even more striking is that the day before another investment house downgraded Millennium from 'Overweight' to 'Equal weight'. Each company picks its own jargon, but this is really agreeing -- they both predict Millennium to do as well as the market. Oy!
Far more likely a cause in the spike was leakage of the impending good myeloma news. I've never looked systematically, but good news in biotech seems to be preceded by trading spikes as much as it is followed by them. Periodically someone is nailed for it (and not just domestic design goddesses), but there is probably a lot of leakage that can never be pursued.
I'm sure there are a lot of smart people earning money as stock analysts who carefully consider all the facts and give a well-reasoned opinion free of bias, but they ain't easy to find. For a while I listened to the webcasts of Millennium conference calls, but after a while I realized that (a) no new information came out and (b) some of the questions were too dumb to listen to. Analysts would frequently ask questions whose answer restated what had just been presented, or would ask loaded questions which were completely at odds with the prior presentation. How the senior management answered some of those with a straight face is a testament to their discipline; I would have been lucky to get by with a slight grimace. Some analysts were clearly chummy with company X, and others with company Y, and little could change their minds.
If you look at the whole thing scientifically, the answer is pretty clear: listening to stock analysts is a terrible way to invest. If you want average returns, invest in index funds. If you want to soundly beat the averages, start looking for leprechauns -- their pots of gold are far more plentiful than functional stock picking schemes. Buy a copy of 'A Random Walk on Wall Street' and sleep easy at night. Yes, there are a few pickers who have done well, but they are so rare they are household names. Plus, there are other challenges: Warren Buffett has an impressive track record, but if he continues it until my retirement his financial longevity will not be the point of amazement.
Disclosure: somewhere in the bank lock box I have a few shares of Millennium left -- I think totaling to about the same as the blue book value on my 11-year old car (though perhaps closer to the eBay value of my used iPod). The fact they are in a bank protected them from the grand post layoff clean out.
Wednesday, June 27, 2007
Roche munches again
Roche is on quite a little acquisition spree in the diagnostics business: first went 454 with its first-to-market sequencing-by-synthesis technology, earlier this month it was DNA microarray manufacturer NimbleGen in another friendly action, and now Roche has launched a hostile bid for immunodiagostics company Ventana.
Three companies, three technologies with proven or developing relevance to diagnostics. What else might be in the radar? One possibility would be protein microarrays, though there are few players in the functional array space (useful for scanning patient responses) -- but perhaps an antibody capture array company? Not yet a proven technology, but one to watch.
All of these buys have a strong personalized medicine / genomics-driven medicine angle. Ventana makes an assay for HER2 to complement Genentech/Roche's Herceptin (Roche owns a big chunk of Genentech & I think is the ex-US distributor); 454 and Nimblegen are solidly in the genomics arena. Roche already has Affy-based chips out for drug metabolizing enzyme polymorphisms.
Three companies, three technologies with proven or developing relevance to diagnostics. What else might be in the radar? One possibility would be protein microarrays, though there are few players in the functional array space (useful for scanning patient responses) -- but perhaps an antibody capture array company? Not yet a proven technology, but one to watch.
All of these buys have a strong personalized medicine / genomics-driven medicine angle. Ventana makes an assay for HER2 to complement Genentech/Roche's Herceptin (Roche owns a big chunk of Genentech & I think is the ex-US distributor); 454 and Nimblegen are solidly in the genomics arena. Roche already has Affy-based chips out for drug metabolizing enzyme polymorphisms.
Tuesday, May 08, 2007
A Tale of Two Drugs
The BIO convention has unleashed a flurry of opinion items on drug pricing bemoaning the high prices of drugs coming out of biotech. While few are as absurd as the one Derek Lowe skewered today, they are coming from voices taken very seriously. Marcia Angell, former New England Journal of Medicine editor-in-chief, was quoted that biotech companies can charge whatever they want due to holding monopolies on their treatments, whereas today's Globe had an op-ed from Harvard prof Jerry Avorn all but proposing a tax on biotech drugs to be earmarked for NIH funding.
In the end the claim is that drug prices could be much, much lower but real pharmaceutical innovation would be preserved or even enhanced. Angell in particular seems to be fond of the White Queen's habit of believing impossible things; a run of Op-Eds from her this year in the Globe alternately excoriated the pharmaceutical industry for spending R&D dollars 'me-too' drug development followed by celebrating that the availability of multiple closely related compounds enabled large payers to bargain with pharma companies over dispensary prices. Angell also fails to explain why companies which can charge 'whatever they want to' don't charge more -- are they idiots? Or is the world really not so neat and tidy.
I'd like to use the stories of two drugs from my former shop to illustrate how complex reality is. The two drugs are Velcade and MLN-02, both entering Millennium's portfolio through the acquisition of Leukosite. Two or so other drugs from other companies will also make appearances, though that's getting ahead of ourselves.
Velcade is Millennium's biggest drug. Nobody can claim it is 'me too': it is the first proteasome inhibitor ever to enter the clinic, and is still the only approved one. Velcade has been approved to treat two cancers of B-cells, multiple myleoma and mantle cell lymphoma. That short list is not for lack of trying: between Millennium, the NCI and individual investigators it has probably been thrown at virtually every known cancer, alone or in combination with standard chemotherapy agents. Positive signals are few and far between and have the nasty habit of disappearing once the trials get large. In many cases, the right dosing or combination may not have been found yet, so oncologists continue to explore Velcade even in indications where it has not succeeded previously.
Now Millennium and its pharma partner J&J do market Velcade, but the effort is quite modest (I don't have numbers, but the U.S. sales force I think is a few hundred). Millennium continues to plow cash into Velcade trials in the hopes of hitting a significant jackpot; MM & MCL are important but won't drive sales to the stratosphere. Velcade was the first agent in over forty years to demonstrate a survival advantage in second line myeloma. Yet Millennium's stock price is stagnant and the company is trimming expenses annually. So why isn't Millennium in clover?
The answer quite simply is competition. Celgene had thalidomide, with its dark history, and thal is quite useful in myleoma. Thal is oral, whereas Velcade is injectable, and convenience wins all other things being equal -- and at the moment there is not hard evidence to say the two drugs aren't comparably effective. But what's really knocking Millennium around is the thalidomide follow-on Revlimid, which is claimed to be significantly less teratogenic. Now Rev is a follow-on and chemically related to thalidomide -- is this a me-too? Revlimid is also oral and is already looking good in front line trials of myeloma, where Millennium hoped to expand Velcade into -- that will probably be successful, but it will be another the same dogfight all over again.
Note that this competition has some very real effects. It is chic in some circles to sneer at worrying about stock prices, but that stock is a very real mechanism for raising money to plow into further R&D. Millennium is still an independent company because it was lucky enough to issue stock near the peak of the biotech bubble; money is still marching out the door faster than it marches in.
Now let's look at MLN02, another drug with an interesting story. MLN02 is an antibody which targets certain integrins, heterodimeric extracellular protein molecules important for the recruitment of immune cells to sites of inflammation. MLN02 targets an integrin believed to be specific to the gut, and so might offer a very specific approach to downregulating excessive immune activity in ulcerative colitis and Crohn's disease.
MLN02 has had a rocky history at Milllennium. Leukosite had partnered with Genentech on the drug, but Genentech later bailed out. A paper was published in the New England Journal with the results of a large study, but even these results are not as clear as one might like. In any case, the development of MLN02 has at times been a top priority on Landsdowne Street, but at other times the drug was essentially tabled. Why?
A lot has to do with the competitive landscape. Crohn's and UC are not huge markets, so dividing the market up isn't very attractive -- especially if the competitor gets there first. The first entrant advantage is quite large in pharmaceuticals. So the tea leaves are read daily -- and the newswires scanned obsessively -- to see what the competition was up to.
One development which iced down MLN02 enthusiasm greatly was the accelerated development of another biotech company's integrin targeting drug -- and that company was large and successful. Their drug would go first for another indication, but might hit Crohn's and/or UC prior to MLN02 could be expected to get there. With lots of safety data from the other indication & some of the same docs prescribing in both areas, the deck would be stacked against a new entrant -- even if MLN02 had the theoretical advantage of being gut-specific. Launch of the potential competitor in the other indication ahead of schedule did not help matters any.
What heated up MLN02 interest again was what happened to that competitor, as it was Biogen's Avonex. Avonex works in MS, but in a very small number of patients a lethal viral infection was enabled by the drug. Suddenly, the competitive landscape was altered -- though with a new regulatory challenge of convincing the regulators that MLN02 really doesn't alter lymphocyte trafficking in the brain.
To some degree, the numbers folks were daily running estimates of what the expected gain from MLN02 would be, given the competitive landscape (I've left the other big player, Remicade, out of the story -- and it is probably going to waltz all over these markets). Even when Avonex was in trouble the models suggested that MLN02 might end up being a money pit after all -- depending on its efficacy and the price payers were willing to pay for it. Biotech has proven many times it is possible to fail by succeeding; your drug works, but not well enough to make it to market -- and there are no money-back guarantees on clinical trials.
Like it or not, money is the lifeblood of pharmaceutical development. Trials are expensive. No matter how much you hacked away at marketing or executive salaries at Millennium, the brutal reality of costly trials and ever changing competitive markets would prevail. We might want to pay less for new medications and perhaps through price caps or other government fiats society may accomplish this desire. But to claim that new drugs will continue to flow as before is to ignore the real world -- dlrugs go forward which are predicted to pay for their development costs and cover the money sunk into expensive failures. Cut the reimbursement rates and you inevitably negatively change the risk-reward perception for every project in development. Some will survive, but many, particularly the MLN02s of the world, will not.
In the end the claim is that drug prices could be much, much lower but real pharmaceutical innovation would be preserved or even enhanced. Angell in particular seems to be fond of the White Queen's habit of believing impossible things; a run of Op-Eds from her this year in the Globe alternately excoriated the pharmaceutical industry for spending R&D dollars 'me-too' drug development followed by celebrating that the availability of multiple closely related compounds enabled large payers to bargain with pharma companies over dispensary prices. Angell also fails to explain why companies which can charge 'whatever they want to' don't charge more -- are they idiots? Or is the world really not so neat and tidy.
I'd like to use the stories of two drugs from my former shop to illustrate how complex reality is. The two drugs are Velcade and MLN-02, both entering Millennium's portfolio through the acquisition of Leukosite. Two or so other drugs from other companies will also make appearances, though that's getting ahead of ourselves.
Velcade is Millennium's biggest drug. Nobody can claim it is 'me too': it is the first proteasome inhibitor ever to enter the clinic, and is still the only approved one. Velcade has been approved to treat two cancers of B-cells, multiple myleoma and mantle cell lymphoma. That short list is not for lack of trying: between Millennium, the NCI and individual investigators it has probably been thrown at virtually every known cancer, alone or in combination with standard chemotherapy agents. Positive signals are few and far between and have the nasty habit of disappearing once the trials get large. In many cases, the right dosing or combination may not have been found yet, so oncologists continue to explore Velcade even in indications where it has not succeeded previously.
Now Millennium and its pharma partner J&J do market Velcade, but the effort is quite modest (I don't have numbers, but the U.S. sales force I think is a few hundred). Millennium continues to plow cash into Velcade trials in the hopes of hitting a significant jackpot; MM & MCL are important but won't drive sales to the stratosphere. Velcade was the first agent in over forty years to demonstrate a survival advantage in second line myeloma. Yet Millennium's stock price is stagnant and the company is trimming expenses annually. So why isn't Millennium in clover?
The answer quite simply is competition. Celgene had thalidomide, with its dark history, and thal is quite useful in myleoma. Thal is oral, whereas Velcade is injectable, and convenience wins all other things being equal -- and at the moment there is not hard evidence to say the two drugs aren't comparably effective. But what's really knocking Millennium around is the thalidomide follow-on Revlimid, which is claimed to be significantly less teratogenic. Now Rev is a follow-on and chemically related to thalidomide -- is this a me-too? Revlimid is also oral and is already looking good in front line trials of myeloma, where Millennium hoped to expand Velcade into -- that will probably be successful, but it will be another the same dogfight all over again.
Note that this competition has some very real effects. It is chic in some circles to sneer at worrying about stock prices, but that stock is a very real mechanism for raising money to plow into further R&D. Millennium is still an independent company because it was lucky enough to issue stock near the peak of the biotech bubble; money is still marching out the door faster than it marches in.
Now let's look at MLN02, another drug with an interesting story. MLN02 is an antibody which targets certain integrins, heterodimeric extracellular protein molecules important for the recruitment of immune cells to sites of inflammation. MLN02 targets an integrin believed to be specific to the gut, and so might offer a very specific approach to downregulating excessive immune activity in ulcerative colitis and Crohn's disease.
MLN02 has had a rocky history at Milllennium. Leukosite had partnered with Genentech on the drug, but Genentech later bailed out. A paper was published in the New England Journal with the results of a large study, but even these results are not as clear as one might like. In any case, the development of MLN02 has at times been a top priority on Landsdowne Street, but at other times the drug was essentially tabled. Why?
A lot has to do with the competitive landscape. Crohn's and UC are not huge markets, so dividing the market up isn't very attractive -- especially if the competitor gets there first. The first entrant advantage is quite large in pharmaceuticals. So the tea leaves are read daily -- and the newswires scanned obsessively -- to see what the competition was up to.
One development which iced down MLN02 enthusiasm greatly was the accelerated development of another biotech company's integrin targeting drug -- and that company was large and successful. Their drug would go first for another indication, but might hit Crohn's and/or UC prior to MLN02 could be expected to get there. With lots of safety data from the other indication & some of the same docs prescribing in both areas, the deck would be stacked against a new entrant -- even if MLN02 had the theoretical advantage of being gut-specific. Launch of the potential competitor in the other indication ahead of schedule did not help matters any.
What heated up MLN02 interest again was what happened to that competitor, as it was Biogen's Avonex. Avonex works in MS, but in a very small number of patients a lethal viral infection was enabled by the drug. Suddenly, the competitive landscape was altered -- though with a new regulatory challenge of convincing the regulators that MLN02 really doesn't alter lymphocyte trafficking in the brain.
To some degree, the numbers folks were daily running estimates of what the expected gain from MLN02 would be, given the competitive landscape (I've left the other big player, Remicade, out of the story -- and it is probably going to waltz all over these markets). Even when Avonex was in trouble the models suggested that MLN02 might end up being a money pit after all -- depending on its efficacy and the price payers were willing to pay for it. Biotech has proven many times it is possible to fail by succeeding; your drug works, but not well enough to make it to market -- and there are no money-back guarantees on clinical trials.
Like it or not, money is the lifeblood of pharmaceutical development. Trials are expensive. No matter how much you hacked away at marketing or executive salaries at Millennium, the brutal reality of costly trials and ever changing competitive markets would prevail. We might want to pay less for new medications and perhaps through price caps or other government fiats society may accomplish this desire. But to claim that new drugs will continue to flow as before is to ignore the real world -- dlrugs go forward which are predicted to pay for their development costs and cover the money sunk into expensive failures. Cut the reimbursement rates and you inevitably negatively change the risk-reward perception for every project in development. Some will survive, but many, particularly the MLN02s of the world, will not.
Friday, May 04, 2007
Biotech Buildings
I had the opportunity today to attend an event at the Genzyme Center and boy is that building a stunner. A soaring atrium contains mobiles which cast rainbows all over the space. There are watercourses and plantings at ground level, glass elevators -- more of a hotel lobby than an office building.
Novartis' Cambridge facility also has a nice atrium, though a bit more staid. Cell Signalling Technologies' lobby on the North Shore resembles a small jungle.
Biotech buildings in Cambridge are a mix. Some renovated older buildings are quite attractive, and some really are pretty plain. New buildings are a mix too. Space is precious, so those atria really shout 'we can afford it!'. Millennium's first custom building (75 Sidney) had a small atrium with a spiral stair (alas, not a double spiral!), but the later buildings used decorative ornamentation (granite in the bathrooms!) and non-rectangular walls in place of unusable air space.
Much as old banks built solid buildings with serious marble & columns to emphasize their solidity & seriousness, so too does a flashy building speak of a company's confidence in its future. Of course, such confidence is all too often misplaced. As a graduate student I watched Hybridon's headquarters emerge from a rehabbed tire warehouse, but then at Millennium I got to see the gorgeous inside -- because Hybridon was subletting the space to us. One company going up, another going down. Later, Millennium started shedding space and discovered that two story atria with a staircase looks nice, but doesn't make subletting the building a floor at a time practical without some changes.
It is also useful to be skeptical of some of the touted benefits of architecture. I am a fan of good architecture, but what looks good doesn't always work well. I love seeing Frank Lloyd Wright houses, but living in one is reputed to require some getting used to. All glass conference room walls may emphasize openess & light, but sometimes you don't really want to be a goldfish. MIT's Stata center is very funky, but just try finding an office in there (and worse, the interior is dead for at least one major cellphone carrier, meaning you can't be guided in).
In the end, some is just a matter of taste. I actually had the privilege of living in a famous bit of architecture for a semester, a Gropius-designed dorm at Harvard. I loved it; most students hated the small rooms. Plus, noise propagated dreadfully (our late night card games were often shut down) and it really didn't work well as a co-ed dorm - only one bathroom per floor, and those were definitely not suitable for unisex use. Worse, you had to go through the stairwell to get to another floor -- and the stairwell was keyed. Don't forget your key at night, or you get locked in a fishbowl in your PJs!
Do architectural gems translate to a happier, more productive workforce? Or are you stuck with a museum piece which resists change? I don't have a crystal ball -- though perhaps you can find a conference room which looks like one.
Novartis' Cambridge facility also has a nice atrium, though a bit more staid. Cell Signalling Technologies' lobby on the North Shore resembles a small jungle.
Biotech buildings in Cambridge are a mix. Some renovated older buildings are quite attractive, and some really are pretty plain. New buildings are a mix too. Space is precious, so those atria really shout 'we can afford it!'. Millennium's first custom building (75 Sidney) had a small atrium with a spiral stair (alas, not a double spiral!), but the later buildings used decorative ornamentation (granite in the bathrooms!) and non-rectangular walls in place of unusable air space.
Much as old banks built solid buildings with serious marble & columns to emphasize their solidity & seriousness, so too does a flashy building speak of a company's confidence in its future. Of course, such confidence is all too often misplaced. As a graduate student I watched Hybridon's headquarters emerge from a rehabbed tire warehouse, but then at Millennium I got to see the gorgeous inside -- because Hybridon was subletting the space to us. One company going up, another going down. Later, Millennium started shedding space and discovered that two story atria with a staircase looks nice, but doesn't make subletting the building a floor at a time practical without some changes.
It is also useful to be skeptical of some of the touted benefits of architecture. I am a fan of good architecture, but what looks good doesn't always work well. I love seeing Frank Lloyd Wright houses, but living in one is reputed to require some getting used to. All glass conference room walls may emphasize openess & light, but sometimes you don't really want to be a goldfish. MIT's Stata center is very funky, but just try finding an office in there (and worse, the interior is dead for at least one major cellphone carrier, meaning you can't be guided in).
In the end, some is just a matter of taste. I actually had the privilege of living in a famous bit of architecture for a semester, a Gropius-designed dorm at Harvard. I loved it; most students hated the small rooms. Plus, noise propagated dreadfully (our late night card games were often shut down) and it really didn't work well as a co-ed dorm - only one bathroom per floor, and those were definitely not suitable for unisex use. Worse, you had to go through the stairwell to get to another floor -- and the stairwell was keyed. Don't forget your key at night, or you get locked in a fishbowl in your PJs!
Do architectural gems translate to a happier, more productive workforce? Or are you stuck with a museum piece which resists change? I don't have a crystal ball -- though perhaps you can find a conference room which looks like one.
Where Biotech?
Biotech's big industry trade group, BIO, is convening in Boston this weekend & that means lots of dough to various media and advertising groups. The radio ads claim 20K biotech leaders will be here. One very visible consequence are billboards around town urging biotech companies to relocate to Las Vegas.
Even without the convention, there are regular TV and radio spots with Jeff Daniels urging life science companies to relocate to Michigan. Rhode Island has made specific attempts to schmooze companies to head south (alas for them, their biggest success, Alpha-Beta, moved just prior to clinical trial failure and company going bust).
The Nevada ads tout 'No Taxes', ignoring the fact that biotechs generally don't pay taxes -- because you have to make money to pay taxes, an extreme rarity in biotech. Actually, I'd bet the cost structure for Nevada is probably lower across the board -- cheaper electricity, lower heating costs (but higher cooling -- perhaps a wash?), certainly cheaper housing. Yet biotech is clearly strongly clustered. Are there any biotechs in Nevada?
I'm not trying to knock Nevada, or Michigan, or anywhere else. It's just the secret to getting biotechs to grow is elusive. A lot of the genomics companies started near big genome centers -- but why didn't Oklahoma reap companies from the center there? Big research universities are important -- but many big research universities do not have a garden of biotech in their neighborhood. Why is our local biotech largely in urban areas, whereas in Pennsylvania it seems to be entirely suburban -- even where I grew up (a region named for the 24th letter of the alphabet) has a cluster of biotechs. Why is non-Cambridge biotech around here largely west of town, whereas similar regions to the north and south have little to none: Worcester MA has a number of companies, but similarly distant Providence RI or southern NH very few. Why in the midwest is it easy to name companies headquartered near U Wisconsin, but not U Illinois?
I don't have the answer, and I suspect each region has a different answer. The U Mass Medical Center in Worcester may have pulled companies out that way, whereas Philadelphia area doesn't have a major academic research environment just outside the city.
One thought, and one which won't make happy the politicians trying to seed their own biotech clusters: what you need to get lots of new biotech is to have some old biotech. When companies grow and grow, they slowly shed talented people who often stay in the same region but start new ventures. And when companies crash-and-burn, a lot of people are looking for new opportunities. At the old shop we had large cohorts of persons previously at Biogen or Genetics Institute or Genome Therapeutics, and each time those companies went through convulsions a few more came on. Now, of course, every company in Cambridge is riddled with ex-Millennium hands, and many who learned the ropes of business there have gone on to start small companies. In addition to a bunch of folks I knew in my old life, my new shop has other clusters of former employers.
In the forest, when the elements topple an old tree the opportunity for new trees is created. The old roots may sprout new shoots, more sunlight comes in, and most of all the rotten log returns its resources to the surrounding soil. The analogy, like all analogies, is imperfect, but it's the same in biotech. The marketplace's creative destruction is a powerful force, but in order for it to create it needs something to destroy. States and localities wishing they had more biotech companies should continue their efforts, but temper their expectations, as those that gots gets and those that ain't gots gets slowly.
Even without the convention, there are regular TV and radio spots with Jeff Daniels urging life science companies to relocate to Michigan. Rhode Island has made specific attempts to schmooze companies to head south (alas for them, their biggest success, Alpha-Beta, moved just prior to clinical trial failure and company going bust).
The Nevada ads tout 'No Taxes', ignoring the fact that biotechs generally don't pay taxes -- because you have to make money to pay taxes, an extreme rarity in biotech. Actually, I'd bet the cost structure for Nevada is probably lower across the board -- cheaper electricity, lower heating costs (but higher cooling -- perhaps a wash?), certainly cheaper housing. Yet biotech is clearly strongly clustered. Are there any biotechs in Nevada?
I'm not trying to knock Nevada, or Michigan, or anywhere else. It's just the secret to getting biotechs to grow is elusive. A lot of the genomics companies started near big genome centers -- but why didn't Oklahoma reap companies from the center there? Big research universities are important -- but many big research universities do not have a garden of biotech in their neighborhood. Why is our local biotech largely in urban areas, whereas in Pennsylvania it seems to be entirely suburban -- even where I grew up (a region named for the 24th letter of the alphabet) has a cluster of biotechs. Why is non-Cambridge biotech around here largely west of town, whereas similar regions to the north and south have little to none: Worcester MA has a number of companies, but similarly distant Providence RI or southern NH very few. Why in the midwest is it easy to name companies headquartered near U Wisconsin, but not U Illinois?
I don't have the answer, and I suspect each region has a different answer. The U Mass Medical Center in Worcester may have pulled companies out that way, whereas Philadelphia area doesn't have a major academic research environment just outside the city.
One thought, and one which won't make happy the politicians trying to seed their own biotech clusters: what you need to get lots of new biotech is to have some old biotech. When companies grow and grow, they slowly shed talented people who often stay in the same region but start new ventures. And when companies crash-and-burn, a lot of people are looking for new opportunities. At the old shop we had large cohorts of persons previously at Biogen or Genetics Institute or Genome Therapeutics, and each time those companies went through convulsions a few more came on. Now, of course, every company in Cambridge is riddled with ex-Millennium hands, and many who learned the ropes of business there have gone on to start small companies. In addition to a bunch of folks I knew in my old life, my new shop has other clusters of former employers.
In the forest, when the elements topple an old tree the opportunity for new trees is created. The old roots may sprout new shoots, more sunlight comes in, and most of all the rotten log returns its resources to the surrounding soil. The analogy, like all analogies, is imperfect, but it's the same in biotech. The marketplace's creative destruction is a powerful force, but in order for it to create it needs something to destroy. States and localities wishing they had more biotech companies should continue their efforts, but temper their expectations, as those that gots gets and those that ain't gots gets slowly.
Thursday, April 26, 2007
Gobble Gobble Slurp
AstraZeneca's record-setting $15B+ buy of Medimmune gave the old workplace's stock a mild goose, but things have settled. It is a reminder of what the ultimate fate of virtually any semi-successful biotech company will be.
In the end, there are three possible fates for a biotech: survival, liquidation or acquisition. Liquidation is rare & will probably always happen to early-stage companies, but does happen. One genomics company (Progenitor) reputedly let their employees show up for work to locked doors. Most companies will be acquired down the road; only a few frontrunners will stay independent. There are, of course, variations on these themes. J&J has a track record of acquiring companies but then leaving them largely recognizable. Some companies (e.g. Cadus) disappear in an operational sense but never quite disappear legally -- business zombies. Mergers of equals are theoretically possible and often claimed (Biogen-Idec), but how lopsided the division of spoils is can't really be assessed by an outsider.
Millennium executed a number of acquisitions during my tenure, with many being quite successful -- influential people remain who joined through the Chemgenics or Leukosite acquisition. Leukosite brought in Velcade (then PS-341), from a company (Proscript) which Leukosite hadn't finished digesting (er, assimilating) when the the MLNM-LKST merger was announced. Much of Millennium's inflammation pipeline has strong roots back to Leukosite.
But then there was the big demonstration of 2+2<<4: COR. Millennium bought COR for Integrillin & a sales force, with some interesting early stage oncology and cardiovascular programs in as icing. The corporate cultures seemed compatible and the excitement was there. But somehow things quickly ran downhill & when it became apparent that Millennium was overstretched, the COR (now MLNM San Francisco) site was targeted for liquidation. Eventually, after sinking many dineros into further clinical studies, Millennium essentially walked away from Integrillin. So for $2B plus, a sales force was purchased plus a revenue stream from Integrillin and some other leftovers -- plus some important contributions from the ex-COR folks in wrapping up the Bayer collaboration. Was it worth it? My impression is that everybody on the COR side wished they could get a do-over.
MedImmune was hardly an isolated purchase -- big pharmas and even big biotechs (Amgen, Genentech) have been plucking out various biotechs, generally either for hot therapeutic platforms (siRNAs, advanced antibody technologies or exotic antibody alternatives) or late stage compounds. Looking around the Cambridge neighborhoods finds plenty of companies in the former (Dyax, Alnylam, Archemix) or latter (MLNM, Vertex, Alkermes) categories. The majors still have their gaping pipeline gaps, and Wall Street is starting to hound Genentech towards more acquisitions -- and Amgen is starting to experience the pain of commercial reversals. Odds are there will be more buyouts -- and more flameouts & companies (e.g. Imclone) which flop at the auction bay.
So grab a ringside seat & get comfortable -- but please don't play the ponies. If anyone tells you they know who's going to buy whom for what price, odds are they're lying. Even if they aren't, do you really want to follow in the footsteps of the famed biotech investor who was wisked from her Connecticut home to a federally-paid stay in West Virginia?
In the end, there are three possible fates for a biotech: survival, liquidation or acquisition. Liquidation is rare & will probably always happen to early-stage companies, but does happen. One genomics company (Progenitor) reputedly let their employees show up for work to locked doors. Most companies will be acquired down the road; only a few frontrunners will stay independent. There are, of course, variations on these themes. J&J has a track record of acquiring companies but then leaving them largely recognizable. Some companies (e.g. Cadus) disappear in an operational sense but never quite disappear legally -- business zombies. Mergers of equals are theoretically possible and often claimed (Biogen-Idec), but how lopsided the division of spoils is can't really be assessed by an outsider.
Millennium executed a number of acquisitions during my tenure, with many being quite successful -- influential people remain who joined through the Chemgenics or Leukosite acquisition. Leukosite brought in Velcade (then PS-341), from a company (Proscript) which Leukosite hadn't finished digesting (er, assimilating) when the the MLNM-LKST merger was announced. Much of Millennium's inflammation pipeline has strong roots back to Leukosite.
But then there was the big demonstration of 2+2<<4: COR. Millennium bought COR for Integrillin & a sales force, with some interesting early stage oncology and cardiovascular programs in as icing. The corporate cultures seemed compatible and the excitement was there. But somehow things quickly ran downhill & when it became apparent that Millennium was overstretched, the COR (now MLNM San Francisco) site was targeted for liquidation. Eventually, after sinking many dineros into further clinical studies, Millennium essentially walked away from Integrillin. So for $2B plus, a sales force was purchased plus a revenue stream from Integrillin and some other leftovers -- plus some important contributions from the ex-COR folks in wrapping up the Bayer collaboration. Was it worth it? My impression is that everybody on the COR side wished they could get a do-over.
MedImmune was hardly an isolated purchase -- big pharmas and even big biotechs (Amgen, Genentech) have been plucking out various biotechs, generally either for hot therapeutic platforms (siRNAs, advanced antibody technologies or exotic antibody alternatives) or late stage compounds. Looking around the Cambridge neighborhoods finds plenty of companies in the former (Dyax, Alnylam, Archemix) or latter (MLNM, Vertex, Alkermes) categories. The majors still have their gaping pipeline gaps, and Wall Street is starting to hound Genentech towards more acquisitions -- and Amgen is starting to experience the pain of commercial reversals. Odds are there will be more buyouts -- and more flameouts & companies (e.g. Imclone) which flop at the auction bay.
So grab a ringside seat & get comfortable -- but please don't play the ponies. If anyone tells you they know who's going to buy whom for what price, odds are they're lying. Even if they aren't, do you really want to follow in the footsteps of the famed biotech investor who was wisked from her Connecticut home to a federally-paid stay in West Virginia?
Wednesday, April 11, 2007
The ups-and-downs of out-licensing
Monday's Boston Globe had an interesting article (probably $$$) on a story which hadn't seen much attention previously but illustrated a number of biotech themes: rapid reversals & odd partnerships.
A group at Beth Israel Deaconess Medical Center (BIDMC) had developed a potential new protein therapeutic which they thought they might hit it big with. I must confess a special fondness of BIDMC, as a team there oversaw the delivery of my most important project ever, but they seem to have really gone out on a strange limb in this case by picking an odd partner for developing their blockbuster.
The potential blockbuster is apparently a single chain protein encoding a dimeric erythropoeitin (EPO). EPO is, of course, the most financially successful biotech drug ever and what made Amgen bigger in market cap than some old-line pharmaceutical companies. EPO has been in the crosshairs of a number of other companies, but Amgen has thus far won all the battles on patents -- first with Genetics Institute (now Wyeth) at the outset and later knocking out Transkaryotic's (now Shire) attempt to end-run their patents. Amgen followed up with a slightly modified form (Aranesp), which has also cleaned up. Affymax has a clever mimic in trials -- though this illustrates the need for patience in this business, as their splashy paper on it came out when I was interviewing at Millennium nearly 11 years ago!
So a tandem EPO would seem like a reasonable bet, with the claim that this form is longer lasting (ala Aranesp) and more potent. EPO is used to treat anemia in kidney failure (EPO is normally made in the kidney) and cancer patients (along with illicit off-label uses in the field of athletics) -- more is better, right?
Lately, the bloom is off that rose -- several studies are suggesting that for cancer patients there may be drawbacks to high EPO doses. EPO has now earned a black box warning and Amgen is scrambling (the CFO just bailed -- perhaps with shoeprints on his backside).
Now that's just bad luck -- pharmaceuticals are like that. One day COX2 inhibitors are miracle drugs; the next day they're persona non grata.
It's the other half of the story that I found very curious. If you were trying to out-license your institutions exciting new protein therapeutic, what would your first choice of company be? How about a failing genomics company with a slim bank account? No? That doesn't sound appealing? But that's exactly what BIDMC did.
Now a lot of genomics companies exited the genomics boom in a strange place: lots of money raised during the bubble, but no path forward to make money in genomics. Companies such as HGS and MLNM are still living off that cash, but they had interesting programs going. Others had stranger outcomes. Variagenics and Hyseq proved that 1 genomics company + 1 genomics company = 0 genomics companies, as they merged, ditched all their genomics operations, and changed to Nuvelo to develop an in-licensed protein therapeutic.
BIDMC chose DNA Print Genomics for their wonder drug. I have nothing against DNA Print, but there's nothing in memory (or on their website) to suggest that they have any of the key skills. Nor do they have much cash. While they do actually have products, those products don't inspire much awe. DNA Print will type your DNA to estimate your ancestry. That might be fun, but how big is the market really? They also claim to have tools which can predict the physical characteristics of a person (skin color, earlobe attachment) from forensic samples -- a sort of genetic sketch artist. I'm sure there are missing persons-type cases where this provides one more set of clues, but its hardly something that would see routine use in cases.
Wall Street hardly loves DNA Print -- if Yahoo's statistics are to be believed, it is trading at a market cap of 4.58M much below its cash position of about 8.5M -- but it is also (if I'm reading this right; I really don't stare at these often) blowing through 3-4M per quarter -- meaning that cash will run out in the near future unless they find financing or take a scythe to their operations. This was a point raised in the Globe article -- BIDMC has hitched their wagon to a lame horse which may expire very soon.
An interesting question, which one can never get a straight answer to, is why pick DNAPrint? Was there really nobody else interested? It is curious that the consultant who BIDMC hired to find a licensee (after Eli Lilly had bailed out) for the compound ended up as chief executive at DNAPrint. While that is hardly unheard of, it does raise a real issue of conflicts of interest. The deal structure is strongly loaded towards milestones & royalty payments -- i.e. BIDMC sees very little without a lot of progress being made. DNA Print apparently has reported preclinical results.
A weak & failing partner for a troubled market niche -- hardly a good place to be. C'est la biotechnologie!
A group at Beth Israel Deaconess Medical Center (BIDMC) had developed a potential new protein therapeutic which they thought they might hit it big with. I must confess a special fondness of BIDMC, as a team there oversaw the delivery of my most important project ever, but they seem to have really gone out on a strange limb in this case by picking an odd partner for developing their blockbuster.
The potential blockbuster is apparently a single chain protein encoding a dimeric erythropoeitin (EPO). EPO is, of course, the most financially successful biotech drug ever and what made Amgen bigger in market cap than some old-line pharmaceutical companies. EPO has been in the crosshairs of a number of other companies, but Amgen has thus far won all the battles on patents -- first with Genetics Institute (now Wyeth) at the outset and later knocking out Transkaryotic's (now Shire) attempt to end-run their patents. Amgen followed up with a slightly modified form (Aranesp), which has also cleaned up. Affymax has a clever mimic in trials -- though this illustrates the need for patience in this business, as their splashy paper on it came out when I was interviewing at Millennium nearly 11 years ago!
So a tandem EPO would seem like a reasonable bet, with the claim that this form is longer lasting (ala Aranesp) and more potent. EPO is used to treat anemia in kidney failure (EPO is normally made in the kidney) and cancer patients (along with illicit off-label uses in the field of athletics) -- more is better, right?
Lately, the bloom is off that rose -- several studies are suggesting that for cancer patients there may be drawbacks to high EPO doses. EPO has now earned a black box warning and Amgen is scrambling (the CFO just bailed -- perhaps with shoeprints on his backside).
Now that's just bad luck -- pharmaceuticals are like that. One day COX2 inhibitors are miracle drugs; the next day they're persona non grata.
It's the other half of the story that I found very curious. If you were trying to out-license your institutions exciting new protein therapeutic, what would your first choice of company be? How about a failing genomics company with a slim bank account? No? That doesn't sound appealing? But that's exactly what BIDMC did.
Now a lot of genomics companies exited the genomics boom in a strange place: lots of money raised during the bubble, but no path forward to make money in genomics. Companies such as HGS and MLNM are still living off that cash, but they had interesting programs going. Others had stranger outcomes. Variagenics and Hyseq proved that 1 genomics company + 1 genomics company = 0 genomics companies, as they merged, ditched all their genomics operations, and changed to Nuvelo to develop an in-licensed protein therapeutic.
BIDMC chose DNA Print Genomics for their wonder drug. I have nothing against DNA Print, but there's nothing in memory (or on their website) to suggest that they have any of the key skills. Nor do they have much cash. While they do actually have products, those products don't inspire much awe. DNA Print will type your DNA to estimate your ancestry. That might be fun, but how big is the market really? They also claim to have tools which can predict the physical characteristics of a person (skin color, earlobe attachment) from forensic samples -- a sort of genetic sketch artist. I'm sure there are missing persons-type cases where this provides one more set of clues, but its hardly something that would see routine use in cases.
Wall Street hardly loves DNA Print -- if Yahoo's statistics are to be believed, it is trading at a market cap of 4.58M much below its cash position of about 8.5M -- but it is also (if I'm reading this right; I really don't stare at these often) blowing through 3-4M per quarter -- meaning that cash will run out in the near future unless they find financing or take a scythe to their operations. This was a point raised in the Globe article -- BIDMC has hitched their wagon to a lame horse which may expire very soon.
An interesting question, which one can never get a straight answer to, is why pick DNAPrint? Was there really nobody else interested? It is curious that the consultant who BIDMC hired to find a licensee (after Eli Lilly had bailed out) for the compound ended up as chief executive at DNAPrint. While that is hardly unheard of, it does raise a real issue of conflicts of interest. The deal structure is strongly loaded towards milestones & royalty payments -- i.e. BIDMC sees very little without a lot of progress being made. DNA Print apparently has reported preclinical results.
A weak & failing partner for a troubled market niche -- hardly a good place to be. C'est la biotechnologie!
Thursday, March 29, 2007
454? How Roche!
Today's GenomeWeb bears the news that Roche Diagnostics is buying out 454 Life Sciences. Since Roche was previously the sole distributor of 454's sequencers and Curagen had announced their desire to sell the subsidiary, this is hardly a shocking development. But it is the third next generation sequencing company to be bought by an established player -- ABI slurped up Agencourt Personal Genomics and Illumina recently bought Solexa. So far, Affymetrix and Agilent have stayed out -- as has Nimblegen. There are plenty of other startup next generation sequencing shops out there, and certainly other candidates for acquirers. Roche, of course, got the clear current front runner, though it may be that the next wave of sequencer launches will close the gap quickly.
Whether these acquisitions are good for next generation sequencer development is an open question. On the one hand, these larger organizations bring deep pockets and substantial marketing expertise. But, there are plenty of pitfalls. For both ABI and Illumina, the new machines compete with their old machines -- smart companies see this as inevitable, but many companies completely botch the job due to internal conflicts (as amply documented by Clayton Christiansen in his books). It isn't encouraging that the Agencourt Personal Genomics technology is impossible to find on the ABI website.
It will also be interesting to see how long the 454 moniker lasts -- one hates to see pioneers go, but on the other hand I find naming a subsidiary after the accounting code tres gauche.
An interesting note in the GW item is that Roche was previously prohibited from marketing regulated diagnostics built on the 454 platform. Roche has previously tried to launch some molecular diagnostics -- the D word is after all in their name -- so this is a clear fit. On the other hand, a run on the 454 is reputed to be serious money, so they'll need to either find a very high value application (in a field notorious for antiquated, miserly reimbursement rules) or figure out a way to run lots of tests simultaneously. Given the rather long read lengths of the 454, one approach to the latter would be to use sequence tags near the beginning of the read to identify the original samples.
Another GW item describes some roundtable discussion at a recent meeting on next generation sequencing. The price for a genome in 2010 is still a big question, but a lot of bets are apparently in the $10K-$25K range. Some of the leaders in the field are taking a realistic view of the utility of such sequencers at such a price tag -- if you can scan the most informative SNPs for $1K, then why sequence? I'm guessing that other than a few pioneers (J.Craig is apparently resequencing his genome), there won't be a lot at those prices. On the other hand, cancer genomics is a natural fit, as each genome is different (indeed, each sample probably has many distinguishable genomes) and understanding all the fine molecular details will be valuable. SNP chips can estimate copy numbers, but not tell you how those pieces are stitched together nor find all the interesting mutations.
Even with the price at $1K, sequencing will certainly not be 'too cheap to meter'. Notions of sequencing a big chunk of the human population have appeal, but do we really want to blow another few billion dollars on human sequencing? On the other hand, as I've suggested before, other mammalian genomes may provide a lot of interesting biology for the buck (or bark). What are the most interesting unbagged genomes out there -- that sounds like the topic for another day's post...
Whether these acquisitions are good for next generation sequencer development is an open question. On the one hand, these larger organizations bring deep pockets and substantial marketing expertise. But, there are plenty of pitfalls. For both ABI and Illumina, the new machines compete with their old machines -- smart companies see this as inevitable, but many companies completely botch the job due to internal conflicts (as amply documented by Clayton Christiansen in his books). It isn't encouraging that the Agencourt Personal Genomics technology is impossible to find on the ABI website.
It will also be interesting to see how long the 454 moniker lasts -- one hates to see pioneers go, but on the other hand I find naming a subsidiary after the accounting code tres gauche.
An interesting note in the GW item is that Roche was previously prohibited from marketing regulated diagnostics built on the 454 platform. Roche has previously tried to launch some molecular diagnostics -- the D word is after all in their name -- so this is a clear fit. On the other hand, a run on the 454 is reputed to be serious money, so they'll need to either find a very high value application (in a field notorious for antiquated, miserly reimbursement rules) or figure out a way to run lots of tests simultaneously. Given the rather long read lengths of the 454, one approach to the latter would be to use sequence tags near the beginning of the read to identify the original samples.
Another GW item describes some roundtable discussion at a recent meeting on next generation sequencing. The price for a genome in 2010 is still a big question, but a lot of bets are apparently in the $10K-$25K range. Some of the leaders in the field are taking a realistic view of the utility of such sequencers at such a price tag -- if you can scan the most informative SNPs for $1K, then why sequence? I'm guessing that other than a few pioneers (J.Craig is apparently resequencing his genome), there won't be a lot at those prices. On the other hand, cancer genomics is a natural fit, as each genome is different (indeed, each sample probably has many distinguishable genomes) and understanding all the fine molecular details will be valuable. SNP chips can estimate copy numbers, but not tell you how those pieces are stitched together nor find all the interesting mutations.
Even with the price at $1K, sequencing will certainly not be 'too cheap to meter'. Notions of sequencing a big chunk of the human population have appeal, but do we really want to blow another few billion dollars on human sequencing? On the other hand, as I've suggested before, other mammalian genomes may provide a lot of interesting biology for the buck (or bark). What are the most interesting unbagged genomes out there -- that sounds like the topic for another day's post...
Subscribe to:
Posts (Atom)