Spatial profiling has been to AGBT before -- NanoString has been running their Spatial Profiling Summit for several years now and I first saw the technology underlying 10X's Visium in one of the new technology tracks back in either 2013 or 2014. ReadCoor did a splashy sponsorship in 2018 without really providing any details; last year they had a highly promoted launch that could have used a much bigger room.
Given all the activity in this space, it may be useful to set the table by sketching out some of the major axes on which we might differentiate different offerings.
None of these are going to come cheap. Well, technically Visium can be as simple as a not-so-cheap consumable, but even that is now getting accessories. I would expect the early markets for these technologies to be relatively well funded academics and bigger biopharma shops. That's my general point-of-view that shapes all the commentary in this piece: the initial market is a modest number of well-funded labs eager to be on the bleeding edge and using this technology for basic research and proof-of-concept work in translational research. I'm expecting success in the long term, and with such success things like cost will come under pressure. If you're going to equip every community hospital with your technology, it probably can't be half a million dollars a box.
A similar thought on consumables: for this early going there won't be extreme pressure on consumables cost. Single cell doesn't come cheaply, yet some labs have gone big with it. That can't hold if one is worrying about reimbursement for clinical tests, but that pricing pressure is years down the road. I'd also say that the sorts of labs buying these won't be picky about the size of the box; when the technology becomes more mature then one can worry if startups can even fit one in their space.
On the other hand, the degree of walk-away running of a system is likely to be a big selling point to biopharma. Employees are expensive items. It won't be surprising to have someone specializing in operating the new system, but they will likely have other tasks. More hands-on also means more training and makes it difficult to have a pool of trained individuals to cover absences. So there's a huge difference between "move the flowcell between boxes" and "carefully tend these reagent additions over several hours"
Getting to the more technical, an important point but I don't believe will be a sticking point on these is whether they are end-to-end solutions or if instead they require a sequencer to generate the actual data. ReadCoor was proposing the former; Visium is the latter.
In general, I expect what these platforms deliver to the end user to be the main points of contention, particularly early on. There are multiple functional aspects on which to compare different approaches.
Resolution will be one. The original Visium couldn't localize to single cells; Visium HD promises to do that. nanoString says their Spatial Mapper will have subcellular resolution; ReadCoor was claiming similar last year at AGBT -- after all their technology relied on generating sequencing data from single molecules! Some may even have a degree of resolution in the Z dimension
On the other end of the imaging capability is how much of a sample can be imaged. One potshot 10X has taken at nanoString's GeoMX platform is that the user must choose regions-of-interest to collect data from versus Visium collecting from the entire sample field. Conversely, nanoString would point out that if one is capturing multiple cells then it is better for the user to be able to optimize that field rather than have one based on a rigid geometry. If you are worried about the sequencing costs, then field-of-view selection reduces the amount of sequence data required, but sequencing it all means avoiding later remorse at not picking the right fields.
The panel sizes matter too. There's definitely an appeal to not having to choose and go whole transcriptome, but if one must choose then bigger panels require fewer painful decisions. Conversely, one advantage of focused panels may be higher sensitivity -- does your whole transcriptome spatial scheme really capture the ups-and-downs of low copy number transcripts? And is it only looking at 3' ends or can you look at any splice or fusion transcript of interest? And can you run proteins too? If so, what are the limitations? And if I want to design a panel, what is involved in terms of workflow, time, effort and expense?
And of course the issue of sample type. I can comment more freely on this topic than some because the Strain Factory mostly deals in microbes, and those are tough cookies for single cell or spatial profiling (though there was a cool preprint just the other day doing spatial profiling on Pseudomonas). I would expect that 99.9% of experiments on these platforms will be run on human or mouse material; maybe that's an overstatement but most of the rest will probably be rat, dog, zebrafish and maybe a few other model systems. But beyond species, will a given system work only with fresh/frozen inputs or can it handle fixed samples such as FFPE? And if working with fixed material, how much of a hit in sensitivity or resolution is there?
Data is great, but software is better. Many prospective buyers may be bowled over by flashy GUIs; tip-of-the-spear users will want to know how they can export to use data within new tools that might not yet exist. There's also the whole question of control and annotation and metadata -- how to interface the fancy new toy with an existing LIMS?
Getting back to softer axes, there is the question of the company offering a product and its reach and reliability. Can you get your reagents quickly and without fail? Will the more complex instruments cause a service engineer to be on a first name basis with your front desk receptionist? New launches of instruments often are oversubscribed -- will you even get an instrument order on your desired timeline?
That's a particularly tricky for companies trying to work across international borders -- those borders are very real. The first two rounds of Oxford Nanopore reagents under their MAP access program were hung up in customs and often arrived with dry ice nearly gone. So any company launching in this space must balance the enormous advantages of reaching more of the world versus the enormous headaches that come with it. Not only logistics, but also IP -- which is a whole area I'm going to generally try to avoid diving into but the spatial profiling space must be a complex world.
It should be a fun, messy campaign. I truly hope we don't see clear winners and losers quickly -- having a long, drawn out affair will be both more fun to watch and will lead to better products to serve the science. I've even had folks at Illumina say in the past that they would prefer to have more competition; being a monopolist risks going soft.
No comments:
Post a Comment