What is going on with 454 GS FLX+ sequencing?

Supposedly, if you need long reads of high quality and throughput, you should be using GS FLX+, yielding 750-800 peak read length, from Roche/454, right? Illumina reads are getting longer, but 250 is for now the limit (notwithstanding the 300 bp run done by the broad). IonTorrent promises 400 bases, but we will have to see what the quality is going to be. And PacBio, well we all know they are long, but with relatively low throughut and quality peaking at 85% – 86% accuracy (useful nonetheless).

Commercially, GS FX+ had been around for more than half a year. So far, the community is reporting some success, see this thread at SeqAnswers. But, there are problems all around when you talk to people. Our own centre (the Norwegian Sequencing Centre) got the upgrade in August. I’ll spare you the details on numerous control/test fragment runs, short read runs etc, but the bottom line is that we still haven’t been able to sucessfully sequence a FLX+ library on our GS FLX+. Right now, we are having a service visit, and I intend to chain the guy to the instrument until we see some real good data from one of our own libraries…

There is another thing that surprises me too: I am attending PAGXX, the Plant and Animal Genome conference in San Diego, where Roche is one of the main sponsors. However, when you look at the little text they have as an exhibitor, well, let me just quote a part of it for you (source):

“Roche Applied Science introduces the latest innovations in reagents and instruments for genomics research with the launch of our new Titanium reagents generating 400 to 500 bp sequencing read lengths using 454’s ultra-fast pyrosequencing technology for whole genome sequencing projects.”

What? ‘latest innovation’, ‘our new Titanium reagents’, ‘400 to 500bp’? Nothing about GS FLX+, 700-800 bp, let alone 1000 bp reads? Surely, this is a marketing glitch, right, just somebody who forgot to update the standard text. Well, today I passed the booth Roche has here at PAGXX. Sure, there was the obvious GS FLX instrument. But, to my amazement, it was a GS FLX, NOT a GS FLX+! You can tell since the loading bay for reagents is bigger on the GS FLX+, and it says ‘GS FLX+’ on it as well. The exhibited instrument had none of that.

Should we read anything into this? Is there no pride in the Roche system about their uniquely long reads? Surely there is, they have talks around the longer reads, for example here and here, and a workshop with GS FLX+ in the title: “De Novo Assembly of Complex Genomes: The GS FLX+ System Makes a Difference.”

The buzz at PAGXX is not very positive on GS FLX+ either. For example, I talked to a core facility head that complained a lot about how their machine didn’t want to run GS FLX+ after the upgrade either. They repeatedly had to have them fixed, or even exchanged. And apparently, they are not the only ones.

I have been, and still are, a big believer in the 454 technology, and have used it successfully for my research. Our core facility has great experience with the instruments and our users are generally very happy with the data. But, the current developments worry me. The niche 454 has always been the longest possible high-throughput (next-gen) reads. But, we, and others, have seen less shotgun projects last year, in favor of long amplicon sequencing, due to the much higher price of 454 sequencing relative to Illumina (and SOLiD). Also, IonTorrent is catching up on read length, with 400 bp coming this year (seeing is believing, though). The PGM, and especially the upcoming Proton instrument, has a troughput that is staggering when you’re used to 454. But also the newly announced upgrade to the MiSeq later this year will get that platform up to 2×250 bases PE sequencing. This means that a 400-450 bp PCR product, currently only sequencable on the 454, can soon easily be done on the MiSeq, at lower cost and higher throughput.

This means that, in principle, the upgrade to GS FLX+ could reinvigorate the interest in this platform, for example for de novo transcriptome and genome sequencing. But the outlook is bleak, at best. Unless GS FLX+ becomes stable all over the world, and reports on the results start becoming very positive, 454 is very fast losing it’s niche to the competitors. To me, it doesn’t look like Roche is taking this situation very seriously. This makes me wonder if they perhaps also have lost faith in 454?

EDIT:

Check out the comments below, and this discussion at SeqAnswers.

Advertisements

9 thoughts on “What is going on with 454 GS FLX+ sequencing?

  1. Any idea what the problem is? I mean, it seems that if the chemistry is sufficiently tightened up, you’d just run the machine for more cycles – but clearly it’s not that simple.

  2. Interesting post!
    Maybe 454 should focus more on increasing the number of reads produced instead?
    I wonder if this more important than read length?

    • Yeas and no: read length is what makes 454 unique (at least when they are over 400 bp), but there is definitely a need for much higher throughput per run.

  3. Roche as you know it, just sat on it while it’s own boys moved to ion to create the One Touch. How could they sit on such bad library protocol all these years. They just milked the cow. Now, they are stretching a machine with just longer fluidics and the specs are non existent ! Sounds like a total meltdown to me. Then again, if it wasn’t for IT, Life Tech would be exactly the same with their last Solid or 5500, probably the worst product intro on the market since the AB 1100 or genospec or whatever never came. Some product managers need to go live in reality for a year or two and drop their MBA bullshit.

  4. Hi guys,
    since our FLX+ Upgrades have been installed in September 2011 we contiously have problems with the new FLX+ chemisty.

    Actual Roche support says: FLX+ works for SG lib protocol only and for fragment size of sharp 1600 bp – period. This does not match any so far experiences with this technology – but not only shortend read length with respect to promised values counts (or announced modal read length that cannot be produced) – more serious are artefacts in sequences you will find when you compare backstored libraries of excellent quality and sequence them with both Titanium and FLX+ chemistry.

    We intensively have tested out several different kits charges but problems do not change – and chemistry is not getting stabil at all. Initially Roche told us about system issues – but this was not the reason.

    According to Roche some kits have been released that had minor quality issues – release being forced by high number of orderes – meanwhile we do not believe this any longer. We think the FLX+ had to be started because of the severe pressure in markets caused by Illumina constant updates. Product anyway has not been finalised. And of course you need to know that Roche actual changing production sites and is restructuring becuase of cost – guess why kits do not work as initially promoted…
    fredooo

  5. So, two things.
    1. This history repeats itself, when they first introduced Titanium(Ti1) chemistry after the flx their RAW read error rate jumped ~2-4 times (dephasing), while the quality trimming was not very good in the 2.0X.XX releases, leaving a lot of junk in the “clear range” sff output.
    b. I needed to add cumulative error filtering to my sff2phd importer to be able to assemble the data by phrap at al. Fortunately it had improved a bit with Titanium2+betaine. (to fall again at the initial flx+, but now it is gradually going up).
    c. The stability of their emPCR kits (quality is very variable between some versions of them).
    d. Consumables were/are too expensive for 454, which had dramatically reduced their market share. Some really doggy marketing in the action. Now we have the results…

    2. ### Improving signal of the flx+ ###
    Make sure yours library is on the shorter side (between 800-1000bp), but make sure to eliminate all fragments under 500 bp, also keep molar ratio with adapters during their ligation and emPCR setup (to avoid chimeras).
    The strong signal good quality reads running into the opposite sequencing adaptors are WAY better, then the low signal high noise reads running for 150-200 bp, then terminating due to low quality.
    If the improvement is not sufficient – cut back to 600-800 bp. Use gel cut size selection to get sharper peak. Don’t overload the gel lane (use thick/long cone cones), to improve short fragments separation.
    There is no point of struggling with >1kb fragments in the emPCR, if you get 150-250bps average read length in yours flx+ run… In this case old good flx lib prep will be better 🙂

    PPS: anybody had tried making Roche’s emPCR using oneTouch system?
    (use roche’s library/primers/beads/betaine with iontorren’s oil).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s