Next-Generation Sequencing

Is this the Holy Grail of clinical diagnostics?

Next-generation sequencing (NGS) has been touted as the game-changing technology for the field of diagnostics and personalized medicine. But will this come to fruition, and if so, when?

In this article, we review the current status of NGS technology and instruments and consider if the likely transition from NGS to third-generation techniques will be the catalyst for wider adoption of sequencing technology in clinical diagnostics. We will also discuss the main barriers that NGS faces, such as standardization and regulatory pathways.

At the time of writing, more than 2,500 NGS systems have been placed into laboratories, typically academic and industrial research centers. Many of these laboratories have selected either the Illumina HiSeq, Roche 454 GS FLX, or ABI SOLiD platforms, all of which are capable of a huge amount of sequencing per run. However, as the HiSeq and SOLiD platforms take approximately 10 days to complete a run, many researchers have recently opted to use smaller benchtop instruments such as Life Technologies’ Ion Torrent platform or Illumina’s MiSeq system. These devices are relatively low cost, better suited for a mid-sized academic research facility, and offer a rapid turnaround time, measured in hours rather than days. While much of this sequencing activity is still based in research-settings, the reduced turnaround time makes these smaller instruments more appropriate for clinical diagnostics.

Third-generation systems that analyze single molecules offer a potential advantage over NGS, or second-generation systems, in that they do not require the time-consuming DNA amplification step. The vastly reduced turnaround times and lower costs of these technologies provide huge appeal for clinical diagnostics. For example, Oxford Nanopore’s larger GridION system will supposedly be able to sequence an entire human genome in as little as 15 minutes, with appropriate computing power. Furthermore, they should be better suited to longer read lengths and epigenetics, meaning they may well be suited to applications such as oncology.

Currently, it is hard to imagine third-generation techniques leading the way in clinical settings as we continue to see companies in this space facing challenges. For example, Oxford Nanopore missed its 2012 deadline to launch the disposable MinION system, and no further announcement has been made since; Life Technologies has put its third-generation system on hold; and Pacific Biosciences’ Single Molecule, Real-Time technology has initially been met with skepticism after much noise about error-prone results.

Looking forward, sequencing may well form a common part of clinical medicine and become somewhat commoditized. After that time, disruptive innovation is more likely to focus upon sample preparation, throughput, and bioinformatics than the core detection technology. Although we are yet to see a commercial third-generation sequencing platform, we believe this to be an exciting space for companies to play in, with many firms developing alternative sequencing techniques and associated hardware for direct detection of bases. Arguably though, it is not the continued development of the core sequencing technology itself that is required to ensure widespread clinical adoption.


Clinical Adoption, Interpretation
There is little debate that the use of sequencing will play a crucial role in personalized medicine in the near future. While cost and time have been significant barriers to clinical sequencing to date, industry competition and technology development have led to these barriers being steadily lowered. NGS technology can already be used to sequence a whole genome, an exome, or a focused clinical gene panel, and we are now seeing third-generation technologies starting to do the same without prior amplification. However, there are still some significant obstacles to clinical adoption that need to be overcome.

Few physicians are trained geneticists, and so access to genomic sequencing data alone will not necessarily result in optimized healthcare. To make the most use of genomic data, a multidisciplinary team including pathologists, biologists, physicians, and software specialists would be required. On the other hand, plenty of physicians would not question the reactions and mechanisms involved in an electrochemical test providing a coagulation parameter such as PT/INR. They will simply look at the result and consider where this lies within a clinical range. This analogy could surely be extended to genomics, and a challenge remains to abstract clinically relevant information from sequencing in a format that all physicians will welcome. As with other clinical diagnostic tests, genetic tests need to return a probabilistic measure that a particular condition either is or will be present, perhaps including a potential treatment course in the case of companion diagnostic tests.


Standardization/Quality

For physicians to be comfortable using genetic testing, the workflow and the way in which results are presented should ideally be standardized. Some sequencing technology features its own proprietary software, while individual researchers may choose to use open source bioinformatics tools or even develop their own software code for data analysis. This means a wide range of output methods is available to display the sequence, any abnormalities identified, etc. The output must be displayed in an appropriate format to provide a clinical benefit. Similarly, a standard data storage format would be useful if genetic data is to form a part of a patient’s health record (it would also be a great help if electronic health records were standardized worldwide).

NGS systems typically provide output data in FASTQ format, a text-based format listing between 200 to 300 bases and an associated quality score. Bioinformatics software is then used to assemble these reads into a complete sequence, and it is essential that some form of appropriate QA metrics are included if sequencing is to become commonplace in clinical diagnostics. The FDA has started to support the process by establishing the MicroArray Quality Control (MAQC) project. Having initially addressed QC tools, metrics, and thresholds for investigating microarray tools (MAQC-1) – as well as considering data analysis methods and validating predictive models (MAQC-2) – the project is now generating reference samples so that different NGS technologies can be evaluated (MAQC-3).


Computing Infrastructure

Assuming no loss in test quality, the faster a test can deliver a result, the more useful it will be in a clinical setting. While a benchtop system might typically deliver <50MB from sequencing a human exome, high-end NGS systems might output as much as 600GB from a single run if sequencing whole genomes. This will require considerably more computational power. As the test data output is inherently self-similar, much of the sequence assembly work can be done in parallel, so a single multi-core processor or a cluster of computer processors can be used to reduce the time taken. While world-leading genomics research centers will have impressive server rooms, this is unrealistic for a smaller research setting or hospital. A rule of thumb is that for every $1 spent on sequencing instrumentation, the same amount should also be spent on computer processing power. Beyond the expense, this sort of super-computing capacity takes physical space.

An alternative approach is the use of commercial cloud computing whereby access to a cluster of computers can be purchased on a pay-as-you-go basis. However, this requires the transfer of patient data across the Internet. Given that genomic data can be measured in Terabytes on occasion, the best solution might be to courier a hard copy of the data to the cloud. This does not answer concerns over patient confidentiality, however, and the use of cloud computing may well fall foul of HIPAA guidelines.


Regulatory Environment

The use of DNA-based tests is not new to the world of in-vitro diagnostic (IVD) testing. The FDA has approved more than 200 tests, which are primarily single-gene tests with clearly associated clinical diagnoses. The majority of diseases are a function of highly complex interactions between multiple genes and the associated tests are more prone to errors and incidental findings. As a result, achieving regulatory approval following the path of conventional IVD assays is particularly challenging because the assessment of reliability, accuracy, and safety is considerably more complicated. The FDA is already examining molecular diagnostic instruments with combined functions, i.e., devices that are used both in formal IVD tests and for research, issuing draft guidance in April 2013. There are also moves to increase the regulation of laboratory-developed tests (LDTs), diagnostic tests developed and performed by a laboratory that already include genetic tests. The FDA is working with industry representatives, but has yet to fully address LDTs that incorporate NGS technologies.

The slow and unclear regulatory pathway is one aspect that is stifling the development of using NGS in diagnostic testing. Against a backdrop of rapidly changing technology, which is still in its infancy, significant barriers are preventing companies from receiving regulatory approval. Tests are specific to devices and the highly multiplex nature of genetic sequencing creates additional complexities for the regulatory bodies with regard to test and device validation. Guidelines will need to be established that will accommodate the different platforms and types of tests.

The approach sought by most companies is to target specific assays and achieve approval before expanding the menu around the same test. This fits into the known and established IVD regulatory process, but it is clear that a new, more flexible approach is required, which allows the new technology to evolve both in the bioinformatics analysis and assay development, acknowledging the range in platforms and applications.


Looking to the Future
The opportunities for genetic testing using current and future technologies are huge. The understanding and knowledge that the technology is providing to scientists is opening up new areas of treatments, which will be based on genetic sequencing diagnostic tests. From a commercial perspective, sequencing technologies are well suited to widespread screening tests and are hugely attractive in this high-volume, low-margin industry. NGS technology still has some way to go before it is used routinely in a clinical setting, but it appears that these second-generation systems will provide the basis for increasing uptake of diagnostic testing as the near-term promise of third-generation systems has been overstated.

Although there are still barriers to overcome, extensive collaboration between the device manufacturers, patients, clinicians, regulatory bodies, reimbursement organizations, and national healthcare providers will enable sequencing to be used extensively as a clinical diagnostic tool and the industry to make the most of the great opportunities in this space.

 

Sagentia Inc.
Boston, Mass.
www.sagentia.com

 

About the author: Paul Wilkins, MEng, is vice president, diagnostics, at Sagentia. His primary focus is the design and development of IVD consumables and point-of-care instruments. He can be reached at Paul.Wilkins@sagentia.com.

October 2013
Explore the October 2013 Issue

Check out more from this issue and find your next story to read.