Challenges of Personalized Medicine
A number of challenges exist for both the wider adoption of technologies that can impede personalized medicine workflows and the implementation of such systems.
The perception exists that a genome analysis can only be done on hundreds of nodes or an expensive supercomputer. However, optimized systems that include the right hardware and software, architected by experts from leading vendors, can bring genomics analysis to a broad base of researchers and users. In many cases, IT departments look for an immediate ROI, or will quickly look at utilization of the compute/storage cluster. However, it is possible to start small and grow as the needs grow. Careful planning for this expansion allows for servers and storage to be added incrementally. As projects become more complex or the number of users increases, servers can be added to increase the overall capacity of the compute and storage cluster.
ROI for Small Use Cases
Small organizations that require significant IT resources may avoid updating their infrastructure to serve the needs of the users. Aside from the confusion as to the scalability of starting with a small system and growing as needs grow, these organizations may not have the staff to investigate a number of alternatives or implement a piece-by-piece purchase path. They may be resigned to using their older technology rather than upgrading, due to a fear of the IT unknown. However, if a turnkey-type solution were available with minimal IT expertise needed, departments or smaller companies would be able to take advantage of current and future technologies.
FDA and CLIA Compliance
FDA approval (compliance) is required for devices used to treat and diagnose patient diseases. Clinical uses must endure a safety period. However, the FDA has in place a number of regulations and safety assurances that must be followed when working with patient health, such as certifications when working with lab instruments, appliances, and technology that are used to facilitate patient health. These safety checks can be daunting, thus the need to work with experienced vendor services teams.
Patient data is obviously very valuable and must be kept secure. Genomics is no exception; it is actually even more important to provide security attention and resources to patient health record data. Special tools, processes and products must be used for patient data and must be compliant with all federal requirements.
Clinicians using electronic medical records and imaging archives must abide by procedures and protocols in order to comply with the Health Insurance Portability and Accountability Act (HIPAA) and best practices. These practices include various consultations with specialists and experts that may or may not be a part of the existing IT infrastructure. Due to the fact that these records can assist in treatment or diagnosis, they must be accurate and available quickly to those involved in the treatment of patients.
Large chunks of data must be managed in a genomics solution. A single genome is approximately 200GB to 300GB. Even though the data consists of just four letters (with TGAC as its building blocks), there are approximately 3 billion of these nucleotide bases in a single person. The data from the sequencer is a very large data file that must be accessed, stored and acted upon. Analyzing the genome magnifies the need for nearby storage, scratch storage, archival storage and network bandwidth.
Next week we will look at a few success stories in personalized medicine. If you prefer you can download the complete insideHPC Guide to Genomics in PDF form by clicking here, courtesy of Dell and Intel.