Data-Intensive Problems to Shift Course of Supercomputing

Print Friendly, PDF & Email

Science Daily is reporting that a ‘Data Deluge’ is resulting in a paradigm shift in supercomputer architectures. In a presentation during the 3rd Annual La Jolla Research & Innovation Summit this week, SDSC Director Michael Norman said that the amount of digital data generated just by instruments such as DNA sequencers, cameras, telescopes, and MRIs is now doubling every 18 months.

Digital data is advancing at least as fast, and probably faster, than Moore’s Law,” said Norman, referring to the computing hardware belief that the number of transistors which can be placed inexpensively on an integrated circuit doubles approximately every 18 months. “But I/O (input/output) transfer rates are not keeping pace — that is what SDSC’s supercomputers are designed to solve.”

Later this year, the San Diego Supercomputer Center will deploy a new data-intensive supercomputer system named Gordon, which will be the first high-performance supercomputer to use large amounts of flash-based SSD memory. Thanks to a five-year, $20 million grant from the NSF, Gordon will have 250 trillion bytes of flash memory and 64 I/O nodes, and be capable of handling massive data bases while providing up to 100 times faster speeds when compared to hard drive disk systems for some queries. Read the Full Story.