“DDN’s unique ability to handle tough application I/O profiles at speed and scale gives weather and climate organizations the infrastructure they need for rapid, high-fidelity modeling,” said Laura Shepard, senior director of product marketing, DDN. “These capabilities are essential to DDN’s growing base of weather and climate organizations, which are at the forefront of scientific research and advancements – from whole climate atmospheric and oceanic modeling to hurricane and severe weather emergency preparedness to the use of revolutionary, new, high-resolution satellite imagery in weather forecasting.”
“Deploying DDN’s end-to-end storage solution has allowed us to elevate the standard of protection, increase compliance and push the boundaries of science on a single, highly scalable storage platform,” said Ramjan. “We’ve also saved hundreds of thousands of dollars by centralizing the storage of our data-intensive research and a dozen data-hungry scientific instruments on DDN. With all these advantages it is easy to see why DDN is core to our operation and a major asset to our scientists.”
“STFC Hartree Centre needed a powerful, flexible server system that could drive research in energy efficiency as well as economic impact for its clients. By extending its System x platform with NeXtScale System, Hartree Centre can now move to exascale computing, support sustainable energy use and help its clients gain a competitive advantage.” Sophisticated data processes are now integral to all areas of research and business. Whether you are new to discovering the potential of supercomputing, data analytics and cognitive techniques, or are already using them, Hartree’s easy to use portfolio of advanced computing facilities, software tools and know-how can help you create better research outcomes that are also faster and cheaper than traditional research methods.
Today DDN announced that Yahoo Japan has deployed an active archive system jointly developed by DDN and IBM Japan. The new system allows Yahoo! JAPAN to cache dozens of petabytes of data from its OpenStack Swift storage solution in a Japan-based data center, and transfer data to a U.S.-based data center at an astonishing rate of 50 TB of data per day – thus enabling energy cost savings of 74 percent due to lower energy rates in the United States versus Japan, while ensuring fast data access regardless of location.
Designed specifically with researchers in mind, the Birmingham Environment for Academic Research (BEAR) Cloud will augment an already rich set of IT services at the University of Birmingham and will be used by academics across all disciplines, from Medicine to Archaeology, and Physics to Theology. “We are very proud of the new system, but building a research cloud isn’t easy,” said Simon Thompson, Research Computing Infrastructure Architect in IT Services at the University of Birmingham. “We challenged a range of carefully-selected partners to provide the underlying technology.”
“The University’s researchers are making landmark discoveries in fields spanning human heritable disease, cancer, agriculture and biofuels manufacture – and they depend on our IT team to provide them with the fastest, most efficient data storage and compute systems to support their data-heavy work,” said Professor David Abramson, University of Queensland Research Computing Center director. “Our IBM, SGI (DMF) and DDN-based data fabric allows us to deliver ultra-fast multi-site data access without requiring any extra intervention from researchers and helps us to ensure our scientists can focus their time on potentially life-saving discoveries.”
In this special guest feature from Scientific Computing World, Shailesh M Shenoy from the Albert Einstein College of Medicine in New York discusses the challenges faced by large medical research organizations in the face of ever-growing volumes of data. “In short, our challenge was that we needed the ability to collaborate within the institution and with colleagues at other institutes – we needed to maintain that fluid conversation that involves data, not just the hypotheses and methods.”
OCF in the U.K. recently deployed a new Fujitsu HPC cluster at the University of East Anglia. As the University’s second new HPC system in 4-years, the cluster can be easily scaled and expanded in the coming months through a framework agreement to match rapidly increasing demand for compute power.
A partnership of seven leading bioinformatics research and academic institutions called eMedLab is using a new private cloud, HPC environment and big data system to support the efforts of hundreds of researchers studying cancers, cardio-vascular and rare diseases. Their research focuses on understanding the causes of these diseases and how a person’s genetics may influence their predisposition to the disease and potential treatment responses.
“Ngenea’s blazingly-fast on-premises storage stores frequently accessed active data on the industry’s leading high performance file system, IBM Spectrum Scale (GPFS). Less frequently accessed data, including backup, archival data and data targeted to be shared globally, is directed to cloud storage based on predefined policies such as age, time of last access, frequency of access, project, subject, study or data source. Ngenea can direct data to specific cloud storage regions around the world to facilitate remote low latency data access and empower global collaboration.”