As exponential data growth reshapes the industry, engineering, and scientific discovery, success has come to depend on the ability to analyze and extract insight from incredibly large data sets. Exascale computing will allow us to process data, run systems, and solve problems at a totally new scale and this will become vitally important as problems grow ever larger, ever more difficult. Our unmatched ability to bring new technology to the mainstream will provide systems that are markedly more affordable, usable, and efficient at handling growing workloads. To learn more download this white paper.
Using commodity hardware and the “plug-and-play” NumaConnect interconnect, Numascale delivers true shared memory programming and simpler administration at standard HPC cluster price points. Download this white paper to learn more.
When the National Center for Supercomputing Applications (NCSA) was created at the University of Illinois 27 years ago, it had a unique proposition—its computing, data and networking resources were designed for industry as well as academia. Over the years, NCSA’s efforts to serve industry have grown and matured. Download this white paper to learn more
The Central Processing Unit (CPU) has been at the heart of High Performance Computing (HPC) for decades. However, in recent years, advances in parallel processing technology mean the landscape has changed dramatically. To learn more download this white paper.
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. To learn more down load this white paper.
Supercomputers may date back to the 1960s, but it is only recently that their vast processing power has begun to be harnessed by industry and commerce, to design safer cars, build quieter aeroplanes, speed up drug discovery, and subdue the volatility of the financial markets. The need for powerful computers is growing, says Catherine Rivière […]
This paper introduced development work being undertaken at Cambridge to create a new state of the art omics analysis hardware and software platform utilizing the open source software framework called OpenCB
This paper will look at how remote 3D virtual desktop technology can be applied in two usage modes within the scientific and technical computing arena . First, as an HPC remote visualisation platform deployed to provide real time graphical access to large scale data sets stored within the HPC data center . Second, to provide a full 3D capable virtual workstation solution as a replacement of traditional fixed under-desk workstation solutions .
2013 has been an exciting year for the field of Statistics and Big Data, with the release of the new R version 3.0.0. We discuss a few topics in this area, providing toy examples and supporting code for configuring and using Amazon’s EC2 Computing Cloud. There are other ways to get the job done, of course. But we found it helpful to build the infrastructure on Amazon from scratch, and hope others might find it useful, too.
When it comes to generating increasingly larger data sets and stretching the limits of high performance computing (HPC), the field of genomics and next generation sequencing (NGS) is in the forefront.
The major impetus for this data explosion began in 1990 when the U.S. kicked off the Human Genome Project, an ambitious project designed to sequence the three billion base pairs that constitute the complete set of DNA in the human body. Eleven years and $3 billion later the deed was done. This breakthrough was followed by a massive upsurge in genomics research and development that included rapid advances in sequencing using the power of HPC. Today an individual’s genome can be sequenced overnight for less than $1,000.