Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


LLNL Collaboration to Improve Cancer Screening

Computer scientists at LLNL and Norwegian researchers are collaborating to apply high performance computing to the analysis of medical data to improve screening for cervical cancer. The team is developing a flexible, extendable model that incorporates new data such as other biomolecular markers, genetics and lifestyle factors to individualize risk assessment, according to Abdulla. “We want to identify the optimal interval for screening each patient.”

General Atomics Releases Next-Gen Nirvana Data Management Software

Today General Atomics announced the next generation of Nirvana – a premier metadata, data placement and data management software solution for the most demanding workflows in Life Sciences, Scientific Research, Media & Entertainment and Energy Exploration. “Nirvana 5.0 reduces storage costs up to 75% by turning geographically dispersed, multiple vendor storage silos into a single global namespace that automatically moves infrequently-accessed data to lower-cost storage or to the cloud.”

BSC Lays out European RETHINK Big Roadmap

Today the Barcelona Supercomputing Center presented a big data roadmap that it has coordinated on behalf of the European Commission. As part of the RETHINK Big Project, the presentation was given as part of the Big Data Congress and BSC used it to highlight the need for Europe to carry out research on new solutions for hardware and software for big data use.

Supermicro Rolls Out New Servers with Tesla P100 GPUs

“Our high-performance computing solutions enable deep learning, engineering, and scientific fields to scale out their compute clusters to accelerate their most demanding workloads and achieve fastest time-to-results with maximum performance per watt, per square foot, and per dollar,” said Charles Liang, President and CEO of Supermicro. “With our latest innovations incorporating the new NVIDIA P100 processors in a performance and density optimized 1U and 4U architectures with NVLink, our customers can accelerate their applications and innovations to address the most complex real world problems.”

MathWorks Release 2016b Makes it Easier to Work with Big Data

“Companies are awash in data, but struggle to take advantage of it to build better predictive models and gain deeper insights,” says David Rich, MATLAB marketing director, MathWorks. “With R2016b, we’ve lowered the bar to allow domain experts to work with more data, more easily. This leads to improved system design, performance, and reliability.”

Interview: Numascale to Partner with OEMs on Big Memory Server Technology

Hailing from Norway, big-memory appliance maker Numascale has been a fixture at the ISC conference since the company’s formation in 2008. At ISC 2016, Numascale was noticeably absent from the show and the word on the street was that the company was retooling their NumaConnect™ technology around NVMe. To learn more, we caught up with Einar Rustad, Numascale’s CTO.

Big Workflow: More than Just Intelligent Workload Management for Big Data

Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. To learn more down load this white paper.

Research for New Technology Using Supercomputers

This paper presents our approach to research and development in relation to four applications in which utilization of simulations in super-large-scale computation systems is expected to serve useful purposes.

OpenCB a Next Generation for Big Data Analytics

This paper introduced development work being undertaken at Cambridge to create a new state of the art omics analysis hardware and software platform utilizing the open source software framework called OpenCB

Remote 3D Visualisation VDI Technologies

This paper will look at how remote 3D virtual desktop technology can be applied in two usage modes within the scientific and technical computing arena . First, as an HPC remote visualisation platform deployed to provide real time graphical access to large scale data sets stored within the HPC data center . Second, to provide a full 3D capable virtual workstation solution as a replacement of traditional fixed under-desk workstation solutions .