Sign up for our newsletter and get the latest big data news and analysis.

Univ. of Chicago’s Polsky Center Launches Venture Conference for Midwest Deep Tech Startups from Universities and National Labs

CHICAGO– The University of Chicago’s Polsky Center for Entrepreneurship and Innovation is pleased to announce the launch of DeepTechU, a venture conference showcasing deep tech innovation and 48 investor-ready companies from universities and national labs. The virtual conference will take place April 20-22, 2021, and feature quick pitches as well as discussions with industry experts and entrepreneurs. The conference […]

What May Come from Exascale? Improved Medicines, Longer-range Batteries, Better Control of 3D Parts, for Starters

As Exascale Day (Oct. 18) approaches, we thought it appropriate to post a recent article from Scott Gibson of the Exascale Computing Project (ECP), an overview of the anticipated advances in scientific discovery enabled by exascale-class supercomputers. Much of this research will focus on atomic physics and its impact on such areas as catalysts used in industrial conversion, molecular dynamics simulations and quantum mechanics used to develop new materials for improved medicines, batteries, sensors and computing devices.

Department of Energy awards Fermilab $3.5 million for quantum science

The U.S. Department of Energy has awarded researchers at its Fermi National Accelerator Laboratory more than $3.5 million to boost research in the fast-emerging field of Quantum Information Science. “Few pursuits have the revolutionary potential that quantum science presents,” said Fermilab Chief Research Officer Joe Lykken. “Fermilab’s expertise in quantum physics and cryogenic engineering is world-class, and combined with our experience in conventional computing and networks, we can advance quantum science in directions that not many other places can.”

Building HPC Clusters as Code in the (Almost) Infinite Cloud

“Researchers can run one cluster for 10,000 hours or 10,000 clusters for one hour anytime, from anywhere, and both cost the same in the cloud. And with the availability of Public Data Sets in Amazon S3, petabyte scale data is instantly accessible in the cloud. Attend and learn how to build HPC clusters on the fly, leverage Amazon’s Spot market pricing to minimize the cost of HPC jobs, and scale HPC jobs on a small budget, using all the same tools you use today, and a few new ones too.”

Video: High Performance Computing for the LHC

In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that make the LHC possible. “The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter.”