IDC has published the agenda for their next HPC User Forum. The event will take place April 11-13 in Tucson, AZ. “Don’t miss the chance to hear top experts on these high-innovation, high-growth areas of the HPC market. At this meeting, you’ll also hear about government initiatives to get ready for future-generation supercomputers, machine learning, and High Performance Data Analytics.”
Today Atos announced that the French CEA and its industrial partners at the Centre for Computing Research and Technology, CCRT, have invested in a new 1.4 petaflop Bull supercomputer. “Three times more powerful than the current computer at CCRT, the new system will be installed in the CEA’s Very Large Computing Centre in Bruyères-le-Châtel, France, mid-2016 to cover expanding industrial needs. Named COBALT, the new Intel Xeon-based supercomputer will be powered by over 32,000 compute cores and storage capacity of 2.5 Petabytes with a throughput of 60 GB/s.”
“It was indicated in my keynote this morning there are two really fundamental challenges we’re facing in the next two years in all sorts of computing – from supercomputers to cell phones. The first is that of energy efficiency. With the end of Dennard scaling, we’re no longer getting a big improvement in performance per watt from each technology generation. The performance improvement has dropped from a factor of 2.8 x back when we used to scale supply voltage with each new generation, now to about 1.3 x in the post-Dennard era. With this comes a real challenge for us to come up with architecture techniques and circuit techniques for better performance per watt.”
Researchers at UCLA have created the first detailed computer simulation model of an injured human leg–complete with spurting blood. The simulation is designed to make training for combat medics more realistic. “To create the simulator model, researchers combined detailed knowledge of anatomy with real-life CAT scans and MRIs to map out layers of a human leg–the bone, the soft tissue containing muscle and blood vessels and the skin surrounding everything. Then the design team applied physics and mathematical equations, fluid dynamics, and pre-determined rates of blood flow from specific veins and arteries to simulate blood loss for wounds of varying sizes and severity.”
In this video, Prof. Dr. Satoshi Matsuoka from the Tokyo Institute of Technology describes his role as Program Chair of ISC High Performance 2016. He talks about the transformation of the conference in recent years and admits: “This is one of the most enjoyable conferences I have ever been to.” ISC High Performance is the landmark supercomputing, networking and storage event that attracts HPC enthusiasts from all across the globe. With 3,000 attendees, it is the largest HPC conference and exhibition in Europe. Regionally, the top five countries with the greatest number of ISC conference attendees are Germany, the United States, the United Kingdom, France and China. A growth in participation is projected for 2016, particularly from Asia.
Today the European Consortium announced a step towards Exascale computing with the ExaNeSt project. Funded by the Horizon 2020 initiative, ExaNeSt plans to build its first straw man prototype in 2016. The Consortium consists of twelve partners, each of which has expertise in a core technology needed for innovation to reach Exascale. ExaNeSt takes the sensible, integrated approach of co-designing the hardware and software, enabling the prototype to run real-life evaluations, facilitating its scalability and maturity into this decade and beyond.
The speaker agenda has been published for the HPC-Based CFD for Offshore Renewable Energy Workshop. The two-day event takes place April 7-8 at Lancaster University in the UK.
“The human microbiome plays a role in processes as diverse as metabolism, immune function, and mental health. Yet despite the importance of this system, scientists are just beginning to uncover which microorganisms reside in and on our bodies and determine what functions they perform. The development of innovative technology and analytical methods has enabled researchers like Dr. Pollard to decode the complex interactions between our human cells and microbial brethren, and infer meaning from the staggering amounts of data 10 trillion organisms create.”
In this video, researchers describe how the Jetstream project at Indiana University. Jetstream is a user-friendly cloud environment designed to give researchers access to interactive computing and data analysis resources on demand, whenever and wherever they want to analyze their data. It will provide a library of virtual machines designed to do discipline specific scientific analysis. Software creators and researchers will also be able to create their own customized virtual machines or their own private computing system within Jetstream.
In this podcast, the Radio Free HPC team looks at the Top Technology Stories for High Performance Computing in 2015. “From 3D XPoint memory to Co-Design Architecture and NVM Express, these new approaches are poised to have a significant impact on supercomputing in the near future.” We also take a look at the most-shared stories from 2015.