The Alan Turing Institute is the UK’s national institute for data science. It has marked its first few days of operations with the announcement of its new director, the confirmation of £10 million of research funding from Lloyd’s Register Foundation, a research partnership with GCHQ, collaboration with the EPSRC and Cray, and the commencement of its first research activities.
In this podcast, the Radio Free HPC team previews three of the excellent Tutorial sessions coming up at SC15. “The SC tutorials program is one of the highlights of the SC Conference series, and it is one of the largest tutorial programs at any computing-related conference in the world. It offers attendees the chance to learn from and to interact with leading experts in the most popular areas of high performance computing (HPC), networking, and storage.”
In this video from the SF Big Analytics Meetup, Bryan Catanzaro from Baidu presents: Why is HPC so important to AI? “We built Deep Speech because we saw the opportunity to re-conceive speech recognition in light of the new capabilities afforded by Deep Learning, to take advantage of even larger datasets to solve even harder problems.”
In this video from the Barcelona Supercomputer Center, Big Data is presented as a key challenge for researchers studying global climate change. “Changes in the composition of the atmosphere can affect the habitability of the planet by modifying the air quality and altering long-term climate. Research in this area is devoted to the development, implementation and refinement of global and regional state-of-the-art models for short-term air quality forecasting and long-term climate predictions.”
Over at NERSC, Linda Vu writes that the SciDB open source database system is a powerful tool for helping scientists wrangle Big Data. “SciDB is an open source database system designed to store and analyze extremely large array-structured data—like pictures from light sources and telescopes, time-series data collected from sensors, spectral data produced by spectrometers and spectrographs, and graph-like structures that illustrate relationships between entities.”
Today Intel Corporation and BlueData announced a broad strategic technology and business collaboration, as well as an additional equity investment in BlueData from Intel Capital. BlueData is a Silicon Valley startup that makes it easier for companies to install Big Data infrastructure, such as Apache Hadoop and Spark, in their own data centers or in the cloud.
“Despite the growing abundance of powerful tools, building and deploying machine-learning frameworks into production continues to be major challenge, in both science and industry. I’ll present some particular pain points and cautions for practitioners as well as recent work addressing some of the nagging issues. I advocate for a systems view, which, when expanded beyond the algorithms and codes to the organizational ecosystem, places some interesting constraints on the teams tasked with development and stewardship of ML products.”
The big memory “Blacklight” system at the Pittsburgh Supercomputer Center will be retired on Aug 15 to make way for the new “Bridges” supercomputer. “Built by HP, Bridges will feature multiple nodes with as much as 12 terabytes each of shared memory, equivalent to unifying the RAM in 1,536 high-end notebook computers. This will enable it to handle the largest memory-intensive problems in important research areas such as genome sequence assembly, machine learning and cybersecurity.”
In this Intel Chip Chat podcast, Bill Mannel from HP stops by to discuss the growing demand for high performance computing solutions and the innovative use of HPC to manage big data. He highlights an alliance between Intel and HP that will accelerate HPC and big data solutions tailored to meet the latest needs and workloads of HPC customers, leading with customized vertical solutions.