The STFC Hartree Centre in the UK is now an Intel Parallel Computing Center (IPCC).
In this video from the DDN User Group at ISC’14, Satoshi Matsuoka from the Tokyo Institute of Technology presents: A Look at Big Data in HPC. “HPC has been dealing with big data for all of its existence. But it turns out that the recent commercial emphasis on big data, has coincided with a fundamental change in the sciences as well. As scientific instruments and facilities produce large amounts of data in an unprecedented rate, the HPC community is reacting to this, with revisiting architecture, tools, and services to address this growth in data.”
On June 22, the US Department of Energy (DOE) and Japan’s Ministry of Education, Culture, Sports, Science and Technology (MEXT) signed an agreement to collaborate on exascale supercomputing technologies for the scientific community. In a nutshell, the plan is to build a common OS kernel that can be used by all post-petascale systems, regardless of hardware eccentricities.
“The Intel Enterprise Edition for Lustre software unleashes the performance and scalability of the Lustre parallel file system for HPC workloads, including technical ‘big data’ applications common within today’s enterprises. It allows end-users that need the benefits of large–scale, high bandwidth storage to tap the power and scalability of Lustre, with the simplified installation, configuration, and management features provided by Intel Manager for Lustre software, a management solution purpose-built by the Lustre experts at Intel for the Lustre file system.”
“Regardless of the application, it is important to have adequate bandwidth to support your network. With so many carriers out there, it can be difficult to choose the right one. Fortunately, there are bandwidth brokers with industry expertise that can help any company make the right decision based on size, location, bandwidth requirements and budget.”
“The Weizmann Institute replaced Moab with PBS Professional to manage a 3,096-core HP cluster shared among hundreds of users. The transition process was easy, thanks to Altair’s excellent customer services and support. Now, the Institute enjoys higher cluster usage rates and greater productivity from their award-winning scientists.”