Over at TACC, Faith Singer-Villalobos writes that researchers are using the Rustler supercomputer to tackle Big Data from self-driving connected vehicles (CVs). “The volume and complexity of CV data are tremendous and present a big data challenge for the transportation research community,” said Natalia Ruiz-Juri, a research associate with The University of Texas at Austin’s Center for Transportation Research. While there is uncertainty in the characteristics of the data that will eventually be available, the ability to efficiently explore existing datasets is paramount.
The PEARC17 Conference has issued its Call for Participation. Formerly known as the Extreme Science and Engineering Discovery Environment (XSEDE) annual conference, PEARC17 will take place July 9-13 in New Orleans. “The Technical Program for the PEARC17 includes four Paper tracks, Tutorials, Posters, a Visualization Showcase and Birds of a Feather (BoF) sessions. All submissions should emphasize experiences and lessons derived from operation and use of advanced research computing on campuses or provided for the academic and open science communities. Submissions aligned with the conference theme—Sustainability, Success, and Impact—are particularly encouraged.”
Today the San Diego Supercomputer Center (SDSC) announced that the comet supercomputer has easily surpassed its target of serving at least 10,000 researchers across a diverse range of science disciplines, from astrophysics to redrawing the tree of life. “In fact, about 15,000 users have used Comet to run science gateways jobs alone since the system went into production less than two years ago.”
Graduate students and postdoctoral scholars from institutions in Canada, Europe, Japan and the United States are invited to apply for the eighth International Summer School on HPC Challenges in Computational Sciences, to be held June 25- 30, 2017, in Boulder, Colorado.
“Bridges’ new nodes add large-memory and GPU resources that enable researchers who have never used high-performance computing to easily scale their applications to tackle much larger analyses,” says Nick Nystrom, principal investigator in the Bridges project and Senior Director of Research at PSC. “Our goal with Bridges is to transform researchers’ thinking from ‘What can I do within my local computing environment?’ to ‘What problems do I really want to solve?’”
In this video, Dr. Kelly Gaither from TACC describes how 20 students identified by XSEDE’s community engagement team participated in a four-day long cohort experience themed around social change at SC16. “The objectives of the program are to engage students in a social change challenge using visualization and data analytics to increase awareness, interest, and ultimately inspire students to continue their path in advanced computing careers; to increase the participation of students historically underserved in STEM at SC.”
Using a unique computational approach to rapidly sample proteins in their natural state of gyrating, bobbing, and weaving, a research team from UC San Diego and Monash University in Australia has identified promising drug leads that may selectively combat heart disease, from arrhythmias to cardiac failure.
The Extreme Science and Engineering Discovery Environment (XSEDE) annual conference is transforming into an independent entity designed to unite the high-performance computing and advanced digital research community. The new Practice & Experience in Advanced Research Computing conference (PEARC) will welcome all who care about using advanced digital services for research. The PEARC17 Conference will take place in New Orleans, Louisiana, July 9-13, 2017.
Last week, XSEDE announced it has awarded more than $16M worth of compute resources to 155 research projects. This is the first cohort of allocations awardees after the announcement of a 5-year renewal of XSEDE by the National Science Foundation to expand access to the nation’s cyberinfrastructure ecosystem.
“We’re trying to make high resolution simulations of super cell storms, or tornadoes,” McGovern said. “What we get with the simulations are the fundamental variables of whatever our resolution is — we’ve been doing 100 meter x 100 meter cubes — there’s no way you can get that kind of data without doing simulations. We’re getting the fundamental variables like pressure, temperature and wind, and we’re doing that for a lot of storms, some of which will generate tornadoes and some that won’t. The idea is to do data mining and visualization to figure out what the difference is between the two.”