In his recent paper, Lester Ingber explores the relationship between large-scale, or “top-down” activities in the brain and short-term memory and consciousness.
“We have developed an active programme of industry engagement, typically consulting in niche areas and helping those partners exploit advanced computing within their organisations. This may be improving fraud detection algorithms, large-scale data management, tailored training courses or oil & gas related work. In fact it is very rarely about computer time! We also work with public sector bodies to support computing centric projects by the state and provide so-called platform technology support for research communities that need more than just raw cycles.”
“One of the hottest topics we see is remote visualization for post-processing simulation results. One big issue in traditional workflows in technical and scientific computing is the transfer of large amounts of data from where these have been created to where they are analyzed. Streamlining this workflow by processing where the data have been created in the first place is tantamount to shorten the wall-clock time it takes end users to get final results. At the same time, hardware utilization is greatly enhanced by using innovative technology for remote 3D visualization. For this, we have long since entered into a strategic partnership with NICE.”
ISC’14 will host a Student Cluster Competition, organized in collaboration with the HPC Advisory Council. Held during the ISC exhibition in Leipzig towards the end of June, the Student Cluster Challenge is not just an opportunity to showcase student expertise but is designed to encourage the next generation of students to take up the challenges of high performance computing.
“I will describe an approach developed in our lab that uses custom-designed video games to achieve meaningful and sustainable cognitive enhancement, as well the next stage of our research program, which uses video games integrated with technological innovations in software (e.g., brain computer interface algorithms, GPU computing) and hardware (e.g., virtual reality headsets, mobile EEG, transcranial electrical brain stimulation) to create a novel personalized closed loop system.”
Intel Parallel Computing Centers are focusing on modernizing applications to increase parallelism and scalability. By enabling the advancement of parallelism, the Intel Parallel Computing Centers will accelerate discovery in the fields of energy, finance, manufacturing, life sciences, weather, and beyond.
In this video, University of Colorado Boulder doctoral student Eric Wolf describes how the Janus supercomputer runs climate simulations that help him study why the early earth was warm despite the Sun being much less luminous that it is today. This quandary is known as the Faint Young Sun Paradox. As one of the first container-based supercomputers in academia, the Janus supercomputer comprises 1368 compute nodes, each containing 12 cores, for a total 16,416 available cores.