Over at InfoStor, Henry Newman from Instrumental writes that a new slide deck from Micron provides and intriguing look at the future of memory technology. “So what does this all mean for our future in the data storage industry? I think Micron and likely other companies are going to making some major changes from 2015 to the end of the decade in the area of non-volatile memory as the market demands changes for mobile devices that need both low power usage and non-volatile memory.”
Over at TechRadar, Julian Fielden from OCF writes that the users faced with almost insurmountable energy and cooling challenges will likely avoid owning and housing their own Exascale computing facilities and look to the “cloud” and on-demand services provided by much larger international suppliers.
In this special guest feature from Scientific Computing World, Dr. James Osborne from HPC Wales writes that distance learning techniques may help train the next generation of computational scientists. Simulation and modelling are now widely seen as the third pillar of science, alongside theory and experimentation. The ability to harness today’s high performance computers is […]
“The availability of HPC-on-Demand is opening up the world of supercomputers to expanding organizations that want to quickly take on more work and burst capacity when required. SMEs, that have never previously had access to this kind of power, can now use it on a project basis over a defined period of time and only pay for what they use. They don’t waste investment by having infrastructure running idle.”
“In aerospace, users prefer to have the option to visualize the entire system. But that doesn’t necessarily mean it’s something routinely done. By definition, system-level visuals involve too much data. If an engineer is trying to troubleshoot something in an analysis program, he or she will most likely analyze only the subcomponents that contribute to the problem. For example, electrical wiring information can safely be omitted when doing an airflow study of the outer surface.”
Over at The Register, Dan Olds writes that, with the acquisition of the x86 servers business from IBM, Lenovo is buying a spot on the Top500 list and a sizeable place in the HPC market. Using the November 2013 list, Lenovo would hold onto second place in terms of systems – 25 per cent of the total (127 boxes). This puts it ahead of everyone except HP. IBM without x86 boxes would place fourth on the system count list, behind HP, Lenovo and Cray.
“The speed, accuracy and cost at which enterprises can process big data analytics is the new competitive battleground, and we expect the need for results to greatly impact computing in 2014 and beyond,” said Rob Clyde, CEO of Adaptive Computing. “In our estimation, big data requires a streamlined approach to a complex data analysis and simulation process that can manage all resources across multiple computing platforms.”
“As for chemistry and life sciences, we see it as a vibrant and dynamic field that’s constantly evolving — from using nanotechnology and new materials to deliver drugs to unraveling the mysteries of how cells work. We definitely have our eye on those new technologies that are revolutionizing the field such as next generation sequencing (NGS). NGS allows for the analysis of genetic material with unprecedented speed and efficiency and is well suited for HPC.”