ISC has announced an Early Bird discount for its second annual Big Data conference. The event will offer attendees insights into the latest developments in data-intensive computing from both industry players and researchers.
In this video, Intersect360 Research CEO Addison Snell describes the dynamics unfolding in the technical computing market, and its convergence with big data analytics. “In their customer surveys on big data, Intersect360 finds that end users need to scale their environments from desktops to servers, to clusters of servers, to large supercomputers. Mr. Snell notes that IBM is a leader in HPC and Big Data that is making the investments to extend that leadership.”
In this special guest feature, Ferhat Hatay from Fujitsu writes that supercomputing technologies developed for data-intensive scientific computing can be a powerful tool for taking on the challenges of Big Data. We all feel it, data use and growth is explosive. Individuals and businesses are consuming — and generating — more data every day. The […]
“Moore’s Law got deflected in 2004, when it became no longer practical to ramp up the clock speed of CPUs to improve performance. So the chip industry improved CPU performance by adding more processors to a chip in concert with miniaturization. This was extra power, but you could not leverage it easily without building parallel software. Virtual machines could use multicore chips for server consolidation of light workloads, but large workloads needed parallel architectures to exploit the power. So, the software industry and the hardware industry moved towards exploiting parallelism in ways they had not previously done. This is the motive force behind the Big Data.”