Steve Conway on the Collision Course of Commercial and HPC Big Data

Print Friendly, PDF & Email

Over at Scientific Computing, IDC’s Steve Conway writes that traffic in both directions is blurring the boundaries between commercial computing and data-intensive HPC.

With commercial and HPC big data converging in many areas, where do you draw the line between the two data-intensive domains? The answer, in IDC’s view, is that a problem moves into the HPC realm when it requires HPC resources, especially software needing to run on HPC hardware to meet performance goals. It seems apparent that, over time, a growing number of commercial problems will scale to this level. This will cause more commercial vendors to expand into the HPC market, and more HPC vendors to expand in the opposite direction.

Read the Full Story.

Comments

  1. Vladimir Vujic says

    Bottom line of HPC system is defined with several attributes:
    1. Scaling of HPC system (compared to itself, and other HPC and non HPC systems)
    2. Capabilities of HPC system(hardware+software) compared to other HPC and non HPC systems)
    3. Productivity of new HPC system implementation (compared to other HPC and non HPC)
    4. Price of running new HPC system. (compared to other HPC systems and non HPC systems)
    By putting measurement criteria in place (money) it is not so complicated to answer to all these questions.