• How You Can Use Artificial Intelligence in the Financial Services Industry

    GPUs

    In financial services, it is important to gain any competitive advantage. Your competition has access to most of the same data you do, as historical data is available to everyone in your industry. Your advantage comes with the ability to exploit that data better, faster, and more accurately than your competitors. With a rapidly fluctuating market, the ability to process data faster gives you the opportunity to respond quicker than ever before. This is where AI-first intelligence can give you the leg up.

Featured Stories

  • Former Hyperion HPC Analyst Steve Conway Joins Intersect360 Research

    In the potboiler world of HPC industry analyst firms this is a surprising development: Steve Conway, longtime senior member of Hyperion Research, a firm he left last summer to start Conway Communications, has joined Hyperion’s closest rival, Intersect360 Research, led by CEO Addison Snell. Conway, who has been in the HPC industry for 30 years, takes on the title of senior analyst. “I am absolutely thrilled to be working with [READ MORE…]

  • A Year of Quantum Information Science on NERSC’s Perlmutter Supercomputer

    Last November, the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory launched its QIS@Perlmutter program to provide researchers working on quantum information science (QIS) problems with supercomputing resources, while also exploring how QIS could benefit high-performance computing (HPC). NERSC is a U.S. Department of Energy (DOE) computing facility that provides supercomputing and other scientific computing resources to thousands of researchers each year. Among these researchers, QIS [READ MORE…]

  • Catalyzing the Advancements in Genomics to Lower Barriers to Sustainable Innovation

    In the 21st Century, if Big Data is used effectively in the health sector only, it can save 300 billion dollars per annum, as per the McKinsey Global Institute survey. Though the Genomic science is experiencing big data overload, its benefit to humanity of deciphering such big biological data sets using NGS technology makes it the ultimate use case in the coming era. 

  • SC22: CXL3.0, the Future of HPC Interconnects and Frontier vs. Fugaku

    HPC luminary Jack Dongarra’s fascinating comments at SC22 on the low efficiency of leadership-class supercomputers highlighted by the latest High Performance Conjugate Gradients (HPCG) benchmark results will, I believe, influence the next generation of supercomputer architectures to optimize for sparse matrix computations. The upcoming technology that will help address this problem is CXL. Next generation architectures will use CXL3.0 switches to connect processing nodes, pooled memory and I/O resources into [READ MORE…]

Featured Resource

Hybrid Cloud with Bright Cluster Manager

This whitepaper from our friends over at Bright Computing discusses how hybrid cloud infrastructures allow organizations to strategically manage their compute requirements from core data center to public cloud and edge, but they are very complex to build and manage. Automation is  essential, and verifying your staff’s abilities to work with your tool of choice is mandatory.

HPC Newsline

Industry Perspectives

  • …today’s situation is clear: HPC is struggling with reliability at scale. Well over 10 years ago, Google proved that commodity hardware was both cheaper and more effective for hyperscale processing when controlled by software-defined systems, yet the HPC market persists with its old-school, hardware-based paradigm. Perhaps this is due to prevailing industry momentum or working within the collective comfort zone of established practices. Either way, hardware-centric approaches to storage resiliency need to go.

  • New, Open DPC++ Extensions Complement SYCL and C++

    In this guest article, our friends at Intel discuss how accelerated computing has diversified over the past several years given advances in CPU, GPU, FPGA, and AI technologies. This innovation drives the need for an open and cross-platform language that allows developers to realize the potential of new hardware, minimizes development cost and complexity, and maximizes reuse of their software investments.

RSS Featured from insideBIGDATA

  • Hypothesis-led data exploration is failing you …
    In this special guest feature, Aakash Indurkhya, Co-Head of AI at Virtualitics, suggests that you should set your assumptions aside and start looking at your data through the lens of AI. Cut through the noise, surface significant insight, and take aim at the real issues. Forget data as oil–data is gold and Intelligent Exploration is […]

Editor’s Choice

  • DOE Document Reveals Next-Gen Supercomputing Strategy: A Move to More Modular, Faster Upgrade Cycles

    Less than a month after its Frontier system broke the exascale performance barrier and won the no. 1 supercomputing world ranking, the U.S. Department of Energy today issued an RFI revealing its strategic thinking for the next generation of leadership-class supercomputers extending out to 2030. The document calls for “the development of an approach that moves away from monolithic acquisitions toward a model for enabling more rapid upgrade cycles of deployed systems, to enable faster innovation on hardware and software.” DOE said it expects its next-gen systems “to operate within a power envelope of 20-60 MW.” Other than that the [READ MORE…]

  • Sentient AI? Google Suspends Engineer over Claims the LaMDA Chatbot Is a Person with Rights

    It’s often said AI is overhyped, but even so, some claims can get you in trouble. That’s the irony of a situation Google finds itself in. The company has suspended one its software engineers who claimed  its natural language processing chatbot, LaMDA, is “sentient.” There are several surprising elements here. One is the commentary from the Google engineer that LaMDA is a person with rights. Another is the astonishing dialogue he reported to have had with LaMDA. Take for example the insights LaMDA rattled off on “Les Miserables”: Lemoine: Okay, what about “Les Miserables”? Have you read that one? LaMDA: [READ MORE…]

  • Frontier Named No. 1 Supercomputer on TOP500 List and ‘First True Exascale Machine’

    Hamburg — This morning, AMD’s long comeback from trampled HPC also-ran – a comeback that began in 2017 when company executives told skeptical press and industry analysts to expect price/performance chip superiority over Intel – reached a high point (not to say an end point) with the news that the U.S. Department of Energy’s Frontier supercomputer, an HPE-Cray EX system powered by AMD CPUs and GPUs, has not only been named the world’s most powerful supercomputer, it also is the first system to exceed the exascale (1018 calculations/second) milestone. This may not come as a  surprise to many in the [READ MORE…]

  • Chip Geopolitics: If China Invades, Make Taiwan ‘Unwantable’ by Destroying TSMC, Military Paper Suggests

    US military planners are taking notice of a suggestion by two military scholars calling for the destruction of semiconductor foundry company Taiwan Semiconductor Manufacturing Co. (TSMC), whose fabs produce advanced microprocessors used in HPC and AI, in the event China invades the island nation A news story in today’ edition of Data Center Times cites the Nikkei Asia news service and a paper in the U.S. Army War College’s scholarly journal, Parameters, discussing the possibility of Taiwan adopting “’a scorched earth policy’ and wipe out its own semiconductor foundries in the wake of any Chinese invasion as a deterrent, U.S. [READ MORE…]

  • How Machine Learning Is Revolutionizing HPC Simulations

    Physics-based simulations, that staple of traditional HPC, may be evolving toward an emerging, AI-based technique that could radically accelerate simulation runs while cutting costs. Called “surrogate machine learning models,” the topic was a focal point in a keynote on Tuesday at the International Conference on Parallel Processing by Argonne National Lab’s Rick Stevens. Stevens, ANL’s associate laboratory director for computing, environment and life sciences, said early work in “surrogates,” as the technique is called, shows tens of thousands of times (and more) speed-ups and could “potentially replace simulations.” Surrogates can be looked at as an end-around to two big problems [READ MORE…]

Sign up for our newsletter and get the latest big data news and analysis.
Daily
Weekly