[SPONSORED GUEST ARTICLE] At DDC, the global leader in scalable datacenter-to-edge solutions, we are taking an innovative approach to building new and retrofitting legacy datacenters. Today, our patented cabinet technology can be deployed in nearly any environment or facility and supports one of the highest-density, air-cooled thermal loads—85kW per cabinet—on the market. In a recent deployment with TierPoint, a US-based colocation provider, we are supporting a 26,000 sq ft facility augmentation outside of Allentown, PA.
The Future-Proofed Datacenter: DDC Delivers 85kW Air-Cooled Density for AI and HPC Workloads
Life is Fleeting, But Data is Forever – Meet your Digital Twin

[SPONSORED GUEST ARTICLE BY HPE AND NVIDIA] With the transformation of medicine from analog to digital, plus the rise of new data-generating devices for health tracking and genomic information, we can look forward to a new world in which virtually every aspect of a patient’s….
Optical I/O Takes Center Stage at SC23

[SPONSORED GUEST ARTICLE] Integration of optical I/O with an FPGA is the tip of the iceberg of a new vision to enable new HPC/AI architectural advances through ubiquitous optical interconnects for every piece of compute silicon. If you are attending SC23 November 12-17, be sure to visit Ayar Labs in booth #228 for an exclusive look at the future….
Taming the Exascale Beast: Challenges and Solutions in Architecting JUPITER

In this sponsored article, Crispin Keable PhD., senior solution architect developing HPC, AI and quantum systems for Eviden (an Atos business), discusses how his philosophy is and always has been to help customers achieve their scientific, technical, and engineering goals by using the best technology now and tomorrow…..
Kickstart Your Business to the Next Level with AI Inferencing

{SPONSORED GUEST ARTICLE] Check out this article form HPE (with NVIDIA.) The need to accelerate AI initiatives is real and widespread across all industries. The ability to integrate and deploy AI inferencing with pre-trained models can reduce development time with scalable secure solutions that would revolutionize how easily you can….
Lenovo Maximizes HPC Resources via Partnership with SchedMD and Slurm Workload Manager

[SPONSORED GUEST ARTICLE] In HPC, leveraging compute resources to the maximum is a constant goal and a constant source of pressure. The higher the usage rate, the more jobs get done, the less resources sit idle, the greater the return on the HPC investment. At Lenovo, with its HPC-class ThinkSystem servers, workload maximization is….
Federated GPU Infrastructure for AI Workflows

[Sponsored Guest Article] With the explosion of use cases such as Generative AI and ML Ops driving tremendous demand for the most advanced GPUs and accelerated computing platforms, there’s never been a better time to explore the “as-a-service” model to help get started quickly. What could take months of shipping delays and massive CapEx investments can be yours on demand….
Revolutionizing Bioscience Research: Creating an Atlas of the Human Body

Making healthcare and life science (HCLS) discoveries is time-consuming and requires considerable amounts of data. HPC enterprise infrastructure with AI and edge to cloud capabilities is required for biomedical research to make creating a human atlas of the body possible. The HPE, NVIDIA and Flywheel collaboration using the latest technologies designed for HCLS promise to transform….
HPC, AI, ML and Edge Solutions Drive Duos Railcar Inspection System Powered by Dell Technologies and Kalray

Industries such as railways are moving from traditional inspection methods to using AI and ML to perform automated inspection of railcars. Data streaming from the edge at high rates requires the compute power of a HPC cluster, storage and advanced analytics to return results in real time….
HPC User Forum: ‘Lawyers Who Use AI Will Replace Lawyers Who Don’t’

“Lawyers who use AI will replace lawyers who don’t.” That was the coda of a presentation given at the recent HPC User Forum in Tucson by Arizona State University law professor Gary Marchant, a graduate of Harvard Law School and a professor at ASU since 1999. Marchant’s presentation at the Forum, hosted by Hyperion Research, was one of the conference’s most talked-about sessions. In this interview, Marchant shares his observations on how generative AI and large language models….