Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


Baidu Collaborates on Intel Nervana Neural Network Processor for Training

Intel is collaborating with Baidu on development of the new Intel Nervana Neural Network Processor for training (NNP-T). The two companies will team up on hardware and software design for the new custom accelerator for “training deep learning models at lightning speed.”

The next few years will see an explosion in the complexity of AI models and the need for massive deep learning compute at scale. Intel and Baidu are focusing their decade-long collaboration on building radical new hardware, co-designed with enabling software, that will evolve with this new reality – something we call ‘AI 2.0.’” said Naveen Rao, Intel corporate vice president and general manager of the AI Products Group.

According to Intel, AI isn’t a single workload; it’s a pervasive capability that will enhance every application, whether it’s running on a phone or in a massive data center. Phones, data centers and everything in between have different performance and power requirements, so one-size AI hardware doesn’t fit all. Intel offers exceptional choice in AI hardware with enabling software, so customers can run complex AI applications where the data lives. The NNP-T is a new class of efficient deep learning system hardware designed to accelerate distributed training at scale. Close collaboration with Baidu helps ensure Intel development stays in lock-step with the latest customer demands on training hardware.

Since 2016, Intel has been optimizing Baidu’s PaddlePaddle* deep learning framework for Intel Xeon Scalable processors. Now, the companies give data scientists more hardware choice by optimizing the NNP-T for PaddlePaddle.

The impact of these AI solutions is enhanced with additional Intel technologies. For example, Intel Optane DC Persistent Memoryprovides improved memory performance that allows Baidu to deliver personalized mobile content to millions of users through its Feed Stream* service and Baidu’s AI recommendation engines for a more efficient customer experience.

Additionally, with data security critically important to users, Intel and Baidu are working together on MesaTEE*, a memory-safe function-as-a-service (FaaS) computing framework based on the Intel Software Guard Extensions (SGX) technology.

Sign up for our insideHPC Newsletter

 

Leave a Comment

*

Resource Links: