ASUS Unveils ESC AI POD Featuring NVIDIA GB200 NVL72 at Computex 2024

Recently, ASUS revealed details of its newest Computex 2024 showcase, which will feature a wide array of AI servers for uses ranging from generative AI to creative storage solutions, including the ASUS ESC AI POD equipped with the NVIDIA® GB200 NVL72 system. The most recent ASUS NVIDIA MGX-powered systems are also on show, such as the NVIDIA GB200 NVL2-equipped ESC NM1-E1, the NVIDIA GH200 Grace Hopper Superchip-equipped ESR1-511N-M1, and the latest ASUS solutions embedded with the NVIDIA HGX H200 and Blackwell GPUs.

Presenting the ASUS ESC AI POD (NVIDIA GB200 NVL72 with ESC NM2N721-E1)

ASUS’s extensive experience in creating AI servers with unmatched efficiency and performance has been strengthened by its partnership with NVIDIA. The NVIDIA Blackwell-powered scale-up larger-form-factor system, the ESC AI POD with NVIDIA GB200 NVL72, is one of the showcase’s centerpieces. With the help of a symphony of GPUs, CPUs, and switches, the entire rack solution can perform in real-time inference and trillion-parameter LLM training at lightning-fast speeds. For the best AI processing performance, it has the newest NVIDIA GB200 Grace Blackwell Superchip and fifth-generation NVIDIA NVLink technology. It also offers liquid-to-liquid and liquid-to-air cooling options.

Paul Ju, Co-Head of Open Platform BG and Corporate Vice President for ASUS, emphasized, “Our partnership with NVIDIA, a global leader in AI computing, underpins our expertise. Together, we’ve harnessed a potent synergy, enabling us to craft AI servers with unparalleled performance and efficiency.”

“ASUS has strong expertise in server infrastructure and data center architecture, and through our work together, we’re able to deliver cutting-edge accelerated computing systems to customers,” said Kaustubh Sanghani, vice president of GPU product management at NVIDIA. “At Computex, ASUS will have on display a wide range of servers powered by NVIDIA’s AI platform that are able to handle the unique demands of enterprises.”

Elevating AI success with ASUS MGX scalable solutions

To fuel the rise of generative AI applications, ASUS has presented a full lineup of servers based on the NVIDIA MGX architecture. These include the 2U ESC NM1-E1 and ESC NM2-E1 with NVIDIA GB200 NVL2 servers, and the 1U ESR1-511N-M1. Harnessing the power of the NVIDIA GH200 Grace Hopper Superchip, the ASUS NVIDIA MGX-powered offering is designed to cater to large-scale AI and HPC applications by facilitating seamless and rapid data transfers, deep-learning (DL) training and inference, data analytics and high-performance computing.

Tailored ASUS HGX solution to meet HPC demands

Designed for HPC and AI, the latest HGX servers from ASUS include ESC N8, powered by 5th Gen Intel® Xeon® Scalable processors and NVIDIA Blackwell Tensor Core GPUs, as well as ESC N8A, powered by AMD EPYC™ 9004 and NVIDIA Blackwell GPUs. These benefit from an enhanced thermal solution to ensure optimal performance and lower PUE. Engineered for AI and data-science advancements, these powerful NVIDIA HGX servers offer a unique one-GPU-to-one-NIC configuration for maximal throughput in compute-heavy tasks.

Software-defined data center solutions

Outshining competitors, ASUS is specialized in crafting tailored data center solutions – going beyond the ordinary to provide top-notch hardware and delivering comprehensive software solutions tailored to enterprise needs. Our services cover everything from system verification to remote deployment, ensuring smooth operations essential for accelerating AI development. This includes a unified package comprising a web portal, scheduler, resource allocations, and service operations.

Moreover, ASUS AI server solutions, with integrated NVIDIA BlueField-3 SuperNICs and DPUs, support the NVIDIA Spectrum-X Ethernet networking platform to deliver best-of-breed networking capabilities for generative AI infrastructures. In addition to optimizing the enterprise generative AI inference process, NVIDIA NIM inference microservices, part of the NVIDIA AI Enterprise software platform for generative AI, are capable of speeding up the runtime of generative AI and simplifying the development process for generative AI applications. ASUS also offers customized generative AI solutions to cloud or enterprise customers, through its collaboration with independent software vendors (ISVs) – such as APMIC, which specialize in large language models and enterprise applications.

At ASUS, the company’s expertise lies in striking the perfect balance between hardware and software, empowering customers to expedite their research and innovation endeavors. As ASUS continues to advance in the server domain, its data center solutions, coupled with ASUS AI Foundry Services, have found deployment in critical sites globally.

ESC AI POD Availability & Pricing

ASUS servers are available worldwide. Please visit https://servers.asus.com for more ASUS data-center solutions or please contact your local ASUS representative for further information on ESC AI POD availability and pricing.

 

Similar Posts

Leave a Reply