Menu Close

Revolutionizing AI: AWS and NVIDIA Forge Path to Future

Cutting-Edge Supercomputing and Drug Discovery Technologies

AWS and NVIDIA have significantly expanded their strategic collaboration, as announced at AWS re:Invent. This partnership aims to provide advanced infrastructure, software, and services for generative AI innovations. Key aspects of this collaboration include:

NVIDIA GH200 Grace Hopper Superchips: This image illustrates the advanced technology and power of these superchips, emphasizing their potential in a cloud computing environment. (ai generated concept)
  1. NVIDIA GH200 Grace Hopper Superchips on AWS: AWS is the first cloud provider to offer these superchips, enabling scalability to thousands of GH200 Superchips for supercomputer-class performance.
  2. NVIDIA DGX Cloud on AWS: This AI-training-as-a-service will feature GH200 NVL32 for accelerated training of generative AI and large language models.
  3. Project Ceiba: This project aims to design the world’s fastest GPU-powered AI supercomputer with 16,384 NVIDIA GH200 Superchips, achieving a processing capability of 65 exaflops.
  4. New Amazon EC2 Instances: AWS introduces three new EC2 instances, including P5e instances powered by NVIDIA H200 Tensor Core GPUs for large-scale generative AI and HPC workloads.
  5. Software Innovations: NVIDIA introduces software on AWS, like NeMo Retriever microservice for chatbots and summarisation tools, and BioNeMo to accelerate drug discovery for pharmaceutical companies.
Project Ceiba Supercomputer: The second image depicts a futuristic supercomputer, central to Project Ceiba, showcasing its immense size and cutting-edge technology. (ai generated concept)

This collaboration signifies a commitment to advancing generative AI, offering cutting-edge technologies and resources. The integration of NVIDIA and AWS technologies will enhance the development, training, and inference of large language models and generative AI applications across various industries​​.

0 0 votes
Article Rating
Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x