Discover Amazon EC2 G6 Instances' Deep Learning Power

 

Amazon EC2 G6 Instances

The NVIDIA L4 Tensor Core GPU-powered Amazon EC2 G6 instances are now generally available. Numerous machine learning and graphics-intensive use cases are compatible with G6 instances. When compared to Amazon EC2 G4dn instances, G6 instances provide up to two times better performance for applications including deep learning inference and graphics. Go to the Amazon EC2 G6 instance page to find out more.

NVIDIA L4 Tensor Core GPU

Many graphics-intensive and machine learning use cases may be fulfilled by NVIDIA L4 Tensor Core GPU-powered Amazon EC2 G6 instances. Compared to EC2 G4dn instances, the G6 instances provide 2x greater performance for applications including deep learning inference and graphics.

Customers may utilize Amazon EC2 G6 instances for workloads related to graphics, such as developing and rendering real-time, cinematic-quality graphics and game streaming, as well as ML models for natural language processing, language translation, video and image analysis, voice recognition, and personalization.

Amazon EC2 G6 instances use third-generation AMD EPYC CPUs and up to eight 24-GB RAM NVIDIA L4 Tensor Core GPUs. Up to 7.52 TB of local NVMe SSD storage, 100 Gbps network bandwidth, and 192 vCPUs are available.

Advantages

Superior performance and economical efficiency for inference in deep learning

When it comes to deep learning inference, G6 instances outperform G4dn instances by up to two times. There is a very effective and affordable option for clients that want to execute their machine learning applications using NVIDIA libraries like TensorRT, CUDA, and cuDNN. They are driven by L4 GPUs, which have fourth-generation tensor cores.

Excellent performance for tasks requiring a lot of graphics

Compared to G4dn instances, Amazon EC2 G6 instances provide up to two times better graphics performance. They are driven by L4 GPUs, which enable NVIDIA RTX technology and have third-generation RT cores. They are thus perfect for operating powerful virtual workstations, enabling graphics-intensive programmes at better fidelity and resolution, and generating realistic sceneries more quickly. Gr6 instances are ideal for graphics applications requiring more memory since they provide a 1:8 vCPU:RAM ratio.

Optimum efficiency using resources

The AWS Nitro System, which combines specialized hardware with a lightweight hypervisor to provide almost all of the computation and memory capabilities of the host hardware to your instances for improved overall speed and security, is the foundation upon which G6 instances are based. The Nitro system uses Amazon EC2 G6 instances to provide the GPUs in a pass-through mode, giving them performance that is similar to that of bare metal.

Product details

 Instance SizeGPUGPU Memory (GiB)vCPUsMemory (GiB)Storage (GB)Network Bandwidth (Gbps)EBS Bandwidth (Gbps)On Demand Price/hr*1-yr ISP Effective Hourly (Linux)3-yr ISP Effective Hourly (Linux)
Single GPU VMsg6.xlarge1244161×250Up to 10Up to 5$0.805$0.499$0.342
g6.2xlarge1248321×450Up to 10Up to 5$0.978$0.606$0.416
g6.4xlarge12416641×600Up to 258$1.323$0.820$0.562
g6.8xlarge124321282×4502516$2.014$1.249$0.856
g6.16xlarge124642562×9402520$3.397$2.106$1.443
gr6.4xlarge124161281×600Up to 258$1.539$0.954$0.654
gr6.8xlarge124322562×4502516$2.446$1.517$1.040
            
Multi GPU VMsg6.12xlarge496481924×9404020$4.602$2.853$1.955
g6.24xlarge496963844×9405030$6.675$4.139$2.837
g6.48xlarge81921927688×94010060$13.35$8.277$5.674

Prices shown in the preceding table are for US East (Northern Virginia) AWS Region.

Features

High-performance NVIDIA L4 Tensor Core GPUs are included in NVIDIA L4 Tensor Core GPU G6 instances, which are ideal for machine learning and graphics-intensive applications. per instance has DLSS 3.0 technology, third-generation NVIDIA RT cores, fourth-generation NVIDIA Tensor Cores, and up to eight L4 Tensor Core GPUs with 24 GB of RAM per GPU. NVIDIA L4 GPUs are equipped with two video encoders, four video decoders, and the ability to hardware encode AV1.

NVIDIA libraries and drivers

Customers using G6 instances can get NVIDIA RTX Enterprise for free. For a variety of graphics-intensive applications, NVIDIA RTX Enterprise drivers may be utilised to create virtual workstations of excellent quality. CUDA, cuDNN, NVENC, TensorRT, cuBLAS, OpenCL, DirectX 11/12, Vulkan 1.3, and OpenGL 4.6 libraries are also supported by G6 instances.

High-end storage and networking

AWS G6 instances can accommodate the low latency requirements of machine learning inference and graphics-intensive applications thanks to their 100 Gbps networking speed. With support for up to 7.52 TB of local NVMe SSD storage and 24 GB of RAM per GPU, big models and datasets may be locally stored for high speed machine learning training and inference. Large video files may also be locally stored by G6 instances, which improves graphics performance and enables the rendering of bigger, more complicated video files.

Built on AWS Nitro System: AWS G6 instances are constructed on the AWS Nitro System, a vast array of building blocks that transfers many of the conventional virtualization tasks to specialized hardware and software in order to minimize virtualization overhead and provide high performance, high availability, and high security.

Using G6 instances to begin machine learning

ML practitioners and academics may accelerate deep learning on the cloud at any scale by using DLAMI, or Deep Learning Containers. Docker images with DL frameworks preloaded are called Deep Learning Containers, and they simplify the deployment of customized ML environments by saving you the trouble of creating and optimizing your environments from the ground up.

Making use of Amazon ECS or EKS

You may use Amazon EKS or Amazon ECS to deploy G6 instances if you would rather manage your own containerized workloads using container orchestration services.

News source: Amazon EC2 G6 Instances

Post a Comment

0 Comments