inf2.xlarge
The inf2.xlarge instance is in the machine learning asic instances family with 4 vCPUs, 16.0 GiB of memory and up to 15 Gibps of bandwidth starting at $0.7582 per hour.
Pricing
On Demand
Spot
1 Yr Reserved
3 Yr Reserved
$239.10 per month (-57%) with Autopilot
Family Sizes
Size | vCPUs | Memory (GiB) |
---|---|---|
inf2.xlarge | 4 | 16 |
inf2.8xlarge | 32 | 128 |
inf2.24xlarge | 96 | 384 |
inf2.48xlarge | 192 | 768 |
Instance Details
Compute | Value |
---|---|
vCPUs | 4 |
Memory (GiB) | 16.0 |
Memory per vCPU (GiB) | 4.0 |
Physical Processor | AMD EPYC 7R13 Processor |
Clock Speed (GHz) | 2.95 |
CPU Architecture | x86_64 |
GPU | 1 |
GPU Architecture | AWS Inferentia2 |
Video Memory (GiB) | 32 |
GPU Compute Capability (?) | 0 |
FPGA | 0 |
Networking | Value |
---|---|
Network Performance (Gibps) | Up to 15 |
Enhanced Networking | True |
IPV6 | True |
Placement Group (?) | True |
Storage | Value |
---|---|
EBS Optimized | True |
Max Bandwidth (Mbps) on (EBS) | 10000 |
Max Throughput (MB/s) on EBS | 1250.0 |
Max I/O Operations/second (IOPS) | 40000 |
Baseline Bandwidth (Mbps) on (EBS) | 1250 |
Baseline Throughput (MB/s) on EBS | 156.25 |
Baseline I/O Operations/second (IOPS) | 6000 |
Amazon | Value |
---|---|
Generation | current |
Instance Type | inf2.xlarge |
Family | Machine Learning ASIC Instances |
Name | INF2 Extra Large |
Elastic Map Reduce (EMR) | False |
EC2 Rightsizing Recommendations
Connect your AWS account with Vantage to see a savings estimate.
Sign Up