The inf1.6xlarge instance is part of the inf1 series, featuring 24 vCPUs and 25 Gigabit of RAM, with Machine Learning Asic Instances. It is available at a rate of $1.1800/hour.
Amazon EC2 Inf1 instances are specifically engineered to support machine learning inference applications.
Equipped with up to 16 AWS Inferentia chips
Compatible with AWS Neuron SDK
High-frequency 2nd Generation Intel Xeon Scalable processors (Cascade Lake P-8259L)
Provides up to 100 Gbps of networking bandwidth
Credits: AWS Resources, Updated At: 2024-08-23
Building recommendation systems, forecasting, analyzing images and videos, advanced text analytics, document analysis, voice processing, conversational agents, translation, transcription, and fraud detection.
Instances | vCPUs | Memory (GiB) |
---|---|---|
inf1.xlarge | 4 | 8 GiB |
inf1.2xlarge | 8 | 16 GiB |
inf1.6xlarge | 24 | 48 GiB |
inf1.24xlarge | 96 | 192 GiB |
General | |
---|---|
Type | inf1.6xlarge |
Region | N. Virginia |
Family Group | Inf1 |
Family Category | Machine Learning Asic Instances |
Instance Generation | Current |
Term Type | On-demand |
Pricing (USD/hr) | $ 1.1800 |
Compute | |
---|---|
vCPU | 24 |
Memory | 48 GiB |
Storage | EBS only |
Processor Architecture | 64-bit |
Operating System | Linux |
EBS Optimized | NA |
Tenancy | Shared |
Clock Speed | NA |
GPU | NA |
ECU | 0.0000 |
Virtualization | HVM |
Networking | |
---|---|
Network Performance | 25 Gigabit |
ENA Support | True |
Region | Price / Unit | Monthly Pricing |
---|---|---|
Ohio | $1.1800 | $861.40 |
Oregon | $1.1800 | $861.40 |
Mumbai | $1.2410 | $905.93 |
Stockholm | $1.2540 | $915.42 |
Central | $1.3150 | $959.95 |
Ireland | $1.3150 | $959.95 |
Paris | $1.3790 | $1006.67 |
London | $1.3820 | $1008.86 |
Milan | $1.3820 | $1008.86 |
N. California | $1.4180 | $1035.14 |
Bahrain | $1.4470 | $1056.31 |
Seoul | $1.4530 | $1060.69 |
Sydney | $1.4750 | $1076.75 |
Frankfurt | $1.4750 | $1076.75 |
Singapore | $1.5940 | $1163.62 |
Tokyo | $1.5940 | $1163.62 |
Sao Paulo | $1.9500 | $1423.50 |