The inf1.xlarge instance is part of the inf1 series, featuring 4 vCPUs and Up to 25 Gigabit of RAM, with Machine Learning Asic Instances. It is available at a rate of $0.2280/hour.
Amazon EC2 Inf1 instances are specifically engineered to support machine learning inference applications.
Equipped with up to 16 AWS Inferentia chips
Compatible with AWS Neuron SDK
High-frequency 2nd Generation Intel Xeon Scalable processors (Cascade Lake P-8259L)
Provides up to 100 Gbps of networking bandwidth
Credits: AWS Resources, Updated At: 2024-08-23
Building recommendation systems, forecasting, analyzing images and videos, advanced text analytics, document analysis, voice processing, conversational agents, translation, transcription, and fraud detection.
Instances | vCPUs | Memory (GiB) |
---|---|---|
inf1.xlarge | 4 | 8 GiB |
inf1.2xlarge | 8 | 16 GiB |
inf1.6xlarge | 24 | 48 GiB |
inf1.24xlarge | 96 | 192 GiB |
General | |
---|---|
Type | inf1.xlarge |
Region | N. Virginia |
Family Group | Inf1 |
Family Category | Machine Learning Asic Instances |
Instance Generation | Current |
Term Type | On-demand |
Pricing (USD/hr) | $ 0.2280 |
Compute | |
---|---|
vCPU | 4 |
Memory | 8 GiB |
Storage | EBS only |
Processor Architecture | 64-bit |
Operating System | Linux |
EBS Optimized | NA |
Tenancy | Shared |
Clock Speed | NA |
GPU | NA |
ECU | 0.0000 |
Virtualization | HVM |
Networking | |
---|---|
Network Performance | Up to 25 Gigabit |
ENA Support | True |
Region | Price / Unit | Monthly Pricing |
---|---|---|
Oregon | $0.2280 | $166.44 |
Ohio | $0.2280 | $166.44 |
Mumbai | $0.2400 | $175.20 |
Stockholm | $0.2420 | $176.66 |
Central | $0.2540 | $185.42 |
Ireland | $0.2540 | $185.42 |
Milan | $0.2670 | $194.91 |
Paris | $0.2670 | $194.91 |
London | $0.2670 | $194.91 |
N. California | $0.2740 | $200.02 |
Bahrain | $0.2800 | $204.40 |
Seoul | $0.2810 | $205.13 |
Sydney | $0.2850 | $208.05 |
Frankfurt | $0.2850 | $208.05 |
Singapore | $0.3080 | $224.84 |
Tokyo | $0.3080 | $224.84 |
Sao Paulo | $0.3770 | $275.21 |