RunPod is a cloud computing platform tailored for AI, machine learning, and general compute workloads. It provides scalable, high-performance GPU and CPU resources, enabling users to develop, train, and deploy AI models efficiently.
With offerings like dedicated GPU Pods and Serverless endpoints, RunPod caters to a wide range of computational needs.
Pod Pricing (per hour):
-
- Community Cloud:
- RTX A4000 (16GB VRAM): $0.17/hr
- RTX 3090 (24GB VRAM): $0.22/hr
- A100 PCIe (80GB VRAM): $1.19/hr
- Secure Cloud:
- A40 (48GB VRAM): $0.40/hr
- L40 (48GB VRAM): $0.69/hr
- H100 PCIe (80GB VRAM): $1.99/hr
- MI300X (192GB VRAM): $2.49/hr
- Serverless Pricing (per second):
- A4000 (16GB VRAM): $0.00016/sec
- A100 (80GB VRAM): $0.00076/sec
- H100 PRO (80GB VRAM): $0.00116/sec
- H200 PRO (141GB VRAM): $0.00155/sec
- Storage:
- Pod Volume & Container Disk: $0.10/GB/month (running), $0.20/GB/month (idle)
- Network Volume: $0.07/GB/month (<1TB), $0.05/GB/month (>1TB)
- Community Cloud:

