Our Testing Methodology
Every provider is tested under identical conditions:
- Server: Entry-level plan from each provider, US datacenter
- OS: Ubuntu 24.04 LTS, fresh install
- CPU: sysbench cpu run --threads=1 --time=60
- Disk I/O: fio --randread/randwrite, 4K block size, iodepth=32
- Network: iperf3 to standard test servers
- Timing: Tests run 3 times, median result reported
All data is from tests we ran ourselves. No vendor-provided numbers.
Why We Test Entry-Level Plans
We benchmark the cheapest plan from each provider because that's what most users start with. Entry-level plans also reveal the most about a provider's infrastructure quality — premium plans often get dedicated resources, but the $5-7/mo tier shows you how aggressive the oversubscription ratio really is. A provider that delivers strong performance on their cheapest plan will only do better on higher tiers.
Handling "Noisy Neighbor" Variance
Shared VPS performance varies depending on what other tenants on the same host are doing. To account for this, we run each test 3 times across different time periods (morning, afternoon, evening) and report the median. We also flag providers where the variance between runs exceeds 15% — a sign of aggressive overselling. In our testing, Vultr, DigitalOcean, and Hetzner showed the most consistent results (under 5% variance), while Contabo and RackNerd showed higher variance (10-20%).
Re-Testing Cadence
We re-run all benchmarks quarterly and after any major infrastructure announcements (new CPU generations, datacenter migrations, pricing changes). The data on this page was last updated in March 2026. If a provider's performance changes significantly between cycles, we note it in the individual provider benchmark pages.