CalcEngine All Calculators

Log Storage Cost Calculator

Performance

Enter your daily log volume, retention window, and compression ratio to get an instant monthly storage cost estimate. Works with S3, GCS, Azure Blob, Elasticsearch, and any GB-priced store.

Last updated: April 2026

This calculator is designed for real-world usage based on typical engineering scenarios and publicly available documentation.

The log storage cost calculator helps you model the real price of retaining application logs before your bill arrives. Log data compounds quickly — a modest 10 GB/day pipeline becomes 300 GB of raw storage after 30 days, and most teams run multiple services at far higher volumes. Who uses this? Platform engineers sizing object storage for a centralised log aggregation pipeline, SREs deciding whether a 90-day retention policy is affordable, and FinOps practitioners auditing why the observability budget keeps ballooning. Plugging in your actual numbers takes 30 seconds and often surfaces surprising savings opportunities. Compression is the single biggest lever in this formula. Structured JSON logs typically compress 5:1 to 10:1 with gzip or zstd, meaning a 300 GB raw footprint can shrink to 30–60 GB. The calculator lets you model both the raw and compressed size side by side so the impact is immediately visible. Storage pricing varies by provider and tier: AWS S3 Standard charges ~$0.023/GB/month, S3 Infrequent Access ~$0.0125/GB/month, and GCS Nearline ~$0.010/GB/month. Use the price field to compare tiers or providers without leaving the page.

How to Calculate Log Storage Cost

Log Storage Cost — how it works diagram

1. Measure your daily ingestion volume in GB. Check your log shipper metrics, S3 PUT bytes, or CloudWatch Logs ingestion stats. 2. Set the retention period — how many days of logs you need to keep online for search or compliance. 3. Multiply: Raw Storage (GB) = Daily Volume × Retention Days. 4. Divide raw storage by your compression ratio to get the on-disk footprint. Structured JSON typically compresses 5:1 to 10:1 with gzip. 5. Multiply compressed storage by your provider's per-GB/month price to get the monthly cost.

Formula

Monthly Cost = (Daily Volume × Retention Days ÷ Compression Ratio) × $/GB/month

Daily Volume     — uncompressed log bytes ingested per day (GB)
Retention Days   — how long logs are kept (e.g. 30, 90, 365)
Compression Ratio — gzip/zstd ratio; use 5 for typical JSON logs, 10 for verbose text
$/GB/month       — storage unit price (S3 Standard ≈ $0.023, GCS Nearline ≈ $0.010)

Example Log Storage Cost Calculations

Example 1 — Small SaaS app on S3 Standard (30-day retention)

Daily volume:   5 GB/day  ×  30 days  =  150 GB raw
Compression:    150 GB  ÷  5:1  =  30 GB on disk
Storage price:  30 GB  ×  $0.023/GB/mo  =  $0.69/month
────────────────────────────────────────────────────
Annual cost:  ~$8.28   (negligible — retention is the right concern here)

Example 2 — High-traffic microservices on S3 Standard (90-day retention)

Daily volume:   50 GB/day  ×  90 days  =  4,500 GB raw
Compression:    4,500 GB  ÷  8:1  =  562.5 GB on disk
Storage price:  562.5 GB  ×  $0.023/GB/mo  =  $12.94/month
────────────────────────────────────────────────────
Switch to S3 IA ($0.0125): $7.03/month  →  saves ~$70/year

Example 3 — Enterprise compliance logging on GCS Nearline (365-day retention)

Daily volume:   200 GB/day  ×  365 days  =  73,000 GB raw
Compression:    73,000 GB  ÷  6:1  =  12,167 GB on disk
Storage price:  12,167 GB  ×  $0.010/GB/mo  =  $121.67/month
────────────────────────────────────────────────────
Annual cost:  ~$1,460   (segment cold vs. hot logs to cut this further)

Tips to Reduce Log Storage Cost

Notes

Frequently Asked Questions

How do I measure my current daily log volume? +
Check your log shipper dashboard (Fluentd, Logstash, Vector) for bytes-out per day. In AWS, open CloudWatch Logs → Log Groups and sum the "Stored bytes" delta over 24 hours. In S3, enable Storage Lens and filter on PUT request bytes. Most observability platforms (Datadog, Grafana Cloud) also show ingestion volume on the billing page.
What compression ratio should I use for JSON application logs? +
Structured JSON logs typically achieve 5:1 to 10:1 compression with gzip (level 6) or zstd. Verbose logs with long stack traces compress closer to 10:1. Short, dense logs with many unique IDs compress closer to 3:1. Run gzip -9 on a representative 100 MB sample and measure the output size for the most accurate ratio for your workload.
How much does CloudWatch Logs storage cost vs S3? +
CloudWatch Logs charges $0.03/GB/month for storage — about 30% more than S3 Standard ($0.023/GB/month). For high-volume or long-retention workloads, exporting logs to S3 via a subscription filter can significantly reduce costs. Use the Storage Cost Calculator to compare tier pricing before committing to a pipeline.
Is it worth using a columnar format like Parquet for log archival? +
Yes, if you query archived logs with Athena, BigQuery, or Spark. Parquet with Snappy compression typically achieves 15:1 to 30:1 reduction on structured logs, and columnar scans read far less data per query. The trade-off is a conversion pipeline. For logs you only need to grep occasionally, compressed JSON on S3 IA is simpler and nearly as cheap.
How do retention policies affect compliance requirements? +
Compliance frameworks vary: PCI-DSS requires 12 months online and 3 years total; SOC 2 typically requires 1 year; GDPR may require deletion after a defined period. Separate compliance-relevant logs (auth, payment, admin actions) from noisy application logs and apply different retention tiers. See the Data Transfer Cost Calculator to model restore costs when retrieving archived logs.