HelpCore ConceptsCredits & Pricing

Credits & Pricing

Weighted credits, tier costs, caching, and overage billing.

Weighted Credits

PeerLM uses a weighted credit system where different model tiers cost different amounts per response:

TierMultiplierTypical Models
Standard1xSmaller/faster models (GPT-4o-mini, DeepSeek V3)
Advanced1xMid-range models (GPT-4o, Claude Sonnet 4.6)
Premium2xLarge frontier models (GPT-5.2, Claude Opus 4.6)
Frontier3xLatest cutting-edge models (o1-pro, o3-pro)

Cost Formula

The estimated credit cost for a run is:

credits = SUM(all model multipliers)
         x system_prompts x test_prompts x samples

Both generators and evaluators consume credits at their tier multiplier.

What's Free

  • Cache hits — reused responses cost 0 credits
  • Failed responses — credits are refunded for API failures
  • Recompute — re-aggregating scores is always free

Pay-As-You-Go (Free Plan)

Free plan users receive 200 signup credits and can purchase additional credits at $0.20 per credit. Enable PAYG billing in Settings > Billing.

Overage Billing (Pro Plan)

Pro plan users can enable overage billing at $0.10 per credit. When enabled, runs can exceed your monthly credit allowance. Overage charges are reported to Stripe in real time. Toggle this in Settings > Billing.

Why does my final cost differ from the estimate? The estimate assumes zero cache hits and no failures. Cache hits and failed responses are refunded, so the final cost is often lower.