Hey LLM folks π Whether you're shipping with Ollama on a Mac, vllm on 8Γ H100, or anything in between β model planning shouldn't be napkin math. I built a free calculator: WeightRoom. β https://smelukov.github.io/WeightRoom/ No backend, no signup, no telemetry. Pick a model, pick a quant, pick a context window β get RAM, disk and TPS estimates in real time. State serializes to URL, so configs are

WeightRoom β an LLM resource calculator
Sergey MelyukovΒ·Dev.toΒ·Β·1 min read
D
Continue reading on Dev.to
This article was sourced from Dev.to's RSS feed. Visit the original for the complete story.