Hey LLM folks πŸ‘‹ Whether you're shipping with Ollama on a Mac, vllm on 8Γ— H100, or anything in between β€” model planning shouldn't be napkin math. I built a free calculator: WeightRoom. β†’ https://smelukov.github.io/WeightRoom/ No backend, no signup, no telemetry. Pick a model, pick a quant, pick a context window β€” get RAM, disk and TPS estimates in real time. State serializes to URL, so configs are