Calculate GPU VRAM and system requirements for running LLM models with Ollama
Forked from @aleibovici