OpiniondockerLLM
The Local LLM Ecosystem Doesn't Need Ollama (And That Made Me Uncomfortable)
I've been team Ollama since day one. Last week I tried replacing it with raw llama.cpp and a minimal wrapper. What I found forced me to rethink something I thought I'd already figured out: Ollama solves UX, not infrastructure.
6 min208