news
I use Linux for local LLMs and everything is easier than Windows
Quoting: I use Linux for local LLMs and everything is easier than Windows —
For a long time, running large language models locally felt like something reserved for people with desktop GPUs the size of toaster ovens. If you were on a modest Linux laptop, the unspoken message was pretty clear: nice ambition, wrong hardware. That reality has shifted. Quietly, and a little faster than many people noticed.
With the right tools and a bit of restraint, you can now run a genuinely useful ChatGPT-style setup locally on Linux Mint without turning your laptop into a space heater. I know because I just did exactly that on a Ryzen 5 machine with 8 GB of RAM and integrated graphics. Not a powerhouse, or a lab rig. Just a very normal daily driver sporting Ollama and WebUI.
If your setup lives in the same neighborhood, here is what actually works, what tends to wobble, and how to get a smooth local AI experience without your swap file filing a formal complaint.