Is it possible to use aider with a local model running in LMStudio (or ollama)?
From a quick glance i did not see an obvious way to do that...
Hopefully i am totally wrong!
reply
Yes, absolutely you can work with local models. Here are the docs for working with lmstudio and ollama:
https://aider.chat/docs/llms/lm-studio.html
https://aider.chat/docs/llms/ollama.html
In the left bar there's a "connecting to LLMs" section
Check out ollama as an example
aider --model ollama_chat/deepseek-r1:32b
Is it possible to use aider with a local model running in LMStudio (or ollama)?
From a quick glance i did not see an obvious way to do that...
Hopefully i am totally wrong!