What are your thoughts on #privacy and #itsecurity regarding the #LocalLLMs you use? They seem to be an alternative to ChatGPT, MS Copilot etc. which basically are creepy privacy black boxes. How can you be sure that local LLMs do not A) “phone home” or B) create a profile on you, C) that their analysis is restricted to the scope of your terminal? As far as I can see #ollama and #lmstudio do not provide privacy statements.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
much thanks to @gary_host_laptop for the logo design :)
Just to be clear, llama is the facebook model, ollama is the software that lets you run llama, along with many other models.
Ollama has internet access (otherwise how could it download models?), the only true privacy solution is to run in a container with no internet access after downloading models, or air gap your computer.
Thank you for the correction!
Could you not just monitor/block outgoing traffic?