A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn’t great, if contents of the website are behind a paywall maybe copy them into the post
- Don’t promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
much thanks to @gary_host_laptop for the logo design :)
- 0 users online
- 108 users / day
- 435 users / week
- 1.32K users / month
- 4.54K users / 6 months
- 1 subscriber
- 4.43K Posts
- 112K Comments
- Modlog
Proton Lump is a good privacy option. But not still runs through protons server and they’ll suck a dick to sell you out to Trump. I’d advise using pocket pals on android with Qwen or SMOL models. You can even given it a personality and it runs entirely offline with no privileges. You just remove networking permissions after downloading your LLM model.
Not judging ? I assume you mean ‘free’ ? It is more difficult now. Groq.com once claimed that they didn’t collect data from the free service, but I just checked their privacy rules, and they have now learned the proper Capitalist virtue of surveilling their users and sell their data. Perhaps one of the other fast npu endpoints, cerberus, or sambanova, that are not directly focused on selling data ?
Duckduckgo ai doesn’t require login, but are very limited for chat.
Alternatively, pick free Chinese AI services ? I guess the data they can accumulate are pretty useless to them unless you buy much over there, but not sure to be honest.
Maybe a vpn and gpt4free ? Someone can see and train on your messages alright, but they don’t know who. This gives different models each time, so maybe not cool.
Something like aihorde maybe okay. Your messages are openly processed elsewhere, but you are not likely to be tracked. There’s a lot of personal chat models available. It afaik still requires a ‘small’ nvidia card to join in. Vpn could help anonymity.
How good’s your computer? Running locally is always the best option, but an 8-13GB model is never going to be as good as the stuff you’d find hosted by major companies. But hey, no limits and it literally never leaves your PC.
You can find models on Huggingface, and if you don’t know what you’re looking for, there’s a subreddit where they have a weekly discussion on enthusiasts favorite models. I don’t remember the sub’s name, but you should be able to find it easily enough with a google search like “reddit weekly AI model thread”. Go to the poster’s profile and you’ll find all of the old threads you can read through for recommendations.
Chatting to someone else’s computer? That’s raw dogging the internet. No protection, no privacy.
Anyone there can see all your shit. Don’t think a VPN or incognito mode is magically saving you. You’re exposed.
The only solution? You must take your computing back under your control. You need your own computer and libre software.
Ollama, LibreChat and maid are a good start.
Try gpt4all, runs in your machine and its easy to setup.
Run your own, then you can be sure its private. Download one from huggingface or modelscope or somewhere, they’re all free.
Keep in mind that the whole model needs to fit into your ram/vram.
Download an AI isn’t a so good idea, because to be reliable, it need a huge amount of data and processing power, which a normal PC don’t have. Downloaded AI which work locally are by definition very basic and limited. When selfhosted versions use online sources, they need a huge amount of bandwith. An reliable AI need an huge datacenter and webaccess, because this, the question for the user is the privacy of the AI service. The mencioned Apertus fullfit all the aspects of reliability and privacy. History of questions and results are stored locally, easy to delete, answers by own knowledge base or by permitting web access, with answers including the links of sources. 100% free to use and EU made.
Swiss PublicAI by Apertus, FOSS, made by the Swiss National Supercomputer Centre (CSCS) also used by the CERN. Possible selfhosting (~90 GB basic Data), up to 70 billion parameters, trained on 15 trillion tokens across more than 1,000 languages. Strict privacy centred.
Fuck US AI
https://www.swiss-ai.org/apertus
https://publicai.co/ (free account to be able for customize it, plug ins (nick, mail))
https://github.com/swiss-ai/
https://huggingface.co/swiss-ai
This POS doesn’t even include sources for it’s BS…
Run your own modal locally, gpt-oss-20b is good for most tasks. If it’s hosted on someone else’s server, most likely it won’t respect your privacy, since these things are costly to run
Just ask a toddler. They would be happy to help and might have less hallucinations.
Not a solution. AI is used for sex roleplay.
Read between the lines. They’re gooning.
Proton Lumo
Try a magic 8 ball or fortune cookies.
There are several like Duck.ai, Proton Lumo, and Brave Leo that proclaim to be more private, but there’s no way to know for sure.
I checked Proton Lumo, it’s private, but not very good, answers mostly BS, AI is not a strong side of Proton. DuckAI is a nice and usable one, Brave not so.
They’re all mostly BS. You shouldn’t use any of them.
It depends, AI is good as tool to help in your researches and tasks, but bad to substitute your researches and tasks with it. The trustworth of AI is always more or less limited and always a big mistake to use the results as is, without contrasting it, which is done too often.
What good is it if you have to fact-check everything it says regardless?
duck.ai is an alright, I use it and like it, but it has the limit thing. I’m hearing some stuff on Proton that makes me want to be away from it.
Yeah I mean you really should either avoid them all or run them locally.