Don’t trust the cloud. You can setup AI locally.

But I run the cloud myself

Local AI will be harvested - if not today, then as soon as tomorrow. I recommend not trusting any system like this with any sensitive information… Or, honestly, with most non-sensitive information.

When people say Local AI, they mean things like the Free / Open Source Ollama (https://github.com/ollama/ollama/), which you can read the source code for and check it doesn’t have anything to phone home, and you can completely control when and if you upgrade it. If you don’t like something in the code base, you can also fork it and start your own version. The actual models (e.g. Mistral is a popular one) used with Ollama are commonly represented in GGML format, which doesn’t even carry executable code - only massive multi-dimensional arrays of numbers (tensors) that represent the parameters of the LLM.

Now not trusting that the output is correct is reasonable. But in terms of trusting the software not to spy on you when it is FOSS, it would be no different to whether you trust other FOSS software not to spy on you (e.g. the Linux kernel, etc…). Now that is a risk to an extent if there is an xz style attack on a code base, but I don’t think the risks are materially different for ‘AI’ compared to any other software.

KillingTimeItself
cake
link
fedilink
-16M

if you use windows, sure.

Don’t use windows.

This guy is clueless

If you connect it to the Internet then sure it can be easily harvested by large companies. Pretty sure you can host an offline AI in a device you have made sure the hardware isn’t gonna be phoning home and it’ll probably be fairly safe if you aren’t an idiot like me and actually know what you’re doing.

If you install it locally, it will be as secure as any other thing you do on your computer.

Carlos Francisco 🦣
link
fedilink
1
edit-2
6M

@AdrianTheFrog @privacy @AceFuzzLord actually, it depends on the code. If it’s no open source you can’t really know what it is doing with your data. Therefore not all things you install in you local computer are equally insecure (or secure)

Possibly linux
link
fedilink
23
edit-2
6M

How? It is running locally in a VM. I could even air gap the VM if I wanted

classic
link
fedilink
146M

Is there a magazine or site that breaks this down for the less tech savvy? And is the quality of the AI on par?

Check my notes https://fabien.benetou.fr/Content/SelfHostingArtificialIntelligence but as others suggested a good way to start is probably https://github.com/ollama/ollama/ and if you need a GUI https://gpt4all.io

You should have at least 16 GB of RAM available to run the 13B models,

Is this gpu ram or cpu ram?

Either works, but system RAM is at least an order of magnitude slower, more play by mail than chat…

pretty sure it can run on either, but cpus are slow compared to gpus, often to the point of being impractical

KillingTimeItself
cake
link
fedilink
26M

likely GPU ram, there is some tech that can offload ram, but generally it’s all hosted in VRAM, this requirement will likely fade as NPUs start becoming a thing though.

I’m not the person who asked, but still thanks for the information. I might give this a try soon.

classic
link
fedilink
26M

Ditto, thanks to everyone’s for their suggestions

On par? No. Good enough? Definitely. Ollama baby

Ollama with Lava and Mistral

Your best bet is YouTubing ollama.

ollama is your friend

*Mistral

ollama supports mistral. but there are much better lms these days. I personally prefer aya.

Citizen
link
fedilink
-96M

Mate, please be kind and help a fellow brother with that…

For example I need a tool that will “automagically” sort all my documents, photos and videos on premises.

Thank you!

qaz
link
fedilink
6
edit-2
6M

An LLM like Ollama won’t help with that. Something like Photoprism could, it uses ML to automatically tag media and recognize people.

Even OpenAI can’t do that lol

I use Ollama plus openwebUI. It won’t sort your documents but you can upload a doc for AI tasks.

Sort them… by what?

Just fackin’ sort it ooouuttt!

Edit, but seriously sorting in most scenarios is just by date created or name, which most file explorers can do.

That Isn’t generally an AI task.

If OP wants the AI to read the file and sort by colour for example, then this is maybe an AI task but sounds more like a software task.

Forbo
link
fedilink
26M

Scenario time: A loved one has recently passed away, and I want to find all the photos I have of them. I would love to be able to have a local AI perform facial recognition to help me find these photos. The classification and tagging info doesn’t get fed into surveillance capitalist garbage, and I’m still able to benefit.

Mate, something like Immich or digikam (if you want local) will do a good job at this. Not perfect but perfection is utopia. I fed 40k images to Immich and it did a reasonable job in not too many hrs.

KillingTimeItself
cake
link
fedilink
-36M

scenario time: you haven’t taken 40 thousand pictures over the last three years because you aren’t cripplingly addicted to technology so you can sort through them manually in about 10 hours or so.

You’re proposing to waste 10 hours sorting photos when the right tool could probably do it in less than 2 minutes? What?

And how does taking pictures translate to being addicted to tech?? We’ve had photography for over 100 years

KillingTimeItself
cake
link
fedilink
1
edit-2
6M

they’re photos of someones dead family relative? Are you really suggesting spending 10 hours on that would be a “waste of time” seems rather disingenuous, or at the very least, incredibly rude.

Also, the AI could just be wrong. You’re more likely to sort much better, at least according to what you want the sorting to be.

And how does taking pictures translate to being addicted to tech?

Because some people take so many pictures it’s actually kind of concerning to me whether or not they would be able to exist in the world if they couldn’t. I think some people just need to focus on enjoying the experience more.

There’s no reason to judge someone for taking many photos. If you’re not willing to help, you don’t have to. There’s no need to write sarcastic comments.

KillingTimeItself
cake
link
fedilink
16M

i was primarily just covering the scenario where you don’t have so many photos it isn’t impossible to sort through. I’d be a little concerned if you took so many photos that you couldn’t sort through them.

Also, it’s not sarcastic.

Create a post

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

  • Posting a link to a website containing tracking isn’t great, if contents of the website are behind a paywall maybe copy them into the post
  • Don’t promote proprietary software
  • Try to keep things on topic
  • If you have a question, please try searching for previous discussions, maybe it has already been answered
  • Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
  • Be nice :)

Related communities

much thanks to @gary_host_laptop for the logo design :)

  • 0 users online
  • 57 users / day
  • 383 users / week
  • 1.5K users / month
  • 5.7K users / 6 months
  • 1 subscriber
  • 3.12K Posts
  • 78K Comments
  • Modlog