AI systems exist to reinforce and strengthen existing structures of power and violence. They are the wet dream of capitalists and fascists. Enormous physical infrastructure designed to convert capital into power, and back into capital. Those who control the infrastructure, control the people subject to it.

While it sways away from the initial thesis of how the use of LLMs could be detrimental to our very being and expression of identity - at least that’s how I interpret what they’re saying - it ends in a fantastic claim on how AI is a tool of the ruling class. Worth a read!

Riskable
link
fedilink
103d

This doesn’t make sense when you look at it from the perspective of open source models. They exist and they’re fantastic. They also get better just as quickly as the big AI company services.

IMHO, the open source models will ultimately what pops the big AI bubble.

@Zerush@lemmy.ml
link
fedilink
4
edit-2
2d

Agree, eg. Apertus (FOSS) is a great choice, but even some indie ones are pretty private, eg.Andisearch. AI can be great to improve our research, work and creativity, but it’s bad if we use it to substitute our research, work and creativity. But yes, avoiding the AIs from big (US) corporations, which use it to spy and log user data.

Right, Betamax much? It doesn’t really matter if one technology is objectively “better” on all aspects than another if the strategy to make it popular outpaces the other.

To be clear I wish you were right (even though I don’t find open source models to be free of problems) but I think the conclusion is a wish, not a logical one.

How is that wishful thinking? Open models are advancing just as fast as proprietary ones and they’re now getting much wider usage as well. There are also economic drivers that favor open models even within commercial enterprise. For example, here’s Airbnb CEO saying they prefer using Qwen to OpenAI because it’s more customizable and cheaper

“We’re relying a lot on Alibaba’s Qwen model. It’s very good. It’s also fast and cheap,” he said. “We use OpenAI’s latest models, but we typically don’t use them that much in production because there are faster and cheaper models.”

I expect that we’ll see exact same thing happening as we see with Linux based infrastructure muscling out proprietary stuff like Windows servers and Unix. Open models will become foundational building blocks that people build stuff on top of.

Riskable
link
fedilink
32d

Working on (some) AI stuff professionally, the open source models are the only models that allow you to change the system prompt. Basically, that means that only open source models are acceptable for a whole lot of business logic.

Another thing to consider: There’s models that are designed for processing: It’s hard to explain but stuff like Qwen 3 “embedding” is made for in/out usage in automation situations:

https://huggingface.co/Qwen/Qwen3-Embedding-8B

You can’t do that effectively with the big AI models (as much as Anthropic would argue otherwise… It’s too expensive and risky to send all your data to a cloud provider in most automation situations).

I’m actually building LoRAs for a project right now, and found that qwen3-8b-base is the most flexible model for that. The instruct is already biased for prompting and agreeing, but the base model is where it’s at.

Snot Flickerman
link
fedilink
3
edit-2
2d

It always horrifies me a little bit how much open source has been exploited by large corporations for profit while much of the open source tech they rely on they do not invest in. Meaning by and large it often feels like open source has been unintentionally the largest transfer of the wealth created by labor to the corporate class in human history because labor had lofty ideals and capitalists are happy to exploit that.

Linux has the majority share of corporate servers and has for a long time, and yet is barely cracking 3% of the desktop (consumer, laborer) market. Corporations profit wildly from open source while the general public has not.

I very much agree there, but think of how much worse it would be if we were stuck dealing with proprietary corporate tech instead.

Snot Flickerman
link
fedilink
3
edit-2
2d

Oh of course, I’m not saying we should dump open source, closed source is so much worse, it just sucks how much the great ethics of open source are exploited by those with no ethics.

Yup, and this is precisely why it was such a monumental mistake to move away from GPL style copyleft to permissive licenses. All that achieved was to allow corporations to freeload.

deleted by creator

Snot Flickerman
link
fedilink
12
edit-2
3d

I mean… it seems painfully obvious and doesn’t need much of a thesis behind it.

The wealthy want their slaves back, but they want slaves that don’t push back, never ask for more, never need a day off, don’t need sleep, don’t need breaks, and are needlessly sycophantic to stroke the egos of the wealthy. It’s no more complex than that: the promise of LLMs was that they could have deeply exploitable knowledge workers without any of the fuss or mess of humans who want a life outside of their fucking jobs.

Like what else has this ever been? It’s been transparent since day one that this is why every business pushes AI adoption so hard, for them it has to work, they’re willing to bet the future on it because they think their sheer belief in it and throwing money at it will eventually “make it work.”

On the plus side, anyone who understands LLMs understands their limitations and the problems that are baked in to how they work and how those issues can’t be “fixed.” So this dipshit ass all-in plan that the wealthy have is doomed to crumble because it’s never going to work the way they want it to. So we’ve got that going for us.

Anyway I hate tools being described as “tools of the ruling class” because it often misses the point of how such tools can be useful to the proletariat as well. Class solidarity is a tool of the ruling class, but class solidarity would be golden in the hands of the proletariat, who vastly outnumber the wealthy class and ruling class. All tools are useful, what makes a tool dangerous is who wields it and what they choose to use it for. A hammer can be used to build and it can also be used to smash in someone’s skull. Tools aren’t the problem: specific dangerous humans are. I don’t actually have huge problems with AI LLMs providing they are open source and rolled out small scale on home PCs, I just have an issue with their industrial applications at scale and the attempt to use them to consolidate power and control. They don’t have to be used that way.

A hammer can be used for good or bad but a shrapnel bomb can only kill and maim; not all tools are multivalent like that, some are realistically only used for evil. Additionally, divorcing discussion of a tool from the social context like this is blinding to the real consequences. We don’t have small scale LLMs running on personal machines (even if we could, that’s not how it is now), we have them at industrial scale controlled by a few billionaires. It’s purely fantastical thinking to say that just because you could imagine a world where they’d be used for good, that actually means anything.

Snot Flickerman
link
fedilink
3
edit-2
2d

We don’t have small scale LLMs running on personal machines (even if we could, that’s not how it is now)

Uhhh, it’s pretty trivial to set them up. I have a local Ollama instance set up on my PC with several different open source models available right now. Just because not everyone is doing it doesn’t mean it’s not possible. I don’t even have an especially fancy computer, either. Ryzen 7 3700X, 32gb RAM, Radeon 6600XT 8gb video RAM, not exactly top of the line. I struggle with programming logic sometimes, so I use it to help me figure out if I’m doing something right or not when I can’t find an answer online, an activity I wouldn’t exactly classify as “evil.”

I also wouldn’t consider a shrapnel bomb a tool, it’s strictly a weapon just like a gun is strictly a weapon.

I get all that, I just mean you can’t just ignore the social reality. Like you said elsewhere in this thread, Linux is barely cracking 3% of the desktop market. Can you run an LLM locally? Of course. Do people do it? No, except a couple of hardcore enthusiasts. That is an issue which can’t just be cleanly separated from the technology itself and has to be taken into account when discussing it.

Also, weapons are tools.

they also want skilled labor; they are happy to have it scrape from authors and artists - primarily the same groups who the ruling class despise.

Snot Flickerman
link
fedilink
4
edit-2
2d

I mean really they despise anyone with skills because the reality is they have hardly any themselves as they’ve spent their lives paying for everyone else to do everything for them. They can’t make a meal, they can’t drive a car, they can’t do basic appliance repair, they don’t know how to actually use a computer other than social media, they can’t wash their own clothes, they can’t do anything for themselves. They despise every skilled person because it betrays their egotistical view that they are simply born better than everyone else and deserve to never have to know how to do anything at all. It reveals they know nothing and are useless to society at large, just a drain on the rest of us.

Secondly, I did say “knowledge workers” and I personally think authors and artists are a type of “knowledge work” as they require knowledge coupled with skill to do the work, just as people managing servers and databases also require a combination of knowledge and skill. Poetaetoe pohtahtoh.

Thus it’s also why they are all pushing hard for humanoid robots because they want to automate the human body after they have automated the human mind.

@utopiah@lemmy.ml
link
fedilink
3
edit-2
3d

It’s always about AGENCY and power.

confuser
link
fedilink
33d

Instead I say we let ai play with the unreal government pyramid scheme that is money while we utilize the money in the same way as we already do, essentially let the AIs do the boring stuff.

And then much later on when there is risk of not finding a niche that needs individuals we then work on creating automated labor so that we need not worry about responsibility as individuals while the AIs play present day jobs in the same we we presently do.

At this point income and responsibility are solved issues all while not even remotely changing how society works other than what we as people do day to day so that way we may finally consider living life as we please instead of being beholden to fixing problems of the world.

Reads like a communist hiding their power level or a liberal searching for a take on the enclosures actively happening this very moment that isn’t the fascist/libertarian one (“it’s different because it’s happening to me!”).

Create a post

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

  • Posting a link to a website containing tracking isn’t great, if contents of the website are behind a paywall maybe copy them into the post
  • Don’t promote proprietary software
  • Try to keep things on topic
  • If you have a question, please try searching for previous discussions, maybe it has already been answered
  • Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
  • Be nice :)

Related communities

much thanks to @gary_host_laptop for the logo design :)

  • 0 users online
  • 108 users / day
  • 435 users / week
  • 1.32K users / month
  • 4.54K users / 6 months
  • 1 subscriber
  • 4.61K Posts
  • 116K Comments
  • Modlog