☆ Yσɠƚԋσʂ ☆
  • 82 Posts
  • 314 Comments
Joined 6Y ago
cake
Cake day: Jan 18, 2020

help-circle
rss

I’m actually building LoRAs for a project right now, and found that qwen3-8b-base is the most flexible model for that. The instruct is already biased for prompting and agreeing, but the base model is where it’s at.


Yup, and this is precisely why it was such a monumental mistake to move away from GPL style copyleft to permissive licenses. All that achieved was to allow corporations to freeload.


I very much agree there, but think of how much worse it would be if we were stuck dealing with proprietary corporate tech instead.


How is that wishful thinking? Open models are advancing just as fast as proprietary ones and they’re now getting much wider usage as well. There are also economic drivers that favor open models even within commercial enterprise. For example, here’s Airbnb CEO saying they prefer using Qwen to OpenAI because it’s more customizable and cheaper

“We’re relying a lot on Alibaba’s Qwen model. It’s very good. It’s also fast and cheap,” he said. “We use OpenAI’s latest models, but we typically don’t use them that much in production because there are faster and cheaper models.”

I expect that we’ll see exact same thing happening as we see with Linux based infrastructure muscling out proprietary stuff like Windows servers and Unix. Open models will become foundational building blocks that people build stuff on top of.


Honestly, I suspect it makes very little difference in practice which one you’re using if you’re going to communicate with people outside Proton. If I use Gmail, and you send me an email from your Proton account, guess what happens.




Right, which really suggests that email is not the right medium if you want genuine privacy.


Right, understanding what your threat model is important. Then you can make a conscious choice regarding the trade offs of using a particular service, and you understand what your risks are.


Metadata tracking should be very concerning to anyone who cares about privacy because it inherently builds a social graph. The server operators, or anyone who gets that data, can see a map of who is talking to whom. The content is secure, but the connections are not.

Being able to map out a network of relations is incredibly valuable. An intelligence agency can take the map of connections and overlay it with all the other data they vacuum up from other sources, such as location data, purchase histories, social media activity. If you become a “person of interest” for any reason, they instantly have your entire social circle mapped out.

Worse, the act of seeking out encrypted communication is itself a red flag. It’s a perfect filter: “Show me everyone paranoid enough to use crypto.” You’re basically raising your hand. So, in a twisted way, tools for private conversations that share their metadata with third parties, are perfect machines for mapping associations and identifying targets such as political dissidents.




you mean the Gestapo since GDR was integrated into the west German model






It’s so lovely to see how the mask has finally fallen off and we get to see the EU as the totalitarian regime that it really is.







This is the core of the issue, and it’s wild how many people don’t get it.

Your phone number is metadata. And people who think metadata is “just” data or that cross-referencing is some kind of sci-fi nonsense, are fundamentally misunderstanding how modern surveillance works.

By requiring phone numbers, Signal, despite its good encryption, inherently builds a social graph. The server operators, or anyone who gets that data, can see a map of who is talking to whom. The content is secure, but the connections are not.

Being able to map out who talks to whom is incredibly valuable. A three-letter agency can take the map of connections and overlay it with all the other data they vacuum up from other sources, such as location data, purchase histories, social media activity. If you become a “person of interest” for any reason, they instantly have your entire social circle mapped out.

Worse, the act of seeking out encrypted communication is itself a red flag. It’s a perfect filter: “Show me everyone paranoid enough to use crypto.” You’re basically raising your hand.

So, in a twisted way, Signal being a tool for private conversations, makes it a perfect machine for mapping associations and identifying targets. The fact that it operates using a centralized server located in the US should worry people far more than it seems to.

The kicker is that thanks to gag orders, companies are legally forbidden from telling you if the feds come knocking for this data. So even if Signal’s intentions are pure, we’d never know how the data it collects is being used. The potential for abuse is baked right into the phone-number requirement.





EU quietly funded a “Thought Surveillance” project that scores citizens for ‘radicalization’ using L
The EU built a system called [CounterR](https://cordis.europa.eu/project/id/101021607) that essentially performs pre-crime thought surveillance. The TLDR is that an [AI company](https://insiktintelligence.com/), with direct input from half a dozen European [police forces](https://cordis.europa.eu/project/id/101021607), built a tool that scrapes social media, forums, and other sources to assign citizens a score based on what they think as opposed to what they've actually done. [The EC also has not released details of the project.](https://cordis.europa.eu/programme/id/H2020). The report itself acknowledges that this sort of automated system "can trigger new fundamental rights risks that affect rights different than the protection of personal data and privacy." > [The European Commission's White Paper on Al observes that Al-related processing of personal data can trigger new fundamental rights risks that affect rights different than the protection of personal data and privacy, such as the right to freedom of expression, and political freedoms - in particular when Al is used by online intermediaries to prioritise information and for content moderation.](https://www.europarl.europa.eu/RegData/etudes/STUD/2020/656295/IPOL_STU(2020)656295_EN.pdf) The police were active co-developers, sitting in meetings to define the criteria and feeding real, anonymized data from their investigations to train the LLM. So now you have a feedback loop where police define the threat, the LLM learns it, and the police validate the results, with zero external oversight. And of course, it's all shrouded in secrecy. The whole thing is confidential, the source code is proprietary so even partners can't audit it, and the ethics board is made up of the same people building the thing. There's no clear requirement to track false positives, so you could be flagged as a potential radical and never know why. > [Regarding transparency of funded research, it must be noted that generally research proposals foresee Confidentiality of some results is often necessary, especially in the realm of security.](https://www.europarl.europa.eu/RegData/etudes/STUD/2020/656295/IPOL_STU(2020)656295_EN.pdf) The cherry on top? The core technology, developed with public funds, was recently [acquired by a private company](https://www.eu-startups.com/2024/08/logically-acquires-barcelona-based-insikt-ai-to-combat-harmful-online-content/), Logically, who can now sell this dystopian scoring system to whoever they want. The citizens of the EU literally paid to build our own panopticon. The whole project is about normalizing the idea that the state gets to algorithmically monitor and judge your political beliefs before you ever commit a crime.
fedilink





The only people who know what the server stores are the people running it.


I’m simply explaining why it’s difficult for people to move from existing networks.


Oh yeah the whole thing is a mess. It kind of blows my mind that we still don’t have a single common protocol that at least the open source world agrees on. Like there is a more or less fixed set of things chat apps need to do, we should be able to agree on something akin to ActivityPub here as a base.


The explanation is obvious. The phone numbers are a personally identifiable network of connections that is available to the people operating Signal servers. If this information is shared with the US government, then they can easily correlate this information with all the other data they have. For example, if somebody is identified as a person of interest then anybody they want to have secure communications would also be of interest.


One of the big problems nowadays is proprietary protocols. Back in the day, you could have a single client that could talk to different networks. Today you have to run a bunch of separate apps, and what’s worse is that a lot of them are built with stuff like Electron that’s resource hungry.


It’s network effects. People have other friends on the network who have their own friends on the network, and so on. Leaving the network means convincing a critical mass of your network to leave along with you.


Oh for sure, I find most tech news articles are just painful to read nowadays. I also distinctly remember this was not the case before.


I don’t think they actually understand that Mastodon is a network in a traditional sense that works the way the internet was meant to operate before the corporate takeover. People have been so conditioned that the internet is just 5 corps in a trench coat, that they don’t have the cognitive tools to engage with something like the fediverse.


Exactly, even is Signal is secure across the wire, if you have a kit like Pegasus on your phone then it can just capture keyboard input, and screen output, entirely bypassing Signal itself.


I mean this has already been happening for years with stuff like Google home and Alexa.




I thought we were talking about China as opposed to autocratic western regimes.


No, it’s because lemmy.ml doesn’t tolerate racism the way you fash instance does.


You can literally access the fediverse from China. It’s frankly incredible that somebody can be this ignorant. 🤣


My comment likewise touches on the fact that people living under western regimes are disappeared in regular course of business. In fact, there’s far more evidence for people disappearing without any reasonable grounds in the west than there are in China.


lmfao people literally get arrested for social media posts in UK and Germany


The distinction with China is in actual human rights like people being able to afford housing, food, and education. People not having to worry about losing their job and ending up on the street or being able to retire in dignity.


Having supposed freedumbs is literally what y’all have been braying made the West different from Russia and China.


Old enough to remember how Europeans smugly thought that they lived in some sort of an enlightened and free society.





The point is that I, and many other people, have answered these questions many times. If you’re personally ignorant on the subject, then spend the time to educate yourself. You can start with the materials I’ve provided you. It’s not my job to educate you. I perfect having interactive discussion with people who understand the subject they’re discussing and want to have a discussion in good faith. It’s very transparent that you are not.

I’ll let you have the last word here which you so desperately need.

Bye.


I just love how you keep acting like these questions haven’t been answered time and again. As if you came up with some novel line of questioning nobody has ever thought before. Go read a book for once in your life. Here’s one you can start with. https://welshundergroundnetwork.cymru/wp-content/uploads/2020/04/blackshirts-and-reds-by-michael-parenti.pdf

And here’s how people who actually live in China characterize their modern government in one or two words. If you spent as much time educating yourself on the subjects you wish to debate instead of making a clown of yourself in public, you wouldn’t have to ask questions like this and em brass yourself.

You’re like a living embodiment of the Dunning-Kruger effect.