• 0 Posts
  • 27 Comments
Joined 2Y ago
cake
Cake day: Dec 18, 2023

help-circle
rss

Making a joke about a ridiculous habit is hardly making someone a hero.



If the proton guy had only kept his mouth shut… he would have tons more customers in the last 2 months.


Interesting. What do you get on wifi 6? On what hardware?

For wifi 6 I have unifi ap 6 lite, unifi Express gateway in ap mode and my ISPs router, some Telekom own hardware IIRC.

For wifi 5 I tested mainly unifi ac lite (the first gen). I got around 300mbps from a raspberry pi with hostapd, but that hardly counts as an access point…


Does it have to be new? I can consistently find used ac lites from unifi on Kleinanzeigen.de for 50€, I got two of those and later a 6 lite for another 50€, that’s half my network. They are old and out of warranty, but use very low power and have really decent speeds (~500mbps).


My wifi 5 APs from unity get 450-500mbps. The wifi 6 APs get between 300 and 600. Only the expensive 7 pro max get noticeably more, 2.3Gbps, but the wifi 5 is more than enough for 95% of people out there.


Are the scissors broken in your house, son?

At least one person got the reference!




Flatpack? You are using Linux and you need “iso writers”? Is your dd broken, son?


If 11 people are sitting at a table with a known Nazi, chatting, enjoying themselves and having a great time, you have a table with 12 nazis.


What facts? Republicans absolutely do not stand for the little guy…


You could spin up instances of cloud servers just for the time of your connection, but that would be slower to set up and probably more expensive (depends how much time you spend on the vpn).


I don’t know if I would describe proton as a big tech company. Even if they were, their whole pitch is “you can trust us”.


There’s nothing in GPL that would forbid it. Only distribution without code publication is forbidden.


How would that help? If you release something as GPL code, you cannot prevent it from being used to train a model, no matter where it’s hosted.


You are right, that is impossible, but it isn’t what they are doing.

They are turning it back on either manually or by some other method (cell tower for instance). This automation seems to be to just turn it off.


You could have been helpful and chose to be an asshole instead. Go back and think about what you did.




My dude, IKEA has an in-house AI model. Every insurance company has one. Subway (the sandwich shop) has one.

Saying that the NSA “supposedly” has an AI model that can search through data is like saying they “maybe” have a coffee machine.





Yeah, modern arm CPUs can run at 3GHz and play PS4 level games, but I don’t want my phone to become a handwarmer every time I want to typefvvn a quick email…

And of course, I’m not talking about correcting “fuck” to “duck”, I’m talking about ChatGPT level prediction. Or llama2, or gemini nano, or whatever…


It can and it will. That is one of the uses of “NPUs” I’m most excited about.

Basically you can run an (potentially open-source) small LLM on the phone using whatever context the keyboard has access to (at a minumim, what you’ve typed so far) and have the keyboard generate the next token(s).

Since this is comptationally intensive the model has to be small and you need dedicated hardware to optimize it, otherwise you would need a 500W GPU like the big players. You can do it for 0.5W locally. Of course, adjust your expectations accordingly.

I don’t know any project doing it right now, but I imagine that Microsoft will integrate in SwiftKey soon, with open source projects to follow.