Everyone talks about how evil browser fingerprinting is, and it is, but I don’t get why people are only blaming the companies doing it and not putting equal blame on browsers for letting it happen.
Go to Am I Unique and look at the kind of data browsers let JavaScript access unconditionally with no user prompting. Here’s a selection of ridiculous ones that pretty much no website needs:
If you’re wondering how sensors are used to fingerprint you, I think it has to do with manufacturing imperfections that skew their readings in unique ways for each device, but websites could just as easily straight up record those sensors without you knowing. It’s not a lot of data all things considered so you likely wouldn’t notice.
Also, canvas and webGL rendering differences are each more than enough to 100% identify your browser instance. Not a bit of effort put into making their results more consistent I guess.
All of these are accessible to any website by default. Actually, there’s not even a way to turn most of these off. WHY?! All of these are niche features that only a tiny fraction of websites need. Browser companies know that fingerprinting is a problem and have done nothing about it. Not even Firefox.
Why is the web, where you’re by far the most likely to execute malicious code, not built on zero trust policies? Let me allow the functionality I need on a per site basis.
Fuck everything about modern websites.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
much thanks to @gary_host_laptop for the logo design :)
Just yesterday I was on a news website. I wanted to support it and the author of the piece so I opened a clean session of firefox. No extensions or blocking of any kind.
The “initial” payload (i.e. after I lost patience approximately 30s after initial page load and decided to call a number) was 14.79MB transferred. But the traffic never stopped. In the network view you could see the browser continually running ad auctions and about every 15s the ads on the page would cycle. The combination of auctions and ads on my screen kept that tab fully occupied at 25-40% of my CPU. Firefox self-reported the tab as taking over 400MB of RAM.
This was so egregious that I had to run one simple test. I set my DNS on my desktop to my PiHole and re-ran my experiment.
Initial payload went from almost 14.79 -> 4.00MB (much of which was fonts and oversized images to preview other articles). And the page took 1/4 the RAM and almost no CPU anymore.
Modern web is dogshit.
This was the website in question. https://www.thenation.com/article/politics/welcomefest-dispatch-centrism-abundance/
Dude. I thought That was bad. Just now I went to arstechnica to view one article and I did the same thing to “support” the site. It was 36MB in one minute.