And since you won’t be able to modify web pages, it will also mean the end of customization, either for looks (ie. DarkReader, Stylus), conveniance (ie. Tampermonkey) or accessibility.

The community feedback is… interesting to say the least.

Actually it would make some sense, not that I like it though.

What can you do to prevent scraping? A lot of people are screaming about their IP being used to train AI, but have they actually done anything to tell the world that you can’t use the texts to train AI? Does copyright alone protect against the use for AI training? To the best of my knowledge there’s no case law either way. But if you have to circumvent DRM to train AI then you’ll have a hard time witht the “I did not know that I couldn’t do that” defense.

So some news outlets get to protect their precious little articles from the big bad AI, which will probably destroy news as we know it anyway even more than it already has, while the rest of us gets force fed advertisement.

Allow me to sarcastically quote timbuk3

Things are going great, and they’re only getting better

So some news outlets get to protect their precious little articles from the big bad AI, which will probably destroy news as we know it anyway

I was thinking about this. What happens when all the big outlets are having AI write their news?You can’t get answers on today’s news without feeding the model today’s news. Therefore, somebody has to create the data source.

I see a few scenarios:

  • Google scrapes, aggregates, and summarizes to the point that nobody reads the article/sees the ads and the news site goes under. Then Google has nothing to scrape but press releases and government sources. Or…
  • News sites block access to scrapers and charge for it but may be wary of crossing their customers (news aggregators) in their coverage
  • The above creates a tiered system where premium news outlets (AI assisted writing but with human insight) are too expensive for ad supported Google to scrape, so Google gets second tier news from less reliable, more automated sources, or simply makes it themselves. Why not cut out the middle man?
  • Rouge summarizers will still scrape the real news outlets and summarize stories to sell to Google. This will again make paid news a luxury since someone with a subscription will summarize and distribute the main point (okay) or their spin (bad).

I’m failing to see where this will go well. Is there another scenario?

Create a post

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

  • Posting a link to a website containing tracking isn’t great, if contents of the website are behind a paywall maybe copy them into the post
  • Don’t promote proprietary software
  • Try to keep things on topic
  • If you have a question, please try searching for previous discussions, maybe it has already been answered
  • Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
  • Be nice :)

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

  • 0 users online
  • 57 users / day
  • 383 users / week
  • 1.5K users / month
  • 5.7K users / 6 months
  • 1 subscriber
  • 2.97K Posts
  • 74.6K Comments
  • Modlog