• 0 Posts
  • 11 Comments
Joined 9M ago
cake
Cake day: Sep 27, 2023

help-circle
rss

Thanks! I didn’t see that. Relevant bit for convenience:

we call model providers on your behalf so your personal information (for example, IP address) is not exposed to them. In addition, we have agreements in place with all model providers that further limit how they can use data from these anonymous requests that includes not using Prompts and Outputs to develop or improve their models as well as deleting all information received within 30 days.

Pretty standard stuff for such services in my experience.


I’m not entirely clear on which (anti-)features are only in the browser vs in the web site as well. It sounds like they are steering people toward their commercial partners like Binance across the board.

Personally I find the cryptocurrency stuff off-putting in general. Not trying to push my opinion on you though. If you don’t object to any of that stuff, then as far as I know Brave is fine for you.


Short answer: inserting affiliate links into results, and weird cryptocurrency stuff. https://www.theverge.com/2020/6/8/21283769/brave-browser-affiliate-links-crypto-privacy-ceo-apology

I don’t know if that’s “worse than Microsoft” because that’s a real high bar. But it’s different anyway.


If you click the Chat button on a DDG search page, it says:

DuckDuckGo AI Chat is a private AI-powered chat service that currently supports OpenAI’s GPT-3.5 and Anthropic’s Claude chat models.

So at minimum they are sharing data with one additional third party, either OpenAI or Anthropic depending on which model you choose.

OpenAI and Anthropic have similar terms and conditions for enterprise customers. They are not completely transparent and any given enterprise could have their own custom license terms, but my understanding is that they generally will not store queries or use them for training purposes. You’d better seek clarification from DDG. I was not able to find information on this in DDG’s privacy policy.

Obviously, this is not legal advice, and I do not speak for any of these companies. This is just my understanding based on the last time I looked over the OpenAI and Anthropic privacy policies, which was a few months ago.


Yeah, I wouldn’t be too confident in Facebook’s implementation, and I certainly don’t believe that their interests are aligned with their users’.

That said, it seems like we’re reaching a turning point for big tech, where having access to private user data becomes more of a liability than an asset. Having access to the data means that they will be required by law to provide that data to governments in various circumstances. They might have other legal obligations in how they handle, store, and process that data. All of this comes with costs in terms of person-hours and infrastructure. Google specifically cited this is a reason they are moving Android location history on-device; they don’t want to deal with law enforcement constantly asking them to spy on people. It’s not because they give a shit about user privacy; it’s because they’re tired of providing law enforcement with free labor.

I suspect it also helps them comply with some of the recent privacy protection laws in the EU, though I’m not 100% sure on that. Again, this is a liability issue for them, not a user-privacy issue.

Also, how much valuable information were they getting from private messages in the first place? Considering how much people willingly put out in the open, and how much can be inferred simply by the metadata they still have access to (e.g. the social graph), it seems likely that the actual message data was largely redundant or superfluous. Facebook is certainly in position to measure this objectively.

The social graph is powerful, and if you really care about privacy, you need to worry about it. If you’re a journalist, whistleblower, or political dissident, you absolutely do not want Facebook (and by extension governments) to know who you talk you or when. It doesn’t matter if they don’t know what you’re saying; the association alone is enough to blow your cover.

The metadata problem is common to a lot of platforms. Even Signal cannot use E2EE for metadata; they need to know who you’re communicating with in order to deliver your messages to them. Signal doesn’t retain that metadata, but ultimately you need to take their word on that.


Any Safari extensions installed that might be interfering with this behavior? That’s the best I can figure.


Interesting. Are there any other accounts on your phone that provide contacts? Maybe social media or other chat platforms? On Android you can see accounts in Settings > Passwords & Accounts (or somewhere similar; it varies a little between brands). You can also check inside your Contacts app by expanding the sidebar (again, varies by brand).

Just a thought. I don’t have any other contact providers on my phone so I can’t test it myself.

Please keep us posted if you get any official response or learn anything new!


Has anyone else been able to reproduce this? I just tried and was not able to.

OP, is it possible these people were in group chats you were part of?


Key verification has been a real problem for decades, and AFAIK nobody’s made a solution that is simple and effective.


I’ve been using the free version for a couple years now. If the app wasn’t so janky I would have upgraded but now. Camera sync sort of works, but only if I manually open the app. It doesn’t function in the background like FolderSync or most cloud storage apps, even when I disable battery optimization. I also can’t manually upload large files easily; usually it fails halfway through.

This is on Android and has been fairly consistent since Android 11.

I’m still on the hunt for encrypted cloud storage that can sync arbitrary folders, like my camera and Signal backup folder.