I’ve seen posts by the GrapheneOS team about recommendations against using both F-Droid and Aurora. F-Droid had a decent sized list of issues they raised. One of the key ones they raised against both was that it added an extra person to trust. You always need to trust the code of the developer of the app. No way to avoid that. With F-droid you need to trust that their build system/infrastructure is serving you the app as per the developers code. With Aurora you need to trust the Aurora devs are giving you the app unmodified from Google.
There were other criticisms on F-Droid that they sign almost all apps with their own key rather than the developers. They do offer to serve apps with the developer keys, but it’s difficult to setup and not many apps implement it. Google Play also does the same thing though, so I feel this risk isn’t that big. Generally they seem to recommend getting apps directly from developers rather than via a 3rd party. They offer Accrescent in the GrapheneOS app store which is designed for this, just pulls files from Github AFAIK.
All that said. I prefer to get all my apps from F-Droid (NeoStore technically) and Aurora for anything without a F-Droid repo.
The UnifiedPush server is intended to be a single source your phone can keep a persistent connection open to, rather than needing a connection per service/app (this is how Google’s Firebase notifications work too).
As Signal doesn’t support UnifiedPush, MollySocket keeps a permanent connection open to Signal’s servers to listen for new activity and forward them to your UnifiedPush server. This saves your phone keeping a permanent connection open to Signal’s servers and draining your mobile battery more.
For Signal/Molly, it’s less that the notification is encrypted as I understand it. It’s more the notification content is just “Hey! Stuff happened” for Signal. The app then reaches out directly to the Signal servers to see what’s new. So the message content is never sent via the push notification service (UnifiedPush or Google’s service).
That would require a lot of data privacy concerns to be addressed. Even if it’s an explicit opt-in. The current method uses sample text which can’t include PII. Using user supplied text would almost guarantee they’d get names and other PII in their data set.
I also imagine it’s harder to train the model when you don’t know exactly what the user was trying to type. I.e. Was the swipe detection wrong, or did the user delete the word because they changed their mind on what to write?
The issue isn’t a big deal for the average user. The vulnerability required them to first get your username and password, physically steal your Yubikey, spend half a day using $10-15k worth of electronics equipment to repeatedly authenticate over and over, they then could potentially make a clone of the key.
When I migrated emails last time, I setup my old email to automatically forward to the new email. Then on my new email, I setup an automatic label for any email that was addressed to the old address. Every week or two I’d review what was sent to it and either update the email address used or unsubscribe. Eventually it got to a level where I wasn’t getting much at the old email anymore and finally deleted it.
I’ve swapped to using it since I switched to GrapheneOS. Only apps I’ve got using it so far are Tusky (Mastodon), Molly (Signal fork with UnifiedPush), and some of my self hosted stuff which allows for web hooks.
I really hope it catches on in more apps. Especially as their library has automatic fallback to Google’s service.
It’s not about it being locked. It’s being able to re-lock it after unlocking. You can unlock it, flash something like GrapheneOS on to it and then re-lock it. If it’s left unlocked, then anyone with a few minutes access to your phone could flash anything over the top allowing them to bypass the standard protections, install any app as at the system level.
That doesn’t hold up against the publicly available source code for their applications, white papers on their security and encryption, and multiple independent security reviews. And again, they are legally required to ignore US court orders. Only a Swiss court order can compel them to provide user information.
Got a source for that? Proton isn’t able to access to any user emails. I believe Swiss law also makes it illegal for them to provide user information without a (Swiss) court order.
The only case I’ve heard of that was similar was when the Swiss court ordered them to provide all the info they had on a user. This was the last IP address they logged on from and a recovery email the user had entered. The recovery email is an optional thing the user had set up on their account. They also used this same email address to sign up for a Twitter account. They were able to get enough data from Twitter to identify the person.
CoMaps is a recent fork of Organic Maps. So those two are pretty similar at the moment in terms of functionality. Osmand I would say has a lot more features and customisation options, but Organic/CoMaps is faster and more responsive.