Hello Everyone,
This is something I’ve been thinking about in the wake of many users joining Signal, due to WhatsApp’s new privacy policy changes.
When it comes to the mobile client (in case of Android), we could verify its integrity by checking the source code & the APK’s integrity using reproducible builds (https://signal.org/blog/reproducible-android/).
When it comes to the server, it is possible that it could get compromised in many ways.
My question is, when it comes to privacy & security, does the server integrity matter if we are reasonably sure the client isn’t compromised in any way or doesn’t transmit anything that the server could access in a meaningful way.
And, this could apply to any service that has both FOSS client & server or just FOSS client.
Thanks for the explanation.
So, hypothetically speaking, can we say that it’s alright for any messaging service to have it’s server remain closed source as long as it has features similar to the following?
No, because if the server is closed source, it means that nobody can spin up another network, and distribute a modified version of the app that connects to said new network, which means that in that case the owner can publish updates that reduce the privacy of the app, without actually facing the risk of a fork (this is exactly what WhatsApp is doing).
The signal server code is open-source: https://github.com/signalapp/Signal-Server
But in the end it mainly depends of what your definition of “alright” is. My n°1 concern is surveillance capitalism. So my definition of “acceptable” relies a lot on the fact that the service is not run by a corporation that is into the ad business. The fact that is FLOSS, especially copyleft licences, pretty much guaranties that the app won’t be bought by an ad company (like WhatsApp was).
Can you please elaborate on this?
For example, removing the encryption so that the contents of the messages can be used for ad targeting. If the server is closed source no one can fork the app and build an alternative network keeping the encryption, they would have to rebuild the whole server side from scratch. One could keep using the old version of the app, but it is likely that they will end up being booted of by the servers.
If Signal does that, it is very likely that third party clients and servers will be able to quickly pop up and keep the encryption.
All these are some hypothetical scenarios I thought about.
As far as I read, the client sends as little as possible, encrypted. So, the server can’t interpret it meaningfully. Let’s say I’ve installed the client from an apk which I know has not been compromised.
So, either the client becomes unusable, because the server tries to mess with the encryption, or the server simply doesn’t accept requests from the modified client.
Almost anyone can be socially engineered to accept a compromised app update, but that is anyways a moot point.
Why would you use a service that connects your device to a US based and likely compromised server, if there are alternatives that can be hosted locally? It doesn’t really matter if the service only sends minimal and encrypted data, because in the age of big data that is plenty to use for ML based correlation. No data shared is always better :)
I totally get that. But, it’s an uphill battle to make people you know well, to switch to a centralized alternative, let alone a decentralized/p2p/self hosted one.