Hello Everyone,

This is something I’ve been thinking about in the wake of many users joining Signal, due to WhatsApp’s new privacy policy changes.

When it comes to the mobile client (in case of Android), we could verify its integrity by checking the source code & the APK’s integrity using reproducible builds (https://signal.org/blog/reproducible-android/).

When it comes to the server, it is possible that it could get compromised in many ways.

My question is, when it comes to privacy & security, does the server integrity matter if we are reasonably sure the client isn’t compromised in any way or doesn’t transmit anything that the server could access in a meaningful way.

And, this could apply to any service that has both FOSS client & server or just FOSS client.

  • Dreeg Ocedam@lemmy.ml
    link
    fedilink
    arrow-up
    11
    ·
    4 years ago

    If signal’s servers become compromised:

    • The messages of already open channels stay encrypted1
    • The attacker has access to the frequency at which you receive new messages (but not from whom, if they use a VPN2) thanks to Sealed Sender
    • The attacker cannot know how frequently and to whom you send messages, thanks to sealed sender also (once again if you use a VPN2)
    • The attacker can compromise any new communication you open, though if sealed sender is enabled for unknown senders, they can’t know who you are opening the channel to before compromising it
    • You can detect any channel that is compromised by checking the safety code of all of your conversations.
    • They can prevent you from receiving/sending any messages, but once again, thanks to sealed sender, they can pick which contact they block you from connecting to.
    • If you have given Signal access to your address book, they might be able to get all the phone numbers that are in it3.
    • In theory they can’t know the members of each groups, but with clever timing analysis of who receives messages when etc… they might be able to have an idea of who is in the same group.
    1. By channel I mean any conversation between 2 people. My understanding is that groups work by just you sending the same message to everyone in the group, so groups are just a special case of 1 to 1 conversations between every pair of 2 people in the group.
    2. If you don’t use a VPN or anything similar, they could likely just track the IP address of the sender to get an idea of who is sending the message, rendering the Sealed Sender tech much less efficient
    3. This is a bit tricky, because of private contact discovery, but my understanding is that if intel is compromised too, the attacker could nullify the benefits of this tech, if someone with greater knowledge of the SGX secure enclave could validate, that’d be great.

    Disclaimer: I am not an expert on Signal, or secure messaging in general, this is just my understanding of the tech behind Signal.

    • Rugged Raccoon@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      4 years ago

      Thanks for the explanation.

      So, hypothetically speaking, can we say that it’s alright for any messaging service to have it’s server remain closed source as long as it has features similar to the following?

      • Both sender & receiver use VPN
      • Sealed Sender
      • Private contact discovery
      • Safety code of conversations
      • Any other strong features Signal has
      • Dreeg Ocedam@lemmy.ml
        link
        fedilink
        arrow-up
        6
        ·
        4 years ago

        No, because if the server is closed source, it means that nobody can spin up another network, and distribute a modified version of the app that connects to said new network, which means that in that case the owner can publish updates that reduce the privacy of the app, without actually facing the risk of a fork (this is exactly what WhatsApp is doing).

        The signal server code is open-source: https://github.com/signalapp/Signal-Server

        But in the end it mainly depends of what your definition of “alright” is. My n°1 concern is surveillance capitalism. So my definition of “acceptable” relies a lot on the fact that the service is not run by a corporation that is into the ad business. The fact that is FLOSS, especially copyleft licences, pretty much guaranties that the app won’t be bought by an ad company (like WhatsApp was).

        • Rugged Raccoon@lemmy.mlOP
          link
          fedilink
          arrow-up
          2
          ·
          4 years ago

          in that case the owner can publish updates that reduce the privacy of the app

          Can you please elaborate on this?

          • Dreeg Ocedam@lemmy.ml
            link
            fedilink
            arrow-up
            4
            ·
            4 years ago

            For example, removing the encryption so that the contents of the messages can be used for ad targeting. If the server is closed source no one can fork the app and build an alternative network keeping the encryption, they would have to rebuild the whole server side from scratch. One could keep using the old version of the app, but it is likely that they will end up being booted of by the servers.

            If Signal does that, it is very likely that third party clients and servers will be able to quickly pop up and keep the encryption.

            • Rugged Raccoon@lemmy.mlOP
              link
              fedilink
              arrow-up
              2
              ·
              4 years ago

              All these are some hypothetical scenarios I thought about.

              As far as I read, the client sends as little as possible, encrypted. So, the server can’t interpret it meaningfully. Let’s say I’ve installed the client from an apk which I know has not been compromised.

              So, either the client becomes unusable, because the server tries to mess with the encryption, or the server simply doesn’t accept requests from the modified client.

              • poVoq@lemmy.ml
                link
                fedilink
                arrow-up
                1
                ·
                4 years ago

                Almost anyone can be socially engineered to accept a compromised app update, but that is anyways a moot point.

                Why would you use a service that connects your device to a US based and likely compromised server, if there are alternatives that can be hosted locally? It doesn’t really matter if the service only sends minimal and encrypted data, because in the age of big data that is plenty to use for ML based correlation. No data shared is always better :)

                • Rugged Raccoon@lemmy.mlOP
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  4 years ago

                  I totally get that. But, it’s an uphill battle to make people you know well, to switch to a centralized alternative, let alone a decentralized/p2p/self hosted one.