Hello,

I recently read an article on backdoors in Signal from Free Software Foundation Europe (actually got it from https://lemmy.ml/post/47901).

Apparently, Signal relies on Google binary code for the location sharing feature. They say it’s possible to update that binary code during runtime, so that the next time Signal starts, it would run this code.

The code is then executed in the Signal process, which includes access to the Signal history database and the crypto keys

Now, we could always say that something like this wouldn’t be used against common people and would most likely be targeted at people like journalists, whistleblowers & other parties of concern. But, if Signal is recommended as a tool for everyone, shouldn’t this be considered?

I haven’t read the Signal-Android source myself and would like to know if this is an actual concern and is it still relevant now?

The following is a quoted content on Google Maps integration for location sharing feature.

---
Maps Integration

After selecting to share a location, Signal shows a small map with a pin on the selected location. After the map is loaded, a screenshot of it is sent as an image to the other side, together with some string describing the location.

The relevant thing here is the displaying of the map. This is done by embedding the MapView when displaying a conversation, and showing it up when there is something to show. This means the MapView is already initialized when opening a conversation in Signal. Critical about this is that the MapView view as used by Signal is just a wrapper that loads the actual MapView, by including code from the Google Play Services binary (which means code outside of the apk file you meant to use). This code is included by calling the createPackageContext-method together with the flags CONTEXT_INCLUDE_CODE and CONTEXT_IGNORE_SECURITY. The latter is a requirement as the android system would deny loading code from untrustworthy sources otherwise (for a good reason). The code is then executed in the Signal process, which includes access to the Signal history database and the crypto keys.

The Google Play Services binary can easily be updated in background through Google Play Store, even targeted to single users, and the updated code would become active inside Signal the moment you use it next time. Can it get worse? Yes. An apk update would be detectable to the user, but Google Play Services uses a dynamic module loading system (called Chimera and/or Dynamite) that seems to be capable of replacing the Maps implementation from a file not installed to the system, as long as it’s signed by Google. If it is possible for Google to push an update only for this module and remove it later, it might be possible for them to inject code into the Signal client that uploads your complete local chat history unencrypted and afterwards removes all signs of it’s existence.

What does “seems to be able” mean? Well it’s hard to determine exactly. The relevant binary is highly obfuscated and thus hard to understand. Maybe someone wants to waste his time on this, but remember it can be changed in the next release again…
---
  • @Bloodaxe@lemmy.ml
    link
    fedilink
    83 years ago

    Is there a particular reason why Signal doesn’t just use OpenStreetMap? I don’t really like the look of this 😑

  • @BlackCentipede@lemmy.ml
    link
    fedilink
    7
    edit-2
    3 years ago

    I would consider it compromised at that point.

    If you have any close source google module/binary (let alone one that auto-update itself…) without any forms of isolation and sand-boxing, your software is compromised period. The things about privacy is that you can’t take a scout honor when it come to dealing with that, you have to ENFORCE IT and take the time to AUDIT IT. which requires a level of transparency IE you need open source

    This is why we don’t trust cryptography algorithms if it’s close source, same concept applies to software that are close sourced.

    • poVoq
      link
      fedilink
      93 years ago

      It seems like you can run Signal without any Google services and then the location picker will just not work.

      I don’t want to defend Signal, but this looks a bit like a lazy short-cut with some security implications, but in the end it boils down to the fact that if you use an Android phone with the Google rootkit installed, no amount of hardening of the Signal app itself would help.

      • @BlackCentipede@lemmy.ml
        link
        fedilink
        63 years ago

        Pretty much, if some level of confidentiality is required, then a hardened Linux computer with 100% open source software running would be necessary, but even that requires audit from the user which is already exceedingly difficult and mentally exhausting.

        Privacy/Security topic is some of the most convoluted discussion as far as IT field as a whole is concerned, because there are simply too many areas that have to be defended against and the chain is only as strong as it’s weakest link. (Hell, we have to start worrying about cell phone picking up cryptographic keys from CPU cache/RAM…)

      • Rugged RaccoonOP
        link
        fedilink
        53 years ago

        Yeah can understand, for ex: when targeted individuals are considered, they probably would be using a de-googled version Android, so either the location services won’t work or, they will use clients like Molly-Foss.

      • @nutomic@lemmy.ml
        link
        fedilink
        43 years ago

        I disagree, Signal markets itself as being very secure, so they shouldn’t take those kind of shortcuts (at least not without telling the user). And sure if your phone has Google apps that is already bad, but no excuse to make it worse and include more Google code.

        • poVoq
          link
          fedilink
          43 years ago

          Sure, but this “backdoor” can only be used by google, so the theat level is pretty much the same as with the Google stuff itself alone.