@iortega
link
fedilink
9urte bat

Is this really a “proof”? I can’t say I’m a Signal hate, but neither a lover, however I’m not sure if signal itself explaining why signal is privacy friendly is enough to consider their service and products privacy friendly. It might just be my opinion though. Too used to companies providing equivalent arguments.

@blkpws@lemmy.ml
link
fedilink
7urte bat

It’s like Telegram saying it is privacy friendly and open source friendly. Or Proton saying privacy friendly and then requesting personal identification of the owner of a blog.

@gmate8@lemmy.ml
creator
link
fedilink
5
edit-2
urte bat

But you know they can’t say no to the authorities. If they say to start logging user activity from an account, they have to. Otherwise they would become criminals. And it was really the top of these cases. They decline requests, what they can. But there are situations, when even the swiss laws can’t protect you. If you are really concerned, you should try self-hosting.

@RushKitty@lemmy.ml
link
fedilink
3urte bat

Cant agree with you more , When they ask for your phone number, It means no privacy .

Wrong, phone number relates to anonymity, not privacy. Privacy, security and anonymity are 3 different things.

@RushKitty@lemmy.ml
link
fedilink
3urte bat

What you said is very enlightening, I didn’t realize these differences before

@pancake@lemmy.ml
link
fedilink
2urte bat

I’m a little out of the loop. What’s going on with Telegram?

Jedrax
link
fedilink
3urte bat

I think the point is that they were ordered to return records by law, and Signal made a legally binding response that they don’t have any records. They are demonstrating that they actually have no information on users. Would that be considered proof? Yes. It absolutely would be.

I have to ask, did you read the court order and the response? Not just the summary they wrote about.

@blkpws@lemmy.ml
link
fedilink
6
edit-2
urte bat

Signal started to make some code not shared so you can’t check what are they doing now, if FBI ask them to get user data they are forced by law to share it with them and they are not allowed by the same law to warn the users or make it public. ¯\_(ツ)_/¯

weex
link
fedilink
3urte bat

My understanding is that code was limited to anti-spam. There’s going to be some level of trust involved with using a centralized service so I don’t see that it’s such a huge issue, even as someone who prefers to use decentralized and FLOSS for as much as possible.

@KLISHDFSDF@lemmy.ml
link
fedilink
2
edit-2
urte bat

They’re hiding the function (rules) that will trigger a captcha response in the client if they get enough reports that it’s a spammer, after which the client will be unable to continue to send messages until the captcha is solved. That’s it. The reason you can’t check how they’re doing it is because the spammers would just read it as instructions on how to avoid getting caught.

Communication/messaging, everything, is still E2EE. Nobody is getting anything out of this. If the FBI asks them to get user data, they will be unable to share anything with them. They don’t need to warn users because they don’t keep any data anyways - as can be seen by the multiple subpoenas they’ve fought to make public and continue to not provide any useful info.

@blkpws@lemmy.ml
link
fedilink
1urte bat

What kind of magic needs to block the contact if it’s reported as spam more than x times in x time? I don’t see the need of starting to make closed source…

@KLISHDFSDF@lemmy.ml
link
fedilink
1urte bat

A simple system like that is easy to implement. I don’t think anyone’s questioning that they can build the worst attempt at an anti-spam system, like the one you’re suggesting. The types of spam you see on modern systems needs a bit more thought than “block if reported more than x times in x times” because you could easily target people and disable them remotely by coordinating attacks.

So yeah, it’s not magic if you want a dumb system that may introduce other problems, but you really have to think about things sometimes if you want it to work well in the long run.

@blkpws@lemmy.ml
link
fedilink
2urte bat

I never get any spam on my chats, if I get someone trying to sell me something I will just block, I still don’t see why they want a super secure smart system to block with captcha… on Telegram for example you can add your own bot to kick the bot users. If you get a direct message you can just block and report. Signals needs a phone number so it should be easy to blacklist bot/scam/spam.

@KLISHDFSDF@lemmy.ml
link
fedilink
1urte bat

I never get any spam on my chats

I’ve never crashed my car, should everyone get rid of their car’s seat belts?

Your experience does not represent the world. I’ve only experienced 2 cases of spam on Signal, but they were all within the last year. I’ve had zero spam in the many years I’ve now been using Signal. So, while my anecdote is just as invalid as your single point of data, there’s definitely a trend for increased spam as a service gains popularity and it makes sense that they’re looking at enhanced methods to block spammers.

I still don’t see why they want a super secure smart system to block with captcha

You don’t understand why Signal, one of the most secure messaging platforms available, wants a super secure smart system to block spammers? I think you answered your own question.

Telegram for example you can add your own bot to kick the bot users. If you get a direct message you can just block and report

Telegram stores all your data and can view everything you do - unless you opt into their inferior E2EE chat solution known as “Secret Chats” - so it’s easier for them to moderate their services. When you report someone, Telegram moderators see your messages for review [0] and can limit an account’s capabilities. Signal can’t view your messages because everything is E2EE, nobody but the intended recipient can view your messages, they can’t review anything.

As you can see, without even digging into it too much, I’ve already found one case where Signal faces challenges not present in Telegram. Thing’s aren’t always as simple as they seem. Especially not for Signal, as they’ve worked their asses off to ensure they have as little data on their users as possible.

[0] https://www.telegram.org/faq_spam#q-what-happened-to-my-account

@blkpws@lemmy.ml
link
fedilink
2urte bat

I know Telegram isn’t privacy respectful xD

I’ve never crashed my car, should everyone get rid of their car’s seat belts?

Just block and done.

one of the most secure messaging platforms available

Briar is probably more secure and it’s not the only secure app to chat in this world, Signal isn’t the MOST SECURED one xD.

Telegram stores all your data and can view everything you do

I never said they don’t, the example I was saying is that you can block and report with a single button.

Your examples are something that I already know and I still think it’s not worth to make closed source. I don’t understand the point on making closed source to make it “the most secure app” as you say, FBI could control that closed source app to track users perfectly as by law if FBI asks for a backdoor you are forced to make it BY LAW and you can’t even tell this to anyone BY LAW. Open source is always a way to respect people rights, closed source is not the way.

@KLISHDFSDF@lemmy.ml
link
fedilink
1urte bat

Briar is probably more secure and it’s not the only secure app to chat in this world, Signal isn’t the MOST SECURED one xD.

A communication platform is only as good as it’s feature-set, ease-of-use, and accessibility. I’m not going to ask my grandma to install Briar - hell, half my friends and family with iPhones can’t even install it, there’s no app for it. I would consider my PGP signed/encrypted text files delivered via carrier pigeon even more secure than briar, but who would I even talk to? Maybe Briar will be a great alternative in the future, but it has a lot of ground to cover. Also, Signal is fully E2EE - that’s what I want, that’s what I care about right now. I’m keeping an eye on Briar, but I’m not asking anyone to install it yet.

Just block and done.

You’re simplifying a problem in a domain you seem to have zero experience with. I will just leave it at at that, as my previous examples in my previous reply didn’t seem to click.

if FBI asks for a backdoor you are forced to make it BY LAW and you can’t even tell this to anyone BY LAW

This is a lie.

Forced labor in the US is illegal. The FBI cannot force you or an organization to work without compensation. As such, the FBI cannot compel software developers to work (modify their code to make it less secure) without breaking the law.

The All Writs Act forces companies to assist in investigations by providing data they already have, (which Signal gladly does [1] )but it does not grant the ability to force someone to work (which is what software development is and is what would be required to backdoor their own systems).

[0] https://www.beencrypted.com/news/apple-vs-fbi-events-summary/

[1] Reminder that Signal only collects: 1) the date you signed up 2) the last day your client pinged their servers.

@blkpws@lemmy.ml
link
fedilink
1
edit-2
urte bat

A communication platform is only as good as it’s feature-set, ease-of-use, and accessibility.

I thought we were talking about secure apps not “great apps”.

This is a lie.

https://en.wikipedia.org/wiki/PRISM_(surveillance_program) A lie is what USA is saying most of time, like starting wars on Afghan saying they had massive nuclear weapons and in reality they didn’t found any of those weapons but they killed and now left afghan with a bunch of army suppliers to Taliban. Much logic. That’s USA, a criminal war state, they just keep in secret and if you try to say the truth with proves you will go to jail like Julian Assange. https://yewtu.be/watch?v=D5xJuyrUIIA

Dessalines
link
fedilink
1urte bat

unable to share anything with them

Except phone numbers, dates / times, contacts… pretty much everything except message content.

@KLISHDFSDF@lemmy.ml
link
fedilink
0urte bat

This is incorrect.

They store:

  • Your number
  • The date you first registered.
  • Last day (not time) a client last pinged their servers.

Signal’s access to your contacts lets the client (not them):

determine whether the contacts in their address book are Signal users without revealing the contacts in their address book to the Signal service [0].

They’ve been developing/improving contact discovery since at least 2014 [1], I’d wager they know a thing or two about how to do it in a secure and scalable way. If you disagree or have evidence that proves otherwise, I’d love to be enlightened. The code is open [2], anyone is free to test it and publish their findings.

[0] https://signal.org/blog/private-contact-discovery/

[1] https://signal.org/blog/contact-discovery/

[2] https://github.com/signalapp/ContactDiscoveryService/

@Yujiri@lemmy.ml
link
fedilink
2urte bat

In security, you can’t assume that the the server isn’t storing a piece of data just because the operator says it isn’t. In fact, every time you have a server in a security-sensitive protocol, you’re supposed to assume that it’s compromised! The FBI could force Moxie to hand it over, and may have already done so without us knowing, so the facat that the server could store the knowledge of who you talk you and when means that for purposes of security, we should assume it does.

@KLISHDFSDF@lemmy.ml
link
fedilink
1urte bat

In security, you can’t assume that the the server isn’t storing a piece of data just because the operator says it isn’t

100% agree with you about being unable to confirm what the server is doing, but the fact of the matter is anyone you interact with - centralized server-client or decentralized peer-to-peer - can store some metadata.

The FBI could force Moxie to hand it over, and may have already done so without us knowing

Private contact discovery is engineered in a way that you would be unable to retrieve what is being processed even if you had access to Signal’s infrastructure or admin/root rights. If you don’t believe this is true, please point out where the weakness in their code is, it’s open for review and for anyone to point out its flaws.

Lastly, the FBI cannot compel anyone - individuals or companies - to work on anything without compensation. That is considered forced labor, which is highly illegal in the United States where Signal resides. The FBI attempted to force Apple to develop software to compromise the security of iOS, but they dropped the case, likely because they knew they would fail. Although they claim they found the software they needed elsewhere [0].

So the FBI can ask Signal for assistance, but that’s it. Signal must comply with the law so they always provide the info they do have - which is the data I previously pointed out - but they do not have to build any such system that would compromise the security of their service as it would fall under forced labor; i.e. developing software against their will.

[0] https://www.beencrypted.com/news/apple-vs-fbi-events-summary/

@Yujiri@lemmy.ml
link
fedilink
1urte bat

100% agree with you about being unable to confirm what the server is doing, but the fact of the matter is anyone you interact with - centralized server-client or decentralized peer-to-peer - can store some metadata.

That is true, but there are ways to mitigate this a lot better than having everyone go to one server and be identified by a phone number. As you’ve pointed out, being identified by a phone number isn’t too bad if Signal only stores that, the date you first registered, and the last date you connected, but would be really bad if Signal were ever compromised and forced to collect more data.

On the other hand, if Signal didn’t require a phone number, the case of a compromised server would be much less threatening.

So the FBI can ask Signal for assistance, but that’s it. Signal must comply with the law so they always provide the info they do have - which is the data I previously pointed out - but they do not have to build any such system that would compromise the security of their service as it would fall under forced labor; i.e. developing software against their will.

I find this very hard to place any confidence in. It may be formally illegal, but that’s not going to assure me that the FBI wouldn’t just do it anyway and get away with it if they really wanted. Government actors do that (breaking the law and getting away with it) all the time.

@gmate8@lemmy.ml
creator
link
fedilink
0urte bat

Fair point tho.

@iortega
link
fedilink
3urte bat

I think I did. I might have not understood it. However, is that response enough? Shouldn’t Signal have some kind of audit by authorities to confirm it is true what they responded?

Jedrax
link
fedilink
1
edit-2
urte bat

deleted by creator

Kinetix
link
fedilink
1urte bat

Not all of it any longer.

Jedrax
link
fedilink
1urte bat

Sorry yeah, deleted it right away, not sure how you were able to see it.

Kinetix
link
fedilink
1urte bat

Magic!

Heh… Come see https://lemmy.ca/post/15834 it looks like there’s still some interesting federation bugs, as your comment still resides here undeleted.

Jedrax
link
fedilink
1urte bat

Ahh good to know! Thanks

Jedrax
link
fedilink
1urte bat

Ahh good to know! Thanks

Kromonos
link
fedilink
2
edit-2
4 hilabete

deleted by creator

Kromonos
link
fedilink
7
edit-2
4 hilabete

deleted by creator

@KLISHDFSDF@lemmy.ml
link
fedilink
3urte bat

It’s a form of evidence. They were compelled by the law to provide everything they have on a user and the only thing they could provide, because they don’t log anything, is the date a user signed up and the last time a client pinged their servers- that’s it!

If you can’t trust the ACLU, the courts, Signal, cryptography experts, etc, who can you trust?

Is the ACLU denying the evidence posted by Signal? Is the Judge denying the records posted by Signal?

I get that Signal has posted this on their website and it could be faked, but do you realize how crazy it sounds that everyone involved would be in on one of the biggest conspiracy theories regarding secure messengers EVER?

I understand scrutinizing Signal to ensure they’re above board, but this is kinda ridiculous.

Kromonos
link
fedilink
2
edit-2
4 hilabete

deleted by creator

@KLISHDFSDF@lemmy.ml
link
fedilink
1urte bat

Sorry, didn’t mean to upset you. I think my response was pretty solid, sorry if you’re unable to understand what I’m saying.

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

  • Posting a link to a website containing tracking isn’t great, if contents of the website are behind a paywall maybe copy them into the post
  • Don’t promote proprietary software
  • Try to keep things on topic
  • If you have a question, please try searching for previous discussions, maybe it has already been answered
  • Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
  • Be nice :)

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

  • 0 users online
  • 4 users / day
  • 21 users / week
  • 88 users / month
  • 288 users / 6 months
  • 14 subscribers
  • 1.86K Posts
  • 8.65K Comments
  • Modlog