cross-posted from: https://kbin.projectsegfau.lt/m/[email protected]/t/26889
Google just announced that all RCS conversations in Messages are now fully end-to-end encrypted, even in group chats. RCS stands for Rich Communication Services and is replacing traditional text and picture messaging, providing you with more dynamic and secure features. With RCS enabled, you can share high-res photos and videos, see typing indicators for your…
But that’s kinda my point, you rely inherently on someone else doing what open source allows you to do. So in the end you can be tricked just the same.
I mean of course, Signal is a pretty clearcut case, but even with that one you - and I’m guessing here but tell me it ain’t true 😅 - probably do not actively verify things. You did not check the source code. You did not build your own APK to install it. I don’t think you can build the desktop version yourself but I ain’t entirely sure, granted. You probably did not probe the network data to see whether the built APK actually does what the source code promises it’ll do or has been swapped out for one that allows the server they’re running to log all messages sent.
And so on.
My point was entirely that even in the easiest of cases where we could do all of that, we do not actually do it. Hence the point of being able to do that is usually extremely moot.
And I say this as someone who, at work, checks external libraries we’re using, which is an insanely time-consuming job that entirely explains why no one in their right mind does this without being paid for it, that is, in their spare time for private use.
If you can’t trust peer review from experts in a field, many aspects of society break down. For example:
Nobody can be an expert in every field. It’s completely unfeasible for most people to verify source code themselves, but that doesn’t mean open source doesn’t matter. Society operates on a degree of trust in our fellow humans that ARE experts in their field. The more experts in agreement the better, since nobody is infallible.
I’m not sure what you’re suggesting people do? Go live in a hole by themselves because the world is full of liars and deceivers? Or become superhuman and hand verify every possible thing that could negatively effect them?
No of course not. I’m sorry if I’m expressing this badly, my point was merely that open source tends to add a false sense of security for people. The relevant ability to verify is factually never used, and experts that review the code might as well have had access to it without it being open sources (see Whatsapp’s audit a while back).
That is not to say that Open Source is not a good thing, don’t get me wrong. But I feel we tend to massively overstate what it adds for us personally. We put too much value on that side of it, as if it automatically means every user has personally verified everything.
That’s a fair statement to say open source on it’s own doesn’t add any security. I will say that any developer who’s intentionally adding vulnerabilities to their code is less likely to publish the source, simply because someone COULD see it. With the number of automated vulnerability scanners on Github, it would require a lot of extra work to go undetected, when simply going closed source is an option. Once again, the more open the better, since there’s fewer places to hide things.
I’m not qualified to determine personally the situation of signal, or any other app. But I don’t need to. There are several experts who are and the fact that multiple of them have analyzed and evaluated an app as signal should give us a lot of confidence in their conclusions.
We need to trust experts, and I don’t mean individual experts, but the experts as a whole, especially when they verify each other’s work. This is what it’s about. You can’t do everything yourself, you got to trust some form of collective.