I like the idea of Signal a lot. It works pretty well. Unfortunately, not that many of my friends are on it. However, I have quite enjoyed using Matrix, specifically in the form of Beeper, which has nice bridges for Signal, WhatsApp, FB, IG, and so on. Ultimately, I can talk to my friends where they are and keep it all in one place.
Beeper has been making great improvements but unfortunately it's still not as good as Signal in terms of polish.
She gets about half as much as normal engineers at the company ($200k vs $400k). I wonder what would be more productive - 6 engineers getting paid $200k or 3 getting paid $400k?
You could still attract a very large talent pool for $200k/yr, especially given the mission, and doubly especially if this is remote work and not based in SF/SV.
All of the sibling comments are good answers to this question!
What hasn’t been mentioned is that Signal is an extremely high-profile target for rogue actors and nation-states. Employing staff with personal financial troubles in sensitive positions can contribute to the temptation to use access for personal gain. This is one way intelligence services groom double agents. Despite what appears to be very high salaries they won’t get rich in the Bay Area— but it seems like it’s above a threshold where one of these engineers would feel desperate.
Also: I imagine having a WhatsApp founder as a board member/funder helps here, as they famously built and scaled globally before their acquisition with only 20-30 engineers…
It’s nuts to think a salary of $400k a year won’t get you rich, even in the Bay Area. If they’re not rich, maybe they should check in with a financial advisor because they’re doing something wrong.
“Annual income twenty pounds, annual expenditure nineteen nineteen and six, result happiness.
Annual income twenty pounds, annual expenditure twenty pounds ought and six, result misery”
I agree, but only because I routinely speak with people who make double that and still struggle to pay monthly bills. Short answer: financial outcomes aren't about the number (above a basic threshold), and they’re almost entirely a function of behaviors & (some) luck.
Paying more is still a valid form of defense against petty corruption, though.
What often baffles me is the "buy a whole bunch of things on credit", and then get laid off and have not a penny left.
I've seen stories of actors in Hollywood banking $15M from a movie, buying all sorts of property with mortgages, boats, cars, all on credit, then losing it all due to default without that next movie.
Anyone whose worked with a dozen or so people will know they're never equal. There's generally at least one 'guru' in the group who knows the ins, outs, and why's of everything. At big companies they'll get split up so there's only one per team, with maybe a couple roaming super team for anything truly major.
But I've certainly met engineers who are worth at least 2 lower engineers.
People love to wish this weren’t true, but it is. All engineers have great potential, but some realize it faster than others, and some never fully realize it at all. The difference is significant.
Moreover, not all spectacular engineers are worth working with; there’s a lot to be said for someone’s ability to be worked with and to be a leader, in addition to just pure knowledge or capability. Being able to work on teams matters, and being able to lead them (while being respected, respectful, and educating) matters even more.
The latter pieces of being a “strong engineer” are the pieces that are the hardest to learn for most engineers, because we spend so much time pretending it’s only the code that counts. Great engineers who realize and understand how to push their colleagues and themselves to grow while pushing the company forward? They are worth their weight in gold.
"Signal is a nonprofit because a for-profit structure leads to a scenario where one of my board members goes to Davos, talks to some guy, comes back excitedly telling me we need an AI strategy for profit."
She gave a talk at TechCrunch last year and kindly spoke to me briefly afterwards about how end-to-end encryption can help businesses. The problem is that without a huge benefactor, there isn’t enough consumer demand to make a sustainable business out of privacy concerns.
I had a family member working in Tibet with Christian churches, which is a very sensitive topic in China. Most all technology was outright blocked (except overtly monitored wechat).
However, Signal was not blocked by the great firewall and we were able to communicate freely on it.
But now I'm wondering why isn't Signal blocked? Does the CCP have a backdoor into it somehow and therefore doesn't feel threatened by it?
I very much doubt that the CCP has a backdoor into Signal the app; but it is worth keeping in mind that if an actor can observe traffic, e.g. cellular traffic records or network flow logs, they may be able to correlate senders and receivers through ordinary time-stamps. Person A sends 200 bytes at time T, Person B receives 200 bytes at time T + 300ms. It doesn't take much analysis to build up a picture of a weighted network of communicators.
There are "Military Grade" message and voice encryption technologies that avoid these issues by always sending a fixed amount of traffic to all parties, but that tends to be impractical for mobile apps.
It is blocked. Its activation SMS had been blocked for quite a while. Same happened for the connection to signal servers. Early this year, Apple also removed signal from China Appstore.
"Not being able to operate" is not equal to "it won't work if you already have an account".
As long as the average Chinese citizen cannot effectively create an account (because the activation SMS cannot reach their phones) the Chinese government may allow existing users of signal (who activated the account abroad) to communicate with signal.
Signal may also be a "signal" that the state should look more closely at a person. With so few users, if the user is up to no good, as per the state's view, just knowing to watch them is helpful.
> If China was allowing Signal to operate within China, the most logical and simple explanation would have been that China had backdoor access.
It still isn't the simplest explanation. The simplest one is probably that there were simply too few Signal users.
The complete source code for the Signal applications and the Signal server is available on GitHub. This way, anyone can review the code for security and correctness. An existing backdoor would mean that every expert review of the source code in this field has been faulty. And the fact that I can set up my own Signal server and compile my own client supports Signal's claim of being a secure messenger.
The possibility of manipulation through Apple and the App Store process remains. However, this is by no means the simplest solution.
I've had an issue with my phone for the past few months that could be solved by erasing it and restoring from backup, but I use Signal a bunch and I'm not currently feeling risky or motivated enough to migrate my whole Signal state to a spare phone and do it.
Lack of backups is not a retention/deletion policy. Signal chats can, in fact, have a deletion policy set. Instead it directly ties retention to "how long can I last without losing or erasing my phone", which is not a useful proxy.
Anyone sufficiently motivated to keep messages forever can (a) set up the desktop client and back up its data store or (b) set up signal-cli and save everything that comes out of it.
No backups doesn't defeat this, it just makes life harder for everyone who relies on scrollback. Imagine if email worked this way.
Indeed they do, as I just learned today. That really doesn't make any sense then – I vaguely remember Moxie stating that the lack of backups was intentional, but now having them on Android but not iOS makes no sense at all.
Signal does have backups, it just refuses to reveal anything about them. Fortunately thanks to reverse engineering efforts you can now do pretty much whatever you want with them: https://github.com/bepaald/signalbackup-tools
That's essentially an Android system backup, i.e. a disk dump of the app data folder, right?
Almost nobody uses these in practice, so I think my point largely still stands.
I'm pretty sure that, given their stance on this issue on iOS, they'd start locally encrypting the message database using a key stored in the Android Keystore system (which won't be backed up or extracted).
No as in, this is not the Android backup system, but rather something custom-built by Signal? If so, how do I enable this on my phone, and what's that passphrase the tool alludes to? I don't see an option to set one in the Android app.
Yes it's custom-built by Signal. I thought it was enabled by default — I don't remember ever turning it on — but it might not be. Here is how to do it:
> Your backup folder is listed under Signal Settings profile_avatar.png > Chats > Chat backups > Backup folder. Use the files app or plug your phone into a computer to go to the folder.
> For older versions of Signal, the backup file signal-year-month-date-time.backup can be found at /Internal Storage/Signal/Backups or /sdcard/Signal/Backups
Maybe they should add good backup encryption then lest people decide they don’t want to use Signal because it doesn’t have this feature and end up with weak/no backup encryption.
I'm pretty sure they would, if they could think of a good way. Usable, yet secure backup encryption is so far an unsolved problem, as far as I can tell.
The two options, roughly speaking, are: Force users to store some high-entropy passphrase (which most users will then store somewhere not very secure), or let them pick their own passphrase (which won't be very good). This is what WhatsApp does.
A third one would be to allow a short passphrase and guard that by a server-side HSM or maybe SGX, which Signal seems to be somewhat fond of; I'm glad they're not doing that.
„Because having 70 percent of the global market for cloud in the hands of three companies globally is simply not safe. It’s Microsoft and CrowdStrike taking down half of the critical infrastructure in the world, because CrowdStrike cut corners on QA for a fucking kernel update. Are you kidding me? That’s totally insane, if you think about it, in terms of actually stewarding these infrastructures.“
Cruft is like dust bunnies, it grows if not cleaned up!
I literally just removed some code from 2021 that was echoing huge JSON files into build logs that nobody looked at.
It reduced the pipeline run duration from 45 to 30 minutes.
Now, a crypto coin will probably be harder to remove, but there's a weird inertia around long-lived repositories where people are afraid to make any changes. Although I hope the crypto portion is feature-flagged and can be somewhat easily disabled.
I've encountered this argument ... repeatedly. Let's explore the DIY route:
If you can build your own Signal server, you too can serve you and your own circle of friends. The bar is not that high (Java and VPS).
Signal clients are even easier but it remains mostly an unique build-challenge due to not so strong documentation and by the virtue of mastery of multi-platforms.
Having said all that jazz, step back and ask yourself this, what am I losing by building my own Signal-protocol network?
Anonymity
Now, you would easily stick out like sore thumb to all the Internet overwatch, even within VPN tunnels.
That's a risk for me.
What am I actually gaining?
Not much: a more unique hash signature of client app (it has downsides); the ability to perform a unique but slight tweak of hash/key/encryption algorithm using same Signal protocol (dangerous rabbit hole), and avoidance of XDR/NDR/IPS/IDS firewall, and the biggest one: zero spreading of hashed contact info (more on this below).
-----
Alternatively, let's take the original route: your own client against "the" Signal server:
Now, Signal protocol would be open to misshapen protocal usages (think "fuzzing"). Might be a good thing but certainly not at this early stage; do we have the manpower to stand guard over a protocol like ISC Bind9 team do with their DNS?
The one area that is not firmed up 100% (more like 99.999%) yet is the Privacy Information Protection axiom and that is centered around the exhanges of hashed "Contact" address book.
This there is largely understudied and under-whitepapered: how to exchange contact info in safe privacy order just to build your network: I keep that Signal client app option off for now and manually add my contacts. That's why I think that Signal team is moving away from telephone number.
Starting with "Which insider used MobileCoin to steal a billion dollars from FTX with the help of the publicity created by Signal?"
Then perhaps, "Now that SGX has been completely destroyed by a class break when will MobilCoin support be removed along with signals other security dependencies on SGX?"
In terms of growth strategy, all Signal has to do is wait; bad press for the competitors arrives all the time. So it seems right that she's focused on funding more than user growth.
> The Open Technology Fund (OTF) is an American nonprofit corporation[5] that aims to support global Internet freedom technologies. Its mission is to "support open technologies and communities that increase free expression, circumvent censorship, and obstruct repressive surveillance as a way to promote human rights and open societies. As of November 2019, the Open Technology Fund became an independent nonprofit corporation and a grantee of the U.S. Agency for Global Media.[5] Until its formation as an independent entity, it had operated as a program of Radio Free Asia.[5]
> ... The OTF funds third-party audits for all the code-related projects it supports.[13] It has also offered to fund audits of "non-OTF supported projects that are in use by individuals and organizations under threat of censorship/surveillance".[13] Notable projects whose audits the OTF has sponsored include Cryptocat,[14] Commotion Wireless,[15] TextSecure,[15] GlobaLeaks,[15] MediaWiki,[16] OpenPGP.js,[17] Nitrokey,[18] Ricochet[19] and Signal.[20] The OTF also matched donations to the auditing of TrueCrypt.[21] In 2014, the OTF reported that it had funded more than 30 technology code audits over the past three years, identifying 185 privacy and security vulnerabilities in both OTF and non-OTF-funded projects.
There's a difference between an investment and a grant.
What ownership share, promise of future payment, board seats, or other rights to control Signal was assigned to the Open Technology Fund by receiving that grant?
> Signal should not have been taking money from the CIA.
...
> Why not?
In fairness, at least one answer is obvious: because it created the appearance, for some, that the CIA was getting something out of the deal, undermining trust in the platform.
Just as long as you understand that the messages remain verifiably solidly-encrypted within Signal servers, both in-transit (network) and at-rest (storage).
As to about the end-devices that uses Signal-client, that is about as strong as you and your OS lets you.
I've encountered this argument ... repeatedly. Let's explore the DIY route:
If you can build your own Signal server, you too can serve you and your own circle of friends. The bar is not that high (Java and VPS).
Signal clients are even easier but it remains mostly an unique build-challenge due to not so strong documentation and by the virtue of mastery of multi-platforms.
Having said all that jazz, step back and ask yourself this, what am I losing by building my own Signal-protocol network?
Anonymity
Now, you would easily stick out like sore thumb to all the Internet overwatch, even within VPN tunnels.
That's a risk for me.
What am I actually gaining?
Not much: a more unique hash signature of client app (it has downsides); the ability to perform a unique but slight tweak of hash/key/encryption algorithm using same Signal protocol (dangerous rabbit hole), and avoidance of XDR/NDR/IPS/IDS firewall, and the biggest one: zero spreading of hashed contact info
-----
Alternatively, let's take the original route: your own client against "the" Signal server:
Now, Signal protocol would be open to misshapen protocal usages (think "fuzzing"). Might be a good thing but certainly not at this early stage; do we have the manpower to stand guard over a protocol like ISC Bind9 team do with their DNS?
The one area that is not firmed up 100% (more like 99.999%) yet is the Privacy Information Protection axiom and that is centered around the exhanges of hashed "Contact" address book. This there is largely understudied and under-whitepapered: how to exchange contact info in safe privacy order just to build your network: I keep that Signal client app option off for now and manually add my contacts.
That's very misguided of you to say and pure speculation. It's as useful for a discussion as me saying "Durov is Russian-born, and has resisted proper e2e encryption defaults, so the Kremlin incentivized him to do it, so they can spy on people". You see how this works?
There is actually a clear official statement from Signal why they don't permit open-source clients to use their servers.
What in their statement suggests that they are "hiding something"?
If you can find something wrong with their protocol, source, or suspect that somehow the communication is compromised when it reaches their servers, please go ahead and disclose it, but thinly veiled: "do they have something to hide" is just speculation.
> That blog post is 8 years old and predates things like mastodon
What in this blog post that's 8 years old has become obsolete in the meantime?
> Nothing has changed in these intervening 8 years?
Signal got better, and got bigger user base in a couple of waves, with most notably the WhatsApp terms of service fiasco. Is that what you meant?
Btw, people are still free to clone/fork the client, the protocol, and run their own servers. But compute ain't free, and you can read about that here: https://signal.org/blog/signal-is-expensive/
I think that for folks living in the West, Putin and his gang of cyber criminals is much bigger day to day threat than the NSA. So the fact that Durov is still alive (a couple days in a French prison is the least he has to worry about) makes Signal look like a much better bet than Telegram.
The mostly unelected EU regime really loves censorship. Nobody ever claimed that Putin is in any way better. That doesn't mean that the EU isn't complete shit. And if you want to focus on the EU market like Whittaker claims in the interview it only works if you are completely compromised. Also looking at her history I don't think you could find a more glow in the dark person than her.
Doesn't matter. As long as the code is open source and e2ee, Signal staff could be official NSA employees, it wouldn't matter (in the short term - in the long term, you would see these things to change, of course.)
I'd change my mind on Signal if you can demonstrate an attack that assumes an evil signal operator, or evil signal servers.
Signal know they just need to keep themselves open to the possibility of this kind of demonstration. Then any mistrust, combined with the fact that there is no exploit at the next CCC or defcon, becomes evidence that it's secure. More mistrust -> More attempts to prove its insecure + no demonstration of insecurity -> better argument that its secure. It's a negative feedback loop. It's also honest - you could actually break it. Did I miss how you can break it? Link to the demo.
Signal the program doesn't trust signal the organization, as it should be. That's the core idea. It's what lets them not get fucked by the government. They cooperate fully and ensure they have nothing to tell (privacy by design. data minimization. self blinding). And by having a lot of users they make themselves impossible to ban and thereby protect the whole concept.
Whittaker is very smart politically. The software isnt perfect, sure. It's polished and reliable and secure. Make a better one... it is fine.
Also, are you reading what she's saying? This is not what compromise looks like. Here is how compromise looks like: When you see them starting to talk about protecting people by establishing police control to fight the bogeyman. When they start talking about the threats here, threats there, enemies here, enemies there... When they say, because of big tech, we need things like DSA (enforcement regimes, access for police) [1]. Whittakter says because of big tech, we need a lot of open source projects backed by nonprofit organizations that dont advertise, dont surveill, and have no incentive to start doing it... and that build stuff that has no backdoors and makes no affordances for state or anyone else in power to compromise it.
[1] and then plugins like E-Evidence, and finally rules like in England that prohibit privacy by design... which would prohibit: Signal... but which the english are not enforcing because of protests by: Signal.
Hagiographies are exhausting, and not exactly tech news in my opinion.
I understand that people love signal, but I don't understand why: outside of what appears to be propaganda such as this.
btw; Downvotes only make my belief stronger that unironically there are actual shills on HN. The way people are bleeding to defend it as “the one true secure messenger” is about as convincing as Epstein killing himself.
Big props for that (and for the interviewer for looking it up). A lot of nonprofits are glorified jobs programs for politically connected individuals.
Nothing is pure-as-the-driven-snow perfect, but I use Signal.