Hacker News new | past | comments | ask | show | jobs | submit login
FSF responds to Microsoft's privacy and encryption announcement (fsf.org)
54 points by eltondegeneres on Dec 5, 2013 | hide | past | favorite | 64 comments



It seems like the response glosses over what Microsoft is actually doing and instead just attacks them for something unrelated (Widows is closed source). In the FSF eyes, Microsoft can do nothing to improve security until Windows is open sourced.


No, I think in the FSF's eyes -- rightfully -- it can't be proven that security has improved.

I also can't agree that it isn't related. If I tell you I'm wearing a green shirt, how can you know for sure if you or someone you trust hasn't verified it? You can't. It's the same with MSFT. But in the case of MSFT, it has been proven that they wear a lot of Hypercolor[1] stuff.

Is it good that MSFT is doing stuff to make things more secure? Sure. Do we still have to take it on faith that they are doing everything they can to protect their users? Yup.

[1]http://en.wikipedia.org/wiki/Hypercolor


So, Microsoft and its Windows product adheres to no industry standards, has no external audit process, has never been verified by a private or Government contract agency through audit or other verification process?

I work in a small programming company and we do internal and external audits while maintaining compliancy with federal and state regulators as well as groups like ISO.

Sure, our work is closed source, but that doesn't automatically mean it hasn't been externally verified for a number of different things by a number of different organizations...


No, I did not say those things. That aside, if you wanted to, after an audit/review concluded, could you put a backdoor in your software? Since it's closed source, would anyone know about it?


If there's a deterministic build process, in theory, the auditor would know something was up if the binary differed.


In deterministic build? It will be very hard. I doubt that any audit signs on anything other than specific versions.


does microsoft use deterministic builds?


There are these things called signed binaries...


But signing binaries does not prove from which source the binaries have been build, only who did it.


> Sure, our work is closed source, but that doesn't automatically mean it hasn't been externally verified for a number of different things by a number of different organizations...

Yes, but we have to take your word for it.


As mentioned below, a software being secure and the ability to convince yourself that a software is secure are two different things.


Audit and verifying the security of software the size of Microsoft Windows without source code is like auditing and verifying the security of the international space station with only the help of a hand held telescope.

Sure, it looks like its not leaking air. It has not dropped down to earth yet, and all the videos posted on their website looks to show it being fine. However, if I ever went there and depended on its security, I would demand more.


If a piece of proprietary/close source software gets audited by an external agency, any of the private actors involved can be involved in deception (the authors of the software, the auditing company and the government or whatever). They can all be in collusion. This would have sounded like a conspiracy theory before recent discoveries, but now we've seen this can actually happen.

Whereas if the source is open, and you are a subject matter expert (yes, that's a big if), you can review the source yourself. You can decide for yourself whether the software has an NSA backdoor, an innocent flaw or whatever.

Yes, a lot of people, myself included, don't have the technical background to do this. But with open source we could, if we had the knowledge (which can be learned), without relying on potentially compromised authorities.

With closed source we simply can't. We have to trust the auditors and the government, which have been shown to be unreliable.


Why would you place any amount of trust in a closed source privacy solution from a company with a history such as Microsoft's?


Because they employ intelligent, skilled people who care and know more about privacy than I do.

How many open source projects have most people audited for their own sense of satisfaction about its security promises?


Unfortunately, those skilled people at MS have let the NSA in on so many 0-day exploits. God knows how many have not been reported to the public yet. At least with open source, I know there is a community behind it for me or others to verify. Sure it is not 100% fool proof, but it makes it far harder to sneak bad things through.


> Sure it is not 100% fool proof, but it makes it far harder to sneak bad things through.

Debian SSL bug lasted 2 years. Open source means little for security.


cherrypicked examples mean little for arguements either. The WMF exploit was in windows for more than 15 years.

http://en.wikipedia.org/wiki/Windows_Metafile_vulnerability


SSL is a security sensitive bit of code. That's the kind of thing that needs to be kept safe, and it's the kind of thing that people claim is kept safe my open source's many eyes.

The argument I'm making is not that Windows is secure (because it isn't), but that Open Source isn't necessarily secure just because it's open source.


No one has argued that open source is secure just because its open source.

Open source is however possible to independent verify if it is secure. Closed source is not possible to verify as secure and must be taken solely on the word of the company who made it.


There are a lot or security holes regularly surfacing in all kinds of software. We don't even have the post mortem of the kernel.org compromise , as one example. Even some Debian servers got hacked. Open source helps but lets not pretend it's a panacea.


It's not all or nothing. Open source is better than close source for security auditing purposes, but of course open source alone is not enough, nor is it impervious to security flaws. It's just better than the alternative.


> Debian SSL bug lasted 2 years. Open source means little for security.

How many examples can you come up with? Was this specific bug being actively exploited when it was discovered?


> How many examples can you come up with?

A bug caused by prettying the code, which was secure from upstream, which is in an important, widely used, supposedly secure bit of code isn't a good enough example?

> Was this specific bug being actively exploited when it was discovered?

Many Linuxes used to ship with lots of services running. That lead to many rooted boxes being used to deliver spam. Open Source fixed the problem, but only after many millions of emails had been delivered.

Someone somewhere probably has a nice chart of all the Red Hat boxes in SKorea in the late 1990s early 2000s.

Again, this isn't to suggest that MS or Apple are more secure. For years anyone putting an MS server onto the Internet ran the risk of very quick exploitation.


> A bug caused by prettying the code, which was secure from upstream, which is in an important, widely used, supposedly secure bit of code isn't a good enough example?

It's a good example. Can you come up with more? Because, you know, it's just one instance of a problem. It says nothing on how pervasive it is.

> For years anyone putting an MS server onto the Internet ran the risk of very quick exploitation.

IIRC, there was a time when the average time between install and first invasion was in the 40 seconds range.


I think the article glosses over Microsoft's security measures by design. It is consistent with the FSF's position that closed/proprietary source is inherently more insecure than open source because you cannot freely review (even if you have the technical knowledge to do so) whatever measures the authors say they implemented.


It is very much related: closed source software isn't audited in the open. Core principle of building secure software is one must assume that their adversary has the source code and is actively building newer and newer attacks based on it (corporate leaks happen all the time). It's not impossible to follow through with this principle when building closed source software, but it is a lot easier to handwave it.

I am not a fan of FSF's tone here, they could be more diplomatic -- but saying "we appreciate your effort, but you fail" would have been more insulting. I think there are many places for closed source software, but core privacy software is not one of those places.

The encryption core (the critical pieces that either input or output plain text -- the places where the attack is more likely to succeed as opposed to the core of the algorithms), as well and the general platform should have the source code available (even if at a fee). That's not quite the FSF vision, but perhaps the powerful vision is needed (one can think of FSF's goals as a captivating utopian story that leads to more incremental improvements).


huh? I know this isn't going to be the popular opinion but I have to get this off. As much as I respect FSF, this mentality is one of the things I dislike about them. Statements that imply either you are with us or you are evil, trying to crash at opening events of MS/Apple "saving" people from closed source etc.

Not everything has to be open source and not everyone has to choose open source. Microsoft/Apple/Google may never make their core products open but it doesn't mean they can't try to make it secure or privacy conscious. The beauty of freedom is that YOU get to choose and trust what you want, good or bad, open or closed.

As a developer, I find GPL to be against the "spirit of open source", but it would be extremely wrong for me to demand you license your code differently or to even suggest to your clients to choose a liberal license. Its yours and their choice not mine.


>As a developer, I find GPL to be against the "spirit of open source".

It's not about open source. It's about free software.

https://www.gnu.org/philosophy/open-source-misses-the-point....


Which is, urm, rather the point.

Some of us care more about open source than free software.


Not the FSF. Have they ever given the impression that they care at all about open source except as a characteristic of Free software? I'm sure they didn't intend to.


>I find GPL to be against the "spirit of open source"

More like Open Source is against the GPL, which is really the only reason the OSI came into existence.

>either you are with us or you are evil

Free software is about morality, not about getting or giving away free stuff. When you take a moral stand, you're necessarily making a moral judgement about people who don't.

Open source sells itself as a better way of doing business. Free software supports people's full ownership of their own devices even if it is worse for business.


I think you are right. If microsoft comes out with some sort of license that ensure that they will protect our privacy (or allow us tu sue the hell out of them if they fail to do so) that would be a way to achieve this.


I just love how the FSF ignores, and has ignored for a considerable amount of time that governments, and organizations can and have gotten the Windows source code, and that their actions today restate that they do provide the windows source code for governments to audit. Given that such programs have been available for a considerable amount of time, one has to wonder when the FSF is going to change their tactic away from their current one of outright lying about the situation.


"Freedom and security necessitate not just being allowed a peek at the code."

"Transparency in the Windows world normally means self-reports commissioned by Microsoft, or access granted to outsiders covering very limited portions of source code under strict agreements that limit sharing that information."

Yup, John Sullivan really ignored that.

You can disagree with the FSF's mission, but they are certainly not spreading lies on purpose.


The source agreements are far more than just a "peek" at the code. I would still argue that Mr Sullivan is at the very least distorting the truth, if not outright lying. The FSF has its agenda, and has proven it will try to distort the motives of any entity that doesn't completely agree with it.


>I would still argue that Mr Sullivan is at the very least distorting the truth, if not outright lying.

And I would listen to that argument, but you haven't made it.


I just love how you like to equate governments and organizations (not individuals) getting to peek at source code (that they're not allowed to modify, share or comment publicly upon) under an NDA as "open source". Who's outright lying?


I never said it was open source. However, the source is available, and researched, by third parties. The announcement today also states that they are making the source available to more eyes than before. Not everything needs to be completely open source.


So the NSA can see Windows source code, but Windows users can't? And users are supposed to feel secure about this? Why?


More than just the NSA. This shrill hyperbole that permeates the free software world is counterproductive to getting things fixed.


Okay, who? And why should I trust them? How do I know they didn't sign an NDA promising to keep any security vulnerabilities secret?


Drcube: How do I know that open source developers aren't contractors out to put backdoors in? At some point you have to trust someone, and I doubt with all the eyes on windows, both within and without, that any extant backdoor would have remained hidden until now. There are quite a few people who do reverse engineer windows without a license, and would be shouting it from the rooftops if they found a backdoor.


> At some point you have to trust someone, and I doubt with all the eyes on windows, both within and without, that any extant backdoor would have remained hidden until now.

Right, but A) I like to choose who I trust. Get a second opinion. Verify a few things myself, if I feel I'm up to the task. I get none of these options with Windows.

B) If you doubt, with all the eyes on Windows, that a backdoor would have persisted, why do you act as if more eyes is not better? Or, perhaps more importantly, a greater variety of eyes?

Microsoft gets to pick who studies their code, and may or may not prevent them from revealing vulnerabilities and bugs anyway. Popular open source projects may get actively hostile researchers studying their code specifically to shout any bugs they find from the rooftops. Or they may get the world's foremost expert in some aspect of their code to study it for fun. And nobody can stop them from revealing anything the find.

If MS giving a few hand-picked organizations their source code under strict terms, who may or may not be allowed to report what they find honestly, is good, why wouldn't fully open software be better?

Nobody is saying that giving governments and a few organizations a look at the Windows source isn't better than having it completely locked down. But it just doesn't go far enough.


>How do I know that open source developers aren't contractors out to put backdoors in?

You don't. That's why the source is open.


Can those organizations also build the code from source, and then exclusively use the binaries they produced? And possibly even compare to the binaries that Microsoft ships?

Just being able to see the code is no advantage, you still have to trust Microsoft to actually ship the same code they've shown you.


As long as Microsoft tells NSA about the bugs they have in Windows before they start fixing them [1], that just constitutes the same thing as "backdoors", since with many of those bugs NSA can take full control of a machine.

So Microsoft doesn't need to "give NSA a backdoor". They just need to tell them about certain bugs before they fix them - and that's just as bad as giving them backdoors, since NSA can and will use them as such.

[1] - http://www.bloomberg.com/news/2013-06-14/u-s-agencies-said-t...


You're describing a security disclosure program. Since anyone else could have (and often has) already found the attack, the point of this is to put the defenses up as early as possible.

Surely real valuable attacks would be ones there's no planned security update for.


>You're describing a security disclosure program.

No, a security non-disclosure program. Disclosure is when you tell people. Telling a spy agency in no more "disclosure" than telling a Russian trojan dev.


Open/closed source software and secure/unsecure software are orthogonal concepts. Yes, it may be easier to assess open source software with regard to security and privacy issues, but it is absolutely not necessary. And even with open source software the overwhelming majority of users still has to trust some third party because it is absolutely unrealistic that every user or organization audits their complete software (and hardware) stack. The only thing you really gain is that you are free to choose which third party or parties you have to trust.


> Open/closed source software and secure/unsecure software are orthogonal concepts

No, they are not. It's fundamentally impossible to secure proprietary software because you have to trust its provider the software does what it says it does whereas with open-source you can always check for yourself. Any backdoor in open-source software is there to be exposed and corrected.

With proprietary software only one party can disclose vulnerabilities and in open-source anyone with the knowledge can do it. You can choose to trust a single party or choose to trust a myriad of different parties any one of which can blow the whistle if they find something fishy.

I find it highly unlikely a backdoor to a popular open-source application could remain there for long. I don't think it's unlikely at all with proprietary software where there is no incentive to fix a problem until someone outside the company learns about it.


It is not impossible. There is no reason why closed source software can not be secure. Yes, you can not convince yourself in the same way you can with open source software but again secure software and the ability to convince yourself that a software is secure are different things.


> There is no reason why closed source software can not be secure

True, but there is no way to prove it's secure. It's not about convincing myself or anyone else - it's about proof.


You get no proof for open source software either unless you perform a formal verification. And even then your proof may be wrong.

But maybe we can agree on the following. Closed source software can be secure but there is a broad spectrum of needs for convincing someone that a software is secure and this need may be better served with open source software in some circumstances. For some it is sufficient to trust a vendor. Some want to audit the source code (and this does not exclude closed source software). Some even need formal verification maybe even of the underlying hardware.


So you mean the Linux kernel was proven secure? When?

http://www.theregister.co.uk/2009/08/14/critical_linux_bug/

http://www.networkworld.com/community/blog/linux-finally-fix...

http://it.slashdot.org/story/11/06/20/2257229/13-year-old-pa...

You're trying to apply an impossible standard to closed source software that software that was that developed from the start to be open source and developed in the open cannot meet.


At least with FOSS, I get to choose the third party I trust to verify my software's security. And I don't have to take it on faith, I can get a second opinion, or audit the parts I care most about myself.


Added exactly this while you wrote you comment. I don't deny that open source software has advantages in this regard.


> A lock on your own house to which you do not have the master key is not a security system, it is a jail.

This is completely bogus. The owner of the master key may have the access (understandably undesirable), but that does not keep you from getting out. If anything, it's like having no lock at all.


True, but their point that "these promises are meaningless" remains valid.


Replace "Microsoft" with manufacturers of food, cars, medicines, personal hygiene products, etc. They are regulated, inspected, but all under the same / similar conditions:

> or access granted to outsiders covering very limited portions of source code under strict agreements that limit sharing that information

You are trusting the manufacturer's promises. Are they essentially meaningless just because the general public doesn't have insight into manufacturing details?


I think FSF has an open-source axe to grind here and this happens to be an issue they can use to bash MS. I do think MS should be praised for improving security. However, don't forget Snowden's warning -- encryption does not help you if you don't have secure endpoints. Moreover, encryption does not help you if you have rogue agencies with secret courts, secret rulings, no due process and no legal rights in a police or dragnet surveillance state.

I hope the big tech companies are serious about protecting their users, even foreign users, since their business model depends on it.


I don't think people should be so dismissive of the FSF's argument here. It's central to the issues with security these days. P2P Affero GPL licensed software is about the only way to be secure anymore. Even then, we have issues with unknown code and hardware at the lowest levels of the stacks.


You've got to read between the lines with a corporate statement. There are two major issues this one:

1) They're giving no indication that they can't decrypt their customers data. This won't protect customers from the thousands of information requests that they're not allowed to publicly acknowledge, and will only hamper vectors such as MITM fibre splitting. This is concerning given the fact the US intelligence agencies share their data with private companies, and that Microsoft didn't even attempt resist previous requests. They have no incentive to inform customers and fight expensive legal battles, so as soon as the whole privacy thing blows over it will be back to old habits.

2) Allowing companies to review their source code is only useful for their desktop products. Most data is going into the cloud now, plus it's possible to use cross library exploits and obfuscated code. I don't actually think that they'll do this now, hover they've done it in the past with their famous NSAKEY in the 4.0 kernel.

Office 2008 with a firewall will keep your data safe. Office365 is a company risk. I wouldn't put anything more confidential than a CV or short story on it.


Microsoft's effort to protect customer data from government snooping? sounds legit. I'll just leave this here https://www.schneier.com/blog/archives/2007/12/dual_ec_drbg_...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: