Hacker News new | past | comments | ask | show | jobs | submit login

Sure. But would they have done this without the recent publicity about third-party app tracking? I suppose that they must have been working on this for some time. And maybe they deserve some slack.

But there should at least be an apology for not acknowledging tracking by third-party apps. For years. While they were touting their privacy stance.

And it's not just about behavioral ads. Some repressive governments have used iOS apps against dissidents.




> And maybe they deserve some slack.

They deserve a lot of slack. Apple's privacy approach may not be perfect but they are head and shoulders above just about every other major tech company.

I've written many privacy policies - for privacy-focused companies and for companies seeking to get away with the absolute bare-minimum. One of the best "quick and dirty" ways to assess a company's privacy policy without actually reading the whole thing is to count how many footnotes the document has. All of the important information that a company has to disclose but would rather not is hidden in the footnotes.

Apple's privacy policy[1] has zero footnotes. Google's privacy policy[2] has 51.

[1] https://www.apple.com/legal/privacy/en-ww/

[2] https://policies.google.com/privacy#intro


Can you clarify your comment? I don't see anything I would classify as a footnote in either linked policy.


Certainly! I contemplated explaining a bit further about Google's footnotes but thought I might sound crazy.

Google actually takes the extra step of obscuring the footnotes to look like hyperlinks. For example, if you click "ads you'll find most useful" under "We want you to understand the types of information we collect as you use our services" you'll see that it pops out as a footnote rather than linking you to a new site.

Because I often found myself looking for footnotes to find the begrudging disclosures and because I found more and more footnotes hiding as hyperlinks I made myself a little browser extension to highlight html elements that are probably footnotes as most are still pretty obvious from the properties. For example, from Google's privacy policy:

  <a class="g1mG8c" href="privacy#footnote-useful-ads" data-name="useful-ads"jsaction="click:IPbaae(preventDefault=true)">ads you’ll find most useful</a>
I'm undecided whether this is a style decision or another attempt to hide the ball. Google used to publish a pdf version which made it much easier to see the footnotes but I haven't been able to find a current pdf version in years.


Ah! Thanks for expanding on that.

It looks like the PDF version is available at the top of your linked page under the link "Download PDF" (https://www.gstatic.com/policies/privacy/pdf/20190122/f3294e...). The footnotes are all tacked on to the end of the document there, but they're hard to make use of out of context.

It's frustrating that the footnotes contain both important disclosures like "we assign you a unique identifier to track your activity if you're not signed in to a Google account" but then use the exact same format to say "a device is a computer that can be used to access Google services." Makes it harder to identify the important bits.


You know, it’s only just occurred to me that all this time I had been googling to find a pdf of their policy but I never actually looked on their page. Thanks! What a major facepalm on my part.

And it is frustrating for sure. Not all footnotes are bad but all the bad stuff is usually in the footnotes!


How do you “guarantee” third party apps don’t do tracking and allow them - with your permission - to know your location?


I'm not arguing that it's easy. Or even possible.

What I am arguing is that Apple has been BSing people.

> What happens on your iPhone, stays on your iPhone.

Because that's just not true. Unless, I suppose, you don't install any third-party apps on your phone. So then, that privacy claim should look like:

> What happens on your iPhone, stays on your iPhone.#

> # Unless you install third-party apps.

Otherwise, dissidents will feel safe using iPhones. Until they're dead or in prison.


Apple is talking about what their own products do. Their cameras, their apps, their devices.

They can't 100% guarantee that third-party apps won't spy on you, and they don't even promise to do it.

You might want to bash Apple for something they haven't promised and is probably not even possible. But that's on you, just baseless bashing.


These apps are in their store. That implies that they have approved them.


Websites track iPhone users too. Is that also Apple’s fault?


What does it mean (officially) for an app to be approved? You seem to have your own interpretation.


You can read their policies as well as I can. Or better, perhaps.

But my point is that Apple does have lots of rules about what apps can do, and can't do. And that it's been rather aggressive in applying those rules. If you search HN re some mix of ["Apple", "iOS", "app", "store", etc] you'll find complaints from developers about Apple removing their apps from its store.

So, in that context, why were they silent for years about privacy risks of third-party apps? That wouldn't be a remarkable omission by Google, given that its business model is largely about monetizing users' information. But for Apple, which has been promoting itself as privacy-friendly, it strikes me as a glaring omission.

I'm getting criticism for not acknowledging Apple for its stance on privacy, and for how much better it is than Google. And for blaming it for not being perfect. And yes, it is privacy-friendly, and does a far better job at privacy than Google does.

What I'm criticizing is the failure to clearly acknowledge limitations. And I'm coming at this from the perspective of users who are concerned about threats to their privacy. Users who aren't very technical, and who may misunderstand just what Apple protects them from.

Also, this isn't just me hating on Apple. I've said pretty much the same things about the Tor Project. Back in the day, when many users actually saw Tor start at the command line, they saw "[notice] Tor v0...(...). This is experimental software. Do not rely on it for strong anonymity." But the new https://www.torproject.org/ starts with "Browse Privately. Explore Freely. Defend yourself against tracking and surveillance. Circumvent censorship." Finding anything at all about limitations is not so easy. About risks from global adversaries. About Tor-bypass risks in Tor browser. About risks from malware that phones home through clearnet, bypassing Tor. Conversely, when you start Tor browser in Whonix, you see "Whonix is experimental software. Do not rely on it for strong anonymity."


>You can read their policies as well as I can. Or better, perhaps. >So, in that context, why were they silent for years about privacy risks of third-party apps?

My point was, you seem to have an interpretation of "Apple approved this app so therefore this means that X,Y, Z is true". I'm asking if that is actually what Apple is claiming.. officially, and also what X, Y, Z mean to you.

If you're claiming that "Approved third party app" == "no data ever leaves your phone" then this has never been the claim of Apple AFAIK.

>What I'm criticizing is the failure to clearly acknowledge limitations.

I see. But why would a company acknowledge their limitations in a commercial competitive marketplace? People who appreciate companies being honest about their limitations in such a public manner, and still end up buying their product are not in the majority, I think.

People try to avoid mentioning anything negative about their past in a job interview - which is kinda the position companies are in, when they go look for customers.


> My point was, you seem to have an interpretation of "Apple approved this app so therefore this means that X,Y, Z is true".

There's arguably an analogy to potential limitation in DCMA safe harbor protection for sites moderate user posts. So if Apple didn't vet apps in its store, and only removed apps after complaints about malicious behavior, it would have no burden for disclosure.

But Apple clearly does vet apps. Aggressively so, given what I've read. So allowing apps that violate users' privacy does create a burden for disclosure. Unless you argue that Apple didn't know that they were doing that, which seems unlikely.

> I see. But why would a company acknowledge their limitations in a commercial competitive marketplace? People who appreciate companies being honest about their limitations in such a public manner, and still end up buying their product are not in the majority, I think.

Yes, for better or worse, that's how things are.

But if you play the "you can trust us" card, and are not in fact being totally honest, it's arguably worse than not promising anything.

Google did pretty much the same, with its "do no evil" mantra. But nobody believes that anymore. I was hoping that Apple was really trustable, but now I'm dubious.


When an app asks for access to your location, you are given three choices - never, when using, or always. How technical do you have to be to know that when you allow an app access to your location - it has access to your location?


All apps always that honest about it?


Apps don’t have to be “honest”. iOS enforces it.

But to be clear by saying “the app asks you”, it would be more accurate that the app asks the operating system for the location, the operating system asks you and the OS enforces it.


I assume you're enough of an engineer to know that building an SSO service like this takes a lot of time and energy. Third party apps are exploiting loopholes in the system and Apple is closing those loopholes one-by-one. I don't think this is unnatural. No one has foresight.


So are you arguing that Apple hasn't known about privacy violations by third-party apps in their store?

Given that Apple is announcing patches for these loopholes, it must have been working on them for at least months, if not years. But searching "apple privacy third party app" shows nothing before late May, 2019. Maybe I missed something, and if I did, please share.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: