There are two major philosophical differences between Google and Apple which led to this outcome.
The first is the difference in business model, i.e. advertising versus whole-product.
The second is that Google starts with a permissive ecosystem mindset and locks stuff down as they go along; Apple starts with a conservative mindset and opens stuff up as they go along. Neither approach is inherently better than the other—the competing platforms are converging towards equilibrium—but Apple's approach does prove to have the upper hand when it comes to consumer privacy.
The amusing thing is how much flack Apple received early on for their closed product approach.
It's arguable that Google traded heavily on the relative freedom of the Android platform, and sucked in a lot of early adopter / tinkerer types on the promise of openness. Kind of ironic that for most people, most of the time, the open source nature of Android is now barely a historical footnote.
I was a bit of an outlier in that I generally appreciated Apple's walled garden approach because it severely limited the amount of crapware/malware in the app store.
Plus - certainly 5-6 years ago - because of that whole product attitude, when you used an iPhone it simply felt a lot more polished than contemporary Android devices: I haven't really used Android enough recently to comment on whether or not that's still the case.
Certainly not for those tinkerer types. Often it's the tinkerer types who are concerned about privacy, and it's those types who install Copperhead OS or XPrivacy, which allows you to deny exactly this kind of thing. Not only that, but it'll let you block all those smaller ways of spying - like unique device IDs being phoned home to 6 different ad, analytics and crash handling services that the silly game you just installed uses.
It's hard for me to imagine using a phone on which I see ads, especially on YouTube, can't background apps like SSH clients, syncthing, even direct IMAP and SIP connections used to be a struggle for people on iOS (still may be?) or run app that Google/Apple have decided are evil piracy tools, like a manga reader or a torrent client manager and search tool. I have friends who even run emulators and use memory editors to cheat at mobile games regularly on their phones... very, very different models. Android is just a lot more flexible for a tinkerer to this day. All this is possible without exploits on most devices, allowed and accepted by many manufacturers.
There's this weird attitude on HN I see frequently where it seems like everything has to be "for the masses" for it to be of any value - tinkering by definition is not for the masses. Android devices probably shouldn't be for the masses, but for tinkerers, they really do pack a respectable punch in my opinion.
But even then I think you're still massively overstating it. 6-10 years ago nearly everyone in my circle of geeky friends and colleagues had a root-kitted Android (Cyanogenmod or similar) or a jailbroken iPhone. Today that number is exactly zero.
I know many older people for whom an iPad is the first "computer" they've ever owned, and for them it's a lifeline to grandchildren and community. These are people who were never going to learn MacOS or Windows.
True, it's a bit of a fight to go full open source on android, but in recent months this got easier.
Many apps that replace the whole google apps suite have been updated to be a great alternative, sometimes better.
Also F-Droid got a nice look and functionality now, don't miss the Play store a bit.
In case someone does, there is the "yalp-app", google play backwards. It downloads apks from the play store with a fake account.
Android phones are the PCs of smartphones -- you can configure them to do anything you want, but get one piece of hostile software on there and you're fucked.
For stock releases, yes, a rigorously vetted walled garden with no sideloading is going to provide greater default protection than an open platform, but consumers also lose the ability to shape the extent to which they manage their own devices.
As far back as 2012 I used an Android CFW that sandboxed apps and returned empty values for whichever permissions were specified. And prior to that, it used 7.x at-will permissions. Since it was rooted, I could set the hosts file to block known malware domains and social media tracking anf load modules or patch issues instead of hoping it would be in a future update, if ever.
Until buying a Pixel (on Android 7.1.2-8.1), I'd never bothered with an OEM spin or stock builds, but it's been... okay. It's a shame it took half a decade for privacy options to catch up to the level of custom projects, and it's still primarily for device protection, but the situation isn't the sieve it once was with locked bootloaders and the over-broad support emphasis of earlier days.
> but consumers also lose the ability to shape the extent to which they manage their own devices.
Which is only useful for the consumers who have the skill, the depth of knowledge and who maintain that knowledge to keep it up-to-date. Maybe one percent of users. Probably fewer.
This is why most people outsource this stuff to third parties, just like we outsource the pasteurisation of milk and the maintenance of sewerage systems.
The typical analogy is the age-old argument about whether consumers should be expected to know how their car works. Maybe they should but that's beside the point—they don't. Insisting that they should is irrelevant. Most people don't know what brake fluid does and they won't care if you explain it to them.
Not to sound too Monty Python, but shouldn't they have at least the right to know these things?
And your example proves the point; because others cared to become the third-party, we have projects like LineageOS for smartphones and tablets. However, they could not have if Google had locked its OS down or restricted it to certain hardware.
Just because most people only want to know the time doesn't mean no one but clock makers should have the right to understand how clocks work.
I just don’t agree with your premise (because you don’t need to know how a Seiko watch works to learn how watches work) or your conclusion (because in the absence of AOSP, projects like Firefox OS might well have been successful).
What has Apple opened up along the way? To me it seems like they just started with permissions to slightly fewer things than Android and restricted access away along the way. Both gave developers access to MAC addresses early on, for example. Apple also stopped allowing developers to identify other apps installed using canOpenURL but AFAIK it's still technically a public API.
iPhone OS 3 — Apps can now connect to external accessories through serial I/O or bluetooth. Apple also began permitting turn-by-turn navigation and push notifications. Apps can now access music library. Lots of new APIs.
iOS 4 — Apps can now perform limited tasks in the background with multitasking. Lots of new APIs.
iOS 7 — Multitasking is liberated further. Apps can even send audio streams to other apps. Apps can also run JavaScript directly, allowing app scripting for the first time. Lots of new APIs.
iOS 8 — Developers can now create and sell third-party keyboards. Apps can now use Touch ID for authentication; have deeper access to camera exposure settings; register extensions for sharing and notification centre. Lots of new APIs.
iOS 10 — New APIs let VoIP apps do stuff that used to be exclusive to the native dialler. Some apps can now hook into Siri. Lots of new APIs.
More recently, Apple has relaxed its stance on apps executing (interpreted) code that is downloaded from the internet.
I think it's quite obvious that Apple's approach is better. What exactly was the net gain for Android users by not being able to block permissions until 5.0?
Not to mention the Android OS itself just being a data mining platform for Google's ecosystem of products.
It's quite obviously better in terms of data privacy.
I don't use Android myself, but I'm not going to say their approach doesn't have any upsides. For example, I still can't pick a third party iOS app to be my default email client.
The app would have been trivially considered a malware if the same thing happened in Windows, where the app was traditionally allowed to access anything within the basic privilieges---just about anything. One can (rightly) blame Windows (and also Android) for allowing that, but more importantly, the Facebook app would have to be treated in the same way after all.
I believe this is something that regulation should address. Scooping up all this data should be a massive liability for companies. Depending on the severity of a hack or misuse of the data it should effectively bankrupt or significantly destroy profits of any company. But the OS should not completely prevent it.
There are a number of programs for power users on macOS that have stopped selling through the App Store or warn against limited functionality for apps bought through the store. There are also a lot of extremely useful apps for Android that work most effectively when your phone is rooted. Google's Safety Net has effectively made rooting a liability for all power users. When the OS provider is limiting what a user can do the device quickly devolves into an entertainment consumption device.
Bad actors should be treated as viruses and malware the same as they were before.
Equifax is exactly the kind of company that should have never existed, and should have never been allowed to exist.
The same with the Facebook app (+ Android permissions, because even Facebook can't do full evil without Google apparently). And it's going to be the same result. Nothing is going to change, neither company will suffer much, nobody is going to jail. So his counterpoint is valid.
This is very important to highlight. Mobile has tremendously changed the goal posts in terms of what are acceptable things to do on a system and what violates the privacy of the user, and no one noticed.
Imagine a piece of software in the 2000s that sent all your emails, contacts, etc to a central server when running on your desktop computer, made by a large American company. There would have been utter outrage and legal ramifications.
Which is why they keep trying to push the app on you. I've gotten emails from Facebook saying a friend wants me to download messenger. Like, huh? Why would they care if I'm using an app or a browser?
This updated piece by Android Police [1] provides some more pieces of information (see the bottom of the article for the updates):
- The company [Facebook] reiterated that it wasn't saving the actual content of calls or SMS/MMS messages—something neither Ars Technica nor we claimed, but presumably other outlets did.
- It's actually a part of Facebook Lite and Messenger (and users can opt out [...] respectively). Facebook considers the data collection opt-in since the apps in question directly ask if you'd like to upload that information during setup.
LineageOS prevents this type of access. If an application needs access to your phone or your contacts, you need to explicitly permit it to do so during execution. if you want to do this more than once, you need to set it up in the security options for the application in the settings menu.
EU GDPR compelled this sort of inclusion. If it came out that any data was not included in such a request, Facebook would be hosed to the extent of whole percentages of revenue.
I was a Facebook user before I have my first Android phone, it was on 4.0.3 ICS, then I'm on 4.3, 4.4, 5.0, 6.0, 7.1...etc, and before I delete my Facebook account, I see no call logs in the backup data archive downloaded from Facebook, guess what, there's at some point, some users just simply say yes to let Facebook 'manage' your SMS, or calls, if an Android user's call log got scrapped, it's partially his/her own fault.
As an iPhone user I’d like to think iOS does not allow this. However, I understand there’s a private API that allows larger companies access to functions that are not available to regular developers. It allows them access to the microphone without the red bar showing even when there’s not a call. Like others have, I’ve received ads for products spoken about near my phone but never typed into a device I own or on my router.
This believe based on trust to Apple, not on technical limitations. I always wondered about this concept of private API: available to anyone but theoretically forbidden to be used. I'm sure that there's a million ways to circumvent Apple's static code analysis and slip through their review. Why didn't Apple limit their API usage with technical restrictions.
It's the same when developing for PS4/X1 - headers have methods which are not documented and you're not allowed to use them - if you do, the game will fail certification. I guess they just check if the executable is linked against them and won't certify the game.
It is technically impossible to do what you are suggesting, otherwise many apps would already be doing it. If there any documented and known issues with the app validation process, please share links.
Why is it technically impossible? Move functionality to kernel or to another blessed process and restrict interface via sysctls or any kind of RPC. It's like using SQL from JavaScript and forbid users to open developer console.
> Why didn't Apple limit their API usage with technical restrictions.
Because this was impossible without forcing users to stop using Objective-C. The language allows dynamic method dispatch and so it's always possible to allow internal API calls.
In the comment thread you linked to the author has a “faint recollection” of what I’m referring to, confirms Facebook has special access because of iOS integration.
So I’m not sure why you posted that. Seems to undermine your argument.
> Like others have, I’ve received ads for products spoken about near my phone but never typed into a device I own or on my router.
I've had this experience, and I still suspect it's likely to be confirmation bias. Chances are if you're speaking about something, there's a decent chance Facebook can figure out your likely interest in it in a variety of surprising and convoluted ways - your recent credit card purchases, news articles that may have mentioned it in passing, etc.
We also don't really notice all the times Facebook totally whifs a recommendation.
With iOS we have to trust/hope that Apple is doing the right thing. With Android we can audit the code. Yes, the early permissions model left a lot to be desired, and was subsequently revised. But we could always look at the OS source to see where things might be leaking. Unless, of course, these things are happening in the Google Framework level, at which point, we have to trust/hope that Google is doing the right thing.
Unfortunately more and more functionality has been moved into the closed-source Google Framework over the past few years. Some of this is a result of Android's terrible update situation: the only way for Google to get new software to many Android users is through software they can update, unlike the OS which OEMs control. But the fact that Google hasn't open sourced the Google Framework tells a lot about how "open source" Android really is. I wish we had a decent open source phone OS out there, since I don't trust Apple to produce good software (and increasingly, hardware) and I don't trust Google... well, I just don't trust Google. Our current duopoly is a pretty awful situation for consumers.
Well, you can’t audit the proprietary hardware drivers most android manufacturers use or the many proprietary apps from google etc that sit on top of android or the changes that the carriers make.
I remember when I was an Android developer dealing with several issues relating to the fact that one carrier put a proxy in the networking stack.
The first is the difference in business model, i.e. advertising versus whole-product.
The second is that Google starts with a permissive ecosystem mindset and locks stuff down as they go along; Apple starts with a conservative mindset and opens stuff up as they go along. Neither approach is inherently better than the other—the competing platforms are converging towards equilibrium—but Apple's approach does prove to have the upper hand when it comes to consumer privacy.