Apple and Google both have strict and well-developed requirements for app developers in terms of privacy protection. And yet, incidents involving personal data leaks are becoming more common.
In fact, it's safe to say that effective control over privacy is impossible to achieve nowadays when we leave it up to corporations. Not only do they continue to ignore such incidents, but they prevent third-party developers from giving users control over their privacy.
How does Apple fare for data privacy?
Apple’s policies are well defined, and there’s no real difference between the stated and actual approach. Apple imposes stringent restrictions on what user information an app can obtain.
All large analytical systems are forced to obey this rule. Also, some types of apps impose additional restrictions on what can and can't be collected about the user.
Is it possible break these rules? Yes it is. There are plenty of high-profile examples, too. Recently it was discovered that Sensor Tower, a popular analytics platform for technology developers and investors, secretly gathers data about millions of people who installed popular VPN and ad-blocking apps, which were available for iPhone as well as Android.
Plus, as BlackBerry stated in a recent report (via Forbes), hundreds of apps circumvented Apple and Google security measures. And although Apple itself can search for these violators, other researchers experience difficulties in doing so.
In 2018, the Uber iOS app suddenly received additional rights to access users’ screen recordings, an unprecedented step. Private APIs cannot be used in applications on Apple App Store and the Uber API, which technically could allow them to record the display of the device, was eventually blocked.
Apple's privacy guidelines tighten year by year, but the interpretation of the rules tends to change over time.
There is also evidence of this. An update to AdGuard Pro – an ad-blocker – was banned from the App Store because it was deemed to use a VPN profile to block content, something which isn’t permitted due to a sudden change in the interpretation of some paragraphs of the App Store rules.
Users of iOS apps don't have control over their data; they simply have to trust the app developer. And Apple has no desire to let third-party developers provide privacy protection tools for end-users, nor offer such tools for developers to use in their apps. In fact, it severely limits the functionality of such applications.
In an ideal world, app developers wouldn’t have to live with these restrictions, and would be allowed to provide tools not offered by Apple – or anyone else who has an app store. Instead, they have to accept it, and users have to live with less functionality.
However, unlike Google, Apple is at least ready to make contact.
The problem is also that the people who check whether apps conform to the guidelines may not see what the app actually does with personal data. The number of Apple apps (as well as their developers) is growing rapidly, and in recent years the corporation had to employ more reviewers.
Unfortunately, new employees don't have the proper experience. They may not fully understand the guidelines, so they interpret the rules in their own way, and this can be different each time.
As a result, it can be difficult to agree with them on what's permitted and what's not. On the other hand, they at least explain in detail what the problem is, unlike Google, where you often have to speculate what's meant by a particular requirement.
TL;DR: Apple has very good privacy guidelines and it tries to apply them fairly, which means iOS is the most secure platform for users’ personal data.
However, Apple can also be a bit selective about how the rules are applied, and it tries to keep all privacy issues under its own control.
Wild West Google Play
If the situation with Apple is less than ideal, then it’s far worse on Google’s side. You’ll find apps on the Google Play store which are utterly disrespectful to users’ personal data. And the protection of personal data in Android apps remains surprisingly poor, despite the large number of high-profile incidents.
In 2018 AdGuard conducted some research and confirmed that some of the top-ranking Android apps can, without notifying the user, extract email addresses, contacts and text messages, and transfer them to third parties. Plus, there’s almost no protection against this.
It was especially unpleasant to see that some of the most popular apps with 10 million or more downloads, and those with "Editors' Choice" badges were doing this.
The investigation found that at least three apps developed by the Chinese company GOMO violated users’ privacy and tried to siphon as much information as possible. The GO SMS Pro app has had over 100 million installations according to Google Play. Immediately after installing the app, it sends your email to the goconfigsync.3g.cn domain directly in the request URL using regular HTTP. Therefore, your email isn't sent only to their server, but also visible to all intermediate third-party organizations.
Two more apps: Z Camera - Photo Editor, Beauty Selfie, Collage and S Photo Editor - Collage Maker, Photo Collage with more than 100 million installations each also send your email address together with other various information to the domain zcamera.lzt.goforandroid.com. Ironically, GOMO likes to focus on privacy when describing its apps.
And nothing has really changed over the past two years. It's also impossible to be completely sure that security apps are not involved in unauthorised tracking.
The question is, why do such cases go unnoticed by Google?
The privacy situation on the Play Store is best described as the Wild West. It would seem that Google sets the right guidelines for app developers, but the struggle to enforce them remains the task of lone Sheriffs.
Does Google care about user privacy?
Google's privacy protection guidelines aren’t as strict as Apple's. Google doesn’t have restrictions on user identification, for instance. According to the guidelines, personal data can only be requested from a user if the app actually uses it. Here's what's forbidden: “Apps that steal user authentication information (such as usernames or passwords) or imitate other apps or websites to trick users into revealing personal information or authentication information.”
Despite the restrictions, developers can simply ignore them, and there are dozens of examples of this happening as we’ve seen. However, when new high-profile cases become public, Google remains silent.
Users' data protection needs run contrary to the corporation's advertising business model. And it isn’t profitable for corporations to remove ads from their closed platforms, so they staunchly defend them.
If Google didn't restrict the functionality of third-party apps, the situation might not be so deplorable. Android app users’ sensitive information remains unprotected from unrestricted access by third parties. Google isn’t trying to solve this problem, doesn’t allow third-party developers to solve it, and it won’t take responsibility for the inevitable incidents.
Gradually, Google is forcing developers to ask users to grant access to certain data, and this helps a little with privacy protection. At the same time, it is trying to solve all the problems in one fell swoop without using a manual approach as Apple does with the App Store.
It might be because it’s a highly inefficient method, but whatever the reason, Google is trying to automate all interaction with developers. As a result, communication between Google and developers remains opaque, inconvenient, and constantly requiring further clarification.
Ultimately, Apple and Google's policies have little in common with real security and privacy. Developers of privacy protection solutions aren’t comfortable working with either Google or Apple. Apple gives some opportunities, but at the same time it clamps users in the jaws of restrictions and seems not to treat all developers the same. Google allows developers to do anything, but not on Google Play…
Many more personal data leaks lie ahead and as more and more people start to care about their privacy and begin to protest, this problem will surely be dealt with much more actively.
The author, Andrey Meshkov, is co-founder and CTO of AdGuard.