Mobile App Developers: Stop Capturing Unnecessary Data Before Regulators Stop You

The findings presented in an article by German magazine Computerwoche published on Feb 11, 2014, are a forceful reminder that messages about excessive data capture via mobile apps seem to have gone unheeded so far.  As reported, tests by TÜV Trust IT established that “almost one in two mobile apps suck up data unnecessarily”.

What’s “unnecessary” of course depends on your viewpoint: it may seem unnecessary to me if my mobile email app captures my location; the provider of the app, on the other hand, could be capturing the information to provide me with a better service and/or to make money from selling such data to a third party. The trouble is that I don’t know, and I don’t have a choice if I want to use the app. From a consumer perspective, this is not a satisfactory situation; I’d even go as far as calling it unacceptable. Not that it matters what I feel; but privacy advocates and regulators are increasingly taking notice. Unless app providers take voluntary measures, they may see their data capture habits curtailed by regulation to a greater degree than would otherwise be the case.

Let’s step back a moment and consider why so many mobile apps capture more data than is strictly speaking necessary for the functioning of the app:

  • Conscious decision, for commercial reasons: in particular when companies are making apps available free of charge, they may (not unreasonably) feel that they have a right to recoup their investment in some way. One way to do that is monetizing the data - the more you capture, the more you can sell. That’s no doubt why many providers of paid-for apps can’t resist the temptation, either.
  • Conscious decision, simply because it’s possible: with the best of intentions, developers may decide to capture information just because they can, in case it’s of use at some point later - even if nobody has thought about what kind of future use case or functionality might leverage this data. This applies even in countries with stringent data privacy laws.  
  • Ignorance and carelessness: the quickest and easiest way to get a mobile app up and running is often to take existing apps or components and reuse them. Developers may either not realize or simply not care that the code they’re reusing is capturing all kinds of data that’s not actually needed for their app.

And we’re not just talking about device data here (OS and version, language, serial numbers, location, time stamps, accelerometer data, and so on); some apps also transmit personal information, including address book content, without the user being aware. So far, we’ve seen comparatively little user pushback. This is likely because many users simply not aware of what’s going on, while others simply accept whatever is presented to them for agreement, because they want the app, and they want it now.  But this may be changing: in a survey carried out by YouGov in December 2013 on behalf on behalf of the UK’s Information Commissioner to support an awareness raising campaign, 61% of respondents were either concerned or very concerned about “the way mobile apps can use your personal information (e.g. locational data, web browsing history)”;  39% stated that they’d “decided not to download an app due to privacy concerns”.

Covert data capture can also lead to regulatory intervention, which of course attracts headlines that in turn further increase consumer awareness.  One example is the US Federal Trade Commission’s action against Goldenshores Technologies, the developer of the “Brightest Flashlight Free” app, which failed to disclose that it was capturing and transmitting users’ location data and unique device identifiers to third parties (http://www.ftc.gov/news-events/press-releases/2013/12/android-flashlight-app-developer-settles-ftc-charges-it-deceived).

As the flashlight example also demonstrates, the argument is as much about transparency as it is about data capture: the ruling against Goldenshores was about its failure to disclose properly how users’ information would be used, not the fact that it was capturing particular types of data.

As my colleague Fatemeh Khatibloo argues eloquently in her blog post “Rumors Of Privacy’s Death Have Been Greatly Exaggerated”, the time has come to take a different approach to consumer privacy – one based on context, consensus and transparency.

In the context of mobile app development, good guidance is already available on what a “privacy by design” approach should look like. Two resources to highlight are the UK Information Commissioner’s “Privacy in mobile apps. Guidance for app developers” and the GSMA’s proposed set of “Privacy design guidelines for mobile applications”. It’s worth noting that the latter was published back in February 2012; in other words, there’s no excuse for blaming mobile data capture misdemeanors on lack of guidance.

I strongly encourage all app developers to follow mobile app privacy guidelines voluntarily, such as those referenced above. True, some of the principles may appear to be in conflict with the organization’s immediate commercial goals. But I’d argue that it’s better to figure out a consensual, transparent way of addressing this type of data capture than to risk a regulatory clamp-down. 

Comments

Time for the behemoths to step up?

Martha, thanks for this interesting post. My company develops mobile and web applications and I personally developed a mobile app for a start-up I founded some years ago, so I've faced the issue of dealing with privacy and user data collection.

Guidelines are great and reputable services and app developers are definitely taking the subject seriously as users' concerns grow. Missteps like the ones from the app Path or Snapchat are always going to happen, but I bet they are going to be of less concern than the mass of small developers making apps of the sorts of the "brightest flashlight" and free games that become hugely popular. It's a mess and it will become messier with the explosion of connected devices coming up with the "internet of things".

In my opinion it's time for the big players, like Google, Facebook, Apple, Amazon, Microsoft and others to put their ability to detect patterns through machine learning algorithms and provide services to protect privacy. Something similar to what is used today to detect spam. As an example, I noticed once that Flipboard was accessing my Facebook timeline even after I had uninstalled the app from my iPad. I had to go to Facebook's settings and revoke the app access myself, but Facebook could have easily detected the inconsistency and alerted me.

Any mobile OS could also have a much deeper privacy layer that leveraged the big data generated by all those apps. Is an app collecting too much information about a user and sending them out to their servers? iOS or Android, or any mobile OS, should tell us about that and encourage us to take some action. So, in a sense, even if an app passes the review process by the app stores, the OS will be monitoring their behavior to detect, through patterns, if something is weird. 

Post new comment

If you have an account on Forrester.com, please login.

Or complete the information below to post a comment.

(Your name will appear next to your comment.)
(We will not display your email.)
Type the characters you see in this picture. (verify using audio)
Type the characters you see in the picture above; if you can't read them, submit the form and a new image will be generated. Not case sensitive.