What happens on your phone should stay on your phone! Really!

The recent Supreme Court opinion about abortion rights has started a public discussion about the risks of smartphone apps getting out of control, shedding large amounts of personal data in completely unpredictable ways. This discussion is long overdue, and its importance goes far beyond the personal data of women’s health apps.


Why do so many apps send personal data to the Internet even though this is totally unnecessary for their core function? Why don’t they just process and store the data on the phones, secure and encrypted? There are clear alternative architectures that would vastly improve the privacy and security of our apps!


Let’s take a closer look at the problem in the context of my blood pressure cuff. All I want is to keep a history of my blood pressure readings and look for trends. However, the app moving the data from the cuff to the phone requires that I register with the app provider and provide all kinds of personal data. The registration enrolls me into a third party data interpretation service, whether or not I want to use that service. Naturally, after transmitting the data to the phone, the app uploads it all onto the provider’s website, adding time, location and other detail. The app provider shares some of the data – they don’t say which - with unnamed third party vendors, service providers, contractors, partners, or affiliates. They promise to secure the data but point out that, as we all know, data on the Internet is never really secure. They propose to provide health reports but disclaim any validity, pointing out that only a physician can make a valid interpretation. It is my responsibility to share the data with my physician, with no help from the app. Naturally, the term HIPAA never occurs in any of the privacy statements or terms of service.


To summarize: All I want is to store and view data securely on my phone. However, the app to do this, for which there is no alternative, insists

  • to provide services that I have not asked for and don’t want,

  • to upload large amounts of personal health data to provide those services,

  • to upload data that are irrelevant to the purpose of the app, such as time and location,

  • to share those data with unnamed third parties, each of which may use the data for their own purposes and creates additional security exposures, and

  • to provide information and health suggestions for which they don’t take responsibility, and which may be misleading.

If I want to make proper use of the data, it’s my responsibility to share them with my physician, with no help from the app.


This is just one example. These “features” are present with a broad swath of apps and devices, including pretty much all wearable devices.


The key driver of this approach is the Internet business model to suck up as much data as possible and exploit it as broadly as possible, whether or not it’s needed for a particular purpose. When IoT devices are involved, it’s not enough that we pay for the devices. The manufacturers want us to continue paying forever with our data. Most of the app providers create just the top application layer and use a plethora of third party services to build the underlying infrastructure, from id services to analytics, marketing, storage, and others. This practice creates a deep service supply chain. The privacy practices, security, and integrity of these component services are largely invisible to the app provider and likely not examined carefully. End users have little chance to penetrate this thicket.


How would a sane alternative app architecture look? Both the technical and the business architecture would need to be rebuilt.


Under a privacy oriented technical architecture, all personal data would remain on the phone, stored securely and encrypted where possible. The logic of the application, including user interface, analytics algorithms, and general background information would be downloaded from the provider’s web site, using minimal information about the end user. All processing and display of the data would happen on the phone. Uploading of local data, from user inputs to sensor data, location data, and information about other apps on the phone would not be necessary and would be prevented. Naturally, this would require secure application and data management on the phone’s operating system to prevent other applications from accessing the data. Backups can be encrypted before being sent into the cloud.


The business architecture for this model would be purely monetary, without using data as part of the users’ payments. The entitlement to the app, its algorithms, background data, and potential updates could be provided through security tokens that are obtained with a purchase transaction. A one-time purchase and download transaction would clearly be the most transparent option. However, even where providers want to use subscription models, an entitlement token approach, as already used by some apps, can work.


This design would radically reduce the number of transactions between the app and the app provider, substantially reducing the operating cost of the app provider. As no sensitive data ever moves to into the Internet, this approach would vastly improve not only privacy, but also security. The frequent data breaches at Internet service providers could not reveal any sensitive data.


As preventing abuse of personal data is becoming more urgent, apps using an architecture like this could clearly help to solve the privacy problem. Its privacy, security, and cost features could represent a competitive advantage. What will it take for this idea to take hold?

5 views0 comments