Researchers at the University of Washington and Harvard Law School recently published a groundbreaking study analyzing the technical capabilities of 16 electronic monitoring (EM) smartphone apps used as “alternatives” to criminal and civil detention. The study, billed as the “first systematic analysis of the electronic monitoring apps ecosystem,” confirmed many advocates’ fears that EM apps allow access to wide swaths of information, often contain third party trackers, and are frequently unreliable. The study also raises further questions about the lack of transparency involved in the EM app ecosystem, despite local, state, and federal government agencies’ increasing reliance on these apps.
As of 2020, over 2.3 million people in the United States were incarcerated, and an additional 4.5 million were under some form of “community supervision,” including those on probation, parole, pretrial release, or in the juvenile or immigration detention systems. While EM in the form of ankle monitors has long been used by agencies as an “alternative” to detention, local, state, and federal government agencies have increasingly been turning to smartphone apps to fill this function. The way it works is simple: in lieu of incarceration/detention or an ankle monitor, a person agrees to download an EM app on their own phone that allows the agency to track the person’s location and may require the person to submit to additional conditions such as check-ins involving face or voice recognition. The low costs associated with requiring a person to use their own device for EM likely explains the explosion of EM apps in recent years. Although there is no accurate count of the total number of people who use an EM app as an alternative to detention, in the immigration context alone, today nearly 100,000 people are on EM through the BI Smartlink app, up from just over 12,000 in 2018. Such a high usage calls for a greater need for public understanding of these apps and the information they collect, retain, and share.
The study’s technical analysis, the first of its kind for these types of apps, identified several categories of problems with the 16 apps surveyed. These include privacy issues related to the permissions these apps request (and often require), concerns around the types of third party libraries and trackers they use, who they send data to and how they do it, as well as some fundamental issues around usability and app malfunctions.
When an app wants to collect data from your phone, e.g. by taking a picture with your camera or capturing your GPS location, it must first request permission from you to interact with that part of your device. Because of this, knowing which permissions an app requests gives a good idea for what data it can collect. And while denying unnecessary requests for permission is a great way to protect your personal data, people under EM orders often don’t have that luxury, and some EM apps simply won’t function until all permissions are granted.
Perhaps unsurprisingly, almost all of the apps in the study request permissions like GPS location, camera, and microphone access, which are likely used for various check-ins with the person’s EM supervisor. But some apps request more unusual permissions. Two of the studied apps request access to the phone’s contacts list, which the authors note can be combined with the “read phone state” permission to monitor who someone talks to and how often they talk. And three more request “activity recognition” permissions, which report if the user is in a vehicle, on a bicycle, running, or standing still.
Third Party Libraries & Trackers
App developers almost never write every line of code that goes into their software, instead depending on so-called “libraries” of software written by third party developers. That an app includes these third party libraries is hardly a red flag by itself. However, because some libraries are written to collect and upload tracking data about a user, it’s possible to correlate their existence in an app with intent to track, and even monetize, user data.
The study found that nearly every app used a Google analytics library of some sort. As EFF has previously argued, Google Analytics may not be particularly invasive if it were only used in a single app, but when combined with its nearly ubiquitous use across the web, it provides Google with a panoptic view of individuals’ online behavior. Worse yet, the app Sprokit “appeared to contain the code necessary for Google AdMob and Facebook Ads SDK to serve ads.” If that is indeed the case, Sprokit’s developers are engaging in an appalling practice of monetizing their captive audience.
The study aimed to capture the kinds of network traffic these apps sent during normal operation, but was limited by not having active accounts for any of the apps (either because the researchers could not create their own accounts or did not do so to avoid agreeing to terms of service). Even still, by installing software that allows them to snoop on app communications, they were able to draw some worrying conclusions on a few studied apps.
Nearly half of the apps made requests to web domains that could be uniquely associated with the app. This is important because even though those web requests are encrypted, the domain they were addressed to is not, meaning that whoever controls the network a user is on (e.g. coffee shops, airports, schools, employers, Airbnb hosts, etc) could theoretically know if someone is under EM. One app which we’ve already mentioned, Sprokit, was particularly egregious with how often it sent data: every five minutes, it would phone home to Facebook’s ad network endpoint with numerous data points harvested from phone sensors and other sensitive data.
It’s worth reiterating that, due to the limitations of the study, this is far from an exhaustive picture of each EM app’s behavior. There are still a number of important open questions about what data they send and how they send it.
App Bugs and Technical Issues
As with any software, EM apps are prone to bugs. But unlike other apps, if someone under EM has issues with their app, they’re liable to violate the terms of their court order, which could result in disciplinary action or even incarceration—issues that those who’ve been subjected to ankle monitors have similarly faced.
To study how bugs and other issues with EM apps affected the people forced to use them, the researchers performed a qualitative analysis of the apps’ Google Play store reviews. These reviews were, by a large margin, overwhelmingly negative. Many users report being unable to successfully check-in with the app, sometimes due to buggy GPS/facial recognition, and other times due to not receiving notifications for a check-in. One user describes such an issue in their review: “I’ve been having trouble with the check-ins not alerting my phone which causes my probation officer to call and threaten to file a warrant for my arrest because I missed the check-ins, which is incredibly frustrating and distressing.”
The study also addressed the legal context in which issues around EM arise. Ultimately, legal challenges to EM apps are likely to be difficult because although the touchstone of the Fourth Amendment’s prohibition against unlawful search and seizure is “reasonableness,” courts have long held that probationers and parolees have diminished expectations of privacy compared to the government’s interests in preventing recidivism and reintegrating probationers and parolees into the community.
Moreover, the government likely would be able to get around Fourth Amendment challenges by claiming that the person consented to the EM app. But as we’ve argued in other contexts, so-called “consent searches” are a legal fiction. They often occur in high-coercion settings, such as traffic stops or home searches, and leave little room for the average person to feel comfortable saying no. Similarly, here, the choice to submit to an EM app is hardly a choice at all, especially when faced with incarceration as a potential alternative.
This study is the first comprehensive analysis into the ecosystem of EM apps, and lays crucial groundwork for the public’s understanding of these apps and their harms. It also raises additional questions that EM app developers and government agencies that contract with these apps must provide answers for, including:
- Why EM apps request dangerous permissions that seem to be unrelated to typical electronic monitoring needs, such as access to a phone’s contacts or precise phone state information
- What developers of EM apps that lack privacy policies do with the data they collect
- What protections people under EM have against warrantless search of their personal data by law enforcement, or from advertising data brokers buying their data
- What additional information will be uncovered by being able to establish an active account with these EM apps
- What information is actually provided about the technical capabilities of EM apps to both government agencies contracting with EM app vendors and people who are on EM apps
The people who are forced to deal with EM apps deserve answers to these questions, and so does the general public as the adoption of electronic monitoring grows in our criminal and civil systems.
Categories: Electronic Frontier Foundation