App Permissions, Shadow Profiles and Other Potential Risks to Client Confidentiality
It’s a familiar routine: You download an app, a small window pops up on your phone screen and asks you if you will grant it permission to access your contacts, and you agree because – well, why not? We do this all the time as mobile communications devices, such as tablets and smartphones, have become nearly ubiquitous in both our personal lives and our workplaces. Moreover, mobile applications, such as Waze or the Gmail app, are increasingly common fixtures on both personal and work phones. Increased prevalence and reliance on these apps normalize their day-to-day use. Who would think twice about putting an application on their phone?
Lawyers, of course. Attorneys have ethical obligations under New York’s Professional Rules of Responsibility Rule 1.6 to “not knowingly reveal” their clients’ “confidential information,” which includes privileged information, information that is likely to be embarrassing or detrimental to the client, or information that the client has requested remain confidential. It is no secret that mobile devices collect extraordinary amounts of data on individuals, much of which we voluntarily give in exchange for free services. But what happens when companies use data collected through one person to create a detailed mosaic of another person’s life? In that case, is it possible that an attorney may be inadvertently disclosing confidential client information to private corporations?
What Data Do Companies Collect and How Do They Do It?
Imagine you request all of the data that Facebook or Google has on you. In less than a week, you may receive anything from a few megabytes to tens of gigabytes of data on yourself. This data may include, among other things, locations of places you’ve frequented, pictures of your friends and yourself, and even estimates of your weight. It is quite common that people find out that Facebook or Google has a detailed consumer profile on them.
Unseen, however, is the data collected from you about third persons in your network. For example, if you post a picture of a friend and yourself on Facebook, Facebook associates that picture with both users. This is done in a variety of complex ways, such as through contact chaining. In the case of contact chaining, granting Facebook permission to access your phone’s contact database allows Facebook to collect data on both you and all of your contacts: their names, phone numbers, addresses, emails and whatever else you have stored on them. Thus, data collected on us may affect other people in our social and occupational networks.
To continue using Facebook as an example, this data collection practice even extends to people who do not use Facebook’s services. This practice of creating consumer profiles for non-users is called shadow profiling. By voluntarily giving companies access to your contacts, Facebook can begin to identify and collect data on people who do not use Facebook.
Through methods like contact chaining, downloading an app on a mobile device may lead to inadvertent client disclosures. Even the Supreme Court acknowledged the dangers posed by invasive data collection through mobile phones. The “Mosaic Theory” of the Fourth Amendment protections from Supreme Court jurisprudence illustrates the problem that big data can lead to in people’s lives: “[A] cell phone collects in one place many distinct types of information – an address, a note, a prescription, a bank statement, a video – that reveal much more in combination than any isolated record.” Similarly here, the aggregation of a person’s data can paint a comprehensive and nuanced portrait of that person’s life.
Therefore, by merely downloading TikTok’s mobile application, the company may collect sensitive information, including your location, what kind of phone you use, whether you use AT&T or Verizon, and even typing patterns. And even more disturbing is TikTok’s collection of file names. What if TikTok collects a file name, such as “Client.name_breach.of.contract_complaint”? A wealth of information can be gleaned from the file name alone, namely, that the client is communicating with an attorney, the attorney has drafted a complaint, and even the nature of the complaint. If the action wasn’t yet filed, this would not be public information. Such information could plausibly be considered “embarrassing” and, therefore, confidential under Rule 1.6.
This is not an alarmist call to dispense with using mobile devices, mobile applications, and other forms of digital storage and communication and to return to pen and paper. The legal field obviously cannot halt using Microsoft products and services because of the mere speculation that other humans may be viewing some of our data. But raising these issues is the best way to begin creating transparency in a notoriously opaque system. And while companies are currently using contact chaining and creating shadow profiles for relatively benign reasons, attorneys should be aware of the potential hidden risks that data collection, storage, review and sharing poses to client confidentiality. This is especially important because the landscape of threats to client confidentiality changes drastically every year as new techniques, uses and data collection methods emerge. Data that is collected now may be exploited in unimaginable ways in a few years, and the data that we unintentionally give companies on our clients today could create issues for them in the future.
Tyler Rexhouse is a 3L at Albany Law School, where he took NYSBA’s Technology and the Law course. He earned a B.S. in nanoscale engineering and worked in microelectronics manufacturing at GlobalFoundries. His legal interests include emerging privacy and cybersecurity issues in a rapidly changing technological landscape.
. Simon Batt, What Are Facebook Shadow Profiles? Make Use Of (April 21, 2020), https://www.makeuseof.com/tag/facebook-shadow-profiles.
. Riley v. California, 134 S. Ct. 2473 (2014).
. Microsoft Privacy Statement, https://privacy.microsoft.com/en-us/privacystatement.
. Committee on Professional Ethics, Opinion 820 (Feb. 28, 2008).