App Permissions, Shadow Profiles and Other Potential Risks to Client Confidentiality

By Tyler S. Rexhouse

February 4, 2021

App Permissions, Shadow Profiles and Other Potential Risks to Client Confidentiality

2.4.2021

By Tyler S. Rexhouse

It’s a familiar routine: You download an app, a small window pops up on your phone screen and asks you if you will grant it permission to access your contacts, and you agree because – well, why not? We do this all the time as mobile communications devices, such as tablets and smartphones, have become nearly ubiquitous in both our personal lives and our workplaces. Moreover, mobile applications, such as Waze or the Gmail app, are increasingly common fixtures on both personal and work phones. Increased prevalence and reliance on these apps normalize their day-to-day use. Who would think twice about putting an application on their phone?

Lawyers, of course. Attorneys have ethical obligations under New York’s Professional Rules of Responsibility Rule 1.6 to “not knowingly reveal” their clients’ “confidential information,” which includes privileged information, information that is likely to be embarrassing or detrimental to the client, or information that the client has requested remain confidential. It is no secret that mobile devices collect extraordinary amounts of data on individuals, much of which we voluntarily give in exchange for free services. But what happens when companies use data collected through one person to create a detailed mosaic of another person’s life? In that case, is it possible that an attorney may be inadvertently disclosing confidential client information to private corporations?

What Data Do Companies Collect and How Do They Do It?

Imagine you request all of the data that Facebook or Google has on you. In less than a week, you may receive anything from a few megabytes to tens of gigabytes of data on yourself. This data may include, among other things, locations of places you’ve frequented, pictures of your friends and yourself, and even estimates of your weight. It is quite common that people find out that Facebook or Google has a detailed consumer profile on them.

Unseen, however, is the data collected from you about third persons in your network. For example, if you post a picture of a friend and yourself on Facebook, Facebook associates that picture with both users. This is done in a variety of complex ways, such as through contact chaining. In the case of contact chaining, granting Facebook permission to access your phone’s contact database allows Facebook to collect data on both you and all of your contacts: their names, phone numbers, addresses, emails and whatever else you have stored on them. Thus, data collected on us may affect other people in our social and occupational networks.

To continue using Facebook as an example, this data collection practice even extends to people who do not use Facebook’s services. This practice of creating consumer profiles for non-users is called shadow profiling.[1] By voluntarily giving companies access to your contacts, Facebook can begin to identify and collect data on people who do not use Facebook.

Through methods like contact chaining, downloading an app on a mobile device may lead to inadvertent client disclosures. Even the Supreme Court acknowledged the dangers posed by invasive data collection through mobile phones. The “Mosaic Theory” of the Fourth Amendment protections from Supreme Court jurisprudence illustrates the problem that big data can lead to in people’s lives: “[A] cell phone collects in one place many distinct types of information – an address, a note, a prescription, a bank statement, a video – that reveal much more in combination than any isolated record.”[2] Similarly here, the aggregation of a person’s data can paint a comprehensive and nuanced portrait of that person’s life.

Privacy policies of major technology companies provide little comfort on this front and demonstrate that they typically collect and share user data liberally. TikTok’s privacy policy, for example, includes, under “Information we collect automatically”: “We collect information about the device you use to access the Platform, including your IP address, unique device identifiers, model of your device, your mobile carrier, time zone setting, screen resolution, operating system, app and file names and types, keystroke patterns or rhythms, and platform.”[3]

Therefore, by merely downloading TikTok’s mobile application, the company may collect sensitive information, including your location, what kind of phone you use, whether you use AT&T or Verizon, and even typing patterns. And even more disturbing is TikTok’s collection of file names. What if TikTok collects a file name, such as “Client.name_breach.of.contract_complaint”? A wealth of information can be gleaned from the file name alone, namely, that the client is communicating with an attorney, the attorney has drafted a complaint, and even the nature of the complaint. If the action wasn’t yet filed, this would not be public information. Such information could plausibly be considered “embarrassing” and, therefore, confidential under Rule 1.6.

While most attorneys are unlikely to have the TikTok app on their work phones, it is a good bet that they have a Microsoft application downloaded. In its privacy policy, Microsoft discloses that it processes personal data with methods that “include[] both automated and manual (human) methods of processing” and that “[t]his manual review may be conducted by Microsoft employees or vendors who are working on Microsoft’s behalf.”[4] Because of the ease of de-anonymizing data, Microsoft’s privacy policy raises concerns that other people are reviewing data that can be used to identify clients and to make inferences about them that may relate to representation. And because acknowledging and consenting to an app’s privacy policy is required to use it, it can be argued that attorneys are knowingly revealing their clients’ confidential information to private companies. Attorneys may be voluntarily and irreversibly surrendering data to private companies where there is no expectation that that data will remain private, and there are few, if any, mechanisms for retrieval or control of data once released to the app developer.

Unfortunately, there is little ethical guidance on this topic. And while the risks are distinct from the risks of using remote storage (such as iCloud or DropBox) to store confidential information, ethics opinions addressing these issues may be informative. For example, attorneys may use electronic storage means where communications are substantively monitored by non-humans.[5] While it is generally acceptable to use unencrypted email services to transmit client information, even though the emails are reviewed by automated algorithms, the same practice would be impermissible if the emails were “reviewed by human beings or if the service provider reserved the right to disclose the e-mails or the substance of the communications to third parties without sender permission.”[6] Using this rationale, it may be considered a violation of Rule 1.6 to use Microsoft applications under Microsoft’s current privacy policy if client information can be reviewed by humans.

This is not an alarmist call to dispense with using mobile devices, mobile applications, and other forms of digital storage and communication and to return to pen and paper. The legal field obviously cannot halt using Microsoft products and services because of the mere speculation that other humans may be viewing some of our data. But raising these issues is the best way to begin creating transparency in a notoriously opaque system. And while companies are currently using contact chaining and creating shadow profiles for relatively benign reasons, attorneys should be aware of the potential hidden risks that data collection, storage, review and sharing poses to client confidentiality. This is especially important because the landscape of threats to client confidentiality changes drastically every year as new techniques, uses and data collection methods emerge. Data that is collected now may be exploited in unimaginable ways in a few years, and the data that we unintentionally give companies on our clients today could create issues for them in the future.

Tyler Rexhouse is a 3L at Albany Law School, where he took NYSBA’s Technology and the Law course. He earned a B.S. in nanoscale engineering and worked in microelectronics manufacturing at GlobalFoundries. His legal interests include emerging privacy and cybersecurity issues in a rapidly changing technological landscape.


[1]. Simon Batt, What Are Facebook Shadow Profiles? Make Use Of (April 21, 2020), https://www.makeuseof.com/tag/facebook-shadow-profiles.

[2]. Riley v. California, 134 S. Ct. 2473 (2014).

[3]. TikTok’s U.S. Privacy Policy, https://www.tiktok.com/legal/privacy-policy.

[4]. Microsoft Privacy Statement, https://privacy.microsoft.com/en-us/privacystatement.

[5]. Committee on Professional Ethics, Opinion 820 (Feb. 28, 2008).

[6]. Id.

Six diverse people sitting holding signs
gradient circle (purple) gradient circle (green)

Join NYSBA

My NYSBA Account

My NYSBA Account