Please Improve the American Data Privacy & Protection Act

Please Improve the American Data Privacy & Protection Act
Spread the love

Image for article titled Our Country Moves Closer to a Federal Privacy Law, and I Move Closer To Losing My Mind

Photo: Kevin Dietsch (Getty Images)

After years of fizzled talks and stalled negotiations on a federal data privacy bill, House and Senate committee leaders finally set aside enough of their differences to release a draft of a new bipartisan tech privacy bill this past Friday.

The legislation, called the “American Data Privacy and Protection Act,” is being spearheaded by House Energy and Commerce Chair Frank Pallone (D-N.J.), Cathy McMorris Rodgers (R-Wash.) and Sen. Roger Wicker (R-Miss.), ranking member of the Senate Commerce Committee.

And at least from a brief reading of the 10-pager outlining the bill’s basics, it looks pretty good! Upon a deeper reading though, the thing is… well, it’s not pretty good, or even remotely good. It carves out exemptions for bad bosses and law enforcement officials, while letting data brokers continue buying and selling vast amounts of our personal data with impunity.

First, the good. There are stipulations barring most tech companies from “unnecessarily collect[ing]” data that they don’t need, and there are requirements for these companies to give any user the basic right to “access, correct, delete” data pertaining to them. There are also additional guardrails against siphoning data from the under-17 set, including explicit bans on any company from using its data to target teens with ads. Such a rule would be a step up from current child privacy laws, which only ban that practice when it comes to users under 13. The FTC would also receive a new slate of responsibilities under this bill, including creating a public registry of data brokers. The bill would mandate the creation of a universal “opt-out” feature that—ostensibly with a single click—would grant a given user the “right to opt out of covered data transfers” (more on “covered data” below). The bill would also bar these sorts of companies from using pesky dark patterns to trick users into giving up more data than necessary.

This bill includes clauses noting that it would supersede most of the current nationwide patchwork of privacy laws, and would, eventually, permit individuals to sue companies for manhandling their personal data (as long as a federal prosecutor doesn’t take up the case first).

Advocates in the privacy space were cautiously optimistic. EPIC Deputy Director Caitriona Fitzgerald noted that the bipartisan agreement marked “encouraging progress towards tackling the privacy crisis” that users of any technology at all find themselves in today, while Fight for the Future’s Evan Greer tweeted that the bill seems like a “good faith effort,” noting that “the details will be key.”

As it turns out, there are a lot of details. 64 pages of them, to be precise, courtesy of the full text of the draft bill that dropped Friday afternoon. It’s a lot for anyone to get through, but seeing how it’s my job to delve into the details of shitty privacy policies and shitty privacy legislation alike, here’s my initial take on what lawmakers are offering. Though there’s much to chew on, I’m offering my three biggest gripes with this draft as it’s written now.

Gripe #1: How the bill defines “data”

When you think of any tech company violating our collective idea of “privacy,” you’re thinking of data—personally identifiable or anonymized information about you—being mined and shared without your consent. That basic definition lies at the heart of every icky privacy story—from something as historic as the Cambridge Analytica scandal to a therapy startup using its app to extract your most intimate details.

So you’d better hope that any privacy bill worth its salt defines “data” in a way that encompasses the privacy violations people are facing, big and small. Here’s how this bill puts it:

The term “covered data” means information that identifies or is linked or reasonably linkable to an individual or a device that identifies or is linked or reasonably linkable to 1 or more individuals, including derived data and unique identifiers.

Great! Awesome. We can stop there. But, for some reason, we’re not going to. The bill adds exceptions:

The term “covered data” does not include

(i) de-identified data;
(ii) employee data; or
(iii) publicly available information.

Let’s start with the bill’s notion of “de-identified” data—information stripped of the personal identifiers that show whom it was taken from. That phrase, in and of itself, is one that companies in the business of invading your privacy—adtech moguls, data brokers and the like—love slinging around, despite the fact that it’s effectively meaningless.

Even if one company collects personally identifiable data that runs afoul of this bill, that same company could easily “anonymize” that data and pass it off free and clear to the same buyers it always has, who would then have little trouble de-anonymizing it. Because the modern day data-industrial complex works in networks of countless daisy chains, there isn’t much the bill can do once that first hand-off happens. By the time a dating app, or coupon app, or insert-category-here app has Hoovered up data about your device and sold it, the information’s already been de-and-re-identified.

If you’re looking at the average data broker selling access to the average person’s data, the information they’re offering access to has already met the bill’s definition for “de-identified data.” But these brokers, more often than not, work with partners drawing from their own data sources, who then hash it in any number of ways described here before passing it off to partners, where it becomes un-de-identified and easily attributed to you.

The bill will not stop the data collection that already happens, and by the time an app sells your de-identified data to a broker that can combine it with more data sources, it’ll be too late to stop the free flow of your information to anyone who wants it.

Gripe #2: Carveouts for cops

Law enforcement officials are already fond of exploiting every possible legal loophole to collect data on private citizens, with or without consent. This bill, meanwhile, takes some of those loopholes and puts them on paper, laying out exactly what kind of data cops can collect on any of us. Here’s a short rundown:

  • “biometric information,” like your fingerprints, face, or even your particular gait
  • any “known nonconsensual intimate images,” which is a fancy way of saying “pictures involved in revenge porn”
  • any genetic information

While it’s icky enough to see those exemptions codified in a bill that’s supposed to be about protecting individual privacy, it gets worse. Under the bill, data brokers, major tech platforms, or any other entity under the bill’s purview would be required to give users full access to the data that’s been collected on them, along with the right to delete or export that data. This sounds good on paper, until you get to the many, many exemptions:

A covered entity may decline to comply with a request to exercise a right described in subsection (a), in whole or in part, that would

[…]

interfere with law enforcement, judicial proceedings, investigations, or reasonable efforts to guard against, detect, or investigate malicious or unlawful activity, or enforce valid contracts

If the cops tell a tech company that your request to access data about you would mess with their investigation, the tech company can deny you, the bill says. So if you’re worried about a fed working with a third party to get your personal data—the way many of them are wont to do—and you want to ask that third party to delete that data using the rights imbued on you by this new bill, you’d be out of luck. That company could rightfully claim any data is used to “guard against, detect, or investigate malicious or unlawful activity,” since “malicious” can encompass anything law enforcement officials want, including, say, peaceful protests. And those protestors would be hard-pressed to do anything about it, under this bill.

Gripe #3: Employer surveillance

You might have also noticed from the definition above that the bill explicitly doesn’t apply to “employee data.” The coronavirus pandemic has come with an explosion of people working from home and an accompanying uptick in companies watching their employees at all hours, so this raised my eyebrows a bit. Here’s (some of) how that data is defined:

information relating to an employee (or a relative or beneficiary of such employee) that is necessary for the employer to collect, process, or transfer solely for the purpose of administering benefits to which such employee (or relative or beneficiary of such employee) is entitled on the basis of the employee’s position with that employer.

…Okay? The exception mostly applies to employee benefits like health insurance, it would seem. A wiley enough boss, however, could make the case that it’s “necessary” to mine data about workers’ every click on a keyboard—the way some bosses already do—for the sake of “administering benefits.” That same boss could make an argument for snooping on your emails, recording your mouse movements, or tracking your location, too—and be totally unhampered by this bill.

The bill also notes that the term “employee” can be defined as “ employee, director, officer, staff member, trainee, volunteer, or intern of an employer,” regardless of whether they’re unpaid or a temp worker. It’s not just “employees” that are being targeted here, but volunteers and interns, too.

Leave a Reply

Your email address will not be published. Required fields are marked *