The Post-Roe data privacy nightmare has arrived

Home / Uncategorized / The Post-Roe data privacy nightmare has arrived

In addition to the life-and-death dangers now facing people who become pregnant, the end of “Roe v Wade” and the rise of criminalized abortion stand to usher in a data privacy nightmare that civil liberties advocates have warned about for decades.

 

BY:

Eric De Grasse
Chief Technology Officer
PROJECT COUNSEL MEDIA

 

My personal view: this past weekend I read the entire “Dobbs” decision that overturned “Roe v. Wade”, and the Roberts opinion and the dissenting opinion. I also spent (way too much time) reading comments on social media. According to Justice Alito who delivered the majority decision, “Roe v. Wade” was wrong in the sense that the US Constitution does not provide for the right of abortion. He noted anti-abortion restrictions in the mid-to-late 19th century are to be taken as historic foundation. Instead the issue regarding abortion legality should be returned to the people’s elected representatives. Supporters of the decision say the Supreme Court ruling is not about whether abortion is legal or not. It is strictly a ruling regarding rights under the Constitution. Hence it is up to the individual state or federal government to legislate on the issue.

While true it is somewhat naive to think solely along those lines. The question was whether it was a constitutional right. “Roe vs Wade” established a right to abortion which was a constitutional one. So if laws were passed to outlaw abortion, they were anti-constitutional. Now that it no longer has that protection, states can do what they want. So in the end IT DOES come back to whether it is legal or not. Overturning “Roe vs Wade” has now made it possible for abortion to be illegal.

And in the gun case decided the previous day, Alito feels able to say that carrying assault weapons – a 20th century invention – is totally compatible with an 18th century document talking about a well regulated militia.  That mid-to-late 19th century (and later) gun control laws were too new, too removed from the 1790s to establish history, precedent. The exact opposite of his stand in the abortion case. How odd.

 

27 June 2022 (Cannes, France) – We are, all of us, maps of our anonymized data. The many snippets of information you put out onto the internet, which you’re promised are not traceable to you, when combined make an outline of a person who is unmistakably you. When women use health apps (particularly period-tracking apps), alter their buying patterns on shopping apps, change the types of foods they consume and log in nutrition apps, use GPS navigation to travel to health clinics or out of state, and search for health care, they are now putting themselves at risk of being tracked by law enforcement or turned in by bounty hunters, who only have to buy a bit of data to cash in on reporting women they suspect are seeking abortions.

As we reported last month after a draft of the Supreme Court’s decision was leaked to Politico, the criminalization of abortion in states across the US requires that people adopt a comprehensive digital privacy strategy to protect themselves from the surveillance state. We suggested this could include steps like switching to an end-to-end encrypted app like Signal, limiting your data footprint by using search engines like DuckDuckGo instead of Google, locking down your privacy settings on your phone, and using a browser extension to block web trackers. For more details on securing your digital privacy, we recommended guides from the Digital Defense Fund and the Electronic Frontier Foundation.

If you plan to protest against the Supreme Court’s decision, check out the  Wired magazine guide on how to protest safely. And if you’re seeking information about receiving an abortion in post-Roe America, Wired also published a list of resources for that as well.

I have been coding since I was a teen, and have a degree in computer programming and a degree in computer engineering. I have worked in the software industry my whole adult life, and have been reading and writing about privacy and technology with my boss (and now partner), Greg Bufithis, for the last 15 years. But I spend a lot more time than he does amongst the tubes and wires of the internet and web.

I have seen suggestions in the last few days that those seeking abortions should use burner phones – which you use and discard – which sounds cool but is difficult in practice. I’ve spent enough time in digital security and encryption to know that trying to maintain a burner phone requires using almost everything I know about communications systems and security – and I still cannot be completely sure I have evaded surveillance and identification. Generally, a burner phone number can be traced. All mobile phones and burner apps go through a cellular carrier or virtual number operator. Your identity can be tracked via call logs, data usage, location, and text messages. It can be avoided to a certain extent but for the novice not knowing how networks work, it will be difficult. 

NOTE: there is, of course, a way. Using a phone with a serial number not identified to you. Difficult but not impossible and beyond the scope of this post.

How about leaving your phone behind? Let me just say, good luck. Even if you don’t carry a digital device and only use cash, commercially available biometric databases can carry out facial recognition at scale. Clearview AI says it has more than 10 billion images of people taken from social media and news articles that it sells to law enforcement and private entities. Given the ubiquity of cameras, it will soon be difficult to walk anywhere without being algorithmically recognized. Even a mask is no barrier. Algorithms can recognize people from other attributes as well. In China, the police have employed “gait recognition” – using artificial intelligence to identify people by the way they walk and by body features other than their face. And not just the Chinese. U.S. and UK intelligence services use the same gait technology software.

And a note on Facebook. It doesn’t just collect data on its two billion users, and it also doesn’t just collect it from what those people do while using its products. Billions of web pages (including those of The New York Times) and mobile apps contain Facebook code – tracking pixels – that collect detailed data, and communicate them back to Facebook. They try to match this to existing Facebook users, but keep it even for nonusers who connect with Facebook users or who are mentioned by Facebook users, creating what’s called “shadow profiles”. Google’s tracking, too, is all over the web and in many apps through its ubiquitous ad products. “Just don’t use it” doesn’t get people too far. As we noted in our monograph on data privacy last year, Facebook users probably react with or have their data sent (unknowingly) to an average 1,700 data brokers. 

Now let’s get to the truly scary stuff.

Increasingly, artificial intelligence can use surveillance data to infer things that aren’t even whispered. We all know the story from over a decade ago (it has been that long), when the The New York Times reported about a father whose teenage daughter suddenly started getting promotional materials for baby items from Target. The angry dad went to a Target store and got an apology from the manager, only to learn after confronting his daughter that … she was pregnant. Maybe it was something overt, like the girl purchasing a pregnancy test. However, increasingly, such predictions are made by analyzing big data sets with algorithms (yep, “machine learning” again) that can arrive at conclusions about things that aren’t explicitly in the data.

For example, algorithmic interpretations of Instagram posts can effectively predict a person’s future depressive episodes — performing better than humans assessing the same posts. Similar results have been found for predicting future manic episodes and detecting suicidal ideation, among many other examples. Such predictive systems are already in widespread use, including for hiring, sales, political targeting, education, medicine and more.

Given the many changes pregnancy engenders even before women know about it, in everything from sleep patterns to diet to fatigue to mood changes, it’s not surprising that an algorithm might detect which women were likely to be pregnant. And listen up: such lists are already collected and traded. Plus, that data can be be purchased by law enforcement agencies or activists intent on tracking possible abortions which makes the Texas abortion law so scary since it has “deputized” ordinary citizens to serve as bounty hunters, creating lawsuits it encourages as “vigilante litigation”. Many such algorithmic inferences are statistical, not necessarily individual, but they can narrow down the list of, well, suspects.

Simply put, our digital infrastructure has become the infrastructure of authoritarianism. Yeah, such surveillance is usually undertaken for commercial purposes and we used to have limits to what governments would want to do. But I always thought: “we have build it, and by God they will come for it”. Criminalization of abortion may well be the first wide-scale test of this, but even if that doesn’t come to pass, we’re just biding our time.

And our existing legal protections (in the U.S. especially) are effectively outdated. For example, U.S. law enforcement can obtain emails, pictures or any data you stored in the cloud without a warrant, and without notifying you, so long as it is older than six months. This is because when the initial law on email privacy was drafted in 1986, online, or what we now call cloud, storage was very expensive and people downloaded or deleted their email regularly. So anything older than six months was considered abandoned. Almost three decades later, it simply means years of personal digital history – which didn’t exist when the law was drafted – are up for grabs. This doesn’t mean we should snuff out digital technology or advances in algorithms. Even if it were possible, it wouldn’t be desirable. But “Houston, we have a problem”. 

The hardest part is always going to be the private selling, trading and merging of personal data.  The players are numerous and most are unknown, or very difficult to ascertain. And even the “good” folks. If you have been following Greg’s long-running series on COVID, you know that researchers have tried to invent privacy-preserving methods for analyzing data sets when merging them is in the public interest but the underlying data is sensitive — as when health officials were (and are still) tracking COVID outbreaks and want to merge data from multiple hospitals, or multiple jurisdictions. These techniques allow computation but make it hard, but not impossible, to identify individual records. As we are now learning from COVID data “leaks”. 

And Apple? That’s just privacy theatre. When Apple changed a default option from “track me” to “do not track me” on its phones, few people chose to be tracked. Not realizing Apple was still collecting their data and using it.

And most people who do accept tracking probably don’t realize how much privacy they’re giving up, and what this kind of data can reveal. Most location collectors get their data from the most ordinary apps – weather, games, time, etc. – that often bury that they will share the data with others in vague terms deep in their fine print. It is a treasure trove for data brokers. Under these conditions, requiring people to click “I accept” to lengthy legalese for access to functions that have become integral to modern life is a masquerade, not informed consent.

And one amusing point here in Cannes this week: in an off-the-record chat with a member of the U.S. military intel community (yes, everybody comes to this event now) who I have known for a very long time, he noted the phones of Senators and Congressmen can be easily tracked, as well as those of the Supreme Court justices. They do have access to “special phones” but rarely use them. And speaking of “data brokers”. He told me Uber drivers often boast on private chat lines (easily accessible so not so “private”) about knowing of one-night stands in Washington D.C. Hmmm … bit of a threat, to be honest.

As noted on numerous social media sites over the weekend, there is now ramped up surveillance of women’s clinics being developed in states adjoining ones with abortion bans. It’s not implausible that the legal landscape will allow the punishment of acts that lead to abortion even if that abortion is obtained elsewhere, and necessarily, some of those will happen in the state where it is criminalized.

Ah, the technology. Most of it is simply not visible to the eye: you don’t see all the servers/data brokers your phone is constantly pinging, nor can you possible keep up with all of the latest AI, machine learning, surveillance technology – even if you attend all of the industry conferences like we do. I think the overturning of Roe will be a brutal technology wake-up call for many.

One final note from Jackie Milbane who is our resident-reporter in D.C. and covers the U.S. Congress and the Supreme Court:

Fears that state prosecutors would engage in severe surveillance of pregnant people’s metadata – like tracking their period apps – received substantial media coverage in the wake of the Roe news. But I have scanned the handful of cases where state prosecutors have charged people with crimes related to abortion based on digital evidence, and prosecutors have relied on more concrete evidence like search histories, text messages and emails. Yes, there’s a massive amount of digital evidence that could be used to infer circumstantially that an abortion happened, and I think that’s what a lot of people are fearing and paying attention to. And digital surveillance will certainly will develop. I look forward to an updated edition of a study run about 5 years ago that detailed abortion-related prosecutions and digital surveillance.

But pregnant people seeking an abortion should be more concerned about explicitly admitting in text that they wanted to terminate a pregnancy – especially in states where laws could equate having an abortion with homicide or feticide – and handing their phones to police who might view such an admission as intent to commit a crime. What can they decide from a hospital medical record that you had a miscarriage? They can’t get anywhere with that. But when they have access to what you were thinking the day of, that’s gold for them. And people who may become pregnant should already consider taking steps to secure their phone and online communications. 

Overnight there is going to be a radical ground shift from what I did yesterday was legal to what I’m doing tomorrow is not, so operationally, what do I change today? It’s a really difficult swing.

Related Posts