Autonomy, Activism, and AI

Hi Folk,

School’s out, the heat outside is rising, and smoke periodically chokes out the skies – but privacy never rests. It’s June, so happy Pride Month to those who celebrate! This month cheers personal expression, identity, and self-determination. Those values necessitate autonomy, and autonomy and privacy are inextricably linked. At hacker conventions you might come across stickers that read “there can be no digital privacy without bodily autonomy.” I recently heard a technologist, Daly Barnett, point out that the maxim goes both ways: without bodily autonomy, there can be no digital privacy either.

Event: EFA SCOTUS 2023 in Review

June isn’t just Pride Month; it’s also the end of the current session of the Supreme Court of the United States (SCOTUS). This has been a banner year for debate about privacy, civil rights, and law in the face of emerging technology. I’m am pleased to invite you all to a presentation by the Civil Liberties Director of the Electronic Frontier Foundation, David Green, who will be reviewing and discussing some of SCOTUS’ most important decisions this year.

The event will run from 4pm to 5:30pm over Zoom. Join at! PLA participation in this event is made possible by our membership in the Electronic Frontier Alliance, so check out some of our allies 🤝

Resource: Safety Tips for Trans/Gender Non-Conforming Individuals

This Pride Month, stay safe out there, both online and in the real world. If you’re not sure where to start, check out this two-page handout from the good people at t4tech:

Privacy News

To the Poor NSA Agent Watching my Internet History

It’s a classic joke at this point, right? The US government knows all, surveils all, sponsors endless spies and other shadowy figures that make for great movie characters. To be sure, there is some truth to that. But why bother with spying when you can get all that data more cheaply, more easily, and more conveniently free of any pesky 4th amendment rights to demand exceptions to in court? Enter data brokers: because Meta and Google aren’t just selling to each other. According to a recent report from the Office of the Director of National Intelligence, US intelligence agencies buy a massive amount of commercially available private data, and such data has come to supplant a sizeable portion of their own surveillance efforts. What’s more worrying is that such purchases completely bypass normal limitations on government search and seizure afforded by the 4th Amendment, and there is virtually no oversight over what the data is used for once it has been collected.

Is Software Entitled to a Bounty if it Reports an Abortion to Texas?

Roe v. Wade upheld abortion rights on the basis of a constitutional right to privacy in medical care. Now that it has been overturned, states have begun to outlaw abortion, and digital privacy has become an even more essential element of reproductive health. Some states have begun to pass “sanctuary” laws allowing travelers to receive abortions and other reproductive care within their borders; some doctors perform procedures in secret, what one clinician calls “underground medicine.” But the discretion surrounding these procedures is undermined by data sharing common among medical professionals. Simply put, the disadvantage of every doctor being able to access your medical record if you seek treatment… is that any doctor can access your medical record if you seek treatment. With hospitals and private practices alike set up to use modern, connected medical record management systems, continued provision of abortion care will require technological adjustments as well as political ones. To say nothing of the vulnerabilities in personal health apps and online surveillance!

Thankfully, there is also some good news on the health privacy front! The Office of Civil Rights has issued guidance to hospitals to limit the use of tracking technology on their websites. Every step is a victory.

Resource: Decide the Future

After reading all that privacy news, do you wish there was something you could do? There is! On a personal level, you can use safe practices, tools like VPNs and ad blockers, and a modicum of caution. On a societal level, you can try to hold elected officials accountable for their stances on privacy and surveillance. helps do just that: it grades US senators and congresspeople on the basis of their past decisions, and it gives you an easy way to sort, review the basis for a grade, and even tweet a topical exhortation at politicians of your choice.

Non-Privacy News

There is no shortage of privacy news (is there ever?), but a discussion of modern digital data protection would not be complete without a nod to AI. Besides, it’s the summer, and student organizations are formally off-duty 😄.

Because New Tech is Shiny, and Shiny Makes Profits

Quoth Voltaire, “This body which was called, and which still calls itself, the Holy Roman Empire was in no way holy, nor Roman, nor an empire.” Now we have an empire that calls itself Artificial Intelligence. To be clear, I like computers, and I like math, and I both like and have developed AIs. But today’s AI is not really that distinct from the “AI” that has been controlling non-player video game characters since Pong. Intelligence, it ain’t.

Is this news? Not really. But in the wake of privacy scandals, intellectual property rights violations, regulatory clashes, scams and leaks and rounds of investor funding… maybe ask whether all that regulatory deference and market capital is the result of the technology, or just some really good PR. Maybe ask some of the questions people should have asked about Silicon Valley Bank before cryptocurrency made it a household name. Check out this article by Cory Doctorow, because he writes better than I do 😛

Algorithmic Catfish

So what if corporations harvest data and cram it into an AI? It’s not like someone’s shopping habits are going to be what pushes SkyNet over the edge. But, as AI chatbots become more believable and sophisticated, a world of scams and exploitation opens up that was previously only practical for grifters of a human nature. And it goes both ways: unethically mined personal data can be used to augment a chatbot, and a chatbot can be used to unethically mine and weaponize not just personal data, but personal connection.

You might have seen ads for Replika on the subway or online, or ads for any one of “her” competitors. Replika is an AI chatbot intended to mimic a “friend” (read: if ChatGPT started an OnlyFans). The business model was a classic: mine user data and extract steep subscription fees, because people are willing to to sign away a great deal with very limited forethought when they think it will please an attractive member of their preferred sex. And humans will form emotional bonds with anything – strap googly eyes on a Post-It and see how long it takes someone to give the thing a name. But, as an online service every under pressure to sanitize content (think of the children), Replika decided to roll back its offerings to a strictly PG approach. Incidentally clearing all of the “personalities” users had befriended. Incidentally deleting all of the “personalities” users had met and come to care for. Now, are there allegations to be made of good old fraud and bad business practices? Sure. But, whether or not you personally would want to make friends with a chatbot… there’s a whole new world of exploitation out there, and it is already claiming victims.

As always, feel free to reach out to the Privacy Law Association E-Board. June can be fabulous, but it can also be hard; we’re happy to answer questions about privacy, talk about AI, hear about new resources, or just hear about your summer.

Kyle Hunt
President, Privacy Law Association

Scroll to Top