Concerns are growing around privacy and government surveillance in today’s hyper-connected world. Technology is smarter and faster than ever — and so are government strategies for listening in. As a lawyer for the ACLU, Jennifer Granick (TED Talk: How the US government spies on people who protest — including you) works to demystify the murky legal landscape of privacy civil rights, protecting our freedom of privacy against government and private interests. We spoke with her about the battle against government surveillance, how you can keep your data safe and why legal transparency — and legal action — is vital.
In your talk at TEDxStanford, you detail some of the history and methods of government surveillance in the United States. Can you elaborate on how these methods have evolved as technology has advanced?
As Supreme Court Justice John Roberts put it, it’s the difference between “a ride on horseback [and] a flight to the moon.” The amount of information that’s available about us is exponentially more; the ease of accessing it and analyzing it, because of big data tools, storage and machine searching, is categorically different. At the same time, the laws that are intended to protect our privacy have been downgraded repeatedly, most recently in the name of the War on Terror. Everything is bigger; there’s just so much more out there.
In your talk, you mentioned that Section 702 of the FISA amendments (which allows US government agencies to surveil “foreign terrorist threats”) expired in 2017. What kind of impact will that have on the landscape of surveillance?
There was a long political battle about 702 and trying to amend it. What ended up happening is that Congress just reauthorized it, and passed it as part of a larger bill with no real reform. The movement to try to do something about it utterly failed. What it means is that right now, with more confidence than ever before, the intelligence community and [its] agencies can gather information in the name of targeting foreigners and store all of that information. So, they can search through conversations we’re having with people overseas. The news that’s happened since then shows that there are still mistakes and problems with the way these intelligence agencies are handling the information, and that they’re regularly breaking the rules. There was a recent story about the FBI violating the 702 rules. There’s no accountability to comply with the law; weak as it is, it’s basically not a concern.
What role do tech companies like Amazon and Facebook play in perpetuating these surveillance efforts?
Companies don’t want to comply with a whole bunch of legal processes, but when they do, they want it to be clear what they’re supposed to do, and they don’t want any liability for it. The companies have had some comments about wanting to restrain government surveillance to legitimate purposes to reassure their non-American users, and they’ve pushed for some sort of clarity and regularity in how surveillance is going to happen. They came out in favor of a more controlled exercise of 702, but no real reform. They also supported the Cloud Act which is a recent law that basically enables foreign governments to access information stored here in the US without meeting the higher standard of US legal process. They’re not consistently civil libertarians or privacy advocates.
If you care about any political issue — whether it’s tax reform or Black Lives Matter — we need to ensure these people can operate freely in the political world.
Facial recognition technology like Amazon’s “Rekognition” is being used by law enforcement across the country. What are the concerns and possible consequences around the use of this technology?
Face identification connected to surveillance cameras is particular dystopian, but the ACLU of Northern California’s test of Rekognition shows that even the more pedestrian uses of the technology are dangerous. In tests, the software incorrectly identified 28 members of Congress as people who have been arrested for a crime and disproportionately flagged members of the Congressional Black Caucus. The problem is both that the tool is inaccurate and discriminatory, and also that it gives unprecedented power to police.
In an always-connected world with smart tech in our homes, cars and pockets, how can we prepare for and avoid intrusive surveillance?
Number one: use encryption. Encrypting your data is getting easier and easier, and there are communications services out there that protect your communications. iMessage is one for iPhone users. There’s WhatsApp, too. I use Signal, which is a text messaging program. Encrypting your data is easier and easier. For many of us, one of the biggest challenges isn’t necessarily the government — it’s hackers, too, so always turn on multi-factor authentication. This is so that it’s not like somebody can bust into your account with a password; they will also need to have some other kind of hardware token. That’s a good thing to do, and it’s actually very little additional work.
This idea that you can be manipulated into seeing, believing, buying and thinking things that aren’t what you normally would do — and nobody knows about it because nobody knows what I see is different from what you see — is scary.
Don’t use technology that doesn’t need to be connected to the internet. If you don’t need that internet-connected baby thermometer, don’t buy it. It’s going to send your data to some company, and that company is going to sell it to marketers, and it’ll be a source of access for law enforcement. In particular, I don’t like those home assistants like the Alexa or Google Home because I think that eventually, those machines can be used to eavesdrop on people. Why would we invite a ready-made surveillance device into our home?
Everybody likes new, fun stuff — I know lots of people who have those in-home assistants. I have a cell phone, I love the internet and I use Facebook. I think one of the things people really should do is push for better laws. That’s what the law is there for. It’s supposed to protect us and allow us to participate in the modern economy.
At the end of your talk, you close by saying we need to demand transparency. What does transparency mean to you, and how we can reach it?
There’s so much we don’t know about surveillance right now. In the criminal context, we don’t know how many particular surveillance orders are issued. We don’t know what kind of information they’re getting with them. We don’t know what they’re forcing companies to do. We don’t know if they’re potentially subverting security measures in order to facilitate spying on us. It’s much worse in the intelligence context where we have this FISA court that operates and issues opinions behind closed doors. They’re supposed to be publishing these opinions, but we very rarely see them. Any new and novel interpretations of law are meant to be published, but ever since that edict went into law, we haven’t had any FISA court opinions declassified. We find out way after the fact about things, like the FBI’s most recent violation of Section 702 rules, which meant agents had access to data and information they weren’t supposed to see. We find out about these problems years later. There’s just so much that we don’t know.
Transparency is the first step, but it’s not an end unto itself. There’s a Privacy and Civil Liberties Oversight Board, and that board has only recently confirmed members, and now there’s a quorum again. For a long time, that oversight board, which is expected to provide some narrow review of intelligence programs, wasn’t even in operation. We’re behind. Only a few senators and representatives care because the population isn’t coming forward and saying, “This is really important to us.” But they should be.
There’s no more obvious reason why you should care about surveillance than the Trump administration. In the past, people who have been blasé about surveillance had an assumption that if you weren’t doing anything wrong then you didn’t have anything to worry about — police would follow the rule of law, and everybody was operating with good faith. But today, you have the extremity of the immigration situation; today, you have the way that the Trump administration is punishing people who are coming to this country by kidnapping their children. There’s rampant sexism and anti-Semitism and racism, and this idea that people are “Black identity extremists” who should be surveilled — which just means the government is surveilling civil rights activists and communities of color. And so there’s this situation where this immense amount of technical power is in the hands of people who are operating in bad faith, based on the most base of motives.
What does it mean that all this information has been gathered and can be accessed, manipulated and sold? And how do you speak to those who aren’t concerned and believe they have nothing to hide?
There’s two things. One is that everybody has committed crimes. The amount of behavior that’s covered by criminal laws is huge — whether it’s smoking pot or lying on your taxes, there’s just so many ways that you can transgress the law. Nobody is 100 percent clean. If somebody wanted to go after you and they knew everything about you, there would be ample information to do that. It’s not just criminal stuff; it’s foolish things you’ve said in the past or people you were friends with who turned out to be crooked. There’s all kinds of things that can be used to tarnish your reputation with your employer or your friends or your spouse.
The second thing I tell people is that it’s not about you. You may be of no interest, but there are people out there who are challenging the status quo, and these people stick out in order to try to make change. And the powers that be don’t necessarily want change. They like the way things are because they’re the ones in control. So if you care about any political issue — whether it’s tax reform or Black Lives Matter — we need to ensure these people can operate freely in the political world. The ability to do that is greatly reduced if someone has to be afraid that the police are going to come after their undocumented relatives. People need to be concerned about information gathering on the private side because that’s one of the main avenues that information gets to law enforcement. There’s so much incentive on the private side to collect it. That incentive is based on the advertising model: the more that companies know about us, the more targeted the advertising can be and the more money they make.
The real thing to start worrying about is what we’re seeing in China, where they’re using face-surveillance to identify people, follow them out on the street and assign them a social score.
Once you have that much information, people can be manipulated against their best interest. [Social media] sites are designed to be addictive, and in order to keep people clicking, they keep showing you more and more outrageous stuff. This totally skews your sense of the world and skews your facts so you don’t know what’s actually going on in the world. It makes you associate only with like-minded people and puts you into this filter bubble. This idea that you can be manipulated into seeing, believing, buying and thinking things that aren’t what you normally would do — and nobody knows about it because nobody knows that what I see is different from what you see — is scary.
Once you have that data, there’s sociological or systemic problems, because there are certain decisions made based on that data about things, like who’s going to qualify for welfare benefits, what housing ads are shown to me based on my race, what job listings are shown to me based on my gender. These are other kinds of ways in which data can instantiate prejudice or discrimination. It’s not like there wasn’t prejudice or discrimination before big data — the fear is that it’s less obvious that it’s happening, and that makes it much more powerful.
What does the future of surveillance and privacy look like? Is something like Google’s Smart City neighborhood in Toronto going to be the norm?
I think that’s one possible outcome — that not just our communications data but data about our bodies, homes, relationships, shopping and more will be collected and will interact with each other far more than they are now. I think that’s definitely a trend. The real thing to start worrying about is what we’re seeing in China, where they’re using face-surveillance to identify people, follow them out on the street and assign them a social score, which is made up of factors like their law-abidingness, their job and their financials. This score that apparently dictates whether or not they’re good citizens follows them everywhere, enabling government and private entities to discriminate and make decisions about these people based on their rankings. That’s a really terrifying situation to have people be labeled and treated accordingly. That’s very Brave New World.