Pushing Back on Police Surveillance: 2021 Year in Review

4 weeks ago

A year after the police murder of George Floyd, Black-led protests against police violence continue, as does resistance to police departments across the country growing their surveillance toolbelts and unnecessarily amassing troves of personal data. EFF stands with protesters against police abuse, and stands up for the core rights to privacy, speech, and protest threatened by police surveillance. This year we have gone to court to hold police accountable, endorsed regulatory and defunding proposals, and published records shedding light on police surveillance.

Surveillance in San Francisco

The San Francisco Board of Supervisors kicked off the year by voting unanimously in favor of special business districts—such as the Union Square Business Improvement District (USBID)—disclosing any new surveillance plans to the Board. The Board acted in the wake of an EFF investigation and lawsuit that exposed the San Francisco Police Department’s (SFPD) spying on last year’s Black-led protests against police violence. The SFPD monitored the demonstrations by using the USBID’s camera network.

EFF welcomes the Board’s small step toward transparency, but the city continues to defend the SFPD’s unlawful surveillance. In October 2020, EFF sued the SFPD on behalf of three activists who helped organize last year’s protests in the city. This fall, EFF asked the court to rule that the SFPD violated the city’s landmark surveillance technology ordinance and to prohibit the SFPD from using the USBID cameras without prior Board approval. While the SFPD initially claimed it did not monitor the camera feed, an SFPD officer admitted during a deposition that she repeatedly looked at the camera feed during the eight days that the department had access.

Privacy on the Road

EFF is also in court to protect your privacy from Automated License Plate Readers (ALPRs), which police use to amass large databases of location and other sensitive information on millions of drivers. In October, we filed a lawsuit on behalf of immigrant rights activists to stop the Marin County Sheriff in California from sharing its ALPR data with over 400 out-of-state agencies and 18 federal agencies, including Immigration and Customs Enforcement (ICE) and Customs and Border Protection (CBP), which violates two state laws.

Earlier in the year, EFF released Data Driven 2: California Dragnet, a new public records collection and data set that shines a light on police ALPR use across California. In 2019 alone, just 82 agencies collected more than 1 billion license plate scans using ALPRs. Yet, 99.9% of this surveillance data was not actively related to an investigation when it was collected. In Tiburon and Sausalito in Northern California, and Beverly Hills and Laguna Beach in Southern California, an average vehicle will be scanned by ALPRs every few miles it drives. EFF supports state legislation that imposes shorter retention periods on ALPR data, annually audits searches of the data, and strengthens other regulations.

The Police Look Down on You

In a major victory this summer, the Fourth Circuit Court of Appeals blocked Baltimore from using data from its aerial surveillance of people’s movements throughout the city. For a six-month pilot period, the surveillance planes continuously captured an estimated 12 hours of coverage every day of 90 percent of the city. While Baltimore’s spending board ended the surveillance contract early, the city retained some of the location data and asserted a right to search it.

Joined by several other organizations, EFF filed an amicus brief with the court arguing that Baltimore’s detailed tracking of the population of an entire city violated the Fourth Amendment and disparately impacted communities of color. The court agreed and Chief Judge Gregory, in a powerful concurring opinion, emphasized that because Black communities “are over-surveilled, they tend to be over-policed, resulting in inflated arrest rates and increased exposure to incidents of police violence.”

EFF also joined other fights against aerial surveillance. Early this year, St. Louis rejected a similar version of Baltimore’s tracking program after an education campaign and pressure from EFF and several local community organizations. We also endorsed Rep. Ayanna Pressley’s legislation to greatly curtail the amount of dangerous military equipment, including surveillance drones, that the Department of Defense can transfer to local and state law enforcement agencies. And in November, the ACLU of Northern California published records and footage of the California Highway Patrol’s extensive aerial video surveillance of last year’s Black-led protests against police violence.

The Surveillance Grab-bag

EFF pushed back this year on other police surveillance tools too. At the beginning of the year, Oakland’s City Council voted unanimously to strengthen their Surveillance and Community Safety Ordinance by prohibiting government use of “predictive policing” algorithms and a range of biometric surveillance like voice recognition. EFF endorsed a Maine bill that would defund the state’s “fusion center,” which coordinates surveillance and information sharing between federal law enforcement, intelligence agencies, and local and state police, and often threatens people’s free speech and right to protest. We also continued calling on Google to stand up for its users against geofence warrants and to be more transparent about the warrants they receive and how they handle them. During racial justice protests in Kenosha, Wisconsin, federal police used at least 12 geofence warrants to force Google to hand over data on people who were in the vicinity of—but potentially as far as a football field away from—property damage incidents.

We made a lot of progress this year to protect your privacy and free expression from police surveillance, but the fight continues. As the new year approaches, the coming weeks are an opportune time to contact your local representatives. Ask them to stand with you and your neighbors in the fight against government surveillance.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2021.

Related Cases: Williams v. San Francisco
Mukund Rathi

2021 Year in Review

4 weeks 1 day ago

2021 ended up being a time where we dug into our new realities of distributed work and the ever-changing COVID news. At the same time, news continued to come fast and furious, with the events of one week often obliterating memories of the week before. So it’s helpful for all of us to look back at the last year and remember just what we accomplished. Looking at what we did—at what you, our supporters, helped us do—we can be confident that whatever changes continue to roll in, we will continue our vital work.

We’re thankful for our roughly 38,000 members who not only support us financially but spring into action whenever it's needed. It allowed us to build on what we did in 2020, to meet the new challenges brought by this new era.

Our biggest action this year was a powerful pushback against Apple when it announced that it was reneging on its promise to provide us with secure devices. In the summer, Apple announced it would be scanning some images on our devices in a poorly-conceived strategy aimed at child safety. With 25,000 of your signatures, we delivered a single, simple message to Apple: don’t scan our phones. We sponsored a protest at Apple stores and an alternative event to make sure that Apple heard from those, especially children, who have first-hand experience with the real dangers of device insecurity. We even flew a plane over Apple’s headquarters during its major product launch to make sure its employees and executives got our message. Our message was received.  Apple first delayed and then agreed not to scan iMessage and send notifications to parents. This was a first victory, but a big one, and it was only made possible by your contributions. Of course, we’ll keep pushing until all your devices are secure and answer only to you. 

We also stood up with parents and students against the increased surveillance of students. This year, Dartmouth accused medical students of cheating based on a flawed understanding of how technology works. Our experts dug into the data and showed that what looked like cheating was just applications working as they should. After first doubling down and also instituting a policy preventing students from speaking out on social media, news coverage fueled by EFF’s technical and activism work finally convinced Dartmouth to admit its error and drop its allegations. We also brought litigation to protect a student who faced copyright claims after demonstrating the extent of surveillance conducted by student surveillance company Proctorio. 

We also continued our work to hold police responsible for illegally spying on protestors. In fall, EFF and the ACLU, representing three activists of color, asked the court to declare without a trial that the San Francisco Police Department had violated the law. Documents and testimony gathered by EFF in 2021 proved that, as our 2020 investigation had theorized, the police had accessed a local business district’s security cameras during the 2020 Black Lives Matter protests. However, San Francisco law prohibits the use of any surveillance tech by city departments like the police without approval from the Board of Supervisors. By accessing these cameras without that permission, the police violated the law. EFF also partnered with the ACLU to challenge Marin County for sharing Automated License Plate Reader information with ICE and the CPB. 

2021 wasn’t entirely a year about surveillance. In March, EFF urged the Supreme Court to rule that when students post on social media or speak out online while off campus, they are protected from punishment by school officials under the First Amendment—an important free speech principle amid the increasing surveillance of students’ online activities outside the classroom. In September, we were victorious: the Court held that public high school officials violated a student’s First Amendment rights when they suspended her from cheerleading for posting a vulgar Snapchat selfie over the weekend and off school grounds (yes, this is the infamous “fuck cheer” case). We also stood up against efforts in Texas and Florida to require platforms to host speech they do not want to host. 

We also continued our focus on breaking the internet out of the grip of the five tech giants.  After analyzing and giving feedback to Congress on a package of antitrust reform bills, those bills moved forward after a marathon hearing in the House Judiciary Committee. And we didn’t just do work in the U.S. We also worked tirelessly to reform the EU’s Digital Markets Act so it would create actual competition in the online marketplace. Also related to standing up to the tech giants, after much international consultation and feedback, we updated the Santa Clara Principles on Transparency and Accountability in Content Moderation to better match the global landscape and current issues with regard to the platforms that host so much of our speech. Finally, on a state and federal level, we pushed for, and successfully obtained, much more governmental support for universal, affordable, high-speed internet access.

Even with all of those things listed, we’re leaving much out. Season 2 of our podcast premiered. We published papers on interoperability and on the future of high-speed internet in the United States. That, in addition to countless briefs filed, testimony given to legislators, and activism campaigns launched. Please consider joining EFF. None of this is possible without our supporters.

EFF has an annual tradition of writing several blog posts on what we’ve accomplished this year, what we’ve learned, and where we have more to do. We will update this page with new stories about digital rights in 2021 every day between now and New Year’s Day.

Donate to EFF

Support Digital Freedom

2021 in Review Articles:
Cindy Cohn

Electronic Frontier Alliance Defending Local Communities: 2021 in Review

4 weeks 1 day ago

In another year of masking up, local communities have found enough footing to push back on surveillance tech and fight for our digital rights. Members of the Electronic Frontier Alliance have continued to innovate by organizing workshops and trainings for neighbors, overwhelmingly online, and made important headway on issues like more equitable broadband access, surveillance oversight, and even banning government use of face recognition.

The Electronic Frontier Alliance (EFA) is an information-sharing network of local groups that span a range of organizational models. Some are fully volunteer-run, some are affiliated with a broader institution (such as student groups), and others are independent non-profit organizations. What these groups all share in common is an investment in local organizing, a not-for-profit model, and a passion for five guiding principles:

  • Free Expression: People should be able to speak their minds to whomever will listen.
  • Security: Technology should be trustworthy and answer to its users.
  • Privacy: Technology should allow private and anonymous speech, and allow users to set their own parameters about what to share with whom.
  • Creativity: Technology should promote progress by allowing people to build on the ideas, creations, and inventions of others.
  • Access to Knowledge: Curiosity should be rewarded, not stifled.

Since first forming in 2016, the alliance has grown to 73 member groups across 26 states. It's not possible to review everything these grassroots groups have accomplished over the last year, but this post highlights a number of exemplary victories. We hope they will inspire others to take action in the new year.

Advocacy Pushing Back on Police Surveillance

EFA members have been vital in the fight against government use of face recognition technology. This type of biometric surveillance comes in many forms, and is a special menace to civil liberties. Since 2019, when San Francisco became the first city to ban government use of this technology, more than a dozen municipalities nationwide have followed suit, including Portland and Boston last year. In 2021, these victories continued with the passage of bans in Minneapolis and King County, Washington, which were won by a close collaboration between EFA members, local ACLU chapters, other local community groups, and the support of EFF.

Alliance member Restore the Fourth Minnesota (RT4MN), and the rest of the Twin Cities-based Safety Not Surveillance (SNS) coalition, successfully advocated to pass their ban on government use of face recognition technology in Minneapolis. During the year-long fight for the ban, the coalition built widespread community support, took the argument to the local press, and won with a unanimous vote from the city council. The SNS coalition didn’t rest on its laurels after this victory, but instead went on to mobilize against increased state funding to the local fusion center, and to continue to advocate for a Community Control Over Police Surveillance (CCOPS) ordinance. These campaigns and other impressive work coming out of Minnesota are covered in more detail in EFF’s recent interview with a RT4MN organizer.

In California, Oakland Privacy won one of the first victories of the year, when its City Council voted to strengthen their anti-surveillance bill in January. The Citizens Privacy Coalition of Santa Clara County has been organizing for CCOPS policies across the San Francisco Bay, fighting for democratic control over the acquisition and use of surveillance tech by local government agencies.

In Missouri, Privacy Watch St. Louis has taken a leadership role in pushing for a CCOPS bill that was introduced in the city council earlier this year. The group also worked with the ACLU of Missouri to educate lawmakers and their constituents about the dangers and unconstitutionality of another bill, Board Bill 200, which would have implemented aerial surveillance (or "spy planes") similar to a Baltimore program. Early this year, the city’s Rules Committee unanimously voted against the bill.

EFA members also targeted another dangerous form of police surveillance: acoustic gunshot detection, the most popular brand of which is ShotSpotter. One of the most prominent voices is Chicago-based Lucy Parsons Labs which has brought the harms to light in their research and use of Freedom of Information Act (FOIA) requests. Lucy Parsons Labs discusses this and more of their incredible work in their own year in review post. They went on to coordinate with Oakland Privacy and other EFA members to organize protests against another ShotSpotter program.

In New York City, alliance member Surveillance Technologies Oversight Project (STOP) uncovered a secret NYPD slush fund used to purchase invasive surveillance technology with no public oversight. In collaboration with Legal Aid NYC, STOP blew the whistle on $159 million of unchecked surveillance spending, ranging from face recognition to x-ray vans. Also, STOP, the Brennan Center, EFF, and other leading civil society advocates held the NYPD accountable for its inadequate compliance with the POST Act. The 2020 law required greater NYPD transparency in its implementations of surveillance technologies.

Defending User Rights

In addition to protecting privacy from state surveillance, EFA members also turned out to ensure users’ rights were protected from unfair and shady business practices.

In July, the Biden Administration instructed the Federal Trade Commission (FTC) to advance Right to Repair policies, leading to a rare public hearing and vote. Called on by fellow repair advocates such as iFixit, USPIRG, and other members of the repair association, EFA members were able to rapidly mobilize to submit public comments. Following the outpouring of support, the FTC unanimously voted to enforce Right to Repair law, to defend consumers’ rights to repair their own devices without the threat of being sued by the manufacturer or patent holder. The fight for Right to Repair is far from over for local advocates, with state legislation still being considered nationwide.

Back in Oakland, organizers successfully ensured the passage of a service provider choice ordinance by unanimous vote. The new law makes sure that Oakland renters are not constrained to the internet service provider (ISP) of their landlord, but can instead freely choose their own provider. This blocks the kickback schemes many landlords enjoy, where they share revenue with Big ISPs or receive other benefits in exchange for denying competitors physical access to rented apartments. As a result, residents are stuck with whatever quality and cost the incumbent ISP cares to offer. This win in Oakland replicates the earlier success in San Francisco and gives tenants a choice, and smaller local ISPs an opportunity to compete. In the fight for internet access, EFA members like the Pacific Northwest Rural Broadband Alliance have also been working to set up smaller local options to extend broadband access in Montana without relying on Big ISPs that often ignore rural areas.

Electronic Frontier Alliance members were also active in advocacy campaigns to press corporations to change policies that restrict consumer access and privacy. Several groups signed onto a letter calling on PayPal to offer transparency and due process when deciding which accounts to restrict or close.

And earlier this year, when Apple revealed plans to build an encryption backdoor into its data storage and messaging systems, many EFA groups leapt into action. They helped collect over 25,000 signatures in opposition. Also, in Portland, Boston, Chicago, San Francisco, and New York, alliance members joined EFF and Fight for the Future in a nationwide series of rallies demanding Apple cancel their plans for these policies that could be disastrous for user privacy. This wave of pressure led Apple to retract some of its planned phone-scanning and pause its planned scanning of user iCloud Photos libraries.

Building community

While we celebrate each time Alliance members make headlines, we also recognize the extensive work they pour into strengthening their coalitions and building strong community defense. This is, of course, particularly difficult when we cannot safely come together in person, and organizers deal with extra hurdles to rebuild their work in an accessible online format.

Fortunately, in 2021 many allies hit their stride, and found opportunity in adversity. With so many local events going virtual, local groups leaned on their relationships in the EFA despite being in different parts of the country. These are just a few of the unique event collaborations we saw this year:

  • Aspiration Tech again hosted their annual collaborative gathering of grassroots activists and software developers with their unique co-created convening
  • Canal Alliance hosted a panel of partners, including EFF, Digital Marin, the Institute of Local Self Reliance, and Media Alliance, to discuss how communities can take action on the digital divide issues exacerbated by the pandemic.
  • CyPurr Collective maintained their monthly Brooklyn Public Library events, connecting the community to digital security experts such as EFF’s Eva Galperin, Albert Fox Cahn from EFA member S.T.O.P., and 2021 Pioneer Award winner Matt Mitchell.
  • EFF-Austin held many online workshops, including one featuring Vahid Razavi from Ethics in Tech for an event discussing ethical issues with companies in Silicon Valley.
  • Ethics in Tech hosted several all-day events featuring other EFA members, including a recent event with Kevin Welch from EFF-Austin.
  • Portland’s Techno-Activism Third Mondays hosted a number of great workshops, including a three-part panel on online privacy, why people need it, and how to fight for it.
  • RT4MN hosted a number of workshops throughout the year, including a recent panel on drone and aerial surveillance.
  • S.T.O.P. held great online panels online in collaboration with NYC partners, tackling topics that included: face recognition and predictive policing; how AI training causes law enforcement biases; how artists can organize against police surveillance; and punitive workplace surveillance faced by warehouse workers.

In addition to events hosted by EFA members, the EFF organizing team held space for EFA groups to collaborate remotely, including our first EFA Virtual Convening in August. In lieu of regular in-person meet-ups, which are essential to creating opportunities for mutual support, EFF hosted a virtual "World Café" style break-out session where EFA members and EFF staff could learn from each other's work and brainstorm new future collaborative projects.

New members

This past year we also had the opportunity to expand the alliance and establish a new presence in Montana, North Carolina, and Tennessee, by welcoming six impressive new members:

  • Calyx Institute, New York, NY: A technology non-profit with the mission of developing, testing and distributing free privacy software, as well as working to bridge the digital divide.
  • Canal Alliance, San Rafael, CA: Advocates for digital equity for immigrant communities.
  • DEFCON Group 864, Greenville, NC: The newest DEFCON group in the alliance, with a mission to provide learning opportunities and resources for everyone interested in information security.
  • Devanooga, Chattanooga, TN: A non-profit community group for current or aspiring developers and designers.
  • Pacific Northwest Rural Broadband Alliance, Missoula, MT: A non-profit foundation dedicated to building fast, affordable, community-powered broadband networks.
  • PrivaZy Collective, Wellesley, MA: A community-centered student group addressing online privacy issues as faced by Gen Zers.
Looking forward

The fight for our digital rights continues, and maintaining a robust and vigilant network of organizers is essential to that struggle. EFF will continue to work with groups dedicated to promoting digital rights in their communities, and offer support whenever possible. To learn more about how the EFA works, check out our FAQ page, and consider joining the fight by having your group apply to join us.

Learn more about some of our EFA members in these profiles:

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2021.

Rory Mir

Support a Better Web for Everyone & Unlock Grants for EFF

4 weeks 2 days ago

During the holiday season you’ll see lots of appeals from worthy causes. Wherever your heart is, there's little doubt that technology amplifies voices and helps build community around the issues that matter most to you. That’s why the Electronic Frontier Foundation fights for your right to express yourself, connect to friends, and explore ideas online. And it's also why EFF needs your help during our Year-End Challenge.

Digital privacy, security, and free speech lift up all the efforts to make the dark corners of the world a little brighter. Will you help EFF work toward a better digital future for everyone?

Give Today

Donate By December 31 to Unlock Bonus Grants!

Count yourself in during the Year-End Challenge and you'll help EFF receive up to $46,500 in grants gathered by EFF's board of directors. As the number of online rights supporters grows (see the counter!), EFF can unlock a series of seven challenge grants—from $500 to $20,000—that grow larger after we reach each milestone. No matter the size of the donation, every supporter counts.

Your Support Means a Better Web for Everybody

EFF owes every success to members around the world backing tech users’ rights. This year EFF fended off Apple’s plan for dangerous device-scanning; celebrated one of the largest state investments in public fiber broadband; helped convince the U.S. Supreme Court to reject an overbroad interpretation of the Computer Fraud and Abuse Act; took legal action to stop police surveillance of protesters; and kept sanity in conversations about online content moderation and censorship. And that’s just a few highlights!

It doesn’t matter if you’re an engineer, a caretaker, an artist, or a political activist—you can power the work necessary for a brighter future together (and unlock additional grants before the year ends!). Please consider making a contribution today.

Rise to the Challenge!

Support Online Privacy & Free Speech This Year

_____________

EFF is a member-supported U.S. 501(c)(3) organization with a top rating from the nonprofit watchdog Charity Navigator. Donations are tax-deductible as allowed by law. Make membership even easier with an automatic monthly or annual contribution!

Aaron Jue

Podcast Episode: The Life of the (Crypto) Party

1 month ago
Episode 106 of EFF’s How to Fix the Internet

Surveillance is always problematic, but it isn’t neutral—it is more often deployed in communities of color than elsewhere. And surveillance technology isn’t objective, either—it often magnifies the biases of its users and creators, affecting already-marginalized individuals far more heavily than others. Matt Mitchell, founder of CryptoHarlem, has an exciting solution for helping undo the damage that pervasive surveillance has done to those who are most profoundly impacted by it. 

Join EFF’s Cindy Cohn and Danny O’Brien as they talk with Matt, who has worked as a data journalist, a software engineer, a security researcher, a trainer, and a hacker—and learn more about how education, transparency, and building trust can increase privacy and safety for everyone. And best of all, you get to go to a party while you’re doing it.

Click below to listen to the episode now, or choose your podcast player:

%3Ciframe%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2Fe801ce26-abfd-41d5-b3c4-9a640f41bca8%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20width%3D%22100%25%22%20height%3D%22200px%22%20frameborder%3D%22no%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

  
  

You can’t fight back against surveillance unless you recognize it. CryptoHarlem, which Matt Mitchell founded, provides workshops on digital surveillance and a space for Black people in Harlem, who are over policed and heavily surveilled, to learn about digital security, encryption, privacy, cryptology tools, and more. Matt talks with Cindy and Danny about how living under pervasive surveillance dehumanizes us, why you have to meet people where they are to mobilize them, and how education is the first step to protecting your privacy—and the privacy of a community. But overall, he shows us how fun and exciting it can be to help empower and organize your community.   

You can also find the MP3 of this episode on the Internet Archive.

In this episode you’ll learn about: 

  • Cryptoparties being organized by volunteers to educate people about what surveillance technology looks like, how it works, and who installed
  • How working within your own community can be an extremely effective (and fun) way to push back against surveillance
  • How historically surveilled communities have borne the brunt of new, digital forms of surveillance
  • The ineffectiveness and bias of much new surveillance technology, and why it’s so hard to “surveill yourself to safety”
  • Why and how heavily surveilled communities are taking back their privacy, sometimes using new technology 
  • The ways that Community Control Of Police Surveillance (CCOPS) legislation can benefit communities by offering avenues to learn about and discuss surveillance technology before it’s installed
  • How security and digital privacy has improved, with new options, settings, and applications that offer more control over our online lives


Matt Mitchell is the founder of CryptoHarlem and a tech fellow for the BUILD program at the Ford Foundation. As a technology fellow at the Ford Foundation, Mitchell develops digital security training, technical assistance offerings, and safety and security measures for the foundation’s grantee partners. Mitchell has also worked as an independent digital security/countersurveillance trainer for media and humanitarian-focused private security firms. His personal work focuses on marginalized, aggressively monitored, over-policed populations in the United States.  Previously, Mitchell worked as a data journalist at The New York Times and a developer at CNN, Time Inc, NewsOne/InteractiveOne/TVOne/RadioOne, AOL/Huffington Post, and Essence Magazine. Last year he was selected as a WIRED 25, a list of scientists, technologists, and artists working to make things better. In 2017 he was selected as a Vice Motherboard Human of The Year for his work protecting marginalized groups.

Resources: 

Surveillance Technologies:

CCOPS and Community Action

Transcript:

Matt Mitchell: "Privacy is not secrecy. Privacy is saying, 'I got a door, I got a door so I could open it. I got a room that no one knows about so I could invite the people who I wanna share this with. and I could say welcome, "We're friends now. I wanna show you something I don't share with everybody." That's a beautiful thing but you only can do it when you're given the agency to do it. 

Cindy Cohn: That's Matt Mitchell. He's working in his community to get people to understand more about their digital privacy and security, and more importantly, how they can take steps to protect it. On today's episode of How to Fix the Internet, Matt will tell us how he marshaled his neighborhood Harlem in New York City. And he'll help us think about how we can all reach out to those in our own communities who might need a little help on understanding what their digital footprint looks like and how to make our online lives more secure. I'm Cindy Cohn. 

Danny O’Brien: And I'm Danny O'Brien. Welcome to How to Fix the Internet. A podcast of the Electronic Frontier Foundation. Today how to build a movement, one person, one checkbox, and one security setting at a time. 

Danny O’Brien: Matt Mitchell is someone who does many things. He's worked as a data journalist, a software engineer, a security researcher, a trainer, a hacker... 

Cindy Cohn: And we were thrilled to give him an EFF Pioneer Award for his work in his community. Matt founded Crypto Harlem, which hosts parties that teach people about protecting their digital privacy and educates them about modern digital surveillance. Matt, welcome to How to Fix the Internet.

Matt Mitchell: Hey, thanks for having me. It's great to be here. 

Cindy Cohn: When you accepted your award from EFF, you dedicated it to Jelani Henry, can you tell us his story?

Matt Mitchell: Gelani grew up in Harlem and one day the police came to his door. They said, "We have reason to believe, by looking at social media, by looking at people's contacts and their phones, that you were involved in a crime." A crime that Gelani did not commit. And they took him from his home and he did not see home again for 14 months. Spent that time in one of the worst prisons in the world, Rikers Island in New York. And all of this is because of social media surveillance, something that happens every day in the inner city. 

Cindy Cohn: I think people who don’t live in areas like Harlem sometimes have a hard time visualizing the pervasive surveillance that happens there. Can you, can you give us some of what you see in your community?

Matt Mitchell: You know, uh, living in the inner city, anywhere in the United States, in a place like Harlem, you'll be surveilled from the minute you wake up, you know? A lot of folks live in some kind of government subsidized housing. There's a lot of CCTV recordings of these properties 'cause they're technically government properties, and there's different rules that apply to you, and what you can do and what you can't do because this is technically, like, state or partly state owned. Then you'll walk out into your courtyard and you'll see these large floodlights that are either solar powered or gasoline generator powered. And it looks kind of like a guard tower in a prison movie, right? Where light is constantly shed and shined upon you and yourself. It goes through your window. People have, like, many layers of blackout shades. You see people just staple gun comforters to their window at this point 'cause it's so bright. And that light doesn't turn off until, um, the sun rises, right? So you also have this, like, constant gaze, this constant watching.

And then, you know, if you were to go just for a walk outside the courtyard around the corner,if you look up, you'll see there's cameras, surveillance cameras in the lamps. Also, on that same corner there'll be a box that says Property of NYPD, and on it will be many different things from, uh, a camera that can be uh, controlled and turned and tweaked, a flat box which maybe looks like almost like an access point for a WiFi or something, but it's actually a microphone, it's part of the shot spotter system.

But for this apparatus to work, the microphones have to be on the entire time, so they're always listening. And then when they hear the wave form or they match that pattern to what they believe is a database of firearms being fired, then it triggers or it's supposed to work that way. And so that's another element of the surveillance that’s around you. 

Danny O’Brien: Right.  

Matt Mitchell: Furthermore, you'll have surveillance from, uh, the city level and on the corporate level as well. A lot of the folks who own a bodega, or what we call, like, our little, like, smoke shop and candy shop and grocery stores, they'll get caught up in, this pressure from law enforcement, where it's like, "Look, we want you to share, not just the footage 'cause someone was in here yesterday who matched the description of a criminal or there was a crime that was committed. But we want you to kind like, join our network of cameras so we can pull video when need to," which includes a lot of folks just walking around and just doing their thing.

Cindy Cohn: It really does take eyes, you know, kind of careful eyes to see all of this surveillance because as you know, the public isn't notified when this stuff is rolled out.

Matt Mitchell: I always say like we're not, we're not really against all of this surveillance tech. Bring it. Bring all these layers of surveillance tech to every neighborhood, to the suburbs, you know, to downtown, to the tourist area. Bring it because on that day, everyone will be up in arms. On that day, everyone will be like, "What is going on? This is not the society I wanna live in." But unfortunately, the inner city is like a Petri dish, it's like a beta test for a lot of this stuff. And you know, there's a lot of commercial interests there. A company might say, "Well, we'll basically give you the tech so we can say year over year we've proven that this thing does something or it's being used by..." Like, if you say it's being used by Chicago, New York or LAPD, everyone wants that thing, regardless of whether it works. And what we've seen is that the shot spotters go off and police are dispatched to an area expecting a firearm and there's brutality because of that.

Danny O’Brien: When it happened in sort of upper-class neighborhoods, there would be a big discussion and eventually it would be decided not to roll it out. And then you watch the whole thing rollout elsewhere.

Matt Mitchell: Where communities have been subjugated to such huge amounts of surveillance generation after generation. People come to the CryptoHarlem event. they're like, "Yo, we took some pictures. What is this thing? What is that thing?" That's, often a thing that happens in our meetings with folks. And we go through it, we're like, "Well, this is this piece of technology. This is who put it there, this is how it works, this is how it fails. We have this thing called CompStat in New York, which is police data you can look back, like, every crime that's been properly filed as of yesterday, you know what I'm saying? So if it doesn't matter, like, crime in this country is going down, laying out surveillance of the country is always going up. It's not like there'll be a day where you'll see people on that ladder, taking down automatic licence plate reader or a shot spotter. So it's all about like, "Can we gain ground? And then we'll just upgrade the tech?" It just looks ridiculous after a while but no one ever takes it away.

Danny O’Brien: So you founded Crypto Harlem, what? Around 2012? What was your first meeting like?

Matt Mitchell: I would love to say it was like, "Nobody showed up, but we did it at anyway." But it was packed, first meeting was packed. I mean, I was shocked. I could barely get in the room. It was people on the sidewalk looking into the building, into the community center, you know what I'm saying? 'Cause, you know, in some communities, you have to give the whole nothing to hide argumen. But in the, you know, in these neighborhoods it's like, "Oh, I know. I already know that I'm being criminalized. My identity's criminalized, my- my existence is criminalized. And my grandma's existence was criminalized, and her great-great grandma's existence was criminalized. So it's a history of surveillance, right? And to say I have a remedy for this thing that's been plaguing you, people will show up like you're giving out medicine, right? So, you know, that's how that works when you do a Crypto party or anything like this in these neighborhoods, a marginalized community, it's actually pretty easy. 

Danny O’Brien: Do you have a remedy though? Is there something that people can practically do in a situation like that?

Matt Mitchell: Community organizers in the hood they'll sit down with folks, they'll be like, "Okay, listen, you're a laborer, you're a worker, you're undocumented, whatever your situation is. Let's make a list of what's plaguing you, things that you would change. Blue sky, dream a better world, right? For you and your kids." And you take that list, all these things that are bothering them, and then you find what you know secretly it's actually a- easy win, it's a quick fix, you know, like, "And that pothole down the street," right? You go to the nice neighborhoods, there's no potholes, you know? 

So, you just teach folks like, "Look, this is how you show up. This is how you complain. This is how you go to this meeting. This is how the city ordinance is set up," and then you patch that pothole. And with that win, they will work for years on the hardest thing on that list. You know, you have folks who have been subjugated to a huge amount of digital surveillance, we already know the panopticon.  We already know what that feels like. When they have a win and they're armed with that, they will fight forever. 

Danny O’Brien:  Right.

Cindy Cohn: Privacy really isn't about secrecy. It's about control, right. And whether you have control or someone else has control, I think that's exactly right. And when I'm hearing about your community, which is, you don't have to convince people that they're being surveilled or that, this stuff can be used against them. In some ways it's a difficult community to work in because there's so much surveillance and it can feel overwhelming. But what I hear from you is in some ways it's an easier community to work with because you don't have to convince people that there's a problem.

Matt Mitchell: You don't have to convince them. And, uh, because there's so many layers of surveillance, there's an additive effect that it's so ridiculously [laughs]... It's- it's Terminator II world, right? And to- to just have any solution offered, that's a community that's ready, so ready for this. There's a history of surveillance so they don't need to be convinced about things, but also, it's gotten so out of hand that it's really hard to justify, right? It's really hard for anyone to got their eyes open to justify. If you only put the cameras out, you're like, "Well, it's not that many cameras? They're only here." And if you only put the microphones out, you're like, "It's not that many microphones? They're over here." But when you see everything, then you're like, "Okay, we went too far,"

Cindy Cohn: You actually go out on the street and talk to people and try to encourage them to come to these parties. So can you tell me some stories about how you've convinced people to come in? 

Matt Mitchell: Yeah, definitely.  I mean, the first thing is, you know, you can have the hottest thing in your mind, but no one's going to show up, so you gotta be about it, like, "Hey, my life depends on it." So if  you don't get 10 people in this room like this is your last day, you're gonna run around, you're gonna grab everybody, you gonna go through the bus, you're gonna run through the subway. Like, that's the passion you need to have when you're starting a movement, and you also have to meet people where they are. So you have to contextualize solutions for folks, so... And I'm like, "Hey, you know, nice phone. What's your favorite app?" And they'll be like, "Oh, I like this dating app," let's say. So you're like, "Oh, okay, cool. You know, did you know that people can just find you, any, you know, anywhere you are, like, down to the second, like, down to, like, a split step away for you with that app?" And you're like, "What?" And then you'll be like, "Yeah, Kaspersky did this study. Let me show you this article real quick." And then they're like, "Damn," right? And like, "How do I stop that?" I'm like, "Oh, just turn the setting off." And they're like, "Whoa." And then you're like, "Hey, I, we got more of these come through next week, this is the spot." And people will show up because that's how easy it can be if you meet people where they are, where their points of pain are, right? 

Cindy Cohn: We talk a lot about your work in helping people with their technologies, but you're also been helping a bit with policies and laws as well. 

Matt Mitchell: I was involved with this thing called C Cops back in the day, which was Community Control of Over Police Surveillance. With that project, it was like, "Look, we should at least talk about this stuff." Oftentimes, law enforcement will say, "Well, we don't wanna share the information 'cause it gives upper hand to criminals." Well, you know, when the French were like, "Why are y'all measuring heads? You should be using this thing called fingerprints, it's a new technology back in the day." Well, we all know that there's a thing called a fingerprint and you leave it, it doesn't stop you from getting caught with their fingerprints, right? It's just this is the science, we understand it. With a lot of this new technology, it's not well researched. The efficacy is questionable, and it's secret, right? It's secret. So we're like, "Look, just tell us about it. Let's look at the civil liberty,  privacy and other problems that might come from it and let's figure out what we can do to mitigate that. You're gonna use it anyway, you might as well just at least give us that 'cause it's our money. It's our taxpayer money or it's civil asset forfeiture where you're literally stopping people and taking their money to buy stuff to watch them with, which is totally messed up."

Now, in New York, we ended up having something that's called the POST Act, it's a Public Oversight of  Surveillance Technology. But the NYPD has been so reticent to lay out exactly what they're using. But even what they have given us is mind blowing [laughs]. Like, I was like, "Okay, we have x-ray vans, that drive around, and can see through buildings to watch people." Like, "What? How many of these things? When do we get this like, and they're, like, "We won't, we're only gonna use it in case of terrorism," right? They said, "We have drones and the drones will only be used by the state police, right? You know, we use helicopters to look at traffic accidents, We're only gonna use drones? It's cheaper and safer." Then it's like, "Okay, and maybe to help during if  someone has got Dementia, they're older, they get lost." And then it's like, "Oh, maybe it's for the children. Maybe it's And then next thing you know, NYPD is just flying drones around.”

Danny O’Brien: “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation's Program in Public Understanding of Science. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians. 

Danny O’Brien: We've talked a lot about how technology can really harm people's privacy and security. Is there any way in this sort of situation that you can use technology to help people?

Matt Mitchell: Yeah. I mean, I love technology. That's why I got into this game, right? So as a hacker, I would say like if you look at the apps that I tell people to use, they're all tech that was made by people, right? I think what we really need though is technology that's developed and created by people who are directly impacted by it. So people always talk about Signal, which is dope, right? It's like the gold standard of encryption but, you know, homie came from a tech company and he not about this life, this stuff didn't really happen to him, and look what he made. So imagine what the next person who actually is like, "Hey, I wanna solve this for what's going on with my community, what's going on what I see, from the viewpoint of my mosque or my street corner or my teenage runaway center," or whatever, you know what I'm saying? We need to be able to support those technologies. A lot of times, they're open source technologies. A lot of times, there's people who, you know, who give grants to support that stuff. And I think like that- that's the win right there. Anybody, you can literally take something that's designed for you to consume technology all day like a mobile phone, with black and brown folks over index on, we got mad glittered out phones, you know what I'm saying? If there's no keyboard on it, you can't write code on that, but you can actually now program on these things. So I've taught folks like, "Yo, let's take a look at this Python, let's look at this algorithmic bias on your phone," we did it. It's not the most fun thing in the world, but you could do it. And, I mean, when you're time rich and the surveillance is on you, I think that's the future, is like building tech solutions and hacking our way out of these tech problems, right? Wonderful. 

Cindy Cohn: Oh, and I also really love the idea here that, like, we don't need one killer product. We need a million killer products because every community is different. There are some things everybody needs. I certainly would put encryption on that list, but how that gets deployed could come up in a million ways, depending on what the communities need. And that we shouldn't be thinking of a one size fits all killer app, we should be thinking about a whole universe of tools that people are empowered to build themselves. 

Matt Mitchell: I feel, like, directly impacted communities, they're going to solve their own problems and they don't really need anyone to come save them. They just need the tools, they need the resources. Maybe you need to know how to- how to exactly write that code or do that thing, you know what I'm saying? Um, so yeah, like, the Glover Center is this spot in Oakland, little homies just coming in and coding VR. Like, it's amazing, right? So like, you know, We need more of that.

Danny O’Brien: There's another element of this, which is. You know, building capacity within the community of educating people about how the technology works so that people can understand better how they're being impacted and how to mitigate it too. Right. It's is, is that, is that part of your mission? 

Matt Mitchell: This stuff is seriously hidden, right? So, um, and there are allies in strange places. Like, we got people who came in, like, "Hey, I work at the precinct down the street. This is so messed up. I gotta tell you, like, something has to be done. Blah, blah, blah," right? And or will be like, "Yo, here's a tweet. Like, y'all see this? Why is there a dog walking into this housing projects behind these cops? Like what's this thing too, blah, blah, blah.” It's a robot. So we'll research it. We'll look into it. We'll talk to people like, this is what this is. This is how it works. But without action, why, You're just, you know, you're just scaring people, right? Your first time you go to a dentist, they don't show you videos of people with, like, advanced gum disease. You know, when you go to the doctor talking about your health, they don't tell you about illnesses you could do nothing about that keep the doctor up at night. No, they don't. So we don't do that either. We talk, we keep it hopeful, we keep it actionable, and people want that message, right?

But we also are realistic, and we map it to there's many ways to stop these things. We should be fighting on every single front, right? If you're someone who'd throw a party in your living room and say, "Look, oh, we should know about this," do that. If you're someone who would, you know, throw a ballot in a box and vote against something, do that. So, like, we're just gonna lay out every real thing and we're not gonna tell you how to protect yourself, your family, your neighborhood or anything like that, right? 

So I think that's a message that most people don't normally get. They get told this kind of one way message, right? And it doesn't always make sense for them because of what they're dealing with, that is the path. And then real change actually happens, and that's the surprising thing.

I had someone ask me, like, "Aren't y'all just sticking your finger in a dam? And then you're sticking your finger in another dam? And then you're sticking your tongue, in your nose, you're trying to just stop this flood from happening?" And I'm like, "Yeah, welcome to being Black in America. Like, that's how we fix all our stuff," you know what I'm saying? Like, it has to be this way, in that space, you can raise a child. In that space, you can go to college. In that space of just a little freedom and safety, you can better yourself. And that's what we're about. We're about winning through those margins.

Cindy Cohn: That's great. Hey, do you have a favorite story of somebody who you kind of worked with through Crypto Harlem and how you kind of saw the light go off and, and saw them kind of take control over things?

Matt Mitchell: Oh, yeah, yeah, that happens all the time actually, but, Yeah, I got this one story, right? You know, there was this young homie who came in and I was like, "Are you even lost?" Like, you know, maybe this person was in the wrong place and, so he shows up and he- he does this whole thing. He goes and checks it out, he comes back again. He talks to me afterwards. We... There's a place next to the spot Harlem Business Alliance where we have this, the community center, which is on the corner of Malcolm X and- and Martin Luther King, which is pretty poetic and cool, but it wasn't- I didn't even decide that. And next door, this place called Harlem Shake, we usually just break bread and I'll just buy some fries and just talk to people 'cause you really... After the event, people wanna process and talk. And he had so much to say and I was like, "Well, check out this video. Check out the..." we have a lot of checklists and one pagers. And he went through everything and, um, he was like, "Hey, man. I wanna do more." We had this thing called the Glass Room that was in San Francisco at one point, but it was in New York. And he ended up working on- in the Glass Room as one of the ingeniouses there. We had this whole setup, it looked like electronic store but is really talking about surveillance. And he became one of the most, like, passionate anti surveillance speakers on this issue. 

And then he was like, "Hey, um, I think I might try to apply for this job." And he applied for this job at a tech spot and he got this job and it's like, you know, this is a brother from Harlem, you know what I'm saying? Every time I see a Crypto Harlem video or picture, he's in that background. And he writes me, he's just like, "You changed my life. You changed my family's life. You changed my friend's life." And he's just like, you know, that's what's up. We need the younger, cooler, next version of me, you know what I'm saying? I want the, like, you know... wha, whoever they are out there, you know what I'm saying, or- or she or he is out there, you know, we need that, like, cyberpunk Afro-futuristic, baby matt  to come up, you know? So I wanna see that. And that's just one story. I mean, I could tell you more. 

Another thing that we wanna do is, which we think is quite important, is bring it back to the people. So if we get a donation, if we get opportunity, we bring it back to the people. So I'm like, "Listen, if... How's it gonna look if I'm telling you this stuff?" But reality is, you know, you, maybe you need money, maybe you need a job opportunity, maybe you need some kind of thing. So we always make sure, we do that, you know what I'm saying? Like, another thing we do is we teach folks cyber security stuff on a path to get certified because there's more jobs than people in that space. It's relevant because there reaches a point where you need to, like, know how to read code, to find the bias in the thing, you know what I'm saying? And like, "I could do that." And I'm like, "You could do that too, let me show you how to do that."We need to understand high level cyber security stuff, get your CompTIA certifi... you know, security plus certification, so you can understand how to push back and have that authority. People will look at you, they will underestimate you based on your identity, so you need to come up with this stuff. 

Cindy Cohn: That's such a key thing about building a movement, right? Is that you're, you know, you're not just treating these people as the spectators, right? Like they're just watching a show, that's about digital security. You're bringing them in, you're supporting them. You're building the kind of, um, network and community. And you know, we've talked a lot about Harlem and your home communities, but I also know that you, you think a lot about how to, how to had to, had to do this kind of organizing and communities that are not at all, like the community that you came from, places that are rural or that are more distributed. And can you talk a little bit about that?

Matt Mitchell:  I worked globally, you know, I worked in private security, I worked for non-profits. I worked for NGOs, you know what I'm saying? And, you know, whoever is on the margins might look differently, but the treatment's the same and the playbook's the same.  So you might be like an ethnic Russian in Estonia, right? You might be in a place where I can't even tell, I'm like, "Yo, who... what's the difference between this Albanian brother here and this person there 'cause, like, you know... Or that German and this German that you're telling me it's, it's a class different." But then you'll quickly see that, "Oh, that neighborhood has more surveillance because I don't trust you. I don't trust you. I won't ask. I will find out by watching, by surveilling," right?

What you look like and what the community look like might be different but how this rolls out is painfully, obviously very the same. And therefore, the conditions we can use to organize and push back against it are equally the same. So, you know what? It works in our favor too 'cause they could use the same playbook to hurt, we could use the same playbook to succeed and win and help. 

If you live in the United States, in a rural neighborhood, in a rural area, there's a lot of farmlands, a lot- No- not a lot of opportunity. From the cost of just getting on the internet is- is unfair, there's not even equity there, right? Your options and- and choices are not fair, right? And when you speak to people, I think it's about that common ground, and that's how you build a nationwide movement, right? But I,love working in the inner city because I figure like, "Look, if we could stop it there, we could keep all these pains from being experienced by much larger numbers of humans."

Danny O’Brien: Yeah. No community is, is, is monolithic. Right? And, and I know that in situations and places where people suffer from a lot of crime, which is where the first argument about this kind of pervasive surveillance take place. There's a lot of people who, who live there, who, who wants some help and, and see surveillance as a solution, right? Rather than a problem. That they're more scared of the crime than they are of the potential of the technology. What's your way of, of interacting and bringing them into this discussion. 

Matt Mitchell: Well I think it's about, fear is used as a tactic. You know, obviously, your rational mind and your emotional mind is a constant battle, right? And if you're in a moment where you're panicked and you're stressed and you're just reacting fight or flight like that- that animal trying to just survive, that's the moment where you're not making the best choices, you know? So if I just tell you to do some simple math, one plus one is two, two plus two is four, you start using your rational mind, you start getting outta that place. So, that's what we try to do. We try to show up and just be like, and respect the fear, right? For example, look at the Asian communities, right? With Asian hate. People won't even call a hate crime, a hate crime.  It's so obvious, right? I have to respect, this is a reality right? And you work with the people who are trusted in that community, if I wanna talk to the Korean community in Queens, a lot of folks are religious, so I need to go to the Korean church, right? A lot of folks are gamers, I need to go to the Korean internet cafe. I gotta find folks who, you know, um, represent the community, not just as in literally, like, identify like, "Yo, I'm from here, this is my neighborhood," but also the way they stand up in the community, right? They're the most popular, uh, Cos player or whatever, you know what I'm saying? So you bring them in and, um, that is your entry point to fighting against fear because that level of trust, that level of friendship, that level of, like, "I share your pain, I share your- the history of surveillance that happened to you. I share trying to make you safer, Who doesn't want a safer day? Who doesn't wanna safer street for their kid? Everybody does, right?

Matt Mitchell: So and then explaining that these things they're actually dangerous, they're not safe. That's the kicker. And that's what does it for folks because all of these technologies come with a pro and a con like every tech, right? Every tech has a little positive or negative, but surveillance tech, the negative is killer. The negative is so bad that once you break it down, nobody wants to go near it.

Cindy Cohn: If we've, if we've gotten a handle on this stuff and we. You know, shrunk surveillance to the limited place that it ought to have in our society. How, how does that look from where you sit? 

Matt Mitchell: I'm a dreamer, I'm a blue sky person, so I'm a, 100% abolitionist. Like, zero surveillance everywhere, right? Surveillance has always been as a tool to hurt people, so... But I think it looks like little things, you know? I also believe in celebrating every victory that comes from the community organizer in me, you know? So, you know, like, when I open up my phone and there's a settings area and it says Privacy, like, that wasn't there before. I'm an old school nerd, there was no security settings before. That's a win. And the more settings in there, each line they add is a win. And, you know, like, when people are like, "Oh, see this camera up here? woo, boo," whatever, right? That's a win, right?

Matt Mitchell: Every animal wants to be free, right? Every human being wants to be free. Every child wants to be loved. All these things, surveillance the opposite of, so you know what I'm saying? I think, like, what's that future look like? No cameras, no sensors, none of this stuff on the corner, a little bit more trust, little more acceptance that technology can't protect us from the things that we created, the evil things that we've created. 

Cindy Cohn: The stories just keep coming. You're right. You know, every day it gets clearer and clearer. Um, and, and now we need, you know, whether it's people in the ballot box or people on the policy side, or people who build the technologies decide, Hey, I'm not going to be part of this.

Matt Mitchell: In some parts of the world, we see protests where they're just like, "Hey, that ham- that hammer and that camera need to meet, you know, they need to get lunch together." So [laughs], you know, like, whatever it takes. And I wanna see it done the right way and I wanna see it done through policy and through law because that's the best thing. Like, civil rights taught us that, right? When you have the laws, they might not be respected today, but those are what you stand on to change the future tomorrow.

Danny O’Brien: Matt Mitchell. Thanks very much.

Matt Mitchell: Thank you. 

Cindy Cohn: That was terrific. And just so inspiring. I really appreciate that. Matt really just took us on a walk through his neighborhood to show us all the surveillance. I mean, it was chilling and of course it just drives home how marginalized communities are disproportionately targeted by. I did buy a surveillance. And I kind of appreciate the silver lining of that, which is that he doesn't have to convince his community. That surveillance is a bad thing. They already know it and they know it from generations. 

Danny O’Brien: Yeah, surveillance can often be so sort of out of sight out of mind. It's it's, it's this strange contradiction where it becomes invisible just as it's making you visible to whoever is out there, spying on you. Those walks just as a practical community organizing method those like walks and tours. I know Oakland privacy does this of just pointing out where the cameras are incredibly effective. The thing that you take away from is the lesson that I think we all learned is that fear just blocks learning, right? It just paralyzes you. And what you need to do is, is you need to respect the fear, or you need to understand that people often come to you out of a fear and concern, but you want to get rid of that fear and then add a historical lens that makes them understand why this is happening and gives them the possibility that things could change.

Cindy Cohn: I think that the thing that really shines through in this is that, you know, Matt's not just teaching security- he’s building a movement. He's empowering people. He doesn't do security training as if people are just passive listeners or, or students. He he's really working on giving people the tools they need to protect themselves, but also turning themselves into leaders. Including, especially because this is about cybersecurity. I mean, people having real careers and the ability to feed their family from this story. I mean, that's how you build a robust movement. 

Danny O’Brien: And I think that builds on the idea that technology isn't just an enemy here. It can also be part of the solution. It can also be one of the tools that you use to fix things. And I love the idea that the ultimate solutions are going to be built by the communities who are impacted by them. 

Cindy Cohn: Yes. I really love his version of the future when we get it right. Not only that, that every little community is going to build the technology that they need, because they're the ones impacted by it. They're the ones who ought to build the technology that best serves them. Right. But also again on the movement side, you know, recognizing that small steps matter that we have victories every single day and that we celebrate those victories. And then of course the bigger long-term vision, you know, finally living in a world where we realize that we cannot surveil ourselves to safety. That's a world I want to live in.

Danny O’Brien: Me too. Well, that's it for this week on How to Fix the Internet, check out the notes on this episode for some of the resources Matt and Crypto Harlem built and recommend so you can learn more about your digital security or pass it onto someone that you know. 

Cindy Cohn: Thanks to our guest, Matt Mitchell for sharing his optimism and vision for a future with less surveillance and more humanity.

Danny O’Brien: If you like what you hear, follow us on your favourite podcast player. We’ve got lot’s more episodes in store with smart people who will tell you how to fix the internet. 

Nat Keefe and Reed Mathis of Beat Mower made the music for this podcast, with additional music and sounds used under creative commons licence from CCMixter. You can find the credits for each of the musicians and links to the music in our episode notes. Thanks again for joining us, if you have any feedback on this episode, please email podcast@eff.org, we read every email.

“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. I’m Danny O’Brien.

Cindy Cohn: And I’m Cindy Cohn. 

Jason Kelley

EFF Continues Legal Fight to Release Records Showing How Law Enforcement Uses Cell-site Simulators

1 month ago

Four years ago, EFF set out on a mission to chase down the paper trail left behind when cops in California use cell-site simulators. This trail has led us to a California appellate court, where next spring we will face-off with San Bernardino County law enforcement over whether they can keep search warrants authorizing electronic surveillance secret from the public indefinitely.

Cell-site simulators (CSSs) mimic cell-phone towers to trick any nearby phones into connecting with them. Police use this technology to gather information on people’s phones or to track people in real time.

Whenever police use surveillance tools like CSSs, they inevitably sweep up innocent people’s private data. Because these tools are so invasive, there are legitimate questions about whether law enforcement should have deployed them in that particular investigation.

The public should be able to answer those questions by reviewing the public records that reflect how law enforcement justified using CSSs, what types of crimes merited their use, and what training and expertise officers had when deploying them. But in San Bernardino County, Calif., the public has been shut out of accessing these details despite EFF’s effort to make court records public.

The long fight to make search warrant records public

Since 2018, EFF has been trying to pry loose search warrant filings, including affidavits filed by law enforcement officers seeking court approval to use cell-site simulators, after we suspected officials were improperly sealing these records at local courthouses. Our concerns turned out to be correct.

Law enforcement agencies nationwide have a history of shielding the use of this technology from public scrutiny, with prosecutors going so far as to even dismiss cases rather than reveal information about how the technology works.

However, in 2015, two new California laws changed the game in the Golden State. First, SB 741 required California law enforcement agencies to post their cell-site simulator usage policies online. Second, and more importantly, the California Electronic Communications Privacy Act (CalECPA), ensured that the existence of search warrants involving cell-site simulators would be disclosed via the California Attorney General's OpenJustice website and that the warrants would be available to the public via the courts.

From the San Bernardino County Sheriff’s SB 741 policy, we learned that the agency keeps an annual tally of CSS use: in 2017, deputies deployed the technology 231 times, including 20 times in “emergency circumstances.” Via the Attorney General’s CalECPA dataset, we were able to identify the existence of several search warrants we believed were related to CSSs, because they mentioned the term “cell-site stimulators,” an unintentional misspelling.

However, when we tried to obtain six actual search warrants themselves, we hit a wall: San Bernardino County law enforcement refused to turn over the records. The agency claimed that our request was  “vague, overly broad,” didn’t describe an identifiable record, and would be exempt from disclosure as investigative records.

And so we sued in October 2018. San Bernardino continued to refuse to provide the records, and claimed they could not make them public because the files remained indefinitely under seal.

Rather than give up, we expanded our request to cover 22 search warrants that we believe could shine light on the use of cell-site simulators or other forms of electronic surveillance used by the San Bernardino Sheriff's Office.

After asking the court to unseal those records, we filed another lawsuit demanding that they be made public by arguing that the indefinite sealing violated the public’s right to access judicial records under the First Amendment.

An incremental victory for public access, then a new roadblock

The second lawsuit demonstrated that law enforcement had been oversealing these records in ways that were not warranted under the law. Authorities subsequently made public large portions of the search warrants applications and related documents. And in one case, law enforcement released one search warrant in its entirety after learning that a judge had rejected its sealing request and never made the documents secret.

But authorities claimed that eight law enforcement affidavits filed with search warrants must remain entirely under seal indefinitely and moved to dismiss EFF’s lawsuit. These records are important and should be public because they likely contain the justification officials provided for using an invasive tool like a cell-site simulator, as well as the type of crime being investigated. Disclosure of that information will enable the public to see whether police are reserving use of cell-site simulators for when they are truly needed or if they are routinely deploying them in non-violent or other low-level investigations.

Despite the high public interest in the affidavits, the trial court hearing the case agreed with law enforcement and ruled in January 2021 that the affidavits must be kept secret for the foreseeable future.

EFF appeals to defend the public's right to court records

With the trial court’s ruling in favor of secrecy, the stakes were raised. Now, it is no longer just about cell-site simulators, but the question of whether the government has the power to keep search warrant records secret indefinitely.

Our appeal argues that the trial court’s ruling makes it very difficult for the public to understand basic facts about how police are using cell-site simulators and other invasive technologies that sweep up innocent people’s personal data. Moreover, EFF argued that the First Amendment and California laws and rules do not allow authorities to keep every word in the affidavits under seal, in their entirety, forever.

Instead, the trial court should have made the documents public and redacted any specific information that could justifiably remain secret.

We completed briefing in the case earlier this fall and were happy to receive support, via friend-of-the-court briefs, from the First Amendment Coalition and two local journalists who report on law enforcement’s impact on the local community. We anticipate the appeals court will hear arguments in the case in spring 2022.

EFF’s case shows police fail to follow CalECPA

The biggest problem with EFF’s four-year battle to pry the search warrant affidavits loose is that they should never have been kept secret for so long in the first place. Under CalECPA, law enforcement is required to make public their surveillance orders as soon as the underlying surveillance period ended.

So if the cops sought a pen register and trap and trace order for two months, under CalECPA the records reflecting that surveillance generally must be made public after two months. And another California law has always required authorities to make public search warrant filings 10 days after they have been executed.

San Bernardino County authorities’ excessive secrecy violates CalECPA and is in clear conflict with governing law in the state. The fact that EFF continues to have to push in litigation for compliance with these laws highlights how far law enforcement will go to keep their surveillance activity secret. But EFF remains undeterred.

In the meantime, the case shows that CalECPA is only as effective as its enforcement. This is why we need the California Department of Justice to bring San Bernardino authorities and other agencies across the state into compliance with CalECPA’s transparency mandates. 

Related Cases: EFF v. San Bernardino County Superior Court (stingray transparency)
Aaron Mackey

EFF to Court: Deny Foreign Sovereign Immunity to DarkMatter for Hacking Journalist

1 month ago

When governments or private companies target someone with malware and facilitate the abuse of their human rights, the victim must be able to hold the bad actors accountable. That’s why, in October, EFF requested that a federal court consider its amicus brief in support of journalist Ghada Oueiss in her lawsuit against DarkMatter, a notorious cyber-mercenary company based in the United Arab Emirates. Oueiss is suing the company and high-level Saudi government officials for allegedly hacking her phone and leaking her private information as part of a smear campaign.

EFF’s brief argues that private companies should not be protected by foreign sovereign immunity, which limits when foreign governments can be sued in U.S. courts. Hundreds of technology companies sell surveillance and hacking as a product and service to governments around the world. Some companies sell surveillance tools to governments—in 45 of the 70 countries that are home to 88% of the world's internet users—and others, like DarkMatter, do the surveillance and hacking themselves.

DarkMatter’s hacking has serious consequences. In her lawsuit, Oueiss recounts being targeted by thousands of tweets attacking her, with accounts posting stolen personal photos and videos, some of which were doctored to further humiliate her. And earlier this month, EFF filed a lawsuit against DarkMatter because the company hacked Saudi human rights activist Loujain AlHathloul, leading to her kidnapping by the UAE and extradition to Saudi Arabia, where she was imprisoned and tortured.

U.S. companies are on both ends of DarkMatter’s misconduct—some are targets, like Apple and iPhone users, and other companies are vendors. Two U.S. companies sold zero-click iMessage exploits to DarkMatter, which it used to create a hacking system that could infiltrate iPhones around the world without the targets knowing a thing.

Human rights principles must be enforced, and voluntary mechanisms have failed these victims. U.S. courts should be open to journalists and activists to vindicate their rights, especially when there is a connection to this country—the smear campaign against Oueiss occurred here in part. EFF welcomed the Ninth Circuit Court of Appeals’ recent ruling that spyware vendor NSO Group, as a private company, did not have foreign sovereign immunity from WhatsApp’s lawsuit alleging hacking of the app’s users. Courts should similarly deny immunity to DarkMatter and other surveillance and hacking companies who directly harm Internet users around the world.

Mukund Rathi

EU's Digital Identity Framework Endangers Browser Security

1 month ago

If a proposal currently before the European Parliament and Council passes, the security of HTTPS in your browser may get a lot worse. A proposed amendment to Article 45 in the EU’s Digital Identity Framework (eIDAS) would have major, adverse security effects on millions of users browsing the web.

The amendment would require browsers to trust third parties designated by the government, without necessary security assurances. But trusting a third party that turns out to be insecure or careless could mean compromising user privacy, leaking personal or financial information, being targeted by malware, or having one’s web traffic snooped on.

What is a CA?

Certificate Authorities (CAs) are trusted notaries which underpin the main transport security model of the Web and other internet services. When you visit an HTTPS site, your browser needs to know that you are communicating with the site you requested, and that trust is ultimately anchored by the CA. CAs issue digital certificates that certify the ownership and authenticity of a public encryption key. The CA verifies that this key does belong to that website. For a certificate to be valid in a browser, it must be signed by a CA. The fundamental duty of the CA is to verify certificate requests submitted to it, and sign only those that it can verify as legitimate.

What is a Root Store?

Operating systems and browsers choose which CAs meet their standards and provide benefits to their users. They store those CAs’ root certificates in their root store. A CA that does not meet these rigid requirements are not allowed in these root stores.

The Dangers of Requiring Government Mandated CAs

The proposed amendment requires CAs in all major root stores that are nationally approved by EU member countries. The amendment has no assurance that these CAs must meet the root store’s security requirements, no listed mechanisms to challenge their inclusion, and no required transparency.

Even though eIDAS wasn't intended to be anti-democratic, it could open the path to more authoritarian surveillance.


This can lead to issues beyond poorly managed practices from a faulty or careless CA. If browsers can’t revoke a CA that has been flagged by their standards, their response to a security incident will be delayed.

This setup could also tempt governments to try  “Machine-in-the-Middle”(MITM) attacks on people. In August 2019, the government of Kazakhstan tried to require installation of a certificate to scan citizen traffic for “security threats.” Google Chrome, Mozilla Firefox, and Apple Safari blocked this certificate. They were able to take this stand because they run independent root stores with proper security controls. Under this new regulation, this would not be as easy to do. The EU has much more reach and impact than one country. Even though eIDAS wasn't intended to be anti-democratic, it could open the path to more authoritarian surveillance.

If adopted, the amendment would roll back security gains that so many worked hard to achieve in the past decade. The amendment should be dropped. Instead, these CAs should be pushed to meet requirements for transparency, security, and incident response.

Alexis Hancock

Apple’s Android App to Scan for AirTags is a Necessary Step Forward, But More Anti-Stalking Mitigations Are Needed

1 month ago

This post has been updated to say that Tracker Detect is available internationally.

We’re pleased to see Apple has come out with an Android app called Tracker Detect that addresses some of the serious threats to privacy and safety we identified with Apple AirTags when they debuted. Quarter-sized Bluetooth-enabled homing beacons marketed as a way to track lost luggage or keys, AirTags can easily be exploited by stalkers to track and locate their victims.

As we explained in May, AirTags can be slipped into a target’s bag or car, allowing abusers to follow their every move. While there are other physical trackers such as Tile and Chipolo on the market, AirTags are an order of magnitude more dangerous because Apple has made every iPhone that doesn’t specifically opt out into a part of the Bluetooth tracking network that AirTags use to communicate, meaning AirTags’ reach is much greater than other trackers. Nearly all of us cross paths with Bluetooth-enabled iPhones multiple times a day, even if we don’t know it.

We called on Apple to create an Android app that alerts users to nearby tracker. Tracker Detect, released on the Google Play store this week, allows people using Android devices to find out if someone is tracking them with an AirTag or other devices equipped with sensors compatible with the Apple Find My network. If the app detects an unexpected AirTag nearby, it will show up in the app as “Unknown AirTag.” The app has an “alarm” of sorts—it will play a sound within 10 minutes of identifying the tracker, a big improvement over the time it takes for an AirTag to start beeping when it’s out of range of the iPhone it’s tethered to: up to 24 hours, plenty of time for a stalker to track a victim without their knowledge.

While not perfect, Tracker Detect is a win for privacy. It gives victims of domestic and intimate partner abuse who exist outside of the Apple ecosystem a fighting chance to learn if they are being tracked or followed. EFF supports a harm reduction approach to privacy, and this app fills an important gap. It will be important to do outreach to domestic violence shelters and other service providers to familiarize them with AirTags and how to use Tracker Detect to run a scan. The app is available in the U.S. and internationally. But it's only available for Android 9 and higher, which rules out many of the devices at the cheap end of the Android ecosystem which are often used by vulnerable populations.

In September, researchers at Technical University of Darmstad’s Secure Mobile Networking Lab released an app called Air Guard, which is available in the Google Play Store. The app claims to be able to detect AirTags on an Android device while running in the background, but EFF has not yet tested Air Guard’s functionality. This may be an additional option for Android users who are concerned about physical trackers.

Having an app to download is a step forward, but it is not enough. We’re calling on Google to take this one step further and incorporate background AirTag tracking and detection of other physical trackers into the Android OS. Unlike the functionality Apple has incorporated into the iPhone, which operates constantly in the background, Tracker Detect requires the user to run a scan. Having your device automatically detect trackers will put it on par with the stalking mitigations that iPhone users already have. That mitigation can only be accomplished if Apple and Google cooperate in order to protect users against physical trackers.

We hope that both Apple and Google take the threat of ubiquitous, cheap, and powerful physical trackers seriously enough to work together to help their users know when they’re being stalked.

 

Karen Gullo

YouTube’s New Copyright Transparency Report Leaves a Lot Out

1 month ago

YouTube recently released a transparency report on the status of copyright claims for the first half of 2021. It says it will release these numbers biannually from now on. We applaud this move towards transparency, since it gives researchers a better look at what’s happening on the world’s largest streaming video platform. What is less welcome is the spin.

The major thrust of this report is to calm the major studios and music labels. Those huge conglomerates have consistently pushed for more and more restrictions on the use of copyrighted material, at the expense of fair use and, as a result, free expression. YouTube has plenty of incentives to try to avoid the wrath of these deep-pocketed companies by showing how it polices allege copyright infringement and generates money for creators.

The secondary goal of the report is to claim that YouTube is adequately protecting its creators. That rings hollow, since every user knows what it’s like to actually live in this ecosystem. And they have neither the time nor money to lobby YouTube for improvements. Worse, as a practical matter, YouTube is the only game in town, so they can’t make their anger heard by leaving.

Here are the big numbers YouTube just released for the first half of 2021:

  • 772 million copyright claims were made through Content ID,
  • 99% of all copyright claims were Content ID claims, meaning only 1% were DMCA or other forms of complaint
  • 6 million removal requests were made with YouTube’s Copyright Match Tool
  • Fewer than 1% of Content ID claims were disputed
  • When they were, 60% of the time the dispute was resolved in favor of those contesting the claims

YouTube argues that by transferring large sums to music labels and movie studios from internet creators, its ecosystem is, to borrow a phrase, fair and balanced. YouTube basically claims that because rightsholders use Content ID to make a lot of claims and online creators continue to upload new videos, then it must be working. That conclusion ignores a few key realities.  

Monopoly: “Where Am I Supposed to Go?”

The creators who post videos to YouTube don’t do so because they like YouTube. They do it because they believe they have no choice. We have heard “I am on YouTube for lack of any better option,” “Where am I supposed to go?” and “For what I do, there is nowhere else.” One creator, asked if he could leave YouTube, bluntly answered, “No, obviously not.”

It’s not that internet creators like what Content ID does for them, it’s that they have to agree to it in order to survive. They have to use YouTube because of its size. Since most who create videos for a living rely on sponsorships and/or memberships via platforms like Patreon, they need to reach as many people as possible to sell these services. YouTube gives them that power, far more than any other existing platform.

The Number of Disputes Is Hiding a Lot

YouTube’s dispute claims don’t add up. First, the idea that there are so few disputes means Content ID is working to catch infringement is laughable. On page 10 of the report, YouTube admits that there are errors, but that they are few and far between, based on the low dispute rate. They state that “When disputes take place, the process provided by YouTube provides real recourse,” which runs counter to much of what creators actually say they experience. They feel pressured, by YouTube, not to dispute Content ID. They fear disputing Content ID and losing their channel as a result.

YouTube’s suggestion that the relatively high percentage of disputes resolving in favor of the video creator means that there is a functioning appeals process is also dubious.

Disputing Content ID is a confusing mess that often scares creators into accepting whatever punishment the system has levied against them. The alternative—as YouTube tells them over and over—is losing their account due to accumulating copyright strikes. Absent alternative platforms, no one who makes videos for a living can afford to lose their YouTube channel.

One creator, Chris Person, runs a channel of video game clips called “Highlight Reel.” It was an incredibly popular show when Person edited it for the website Kotaku. When Person was let go, he was allowed to continue the show independently. But he had to rebuild the entire channel, which was a frustrating process. Having done that, he told us he would do anything to avoid having to do it again. As would most creators.

Creators have reported that they tell fellow creators to dispute matches on material they have the right to use, only to be met by fear. Too many are too afraid of losing their channel, their only access to an audience and therefore their income, to challenge a match. One music reviewer simply accepts them all, losing most or all direct income from the videos, rather than spend months fighting.

Furthermore, creators report that YouTube ignores its own rules, taking far longer than the 30 days it claims must pass before it acts to either release a claim or repost a video. When delays happen, there are no helplines staffed by actual human beings that might do something about it.

There is a terrible, circular logic that traps creators on YouTube. They cannot afford to dispute Content ID matches because that could lead to DMCA notices. They cannot afford DMCA notices because those lead to copyright strikes. They cannot afford copyright strikes because that could lead to a loss of their account. They cannot afford to lose their account because they cannot afford to lose access to YouTube’s giant audience. And they cannot afford to lose access to that audience because they cannot count on making money from YouTube’s ads alone, partially because Content ID often diverts advertising money to rightsholders when there is Content ID match. Which they cannot afford to dispute.

Katharine Trendacosta

Have an Open Records Horror Story? Shine a Light by Nominating an Agency for The Foilies 2022

1 month 1 week ago

This post is crossposted at MuckRock and was co-written by Michael Morisy.

We are now accepting submissions for The Foilies 2022, the annual project to give tongue-in-cheek awards to the officials and institutions that behave badly—or ridiculously—when served with a request for public records.

Compiled by the Electronic Frontier Foundation (EFF) and MuckRock, The Foilies run as a cover feature in alternative newsweeklies across the U.S. during Sunshine Week (March 13-19, 2022), through a partnership with the Association of Alternative Newsmedia.

In 2021, we saw agencies fight to keep secrets large and small and we saw officials withhold and obfuscate critical information the public needs and is entitled to by law. But even as we’ve kept a running tally of Freedom of Information Act (FOIA) fumbles, we still miss many of the transparency horror stories out there, especially those that go unreported.

If you’ve seen a story about an agency closing off important access or simply redacting ad absurdum, this is your chance to highlight it and let the world know—and hopefully help push all agencies to be a little more open.

EFF and MuckRock will solicit, vet, and judge submissions, but folks from across the transparency community—journalists, researchers, local and international gadflies, and more—are encouraged to submit both their own run-ins of opaque intransigence or items that have been reported on elsewhere. We’ll be accepting nominations until January 3, so please submit early and often!

Submit a Nomination

Note: MuckRock's privacy policy applies to submissions.

We’re looking for examples at all levels of government, including state, local, and national, and while we’re primarily focused on U.S. incidents, we welcome submissions about global phenomenon.

You can also review The Foilies archives, dating back to 2015, for ideas of what we’re looking for this year.

Who Can Win?

The Foilies are not awarded to people who filed FOIA requests. These are not a type of recognition anyone should actually covet. There’s no physical trophy or other tangible award, just a virtual distinction of demerit issued to government agencies and public officials (plus the odd rock star) who snubbed their nose at transparency. If you filed a FOIA request with the Ministry of Silly Walks for a list of grant recipients, and a civil servant in a bowler hat told you to take a ludicrous hike, then the ministry itself would be eligible for The Foilies.

What Are the Categories?

For the most part, we do not determine the categories in advance. Rather, we look at the nominations we receive, winnow them down to the most outrageous, then come up with fitting tributes, such as “Most Expensive FOIA Fee Estimate” and “Sue the Messenger Award.” That said, there are a few things we’re looking for in particular, such as extremely long processing times and surreal redactions.

Who Can Nominate?

Anyone, regardless of whether you were involved in the issue or just happened to read about it on Twitter. Send as many nominations as you like!

Eligibility

All nominations must have had some event happen during calendar year 2021. For example, you can nominate something related to a FOIA request filed in 1994 if you finally received a rejection in 2021.

Deadline

All nominations must be received by January 3, 2022.

How to Submit a Nomination

Click here to submit your nominations. You can nominate multiple entries by just returning to that page as many times as needed. Each entry should include the following information:

Category: One-line suggested award title. We reserve the right to ignore or alter your suggestion.

Description: Succinct explanation of the public records issue and why it deserves recognition.

Links/References: Include any links to stories, records, photos, or other information that will help us better understand the issue.

Email address: Include a way for us to reach you with further questions. This information will remain confidential. If we short-list your nomination, we may be in touch to request more information.

Dave Maass

Victory! Federal Court Blocks Texas’ Unconstitutional Social Media Law

1 month 1 week ago

On December 1, hours before Texas’ social media law, HB 20, was slated to go into effect, a federal court in Texas blocked it for violating the First Amendment. Like a similar law in Florida, which was blocked and is now pending before the Eleventh Circuit Court of Appeals, the Texas law will go to the Fifth Circuit. These laws are retaliatory, obviously unconstitutional, and EFF will continue advocating that courts stop them.

In October, EFF filed an amicus brief against HB 20 in Netchoice v. Paxton, a challenge to the law brought by two associations of tech companies. HB 20 prohibits large social media platforms from removing or moderating content based on the viewpoint of the user. We argued, and the federal court agreed, that the government cannot regulate the editorial decisions made by online platforms about what content they host. As the judge wrote, platforms’ right under the First Amendment to moderate content “has repeatedly been recognized by courts.” Social media platforms are not “common carriers” that transmit speech without curation.

Moreover, Texas explicitly passed HB 20 to stop social media companies’ purported discrimination against conservative users. The court explained that this “announced purpose of balancing the discussion” is precisely the kind of government manipulation of public discourse that the First Amendment forbids. As EFF’s brief explained, the government can’t retaliate against disfavored speakers and promote favored ones. Moreover, HB 20 would destroy or prevent the emergence of even large conservative platforms, as they would have to accept user speech from across the political spectrum.

HB 20 also imposed transparency requirements and user complaint procedures on large platforms. While these kinds of government mandates might be appropriate when carefully crafted—and separated from editorial restrictions or government retaliation—they are not here. The court noted that companies like YouTube and Facebook remove millions of pieces of user content a month. It further noted Facebook’s declaration in the case that it would be “impossible” to establish a system by December 1 compliant with the bill’s requirements for that many removals. Platforms would simply stop removing content to avoid violating HB 20 - an impermissible chill of First Amendment rights.

Mukund Rathi

Google’s Manifest V3 Still Hurts Privacy, Security, and Innovation

1 month 1 week ago

It's been over two years since our initial response to Google's Manifest V3 proposal. Manifest V3 is the latest set of changes to the Chrome browser’s rules for browser extensions. Each extensions manifest version update introduces backwards-incompatible changes to ostensibly move the platform forward. In 2018, Manifest V3 was framed as a proposal, with Google repeatedly claiming to be listening to feedback. Let's check in to see where we stand as 2021 wraps up.

Since announcing Manifest V3 in 2018, Google has launched Manifest V3 in Chrome, started accepting Manifest V3 extensions in the Chrome Web Store, co-announced joining the W3C WebExtensions Community Group (formed in collaboration with Apple, Microsoft and Mozilla), and, most recently, laid out a timeline for Manifest V2 deprecation. New Manifest V2 extensions will no longer be accepted as of January 2022, and Manifest V2 will no longer function as of January 2023.

According to Google, Manifest V3 will improve privacy, security and performance. We fundamentally disagree.

According to Google, Manifest V3 will improve privacy, security, and performance. We fundamentally disagree. The changes in Manifest V3 won’t stop malicious extensions, but will hurt innovation, reduce extension capabilities, and harm real world performance. Google is right to ban remotely hosted code (with some exceptions for things like user scripts), but this is a policy change that didn’t need to be bundled with the rest of Manifest V3.

Community response to Manifest V3, whether in the Chromium extensions Google group or the W3C WebExtensions Community Group, has been largely negative. Developers are concerned about Manifest V3 breaking their extensions, confused by the poor documentation, and frustrated by the uncertainty around missing functionality coupled with the Manifest V2 end-of-life deadline.

Google has been selectively responsive, filling in some egregious gaps in functionality and increasing their arbitrary limits on declarative blocking rules. However, there are no signs of Google altering course on the most painful parts of Manifest V3. Something similar happened when Chrome announced adding a “puzzle piece” icon to the Chrome toolbar. All extension icons were to be hidden inside the puzzle piece menu (“unpinned”) by default. Despite universally negative feedback, Google went ahead with hiding extensions by default. The Chrome puzzle piece experience continues to confuse users to this day.

The World Wide Web Consortium’s (W3C) WebExtensions Community Group is a welcome development, but it won't address the power imbalance created by Chrome’s overwhelming market share: over two-thirds of all users globally use Chrome as their browser. This supermajority of web users is not likely to migrate away because of a technical squabble about extension APIs. No matter what Google decides to do, extension developers will have to work around it—or lose most of their users. And since developers are unlikely to want to maintain separate codebases for different browsers, other browsers will be heavily incentivized to adopt whatever set of extension APIs that Google ends up implementing.

Instead of working in true collaboration on the next iteration of browser extensions, Google expects Manifest V3 to be treated as a foregone conclusion. Participation in the WebExtensions group gives Google the veneer of collaboration even as it continues to do what it was going to do anyway. In short, Google enters the room as an 800-pound gorilla unwilling to listen or meaningfully work with the community.

Forcing all extensions to be rewritten for Google’s requirements without corresponding benefits to users is a fundamentally user-hostile move by Google

Forcing all extensions to be rewritten for Google’s requirements without corresponding benefits to users is a fundamentally user-hostile move by Google. Manifest V3 violates the "user-centered", "compatibility", "performance" and "maintainability" design principles of the WebExtensions group charter.

While Google's response to community feedback has been tweaks and fixes around the margins, we have been paying attention to what developers are saying. The shortcomings of Manifest V3 have come into focus.

Requiring service workers for extensions is harmful

Most browser extensions are built around a background page, a place where all sorts of work happens out of sight as the user browses. With today’s Manifest V2, extensions in Chrome have the choice to opt into using an ephemeral “event”-based background page, or to use a persistent background page. Ephemeral pages get shut down and restarted repeatedly, whenever Chrome decides to do so. Persistent pages continue running as long as the browser is open. In addition to extension APIs, both kinds of extension background pages have access to the standard set of website APIs.

Manifest V3 removes the choice, instead requiring all extensions to be based on “service workers.” Service workers are ephemeral, event-based, and do not have access to the standard set of website APIs. Along with removing the “blocking webRequest” mechanism, which we talk about below, rebasing all extensions on service workers is one of the most damaging changes in Manifest V3.

Rebasing all extensions on service workers is one of the most damaging changes in Manifest V3

Service workers are JavaScript scripts that run in the background, independent of the website that launched them. Service workers are meant to enable websites to perform previously hard or impossible tasks that optimize website performance or provide offline functionality. For example, the first time you visit twitter.com, the website installs a service worker in your browser. The service worker will stay installed, and may continue to perform tasks, even if you lose network connectivity or navigate away from twitter.com.

Service workers give websites superpowers, giving web apps functionality that is otherwise difficult or impossible. But service workers don’t have the same freedom to execute code that websites do, and there are limits to how long service workers live. Each service worker listens for messages from its website, performs its tasks, and shuts down shortly after. This makes sense, as the website is the main actor that calls upon its service worker for help. But this model doesn’t translate well to browser extensions.

Service workers were designed to work with websites, and they are a standardized part of the Web Platform. But there is no equivalent service worker standard for WebExtensions. Since extensions enhance the browser, applying the same execution limits from website service workers makes no sense, and yet this is exactly what Google has done.

Sometimes, extensions do things that explicitly act against the intentions of the browser developers, such as when tracker blockers restrict the information flowing out of Chrome.

Websites and their service workers are developed by the same teams, and are meant to work in tandem. But browsers and browser extensions are built by different teams with different goals. Extensions are supposed to add new functionality that browser developers didn’t think of or intentionally left out. Sometimes, extensions do things that explicitly act against the intentions of the browser developers, such as when tracker blockers restrict the information flowing out of Chrome. Chrome continues to be the only major browser without meaningful built-in tracking protection. Web extensions need more freedom to operate on their own, which means first-class access to browser APIs and persistent memory.

Take a look at the long list of known use cases harmed by requiring service workers. Seamlessly playing audio, parsing HTML, requesting geolocation, communicating via WebRTC data channels, and the ability to start a separate service worker are all broken under the new paradigm.

Under Manifest V2, extensions are treated like first-class applications with their own persistent execution environment. But under V3, they are treated like accessories, given limited privileges and only allowed to execute reactively.

As per feedback from Mozilla engineers, one legitimate benefit of service workers may be getting extensions to gracefully handle early termination on Android. But there are ways of achieving this goal that don’t involve this degree of harm. And if one of Google's aims for Manifest V3 is to help bring extensions to Chrome on Android, Google failed to communicate this information. How can browsers and extensions developers collaborate on moving extensions forward when it appears that Google is unwilling to share all of the reasons behind Manifest V3?

declarativeNetRequest alone is inadequate

Besides proposing to move extensions to an ill-fitting service worker foundation, Google’s Manifest V3 is changing the way that content-blocking extensions can function.

Extensions based on Manifest V2 use webRequest, a flexible API that lets extensions intercept and block or otherwise modify HTTP requests and responses. Manifest V3 drops the blocking and modification capabilities of webRequest in favor of the new declarativeNetRequest API. The interception-only or “observational” webRequest API—which allows extensions to monitor, though not modify, requests—will supposedly remain in Manifest V3, although the API is broken in Manifest V3 at this time, with the relevant bug report open for over two years.

If your extension needs to process requests in a way that isn’t covered by the existing rules, you just can’t do it.

As the name suggests, the new declarativeNetRequest API is declarative. Today, extensions can intercept every request that a web page makes, and decide what to do with each one on the fly. But a declarative API requires developers to define what their extension will do with specific requests ahead of time, choosing from a limited set of rules implemented by the browser. Gone is the ability to run sophisticated functions that decide what to do with each individual request. If your extension needs to process requests in a way that isn’t covered by the existing rules, you just can’t do it.

From this follows the main problem with requiring a declarative API for blocking. Advertising technology evolves rapidly, and privacy extension developers need to be able to change their approaches to it over time. To make matters worse, extension developers can't depend on Google browser engineers to react in any timely manner or at all. Google abandoned extension API development for years before Manifest V3. For example, while extensions have had the ability to “uncloak” CNAME domains in Firefox for over three years now, Chrome still lacks support for CNAME uncloaking. And while this support may come at some point in the future as part of declarativeNetRequest, many years behind Firefox, what about uncloaking CNAMEs elsewhere, such as in observational webRequest?

As we wrote in 2019, “For developers of ad- and tracker-blocking extensions, flexible APIs aren’t just nice to have, they are a requirement. When particular privacy protections gain popularity, ads and trackers evolve to evade them. As a result, the blocking extensions need to evolve too, or risk becoming irrelevant. [...] If Google decides that privacy extensions can only work in one specific way, it will be permanently tipping the scales in favor of ads and trackers.”

We have many questions about how the declarative API will interact with other Google projects. Will Google’s Privacy Sandbox technologies be exposed to declarativeNetRequest? If declarativeNetRequest works exclusively on the basis of URL pattern matching, how will extensions block subresources that lack meaningful URLs, facilitated by another Google effort called WebBundles? As more tracking moves to the server, will Manifest V3 extensions be able to keep up? Is Manifest V3 a step down a path where the Google parts of the Web become unblockable by extensions?

We reject declarativeNetRequest as a replacement for blocking webRequest. Instead, Google should let developers choose to use either API.

We reject declarativeNetRequest as a replacement for blocking webRequest. Instead, Google should let developers choose to use either API. Making both APIs available can still fulfill Google’s stated goals of making extensions safer and more performant. Google could use Chrome Web Store to guide extensions that don’t actually need blocking webRequest towards the declarative API. Google could also provide extension developer tools that would automatically analyze your extension for potential improvements, like the audit tools provided to promote best practices to website developers. In addition, extensions that use webRequest should get flagged for additional review; this should be clearly communicated to extension developers.

Google’s performance claims

Google has claimed that part of the reason for its Manifest V3 restrictions is to improve performance. If extensions are allowed to have persistent background pages, the argument goes, then those pages will sit idle and waste memory. In addition, Google claims webRequest is an inefficient API because of how it traverses browser internals and extension code, and because it makes it possible for poorly implemented extensions to slow down Chrome. Google has provided no evidence to back these claims.

In fact, many of the most popular extensions drastically speed up regular browsing by blocking resource-hogging ads and trackers. On the other hand, the restraints imposed by Manifest V3 will cause broken functionality and degraded performance for common extension tasks.

This exercise should quickly put the lie to Google’s claims.

While a persistent extension background page will continue to use memory as long as your browser is open, try opening Chrome’s Task Manager sometime. Then compare the memory consumed by each and every website you have open to the memory consumed by your (presumably far fewer) extensions. Then, if you are a user of privacy or ad blocking extensions, try disabling them and reloading your websites. This exercise should quickly put the lie to Google’s claims. The memory consumed by your various open websites—especially without the help of privacy and security extensions to block memory-intensive trackers and advertisers—should dwarf the memory consumed by the extensions themselves.

Furthermore, repeatedly starting up and tearing down service worker-based extensions will lead to greater CPU load. For example, an extension using tabs, webNavigation, or observational webRequest APIs will get constantly invoked during browsing until either the user stops browsing or the five-minute time limit is reached. When the user resumes browsing, the service worker will have to get restarted immediately. Imagine how many times such an extension will get restarted during a typical day, and to what end?

Any extension that depends on relatively expensive one-time processing on startup (for example, machine learning models or WebAssembly) is an especially poor fit for service workers’ ephemeral nature.

Beyond harming performance, arbitrarily shutting down extension service workers will break functionality.

Beyond harming performance, arbitrarily shutting down extension service workers will break functionality. The user may be in the middle of interacting with extension-provided functionality on some web page when the extension's service worker gets shut down. After a service worker restart, the extension may have stale or missing configuration data and won't work properly without the user knowing to reload the page. The additional delay caused by service worker startup will break use cases that depend on speedy messaging between the web page and the extension. For example, an extension that dynamically modifies the right-click menu based on the type of clicked element is no longer able to communicate within itself in time to modify the menu before it opens.

Regressions and bugs

On top of everything else, Google’s rollout of Manifest V3 has been rushed and buggy.

While you will no longer be able to upload new Manifest V2 extensions to the Chrome Web Store as of January 2022 (next month!), entire classes of existing extensions are completely broken in Manifest V3. As previously mentioned, observational webRequest is still broken, and so is native messaging. Manipulating web pages in the background, WebSockets, user script extensions, WebAssembly: all broken.

Injecting scripts into page contexts before anything else happens (document_start “main world” injection) is also broken. This is critical functionality for privacy and security extensions. Extension developers have to resort to ugly hacks to accomplish this injection with configuration parameters, but they are all broken in Manifest V3, and the promised Manifest V3 replacement is still not available.

Meanwhile, early adopters of Manifest V3 are running into bugs that cause their extensions to stop working when new extension versions are released. Even something as basic as internationalization is broken inside service workers.

Mozilla’s disappointing response

Mozilla, apparently forced to follow in Google's wake for compatibility reasons, announced it will also be requiring extensions to switch to service workers. While Mozilla will continue to support the blocking capabilities of webRequest, in addition to implementing declarativeNetRequest, it was framed as a temporary reprieve “until there’s a better solution which covers all use cases we consider important.”

Recently, in a belated sign of community feedback finally having some effect, a Mozilla engineer proposed a compromise in the form of “limited event pages”. Limited event pages would lessen the pain of Manifest V3 by restoring the standard set of website APIs to extension background pages. An Apple representative expressed support on the part of Safari. Google said no.

Instead of following Google into Manifest V3, Mozilla should be fighting tooth and nail against Google’s proposal. It should be absolutely clear that Google acts alone despite overwhelmingly negative community feedback. A proposal cannot become a standard when everyone else stands in opposition. Mozilla’s behavior is obscuring Google’s betrayal of the extensions ecosystem. Moreover, it gives a false sense of competition and consensus when in reality this is one of the prime examples of Google’s market dominance and anti-competitive behavior.

Conclusion

What is the future of extensions? As we explained in our 2019 response, removing blocking webRequest won’t stop abusive extensions, but will harm privacy and security extensions. If Manifest V3 is merely a step on the way towards a more "safe" (i.e., limited) extensions experience, what will Manifest V4 look like? If the answer is fewer, less-powerful APIs in service of “safety”, users will ultimately suffer. The universe of possible extensions will be limited to what Google explicitly chooses to allow, and creative developers will find they lack the tools to innovate. Meanwhile, extensions that defend user privacy and safety against various threats on the Web will be stuck in the past, unable to adapt as the threats evolve.

We shouldn't rely on browser developers to think of all the needs of the diverse Web, and we don't have to: that's the beauty of extensions.

The WebExtensions standard is what we all make it to be. If we are to take the WebExtensions Community Group at face value, we should be making extensions more capable together. We should indeed be making it easier to write secure, performant, privacy-respecting extensions, but not at the cost of losing powerful privacy-preserving functionality. We should make it easier to detect abuse, but not at the cost of losing the ability to innovate. We shouldn't rely on browser developers to think of all the needs of the diverse Web, and we don't have to: that's the beauty of extensions.

The next extensions manifest version update should be opening doors to empower all of us, unconstrained by whether you can convince a few browser engineers of the validity of your needs. Google needs to cancel moving to service workers, restore blocking webRequest, and halt Manifest V2 deprecation until all regressions in functionality are addressed. Anything short of that is at best an insincere acknowledgment of developers' shared concerns, and at worst outright hostility to the extensions community at large.

More Information
Alexei Miagkov

Podcast Episode: A Better Future Starts with Secret Codes

1 month 1 week ago
Podcast Episode 105

Law enforcement wants to force companies to build a backdoor to the software that runs on your phones, tablets, and other devices. This would allow easier access to the information on your device and the information that flows through it, including your private communications with others, the websites you visit, and all the information from your applications. Join EFF’s Cindy Cohn and Danny O’Brien as they talk to Riana Pfefferkorn, a lawyer and research scholar at the Stanford Internet Observatory, about the dangers of law enforcement trying to get these backdoors built and how users' lives are better without them.

Click below to listen to the episode now, or choose your podcast player:

%3Ciframe%20height%3D%22200px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F0de5ddf8-99a0-4e2c-9855-4cff1fffb1c7%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

  
  

More than ever before, users—from everyday people to CEOs to even high-ranking government officials—have troves of personal and work-related information on their devices. With so much data stored by such a wide variety of users, including government officials, why would law enforcement want to create a vulnerability in the devices’ software?

Riana Pfefferkorn guides us toward an internet that prioritizes users over the state and how that would lead to individuals having the ability to express themselves openly and have safe, private conversations. 

Not only could bugs get in through that hole, but it also might spider cracks out throughout the rest of the windshield.

In this episode you’ll learn about:

  • Different types of data law enforcement try to gather information from, including “at rest” and “in transit” data.
  • The divide between law enforcement, national security and intelligence communities regarding their stance on strong encryption and backdoors on devices.
  • How the First Amendment plays a role in cryptography and the ability for law enforcement to try to force companies to build certain code into their software.
  • How strong encryption and device security empowers users to voice their thoughts freely.

Riana Pfefferkorn is a Research Scholar at the Stanford Internet Observatory. She focuses on investigating and analyzing the U.S. and other governments’ policies and practices for forcing decryption and/or influencing crypto-related design of online platforms and services via technical means and through courts and legislatures. Riana also researches the benefits and detriments of strong encryption on free expression, political engagement, and more. You can find Riana Pfefferkorn on Twitter @Riana_Crypto.

If you have any feedback on this episode, please email podcast@eff.org. You can find a copy of this episode on the Internet Archive. 

Below, you’ll find legal resources—including links to important cases, books, and briefs discussed in the podcast—as well as a full transcript of the audio.

Resources 

Encryption and Exceptional Access:

Apple and the FBI:

Code as First Amendment Speech

Transcript:

Riana Pfefferkorn:
The term backdoor is one that the government doesn't like to use. Sometimes they just want to call it the front door, to just walk right on in here, your encrypted communications or your devices. But, nevertheless, they tend to prefer phrases like an exceptional access mechanism.
The problem being that when you are building an exceptional access mechanism, it's a hole.
And so we have likened it to drilling a hole in a windshield, where the windshield is supposed to protect you, but now you have a hole that's been drilled in the middle of it. Not only could bugs get in through that hole, but also it might spider cracks out to throughout the rest of the windshield.

Danny O'Brien:
That's Riana Pfefferkorn. She's a research scholar at the Stanford Internet Observatory and she's also a lawyer. We're going to talk to her today about why backdoors into our devices are a bad idea.

Cindy Cohn:
And we're also going to talk about a future in which we have privacy while also giving the police the tools they do need and not the ones that they don't. Welcome to EFF's How to Fix the Internet.

Cindy Cohn:
Welcome to the show. I'm Cindy Cohn, EFS's executive director.

Danny O'Brien:
And I'm Danny O'Brien, and I'm a special advisor to EFF.

Cindy Cohn:
Today we're going to dig into device encryption and backdoors.

Danny O'Brien:
Riana's been researching forced decryption and the influence of the US government and law enforcement have had on technology and platform design. She'll take us through what is at stake, how we can approach the problem, and what is standing in the way of the solutions. Riana, thanks for joining us.

Riana Pfefferkorn:
Thank you for having me today.

Cindy Cohn::
We're so excited to have you here. Riana. Of course, as you know, talking about encryption is near and dear to all of our hearts here at EFF. We think most people first recognize the idea that the FBI was seeking a backdoor into their devices and information in 2015, when it demanded that Apple build one into the iPhone, after a horrible incident in San Bernardino. Now Apple pushed back with help from a lot of us, both you and the EFF, and the government ended up getting the information another way and the case was dropped. Bring us up to speed, what's happened since then?

Riana Pfefferkorn::
Following the Apple versus FBI dispute in San Bernardino, we saw the almost introduction of a bill by our very own here in California, Senator Dianne Feinstein, together with Senator Richard Burr, that would have imposed penalties on smartphone manufacturers that did not find a way to comply with court orders by unlocking phones for law enforcement.
That was in 2016. That bill was so roundly ridiculed that it never actually even got formally introduced in any committees or anything, much less went anywhere further beyond that. Then in the next few years, as law enforcement started being able to with, fair regularity, get into devices the way they had done in the San Bernardino dispute, we saw the debate shift, at least in the United States, from a focus on device encryption to a focus on end-to-end encryption for our communications, for our messages, particularly in end-to-end encrypted chat apps.

Cindy Cohn:
I also remember that there was another incident in Pensacola, Florida a few years ago, where the FBI once again tried to push Apple into it. Once again, the FBI was able to get the information without Apple having to hurt the security of the phones. So it seems to me that the FBI can get into our devices otherwise. So why do they keep pushing?

Riana Pfefferkorn:
It used to be that the rationale was encryption is wholly precluding us from getting access to evidence. But as it's become more and more obvious that they can open phones, as in the Pensacola shooting, as in the San Bernardino shooting, the way they speak about it has changed slightly to, "Well, we can't get into these phones quickly enough, as quickly as we would like."
Therefore, it seems that now the idea is that it is an impediment to the expeditiousness of an investigation rather than to being able to do the investigation at all. And so, if there were guaranteed access by just being able to make sure that, by design, our devices were made to provide a ready-made backdoor for governments, then they wouldn't have to go through all of the pesky work of either having to use their own in-house tools in order to crack into phones, as the FBI has done with its own in-house capabilities, or purchase them, or seek out a vendor that has the capability of building exploits to allow them to get into those phones, which is what happened in the San Bernardino situation.

Danny O'Brien:
So I think this leaves people generally confused as to what data is protected and from whom on their phones. Now I think you've talked about two things here. One is the protecting data that's on the phone, people's contacts, stuff like that. Then there's the content of communications where you have this end-to-end encryption. But what is the government able to access? Who is the encryption supposed to protect people against? What's the current state of play?

Riana Pfefferkorn: We can think about whether data is at rest or if it's in transit. So when the government seeks to get access to messages as they are live passing over the wire, over the airwaves between two people, that requires them to get a wiretap order that lets them see basically in real time the contents of communications that are going back and forth.
Once those have been received, once they are in the string of chat messages that I have in my chat app on my phone, or other messages or information that you might have, locally on your device, or remotely also in the cloud, we could talk about that, then that's a situation where there's a different mechanism that law enforcement would have to get.
They would need a warrant in order to get into, be able to search it and seize data off of your phone. So we're looking at two different points in time, potentially, for what might be the same conversation.
In terms of accessibility, I think if your device is encrypted, then that impedes law enforcement from rapidly being able to get into your phone. But once they do, using the third-party tools or homeworld tools that they have for getting into your phone, then they can see any text messages, conversations that you've got, unless you have disappearing messages turned on in the apps that you use, in which case they will have vanished from your particular device, your end point.
Whereas if law enforcement wants to get access to end-to-end encrypted communications as they're in transit on the wire, they're not going to be able to get anything other than gobbledygook, where they have to compel the provider to wiretap those conversations for them. And so, we've also seen some scattered efforts by US law enforcement to try and force the providers of end-to-end encrypted apps to remove or weaken that in order to enable wiretapping.

Danny O'Brien:
So the data that's stored on the phone, so this is the data that's encrypted and at rest, the idea behind devices and companies like Apple encrypting that is just a general protection. So if someone steals my phone, they don't get what I have, right?

Riana Pfefferkorn:
Yeah. It's funny, we used to see from the same heads of police agencies who subsequently got angry at Apple for having stronger encryption, they used to be mad about the rate at which phones were getting stolen. It wasn't so much that criminals wanted to steal a several hundred-dollar hunk of metal and glass. It was what they could get into by being able to easily get into your phone before that prevalence of strong default passcodes and stronger encryption to get into phones.
There was a treasure trove of information that you could get. Once you were in somebody's phone, you could get access to their email, you can get access to anything else that they were logged into, or have ways of resetting their logins and get into those services, all because you'd been able to steal their phone or their iPad or what have you.
And so, the change to making it harder to unlock phones wasn't undertaken by Apple, or subsequently by Google for Android phones, in order to stick it to law enforcement. It was to cut down on that particular angle of attack for security and privacy invasions that criminal rings or hackers or even abusive spouses or family members might be able to undermine your own interests in the thing that has been called basically an appendage of the human body by none other than our own Supreme Court.

Cindy Cohn:
in some ways it's the cops versus the cops on this, because the cops that are interested in helping protect us from crime in the first place want us to have our doors locked, want us to have set this lock down so that we're protected if somebody comes and steals from us. By the way, that's how most people feel as well.
Whereas the part of the cops that want to solve crimes, want to make it as easy as possible for them to get access to information. And so, in some ways, it's cop versus cop about this. If you're asking me, I want to side with the cop who wants to make sure I don't get robbed in the first place. So I think it's a funny conversation to be in.

Riana Pfefferkorn:
But it's exactly as you say, Cindy, that there are several different components of the government whose interests are frequently at odds when it comes to issues of security and privacy, in as much as not only is there a divide between law enforcement and the national security and intelligence communities when it comes to encryption, where the folks who come out of working at the NSA then turn around and say, "We try and push for stronger encryption because we know that one part of our job in the intelligence community is to try and ensure the protection of vital information and state secrets and economic protection and so forth," as opposed to law enforcement who have been the more vocal component of government in trying to push for undermining or breaking encryption.
Not only is there this divide between national security and the intelligence community and law enforcement, there's also a divide between law enforcement and consumer protection agencies, because I think that we find a lot of providers that have sensitive information and incentives to protect it by using strong encryption are in a bind, where on the one hand, they have law enforcement saying, "You need to make it easier for us to investigate people and to conduct surveillance," and on the other hand, they have state attorneys general, they have the Federal Trade Commission, and other authorities breathing down their necks saying, "You need to be using stronger encryption. You need to be taking other security measures in order to protect your customers' data and their customers' data."

Danny O'Brien:
So the argument seems to be from law enforcement, "Well, okay, stop here. No further. We don't want this to get any better protected." What are the arguments on the other side? What are the arguments for not only keeping the protections that we have already, but not stopping and continuing to make this stuff safer and more secure?

Riana Pfefferkorn:
There are a variety of different pain points. We can look at national security interests. There's the concept of commercial off-the-shelf software and hardware products where people in the military or people in government are using the same apps and the same phones that you or I use, sometimes with additional modifications, to try and make them more secure. But to the extent that everybody is using the same devices, and that includes CEOs and the heads of financial institutions and a high-ranking government officials, then we want those devices to be as secure just off the line as they could be given that variety of use cases.
That's not to say that average people like you or me, that our interests aren't important as well, as we continue to face growing ransomware pandemic and other cybersecurity and data breach and hacking incidents that seem to dominate the headlines.
Encryption isn't necessarily a cure all for all of those ills, but nevertheless, to the greater degree that we can encrypt more data in more places, that makes it more difficult for attackers to get anything useful in the event that they are able to access information, whether that's on a device or whether that's on a server somewhere.
All of this, of course, has been exacerbated by the COVID-19 pandemic. Now that all of us are at home, we're doing things over electronic mediums that we previously did face-to-face. We deserve just as much privacy and we deserve just as much security as we ever had when we were having those communications and meetings and doctor appointments and therapist appointments face-to-face.
And so, it's important, I think, to continue expanding security protections, including encryption, in order to maintain those expectations that we had now that so much more of what we do for the past 18 months has had to be online in order to protect our own physical health and safety.

Cindy Cohn:
Your answer points out that a focus on the just you and you have nothing to hide misses that, on one hand, we're not all the same. We have very different threats. Some of us are journalists. Some of us are dissidents. Some of us are people who are facing partner abuse.

One way or another, we all have a need to be secure and to be able to have a private conversation these days.

Riana Pfefferkorn:
One of the areas where I think we frequently undermine their interest is children and the privacy and speech interests of children, and the idea that children somehow deserve less privacy. We have restrictions.
Parents have a lot of control over their children's lives. But children are tomorrow's adults. And so, I think there's also been a lot of concern about not normalizing surveillance of children, whether that's doing school from home over the laptop, contexts again that we've had over the last 18 months of surveillance of children who are trying to do schoolwork or attend classes online.
There has been some concern expressed, for example, by Susan Landau, who's a computer science professor at Tufts, saying we need to not give children the impression that when they become tomorrow's adults, that we normalize surveillance and intrusion upon their own ability to grow and have private thoughts and become who they're going to become, and grow up into a world where they just think that extremely intrusive level of surveillance
is normal or desirable in society.

Danny O'Brien::
“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation's Program in Public Understanding of Science, enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

Cindy Cohn::
As long as we're talking encryption, I want to talk about the First Amendment, because, of course, that is the tool that we used in the 1990s to free up encryption from government regulation, with the observation that cryptography is just applied math and you can't regulate math without meeting the First Amendment tests. Is the First Amendment playing a role in the conversation today, or how do you think it'll play as this conversation goes forward?

Riana Pfefferkorn::
It's a great question because we have multiple court cases now that recognize that code is speech and is protected speech. To the degree that this is going to come into play in government efforts to either pass a law restricting the ability of providers, of devices, or communication services to offer strong encryption to their users, or to the degree it comes up in court orders to try and undermine existing strong protections, I think there are First Amendment arguments to be made.
When we were doing the front of the court briefing in the Apple versus FBI case, that came up multiple times, including in Apple's own briefing, to say, look, we have a First Amendment right not to be forced to hold out a suborn piece of software that the government's trying to compel us to write that would roll back security protections for this particular phone.
We have a First Amendment right not to be forced to stand by something that we don't actually stand by. We don't want to stand by that piece of software. We don't want to have to pretend like we do.
That's a little bit of a nuanced point, I think, to make. I think often when we talk about the First Amendment in the context of the internet, and this goes into current debates around content moderation as well, it becomes really easy to forget that we're not necessarily only talking about the First Amendment rights of end users as people who want to speak and receive information, receiving information also being a First Amendment protected right, it's also the rights of the companies and organizations that build these tools to be able to write and disseminate code as they wish and not to have to be forced into cryptographically signing a piece of code that they don't stand by, such as custom iOS software.

Cindy Cohn::
When you saw what Apple was saying here, is, "Look, we're selling a secure tool. We are telling people this is a secure tool," and you're basically making us into liars.
It's one thing to think about speech rights in the abstract about corporations, but I think the government forcing a company to lie about the security of its product does, I think, even if you're not a fan of corporations, feel like maybe that's something that the government shouldn't be able to do. It shouldn't be able to force me to lie and it shouldn't be able to force Apple to lie

Danny O'Brien:
So one of the things that fascinates me about the idea of compelling backdoors into the software produced by companies like Facebook, with WhatsApp and so forth, is what happens next? We've seen this a little bit in the rest of the world, because the UK and Australia have effectively introduced laws that are like this. But then they've had to hold off on actually introducing those back doors because the pushback from the companies and from the public has been so huge.
So I guess what I'm asking here is in the nightmare scenario where somebody does introduce this stuff, what happens? Do people suddenly ... Everybody around the world, everybody writes software that has backdoors in it? Does it really solve the problem that they're trying to solve here?

Riana Pfefferkorn:
It really does feel like a time machine in some ways, that we would end up maybe going right back to the 1990s when there were two different versions of the Netscape browser. One was domestic-grade crypto and one was export-grade crypto. Do we end up with a world where services have different versions with weaker or stronger encryption, depending on what market they're being offered in?

It seems like we're in a place right now where if you regulate and mandate weaker encryption at one level, then the encryption moves to a different level. So, for example, WhatsApp just announced that they are allowing opt-in to end-to-end encrypt your messaging backups. If you don't trust your cloud service provider not to be somehow scanning for verboten or disfavored content, then you could choose to turn on end-to-end encryption for your WhatsApp backups.

Or will people be scared of having them at all? I think one of the nightmare scenarios is that we go from this place where people finally have means of communicating openly and honestly and sincerely with each other, secure in the knowledge that their communications have protection, thanks to end-to-end encryption, or that they are able to encrypt their devices in such a way that they can't easily be accessed by others. Instead they get chilled into not saying what they want to say or fully realizing and self-actualizing themselves in the way that they would, which is the grim 1984 version.
But we've seen something of that in terms of the research that's already come out saying that people self-censor their search queries when they think that their search queries are going to somehow be monitored or logged by the government. You could envision a world where if people think they no longer have privacy and security over their communications with each other, or their files that they have on their phone or remotely, that they just stop thinking or saying or doing controversial thoughts or statements. That would be a huge loss.

Cindy Cohn:
So let's flip it around a little bit because we're fixing the internet here, not celebrating its brokenness. So, what are the values that we're going to get if we get this right?

Riana Pfefferkorn:
We're talking about data security, I think we often think of it as does this protect my private thoughts or less popular opinions? But it would protect everything. That's all the good and bad stuff that people do online.
But I think there will be a side effect to improving the security of everything from your e-commerce or online banking, to the security of our communications that we have as if you are an attorney, I think there's a lot to be said for having stronger encryption for your communications with your clients and in other privileged contexts, whether that is your online therapist or e-health.
Instead of a move fast and break things, it's almost a move fast and fix things, where encryption has become more and more ubiquitous just by simply turning it on by default as choices that have been made by the same large providers that, while they are rightly subject to critique for their privacy practices or antitrust practices, or what have you, nevertheless have, because they have such massive user bases, done a lot for security simply by stepping their game up when it comes to their users.

Danny O'Brien:
Yeah. We have this phrase at the EFF, which is the tyranny of the defaults, where you get stuck in a particular world, not because you don't have the freedom to change it, but because everyone gets the default settings which exclude it. It would be great to flip that around in this utopia so that the defaults actually are on the side of the user rather than the people who want to peer into this.

Danny O'Brien:
What discussions would we be having if that was all behind us? What are the subtler debates that you want to get on to and that we would have in our beautiful utopia?

Riana Pfefferkorn:
I mean one thing would be just what does a world look like where we are not privileging and centering the interests of the state above all others? What does it look like to have an internet and devices and the digital environment that centers users and individual dignity? What does that mean when individual dignity means protection from harm or protection from abuse? What does it mean when individual dignity means the ability to express yourself openly, or to have privacy in your communications with other people?
Right now, I think we're in a place where law enforcement interests always get centered in these discussions. I think also, at least in the United States, there's been a dawning recognition that the state is not necessarily the one that has a monopoly on public safety, on protection, on justice, and in fact has often been an exponent of injustice and less safety for individuals, particularly people from communities of color and other marginalized groups.
And so, if we're looking at a world that needs to be the right balance of safety and free and liberty and having dignity, there are a lot of different directions that you could go, I think, in exploring what that means that do not play into old assumptions about, well, it means total access by police or other surveilling authorities to everything that we do.

Cindy Cohn:
Oh, what a good world. We're still safe. We have safety. As I've said for a long time, you can't surveil yourself to safety, that we've recognized that and we've shifted towards how do we give people the privacy and dignity they need to have their conversations. think I'd be remiss if I didn't point out like police solved crimes before the internet. They solve crimes now without access to breaking encryption. And I think we said this at the beginning. It's not like this is blocking police. It might be making things just slightly slower, but at the cost of, again, our security and our dignity.
So I think in our future, we still solve crimes. I would love to have a future without crimes, but I think we're going to have them. But we still solve crimes. But our focus is on how do we empower users?


Riana Pfefferkorn:
Right. I think it's also easy in these discussions to fall into a technological solutionism mindset, where it's not going to be about only having more police powers for punitive and investigative purposes, or more data being collected or more surveillance being conducted by the companies and other entities that provide technology to us, that provide these media and devices to us, but also about the much harder societal questions of how do we fix misogyny and child abuse?

And having economic safety and environmental justice and all of these other things? Those are much harder questions, and we can't just expect a handful of people in Silicon Valley or a handful of people in DC to solve all of our way out of them.
I think it almost makes the encryption debate look like the simpler avenue by comparison, or by looking solely towards technological and surveillance-based answers, because it allows the illusion of those harder questions about how to build a better society.

Cindy Cohn:
I think that's so right. We circle back to why encryption has been interesting to those of us who care about making the world better for a long time, because if you can't have a private conversation, you can't start this first step towards making the change you need to make in the world.

Well, thank you so much for coming to our podcast, Riana. It's just wonderful unpacking all of this was you. We're huge fans over here at EFF.

Riana Pfefferkorn:
Oh, the feeling is mutual. It's been such a pleasure. This has been a great conversation. Thanks for having me.

Danny O'Brien:
Well, as ever, I really enjoyed that conversation with one of the key experts in this area, Riana Pfefferkorn:. One of the things I liked is we touched on some of the basics of this discussion, about government access to communications and devices, which is really this idea of the backdoor.

Cindy Cohn:
Yeah, but it's just inevitable, right? I love the image of a crack in the windshield. I mean once you have the crack in there, you really can't control what's going to come through. You just can't build a door that only good guys can get in and bad guys can't get in. I think that came really clear.
The other thing that became really clear in listening to Riana about this is how law enforcement's a bit alone in this. As she pointed out, the national security folks want strong security, they want it for themselves, and the devices that they rely on when they buy them off the shelf, that consumer protection agencies want strong security in our devices and our systems, because we've got this cybersecurity nightmare going on right now with data breaches and other kinds of things.
And so, all of us, all of us who want strong security are standing on the one side with law enforcement, really the lone voice on the other side, wanting us to have weaker security. It really does make you wonder why we keep having this conversation, given that it seems like it's outsized on the one side.

Danny O'Brien:
What did you think of the First Amendment issues here? Because I mean you pioneered this analysis and this protection for encryption, that code is speech and that trying to compel people to weaken encryption is like compelling them to lie, or at least compelling them to say what the government wants. How does that fit in now, do you think, based on what Riana was saying?

Cindy Cohn:
Well, I think that it's not central to the policy debates. A lot of this is policy debates. It becomes very central when you start writing down these things into law, because then you're starting to tell people you can code like this, but you can't code like that, or you need a government permission to be able to code in a particular way.
Then that's where we started in the Bernstein case. I think, once again, the First Amendment will end up being a backstop to some of the things that law enforcement is pushing for here that end up really trying to control how people speak to each other in this rarefied language of computer code.

Danny O'Brien:
We always like talking about the better future that we can get to on the show. I liked Riana's couching of that in terms of, first of all, the better future happens when we finally realize this conversation is going around in circles and there are more important things to discuss, like actually solving those problems, problems that are really deep and embedded in society that law enforcement is really chasing after.
I like the way that she conveyed that the role of technology is to really help us communicate and work together to fix those problems. It can't be a solution in its own right.
It's not often that people really manage to successfully convey that, because to people outside, I think, it all looks like tech solutions, and there's just some it works for and some it doesn't.

Cindy Cohn::
Yeah. I really appreciated the vision she gave us of what it looks like if we get this all right. That's the world I want to live in. Thank you so much to Riana Pfefferkorn for coming on the show and giving us her vision of the utopia we could all have.

Danny O'Brien::
And thanks to Nat Keefe and Reed Mathis of Beat Mower for making the music for this podcast. Additional music is used under creative commons licence from CCMixter. You can find the credits for each of the musicians and links to the music in our episode notes.

Please visit eff.org/podcasts where you find more episodes, learn about these issues, donate to become a member, as well as lots more. Members are the only reason we can do this work, plus you can get cool stuff like an EFF hat or an EFF hoodie or even an EFF camera cover for your laptop.

“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.

I’m Danny O'Brien:.

Cindy: And I’m Cindy Cohn.

 

Christian Romero

Digital Services Act: EU Parliament's Key Committee Rejects a Filternet But Concerns Remain

1 month 1 week ago
Fix What is Broken vs Break What Works: Oscillating Between Policy Choices

The European Union's Digital Services Act (DSA) is a big deal. It's the most significant reform of Europe’s internet platform legislation in twenty years and the EU Commission has proposed multiple new rules to address the challenges brought by the increased use of services online. While the draft proposal got many things right -- for example by setting standards on transparency, proposing limits on content removal, and allowing users to challenge censorship decisions -- signals from the EU Parliament showed that we were right to worry about where the DSA might be headed. Leading politicians suggested a dystopian set of rules that promote the widespread use of error-prone upload filters. Proponents of rules that would make any “active platform” (which one isn't?) potentially liable for the communications of its users, showed that not everyone in Parliament had learned from the EU's controversial Copyright Directive, which turns online platforms into the internet police, with special license to scan and filter users' content. Together with our allies, we called on the EU Parliament to reject the idea of a filternet, made in Europe, and refrain from undermining pillars of the e-Commerce Directive crucial in a free and democratic society. As any step in the wrong direction could reverberate around the world, we also added international voices to the debate, such as the Digital Services Act Human Rights Alliance, which stands for transparency, accountability, and human rights-centered lawmaking.

Committee Vote: Parliament Listened to Civil Society Voices

In this week's vote, EU members of parliament (MEPs) showed that they listened to civil society voices: Even though the key committee on internal market affairs (IMCO) did not follow the footsteps of the ambitious DSA reports from last year, MEPs took a stance for the protection of fundamental rights and agreed to:

- Preserve the liability exemptions for internet companies: Online intermediaries will continue to benefit from the "safe harbor" rules, which ensure that they cannot be held liable for content provided by users unless they know it is illegal and don't act against it (Art 5);

- Uphold and strengthen the ban on mandated monitoring: Under current EU internet rules, general monitoring of information transmitted or stored by intermediary service providers is banned, guaranteeing users' freedom of expression and their rights to personal data as memorialized in the Fundamental Rights Charter, which enshrines the fundamental rights people enjoy in the EU. MEPs preserved this important key principle and clarified that monitoring should neither be imposed by law or de facto, through automated or non-automated means (Art 7(1));

- Abstain from introducing short deadlines for content removals: The Committee recognized that strict and short time frames for content removals, as imposed under dangerous internet legislation like Germany's NetzDG or the EU's controversial copyright directive, will lead to removals of legitimate speech and opinion, thus impinging rights to freedom of expression;

- Not interfere with private communication: The Committee acknowledged the importance of privacy online and rejected measures that would force companies to analyze and indiscriminately monitor users’ communication on private messaging services like Whatsapp or Signal. Even though Parliamentarians were not ambitious enough to call for a general right to anonymity, they agreed that Member States should not prevent providers of intermediary services from offering end-to-end encrypted services and impose monitoring measures that could limit the anonymous use of internet services (Art 7(1b)(1c)).

We welcome these agreements and appreciate the committee's commitment to approving important procedural justice rights for users as recommended by EFF, such as the reinstatement of content and accounts that have been removed by mistake (Art 17(3)), and the important territorial limitation of take-down orders by courts or authorities (Art 8(2b)), which makes clear that one country's government shouldn't dictate what residents of other countries can say, see, or share online.

We also applaud the Committee's strong focus on transparency—platforms must explain how content moderation works and the number of content moderators allocated for each official language—and for strengthening risk assessment mechanisms—platforms must take into account language and region-specific risks when assessing systemic risk resulting from their service. Lastly, we commend the Committee's inclusion of an ambitious dark patterns prohibition: Platforms are banned from using misleading design or functionality choices that impair users' ability to control and protect their internet experience.

Concerns Remain: Enforcement Overreach, Trust Issues, and Walled Gardens

However, we continue to be worried that the DSA could lead to enforcement overreach and assign trust to entities that shouldn't necessarily be trusted. If the DSA becomes law, online platforms would be required to hand over sensitive user information to non-judicial authorities at their request. While we acknowledge the introduction of procedural safeguards—platforms would be granted the right to lodge an effective remedy—essential human rights guarantees are still missing. Other sections of the bill, even though they come with a number of positive changes compared to the EC's original proposal, still favor the powerful. The text still comes with the option of awarding the status of a "trusted flagger" to law enforcement agencies or profit-seeking industry organizations, whose notices must be given priority over notifications submitted by users. Even though conditions for becoming trusted flaggers were tightened and accompanied by comprehensive reporting obligations, further improvements are necessary.

Parliamentarians also failed to follow the lead of their colleagues, who recently took a first step towards a fair and interoperable market in their vote on the Digital Markets Act (DMA). Whereas DMA amendments called for functional interaction between messaging services and social networks, MEPs did not back key provisions that would ensure interoperability of services and instead went with a lofty non-binding political declaration of intent in the DSA.

Only incremental improvements were put in place on the limits of surveillance advertising and on a requirement that platforms appoint in-country legal representatives, unaffordable for many small non-EU providers. We also agree with the criticism that centralizing enforcement power in the hands of the EU Commission comes with democratic deficits and could lead to corporate capture. There are further aspects of the committee's position that require re-working, such as the last-minute approval of mandatory cell phone registration for pornographic content creators, which poses a threat to digital privacy and could risk exposing sensitive information of vulnerable content creators to data leaks.

We will analyze the details of the committee position in the next weeks and will work to ensure that EU lawmakers agree on a text that preserves what works and fixes what is broken.

Christoph Schmon

Dream Job Alert: Senior Fellow for Decentralization at EFF

1 month 1 week ago

We have an amazing opportunity to join the EFF team.

We are hiring a Senior Fellow of Decentralization, a position that is a public advocate helping to establish EFF as a leader in the civil liberties implications of decentralizing the Internet. You’ll help chart a course for EFF to have real impact in the public conversations about decentralization of the Internet as it fits into our mission of ensuring that technology supports freedom, justice, and innovation for all people of the world. Through fierce and informed public advocacy, blogging, and social media, this Fellow will help us create a future internet that is more decentralized, resilient, and protective of both civil liberties and innovation. Apply today. Note that this is a two year fellowship, with the possibility of extending up to one to two additional years depending on the outcomes of those two years and EFF’s needs.

The landscape of Decentralization is broad. Technologies that can re-decentralize the internet and provide for increased competition and provide for resources to those who are not being served properly by the existing world point to the future we would like to help bring about. There are three major areas of activity where we expect you to work:

  1. New protocols that provide internet services that facilitate competition, freedom of expression and privacy  that anyone can set up and use, effectively. For example, Mastodon, Blue Sky, Diaspora, Tor onion services, and Manyverse.
  2. “Web3” or “DeWeb” technologies that provide for a decentralized infrastructure using blockchain technology, especially as they support a more privacy protective experience from today’s surveillance-facilitating technologies.
  3. “DeFi” technologies including cryptocurrencies and DAOs that can potentially reorganize finance and payment systems for a more privacy protective and equitable use of money.

We will prioritize the work in that order. 

We are open to many different types of candidates for this role. We’re interested in candidates who span advocacy and implementation. It’s most important for candidates to be able to explain, think about, and advocate for resilient decentralized systems that protect and preserve civil liberties and innovation. Secondarily, someone with technical skills to use them and help us use them ourselves is helpful, but not required. And, we value diversity in background and life experiences. 

EFF is committed to supporting our employees. That’s why we’ve got competitive salaries, incredible benefits (including rental assistance, student loan reimbursement, and fantastic healthcare), and ample policies for paid time off and holidays. 

Please check out our job description and apply today!  And if you are someone who has a question about this role, please email jon@eff.org. 

Even if this job isn’t the right fit for you, please take a moment to spread the word on social media. We’re looking for an unusual person who is excited to help bring about a decentralized, locally empowered digital world.





Jon Callas

EFF to Federal Appeals Courts: Hold Police Accountable for Violating Civilians’ Right to Record

1 month 1 week ago

You have the right under the First Amendment to livestream and record on-duty police officers and officers who interfere with that right should be held accountable. That’s what EFF told the Fourth and Tenth Circuit Courts of Appeals in amicus briefs filed in November. EFF is supporting the plaintiffs in those cases, Sharpe v. Winterville and Irizarry v. Yehia, who are suing police officers for violating their right to record.

After police officers beat Dijon Sharpe during a traffic stop, he decided that next time he was in a car that was pulled over, he would livestream and record the police. So in October 2018, Sharpe, sitting in the passenger seat of a stopped car, took out his phone and started livestreaming on Facebook. When an officer saw that he was livestreaming, he grabbed Sharpe and tried to take the phone. Sharpe filed a civil rights lawsuit for the interference, a federal district court dismissed his claims in two opinions, and he appealed.

Abade Irizarry was recording a traffic stop as a bystander when another police officer interfered. The officer shined lights into Irizarry’s phone camera, stood between the camera and the traffic stop, and menaced Irizarry with his car and blasted him with an air horn. Irizarry appealed after a federal district court dismissed his lawsuit.

EFF thinks these officers should be held accountable. The First Amendment protects people who gather and share information, especially when it is about official misconduct and advances government accountability. Police body cameras point towards the public, effectively surveilling those already being policed. The civilian’s camera, by contrast, is appropriately pointed towards the officer. Ordinary people’s livestreams and recordings of the police have always been necessary to inform the public—before the police murder of George Floyd went viral in June 2020, there was the beating of Rodney King in March 1991.

But in the digital age, with the proliferation of smartphones with cameras and access to social media platforms, the right to record has become even more important, powerful, and accessible. Earlier this year, the Pulitzer Prize board awarded a special citation to Darnella Frazier, who recorded the shocking police murder of George Floyd on her phone. The board commended her for “courageously recording the murder of George Floyd, a video that spurred protests against police brutality around the world, highlighting the crucial role of citizens in journalists’ quest for truth and justice.”

Recording the police, while constitutionally protected, does have risks. Unfortunately, police officers often retaliate against people who exercise their right to record. In those situations, it is particularly important for people to be able to livestream them, like Sharpe did. If a person livestreams an encounter with a police officer, they can publish at least part of the encounter even if the officer retaliates and forces them to stop. The officers in Sharpe’s case claim, without evidence, that because livestreaming gives viewers real-time information about where officers are, it poses a greater risk to officer safety than recording. However, a bifurcated right to record but not livestream would confuse people, leave officers in the impractical position of having to verify what the person is doing with their phone, and stifle police accountability.

Finally, EFF argued that qualified immunity should not protect officers who violate someone’s clearly established right to livestream and record. Not only did the Supreme Court long ago decide that the First Amendment protects gathering and publishing information, but several federal circuits have specifically applied this to recording the police. Police officers know that when they use their extraordinary powers, the public has the right to watchdog and record them. When the police violate that right, the public must be able to hold them accountable.

Mukund Rathi

This Is Not the Privacy Bill You’re Looking For

1 month 1 week ago

Lawmakers looking for a starting place on privacy legislation should pass on The Uniform Law Commission’s Uniform Personal Data Protection Act (UPDPA). The Uniform Law Commission (ULC) seeks to write model legislation that can be adopted in state legislatures across the country to set national standards. Sadly, the ULC has fumbled its consumer privacy bill and created, in the UPDPA, a model bill that is weak, confusing, and toothless.

A strong privacy bill must place consumers first. EFF has laid out its top priorities for privacy laws, which include a full private right of action that allows people to act as their own privacy enforcers, and measures that prevent companies from discriminating—by charging more or offering less—against those who wish to protect their privacy by exercising their rights. EFF also advocates for an opt-in consent model that requires companies to obtain a person’s permission before they collect, share, or sell their data, rather than an opt-out model.

The UPDPA falls short on many of these fronts. And why? Because, despite years of evidence that companies will not protect consumer privacy on their own, the UPDPA defers to company complaints that respecting people’s privacy is a burden. In fact, UPDPA Committee Chairman Harvey Perlman openly admitted that one of the drafting committee’s main goals was to lower business compliance costs.

By lowering its standards to coax companies into compliance, the UPDPA leaves consumers twisting in the wind.

By seeking a middle path on some of the biggest disagreements between consumer advocates and companies looking to do as little as possible to change their practices, the UPDPA has come up with “compromises” that work for no one. Company advocates find its suggestions confusing, as it sets up yet another framework for compliance. Consumer advocates find the “protections” in the bill hollow. It’s no surprise that one Oklahoma legislator told the International Association of Privacy Professionals the bill was “empty.” “There appears to be nothing else substantive in this bill besides an obligation for the data company to provide a voluntary consent standard,” he said.  “Essentially those in control of the data get to decide what their policies and procedures are going to be. So this law is empty because it’s saying [businesses] have to come up with something to address privacy, but we’re not telling you exactly what it is.”

Consumer Rights, But Defined by Companies

By lowering its standards to coax companies into compliance, the UPDPA leaves consumers twisting in the wind.  At its core, the bill hinges on whether a company uses your information for purposes that are either “compatible” or “incompatible” with the reasons the company originally collected the information. So, for example, you might allow a company to collect your location information if it’s going to do something for you related to where you are,  such as identify certain restaurants near you. This kind of guardrail might sound good at first blush; in fact, it’s in line with an important privacy principle –companies should only use a consumer’s information for the kinds of purposes that a consumer gave permission for in the first place. However, the UPDPA undermines the meaning of “compatible purpose”—providing no real protections for normal people.

First, individuals have no say over whether the purposes companies ultimately use their data for are “compatible” with the original purpose of collection, leaving that definition entirely up to companies. This gives a company wide latitude to process people’s information for whatever reason it may deem in keeping with the reason it collected it. That could include processing that a person wouldn’t want at all.  For example, if the company collecting your location information to tell you about nearby restaurants decided it also wanted to use that data to track your regular travel patterns, it could unilaterally classify that new use as supposedly “compatible” with the original use, without asking you to approve it. 

The UPDPA also defines targeted advertising as a “compatible purpose” that requires no extra consent—despite targeted ads being one of the most commonly derided uses of personal information. In fact, when consumers are given the choice, they overwhelmingly choose not to participate in advertising that tracks their behavior. This contorts ideas to protect privacy and lets an unwanted privacy invasion slip under the lowest bar possible.

Furthermore, when a company uses a consumer’s data for an incompatible purpose, the bill only requires the company to give the consumer notice and an opportunity to opt-out. In other words, if a weather app had your permission to collect your location information for the purpose of locally-accurate forecasts, but then decided to share it with a bunch of advertisers, it wouldn’t have to ask for your permission first. It would simply have to give you a heads-up that “we share with advertisers” and the option to opt-out—likely in a terms and conditions update that no one ever reads.

Other rights in this bill, including those EFF supports such as the right to access one’s data and the right to correct it, are severely limited. For example, the bill gives companies permission to ignore correction requests that they deem “inaccurate, unreasonable, or excessive.” They can decide which requests meet this criteria without providing justification. That gives companies far too much leeway to ignore what their customers want. And while the bill gives consumers the right to access their data, it does not give them the right to a machine-readable electronic copy—what is often call the right to data portability.

The UPDPA also comes up short on one of EFF’s most important privacy principles: making sure that consumers aren’t punished for exercising their privacy rights. Even in cases where the bill requires a company using data to get permission to use it for an “incompatible data practice,” companies can offer a “reward or discount” in exchange for that permission. In other words, you can have your human right to privacy only if you’re willing and able to pay for it.

As we have said before, this type of practice frames our privacy as a commodity to be traded away, rather than a fundamental right to be protected. This is wrong. Someone who values their privacy but is struggling to make ends meet will feel pressured to surrender their rights for a very small gain—maybe $29 off a monthly phone bill. Privacy legislation should rebalance power in favor of consumers, not double-down on a bad system of corporate overreach.

The UPDPA Has Big Blind Spots…

The UPDPA also fails to address how data flows between private companies and government. It’s not alone in this regard: while the European General Data Protection Regulation (GDPR) covers both government and private entities, many state privacy laws in the United States focus on just one or the other.

However, there is a growing need to address the ways that data flows from private entities to government, and the UPDPA largely turns a blind eye to this threat. For example, the bill considers data “publicly available”—and therefore exempt from its protections­–if it is “observable from a publicly accessible location.”  That would seem to exempt, for example, footage from Ring cameras that people place on their doors which document what is happening in adjacent public sidewalks. Information from Ring and other private cameras needs to be protected, particularly against indiscriminate sharing with law enforcement agencies. This is yet another example of how the model legislation ignores pressing privacy concerns.

The definition of publicly available information would also seemingly exempt information posted on limited-access social media sites such as Facebook entirely—including from requirements for adhering to privacy policies or security practices. Specifically, the UPDPA exempts "a website or other forum with restricted access if the information is available to a broad audience.” That is far too broad, and willfully ignores the ways private companies feed information from social media and other companies into the hands of government agencies.

…And No Teeth

Finally, the UPDPA has gaping holes in its enforcement provisions. Privacy laws are only as good as their teeth. That means strong public enforcement and a strong private right of action. This bill has neither.

Worst of all, it expressly creates no private right of action, cutting people off from the most obvious avenue for defending themselves against a company that abuses their privacy: a lawsuit. Many privacy statutes contain a private right of action, including federal laws on wiretaps, stored electronic communications, video rentals, driver’s licenses, credit reporting, and cable subscriptions. So do many other kinds of laws that protect the public, including federal laws on clean water, employment discrimination, and access to public records. There’s no reason consumer privacy should be any different.

By denying people this obvious and powerful tool to enforce the few protections they gain in this law, the UPDPA fails the most crucial test.

State attorneys general do have the power to enforce the bill, but they have broad discretion to choose not to enforce the law. That’s too big a big gamble to play with privacy. Attorneys general might be understaffed or suffer regulatory capture—in those cases, consumers have no recourse whatsoever to be made whole for violations of the few privacy protections this bill provides. 

Don’t Duplicate This Bill

While the UPDPA wrestles with many of the most controversial discussions in privacy legislation today, it falls short of providing a meaningful resolution to any of them. It grossly fails to address the privacy problems ordinary people face—invasive data collection, poor control over how their information is used, no clear means to fight for themselves—that have data privacy on the agenda in the first place. Lawmakers, federal or state, should not duplicate this hollow bill and lower the bar on privacy.

Hayley Tsukayama

مؤسسة الجبهة الإلكترونية نيابة عن ناشطة حقوقية سعودية، تقاضي صانع برامج التجسس دارك ماتر لانتهاك قوانين مكافحة القرصنة الأمريكية والقوانين الدولية لحقوق الإنسان

1 month 1 week ago
تقدمت لجين الهذلول، المدافعة البارزة عن حقوق المرأة، بشكوى ضد شركة دارك ماتر وعملاء استخبارات أميركيين سابقين صمموا برامج ضارة لاختراق هاتفها.

English version

 بورتلاند، أوريغون - رفعت مؤسسة الجبهة الإلكترونية (EFF) دعوى قضائية اليوم نيابة عن الناشطة السعودية البارزة في مجال حقوق الإنسان لجين الهذلول ضد شركة دارك ماتر لبرامج التجسس وثلاثة من مديريها التنفيذيين السابقين بتهمة اختراق جهاز آيفون الخاص بها بشكل سري غير قانوني لتتبع اتصالاتها وأماكن وجودها.

الهذلول من بين ضحايا برنامج تجسس غير قانوني أنشأه ويديره عملاء سابقون في المخابرات الأمريكية، بما في ذلك المتهمون الثلاثة الذين وردت أسماؤهم في الدعوى القضائية، والذين عملوا لصالح شركة أمريكية استأجرتها الإمارات العربية المتحدة في أعقاب احتجاجات الربيع العربي. لتحديد ومراقبة النشطاء والصحفيين والقادة الأجانب المتنافسين ومن يُعتبرون أعداء سياسيين.

نشرت وكالة رويترز الأخبارية، أنباء حول برنامج القرصنة المسمى مشروع ريفين في العام 2019، حيث أفادت أنه عندما أوكلت الإمارات العربية المتحدة أعمال المراقبة إلى شركة دارك ماتر الإماراتية، حيث أصبح عملاء أمريكيون، تعلموا العمل الاستخباراتي لصالح وكالة الأمن القومي ووكالات استخبارات أمريكية أخرى، يديرون برنامج القرصنة دارك ماتر الذي استهدف نشطاء حقوق إنسان مثل الهذلول والمعارضين السياسيين وحتى الأمريكيين المقيمين في الولايات المتحدة.

قام المدراء التنفيذيون لشركة دارك ماتر، مارك باير، ورايان آدامز، ودانيال جيريك، الذين يعملون لصالح عميلهم في الإمارات العربية المتحدة - التي كانت تعمل نيابة عن المملكة العربية السعودية – بالاشراف على مشروع القرصنة، الذي استغل ثغرة أمنية في تطبيق آي ماسج لتحديد موقع ومراقبة الأهداف. قام باير، وآدامز، وجيريك، وجميعهم أعضاء سابقين في المخابرات الأمريكية أو الوكالات العسكرية، بتصميم وتشغيل برنامج الإمارات العربية المتحدة للمراقبة الإلكترونية، والمعروف أيضًا باسم مشروع دريد (إدارة أبحاث التطوير والاستغلال والتحليل)، باستخدام كود خبيث تم شراؤه من شركة أمريكية.

باير، الذي يقيم في الإمارات العربية المتحدة، وآدامز، المقيم في ولاية أوريغون، وجريك، الذي يعيش في سنغافورة، اعترفوا في سبتمبر\ أيلول بانتهاك قانون الاحتيال وإساءة استخدام الكمبيوتر (CFAA) وحظر بيع التكنولوجيا العسكرية الحساسة بموجب اتفاقية عدم الملاحقة القضائية مع وزارة العدل الأمريكية.

قال مدير الحريات المدنية في مؤسسة الجبهة الأمامية ديفيد جرين: "يجب محاسبة الشركات التي تقدم برامج وخدمات المراقبة للحكومات القمعية التي ينتج عنها انتهاكات لحقوق الإنسان". وأضاف: "لا يمكن مسح الأذى الذي لحق بلجين الهذلول، لكن هذه الدعوى هي خطوة نحو المساءلة ".

الهذلول، التي يرد بيانها حول القضية أدناه هي رائدة في حركة النهوض بحقوق المرأة في المملكة العربية السعودية، حيث مُنعت النساء من القيادة حتى العام 2018، ومطلوب منهن بموجب القانون الحصول على إذن من ولي الأمر للعمل أو السفر. ويعانين من التمييز والعنف. صعدت إلى الصدارة بسبب دفاعها عن حق المرأة في القيادة وعرضت نفسها لخطر كبير في عام 2014 من خلال الإعلان عن نيتها القيادة عبر الحدود من الإمارات العربية المتحدة إلى المملكة العربية السعودية وتصوير نفسها وهي تقود السيارة. تم توقيفها عند حدود المملكة العربية السعودية وسجنها لمدة 73 يومًا. وبلا رادع، استمرت الهذلول في الدفاع عن حقوق المرأة واستمرت في كونها هدفًا لجهود المملكة لقمع المعارضة.

قامت شركة دارك ماتر بتوجيه الكود عن عمد إلى خوادم شركة أبل في الولايات المتحدة للوصول إلى برامج ضارة ووضعها على هاتف آيفون الخاص بالهذلول، وهو انتهاك لقانون الاحتيال وإساءة استخدام الكمبيوتر، كما تقول مؤسسة الجبهة الأمامية في شكوى تم رفعها في محكمة ولاية أوريغون الاتحادية. تم اختراق الهاتف في البداية في عام 2017، مما مكن من الوصول إلى رسائلها القصيرة ورسائل البريد الإلكتروني وبيانات موقعها الفعلي. في وقت لاحق، كانت الهذلول تقود سيارتها على الطريق السريع في أبو ظبي عندما ألقت أجهزة الأمن الإماراتية القبض عليها، ونقلتها قسراً بالطائرة إلى المملكة العربية السعودية، حيث تم سجنها مرتين، أحدهما في سجن سري حيث تعرضت للصعق بالصدمات الكهربائية والجلد، والتهديد بالاغتصاب والقتل.

"تجاوز مشروع ريفين السلوك الذي رأيناه من مجموعة NSO، التي تم الكشف مرارًا وتكرارًا بيعها برامج لحكومات استبدادية تستخدم أدواتها للتجسس على الصحفيين والنشطاء والمعارضين"، قالت إيفا جالبيرين، مديرة الأمن السيبراني في مؤسسة الجبهة الأمامية. وأضافت: " لم توفر شركة دارك ماتر الأدوات فحسب، بل أشرفوا على برنامج المراقبة بأنفسهم".

في حين أن مؤسسة الجبهة الأمامية قد ضغطت منذ فترة طويلة باتجاه الحاجة إلى إصلاح قانون الاحتيال وإساءة استخدام الكمبيوتر، فإن هذه القضية تمثل تطبيقًا مباشرًا لنوع الانتهاك الفاضح لأمن المستخدمين الذي يتفق الجميع على أن القانون كان يهدف إلى معالجته.

قال موكوند راثي، محامي مؤسسة الجبهة الأمامية وزميل ستانتون: "هذه حالة واضحة لاختراق الأجهزة، حيث قام عملاء دارك ماتر باقتحام هاتف آيفون الخاص بالهذلول دون علمها لإدخال برامج ضارة، مع عواقب مروعة، هذا النوع من الجرائم هو ما كان المقصود من إصلاح قانون الاحتيال وإساءة استخدام الكمبيوتر للمعاقبة عليه".

بالإضافة إلى انتهاكات قانون الاحتيال وإساءة استخدام الكمبيوتر، تزعم الشكوى أن باير وآدامز وجريك ساعدوا وحرضوا على ارتكاب جرائم ضد الإنسانية لأن اختراق هاتف الهذلول كان جزءًا من هجوم الإمارات الواسع النطاق والممنهج ضد المدافعين عن حقوق الإنسان والنشطاء وغيرهم من منتقدي حقوق الإنسان في الإمارات والسعودية.

تشترك في هده القضية مكاتب المحاماة فولي هوغ LLP وبويس ماثيوز LLP كمستشار لمؤسسة الجبهة الأمامية.

 

بيان لجين الهذلول حول القضية

 

"لم أتخيل مطلقًا أن يتم الاحتفاء بي لدفاعي عن ما اعتقدت أنه صحيح. دفعني إدراكي المبكر لامتياز التحدث بصوت عالٍ وصراحة نيابة عن النساء وعني إلى الانخراط في مجال المدافعين عن حقوق الإنسان.

 

"في مقال نُشر عام 2018 بعنوان الحريات المخطوفة، عبرت فيه عن فهمي للحرية في أن تكون أمانًا وسلامًا:

"الأمان في التعبير، والشعور بالحماية، والعيش والحب.

[و] السلام للكشف عن إنسانية أنقى وأخلص مغروسة في أعماق أرواحنا وعقولنا دون التعرض لعواقب لا تغتفر.

حُرمت من الأمان والسلام، فقدت حريتي. إلى الأبد؟'

"في السابق ، كان لدي علم محدود لجوانب الضرر الذي يمكن أن يواجهه المدافع عن حقوق الإنسان، أو أي فرد يدافع عن حقه، لا سيما في عالم الإنترنت. اليوم، أقوم ادافع عن الأمان عبر الإنترنت بالإضافة إلى الحماية من إساءة استخدام القوة من قبل الشركات الإلكترونية في فهمي للسلامة. يجب اعتبار الأخير حقًا أساسيًا وطبيعيًا في واقعنا الرقمي.

"لا ينبغي لأي حكومة أو فرد أن يتسامح مع إساءة استخدام برامج التجسس الخبيثة لردع حقوق الإنسان أو تعريض صوت الضمير البشري للخطر. لهذا السبب اخترت الدفاع عن حقنا الجماعي في البقاء آمنين على الإنترنت والحد من الانتهاكات الإلكترونية للسلطة المدعومة من الحكومة. ما زلت أدرك امتيازي للعمل ربما بناءً على معتقداتي.

"آمل أن تلهم هذه القضية الآخرين لمواجهة جميع أنواع الجرائم الإلكترونية لخلق مساحة أكثر أمانًا لنا جميعًا للنمو والمشاركة والتعلم من

للشكوى:

https://www.eff.org/document/alhathloul-v-darkmatter

لمزيد من المعلومات حول البرامج الضارة التي ترعاها الدولة:

https://www.eff.org/issues/state-sponsored-malware

اتصل:

كارين جولو

Carlos Wertheman

Virtual Worlds, Real People: Human Rights in the Metaverse

1 month 1 week ago

Download the report

Today, December 10, is International Human Rights Day. On this day in 1948, the U.N. General Assembly adopted the Universal Declaration of Human Rights, the document that lays out the principles and building blocks of current and future human rights instruments. In honor of this anniversary, Access Now and the Electronic Frontier Foundation (EFF) are calling upon governments and companies to address human rights in the context of virtual and augmented reality (VR and AR) and ensure that these rights are respected and enforced. 

Extended Reality (XR) technologies, including virtual and augmented reality, are the foundations of emerging digital environments, including the so-called metaverse. They are still at an early stage of development and adoption, but Big Tech is investing heavily in these technologies, and there is a scramble to assert dominance and cement monopolies in what tech investors and executives claim will be the next generation of computing and social media.

Like any other technology, XR can have many positive impacts on our daily lives. It can be a useful tool in areas like medicine, science, and education. Artists are using XR creatively to make virtual worlds their canvas and create new forms of expression. Protests and social movements have also used these technologies to raise awareness on collective issues, or to make their voice heard when it is physically impossible or dangerous.

Yet XR also poses substantial risks to human rights. VR headsets and AR glasses, coupled with other wearables, could continue the march towards ever-more-invasive data collection and ubiquitous surveillance. This data harvesting, sometimes done by companies with a history of putting profit before protections, sets the stage for unprecedented invasions into our lives, our homes, and even our thoughts, as data collected by XR devices is used for targeted advertising and to enable new forms of “biometric psychography” to make inferences about our deepest desires and inclinations. Once collected, there is little users can do to mitigate the harms done by leaks of data or data being monetized by third parties. These devices will also collect huge amounts of data about our homes and private spaces, and could allow governments, companies, and law enforcement illegitimate access to our lives, exacerbating already severe intrusions on our privacy. 

These new technologies also create new avenues for online harassment and abuse. AR glasses risk drastically undermining expectations of privacy in both private and public spaces. A person wearing the glasses can easily record their surroundings in secret, which only becomes more dangerous if surveillance technologies such as face recognition are incorporated. 

We have learned many lessons from everything that’s gone wrong, and right, with the current generation of smart devices and social media, and we need to apply these lessons now to ensure that everyone can take advantage of XR technologies and the metaverse without sacrificing fundamental human rights we hold dear.

Here’s what we know:

  • We know that self-regulation on data protection and ethical guidelines are not sufficient to rein in the harms caused by technology. 
  • We know that we need human rights standards to be placed at the center of developments in XR to ensure that our rights are not only respected, but indeed extended, in the metaverse.
  • We need appropriate regulation and enforcement to protect people’s privacy and other human rights in the metaverse.
  • We also need to nurture the grassroots, rights-respecting tech being developed today. Lawmakers need to be vigilant that Big Tech companies don’t swallow up all their competitors before they have a chance to develop rights-respecting alternatives to dominant, surveillance-driven platforms.

To this end, we ask governments to ensure that protections against state and corporate overreach and intrusion apply to XR, as follows: 

  • Governments must enact or update data protection legislation that limits data collection and processing to include data generated and collected by XR systems, including medical or psychographic inferences. Governments should clearly define this data as sensitive, strongly protected personal data under the law, even when it does not meet the high threshold to be classified as biometric, personal data, or personally identifiable information (PII) under current law. Legislation should recognize XR systems can be used to make problematic, invasive inferences about our thoughts, emotions, inclinations, and private mental life.
  • Responsible independent authorities must act to enforce data protection laws and protect people’s rights. Research has shown that people's privacy “choices” to let businesses process their data are typically involuntary, prone to cognitive biases, and/or circumventable due to human limitations, dark patterns, legal loopholes, and the complexities of modern data processing. Authorities should require transparency about and control over not only the collected data but also the use or disclosure of the inferences the platform will make about users (their behavior, emotions, personality, etc.), including the processing of personal data running in the background. Thus, the legal paradigm of notice-and-choice as it is practiced today needs to be challenged.  
  • The metaverse should not belong to any one company. Competition regulators must take steps to safeguard the diversity of metaverse platforms and prevent monopolies over infrastructure and hardware, so users don’t feel locked into a given platform to enjoy full participation in civic, personal, educational, social, or commercial life, or feel that they have to tolerate these failures to remain connected to vital realms of human existence. These pro-competitive interventions should include merger scrutiny, structural separation of dominant firms from adjacent elements of their supply chains, prohibitions on anticompetitive conduct such as predatory pricing, mandatory interoperability of key protocols and data structures, and legal safeguards for inter-operators who use reverse engineering and other "adversarial interoperability" to improve a service's security, accessibility, and privacy. This is not an exhaustive list, and, should metaverse technologies take hold, they will almost certainly give rise to new, technology-specific human rights and competition concerns and remedies.
  • Governments should ensure that the 13 International Principles on the Application of Human Rights to Communications Surveillance are applied, existing privileges against government intrusion are reaffirmed, and legal protections are extended to other types of data, such as psychographic and behavioral data and inferences drawn from them.
  • Governments should increase transparency around their use of XR. As governments start using XR for training and simulations, deliberation, and decision-making and public meetings, new kinds of information will be produced that will constitute public records that should be made available to the public under freedom of information laws.
  • As XR technologies become ubiquitous, companies should respect and governments should protect people’s right to repair, alter, or investigate the functionality of their own devices.
  • As in real life, governments must refrain from censoring free expression and inhibiting journalistic freedoms, and instead encourage participatory exchanges in the marketplace of ideas. With the rise of regulatory initiatives around the world that threaten to chill free expression, it is crucial to adhere to proportional measures, consistent with the Santa Clara Principles, balancing legitimate objectives with the freedom to receive and impart information. 

Companies have a responsibility to uphold human rights as guided by the U.N. Guiding Principles on Business and Human Rights, a global standard of “expected conduct for all business enterprises wherever they operate,” applicable in all situations. 

The corporate responsibility to respect human rights also means addressing adverse impacts that may occur, as follows: 

  • Companies should publicly pledge to require governments to get the necessary legal process to access XR data, notify users when allowed by law, regularly publish transparency reports, utilize encryption (without backdoors), and fight to limit data that can be accessed to what is necessary, adequate, and proportionate.  
  • Companies, including manufacturers and providers, should not only protect their users’ right to privacy against government surveillance but also their users’ right to data protection. They must resist the urge, all too common in Silicon Valley, to “collect it all,” in case it may be useful later. Instead, companies should apply strict data minimization and privacy-by-design principles, collecting only what is required for base functionality or to provide specific services users have requested and agreed to, and retaining it only as long as necessary. The less data companies collect and store now, the fewer unexpected problems will arise later if the data is stolen, breached, repurposed, or seized by governments. Any processing of data should also be fair and proportionate.
  • Companies should be clear with users about who has access to their data, including data shared as part of one’s terms of employment or school enrollment, and adopt strong transparency policies, explicitly stating the purposes for and means of data processing, and allowing users to securely access and port their data. 
  • The development and deployment of XR technology must be scrutinized to identify and address potential human rights risks and ensure they are deployed with transparency, proportionality, fairness, and equity.  

To investors:

  • Investors should evaluate their portfolios to determine where they may be investing in XR technologies and use their leverage to ensure that portfolio companies adhere to human rights standards in the development and deployment of XR technologies.

Digital rights activists and the XR community at large have a significant role to play in protecting human rights, as follows:

  • XR enthusiasts and reviewers should prioritize open and privacy-conscious devices, even if they are only entertainment accessories. Activists and researchers should focus on creating a future where XR technologies work in the best interests of users and society overall.
  • Digital rights advocates and activists should start investigating XR technologies now and make their demands heard by companies and regulators, so their expertise can inform developments and government protections at this early stage.
  • XR communities should educate themselves about the social and human rights implications of the technologies they are developing, and commit to responsible practices.

Our XR data should be used in our own interests, not to harm or manipulate us. Let’s not let the promise of the next generation of computing fail in the same ways the prior generation has. The future is tomorrow, so let’s make it a future we would want to live in.

Katitza Rodriguez
Checked
3 minutes 1 second ago
EFF's Deeplinks Blog: Noteworthy news from around the internet
Subscribe to EFF update feed