EFF’s Submission to Ofcom’s Consultation on Illegal Harms

1 month 1 week ago

More than four years after it was first introduced, the Online Safety Act (OSA) was passed by the U.K. Parliament in September 2023. The Act seeks to make the U.K. “the safest place” in the world to be online and provides Ofcom, the country’s communications regulator, with the power to enforce this.

EFF has opposed the Online Safety Act since it was first introduced. It will lead to a more censored, locked-down internet for British users. The Act empowers the U.K. government to undermine not just the privacy and security of U.K. residents, but internet users worldwide. We joined civil society organizations, security experts, and tech companies to unequivocally ask for the removal of clauses that require online platforms to use government-approved software to scan for illegal content. 

Under the Online Safety Act, websites, and apps that host content deemed “harmful” minors will face heavy penalties; the problem, of course, is views vary on what type of content is “harmful,” in the U.K. as with all other societies. Soon, U.K. government censors will make that decision. 

The Act also requires mandatory age verification, which undermines the free expression of both adults and minors. 

Ofcom recently published the first of four major consultations seeking information on how internet and search services should approach their new duties on illegal content. While we continue to oppose the concept of the Act, we are continuing to engage with Ofcom to limit the damage to our most fundamental rights online. 

EFF recently submitted information to the consultation, reaffirming our call on policymakers in the U.K. to protect speech and privacy online. 


For years, we opposed a clause contained in the then Online Safety Bill allowing Ofcom to serve a notice requiring tech companies to scan their users–all of them–for child abuse content. We are pleased to see that Ofcom’s recent statements note that the Online Safety Act will not apply to end-to-end encrypted messages. Encryption backdoors of any kind are incompatible with privacy and human rights. 

However, there are places in Ofcom’s documentation where this commitment can and should be clearer. In our submission, we affirmed the importance of ensuring that people’s rights to use and benefit from encryption—regardless of the size and type of the online service. The commitment to not scan encrypted data must be firm, regardless of the size of the service, or what encrypted services it provides. For instance, Ofcom has suggested that “file-storage and file-sharing” may be subject to a different risk profile for mandating scanning. But encrypted “communications” are not significantly different from encrypted “file-storage and file-sharing.”

In this context, Ofcom should also take note of new milestone judgment in PODCHASOV v. RUSSIA (Application no. 33696/19) where the European Court of Human Rights (ECtHR) ruled that weakening encryption can lead to general and indiscriminate surveillance of communications for all users, and violates the human right to privacy. 

Content Moderation

An earlier version of the Online Safety Bill enabled the U.K. government to directly silence user speech and imprison those who publish messages that it doesn’t like. It also empowered Ofcom to levy heavy fines or even block access to sites that offend people. We were happy to see this clause removed from the bill in 2022. But a lot of problems with the OSA remain. Our submission on illegal harms affirmed the importance of ensuring that users have: greater control over what content they see and interact with, are equipped with knowledge about how various controls operate and how they can use them to their advantage, and have the right to anonymity and pseudonymity online.

Moderation mechanisms must not interfere with users’ freedom of expression rights, and moderators should receive ample training and materials to ensure cultural and linguistic competence in content moderation. In cases where time-related pressure is placed on moderators to make determinations, companies often remove more than necessary to avoid potential liability, and are incentivized towards using automated technologies for content removal and upload filters. These are notoriously inaccurate and prone to overblocking legitimate material. Moreover, the moderation of terrorism-related content is prone to error and any new mechanism like hash matching or URL detection must be provided with expert oversight. 

Next Steps

Throughout this consultation period, EFF will continue contributing to and monitoring Ofcom’s drafting of the regulation. And we will continue to hold the U.K. government accountable to the international and European human rights protections to which they are signatories.

Read EFF's full submission to Ofcom

Paige Collings

The Foilies 2024

1 month 1 week ago
Recognizing the worst in government transparency.

The Foilies are co-written by EFF and MuckRock and published in alternative newspapers around the country through a partnership with the Association of Alternative Newsmedia

We're taught in school about checks and balances between the various branches of government, but those lessons tend to leave out the role that civilians play in holding officials accountable. We're not just talking about the ballot box, but the everyday power we all have to demand government agencies make their records and data available to public scrutiny.

At every level of government in the United States (and often in other countries), there are laws that empower the public to file requests for public records. They go by various names—Freedom of Information, Right-to-Know, Open Records, or even Sunshine laws—but all share the general concept that because the government is of the people, its documents belong to the people. You don't need to be a lawyer or journalist to file these; you just have to care.

It's easy to feel powerless in these times, as local newsrooms close, and elected officials embrace disinformation as a standard political tool. But here's what you can do, and we promise it'll make you feel better: Pick a local agency—it could be a city council, a sheriff's office or state department of natural resources—and send them an email demanding their public record-request log, or any other record showing what requests they receive, how long it took them to respond, whether they turned over records, and how much they charged the requester for copies. Many agencies even have an online portal that makes it easier, or you can use MuckRock’s records request tool. (You can also explore other people's results that have been published on MuckRock's FOIA Log Explorer.) That will send the message to local leaders they're on notice. You may even uncover an egregious pattern of ignoring or willfully violating the law.

The Foilies are our attempt to call out these violations each year during Sunshine Week, an annual event (March 10-16 this year) when advocacy groups, news organizations and citizen watchdogs combine efforts to highlight the importance of government transparency laws. The Electronic Frontier Foundation and MuckRock, in partnership with the Association of Alternative Newsmedia, compile the year's worst and most ridiculous responses to public records requests and other attempts to thwart public access to information, including through increasing attempts to gut the laws guaranteeing this access—and we issue these agencies and officials tongue-in-cheek "awards" for their failures.

Sometimes, these awards actually make a difference. Last year, Mendocino County in California repealed its policy of charging illegal public records fees after local journalists and activists used The Foilies’ "The Transparency Tax Award" in their advocacy against the rule.

This year marks our 10th annual accounting of ridiculous redactions, outrageous copying fees, and retaliatory attacks on requesters—and we have some doozies for the ages.

The "Winners" The Not-So-Magic Word Award: Augusta County Sheriff’s Office, Va.

Public records laws exist in no small part because corruption, inefficiency and other malfeasance happen, regardless of the size of the government. The public’s right to hold these entities accountable through transparency can prevent waste and fraud.

Of course, this kind of oversight can be very inconvenient to those who would like a bit of secrecy. Employees in Virginia’s Augusta County thought they’d found a neat trick for foiling Virginia's Freedom of Information Act.

Consider: “NO FOIA”

In an attempt to withhold a bunch of emails they wanted to hide from the public eye, employees in Augusta County began tagging their messages with “NO FOIA,” as an apparent incantation staff believed could ward off transparency. Of course, there are no magical words that allow officials to evade transparency laws; the laws assume all government records are public, so agencies can’t just say they don’t want records released.

Fortunately, at least one county employee thought that breaking the law must be a little more complicated than that, and this person went to Breaking Through News to blow the whistle.

Breaking Through News sent a FOIA request for those “NO FOIA” emails. The outlet received just 140 emails of the 1,212 that the county indicated were responsive, and those released records highlighted the county’s highly suspect approach to withholding public records. Among the released records were materials like the wages for the Sheriff Office employees (clearly a public record), the overtime rates (clearly a public record) and a letter from the sheriff deriding the competitive wages being offered at other county departments (embarrassing but still clearly a public record). 

Other clearly public records, according to a local court, included recordings of executive sessions that the commissioners had entered illegally, which Breaking Through News learned about through the released records. They teamed up with the Augusta Free Press to sue for access to the recordings, a suit they won last month. They still haven’t received the awarded records, and it’s possible that Augusta County will appeal. Still, it turned out that, thanks to the efforts of local journalists, their misguided attempt to conjure a culture of “No FOIA” in August County actually brought them more scrutiny and accountability.

The Poop and Pasta Award: Richlands, Va.

Government officials retaliated against a public records requester by filling her mailbox with noodles.

In 2020, Laura Mollo of Richlands, Va., discovered that the county 911 center could not dispatch Richlands residents’ emergency calls: While the center dispatched all other county 911 calls, calls from Richlands had to be transferred to the Richlands Police Department to be handled. After the Richlands Town Council dismissed Mollo’s concerns, she began requesting records under the Virginia Freedom of Information Act. The records showed that Richlands residents faced lengthy delays in connecting with local emergency services. On one call, a woman pleaded for help for her husband, only to be told that county dispatch couldn’t do anything—and her husband died during the delay. Other records Mollo obtained showed that Richlands appeared to be misusing its resources.

You would hope that public officials would be grateful that Mollo uncovered the town’s inadequate emergency response system and budget mismanagement. Well, not exactly: Mollo endured a campaign of intimidation and harassment for holding the government accountable. Mollo describes how her mailbox was stuffed with cow manure on one occasion, and spaghetti on another (which Mollo understood to be an insult to her husband’s Italian heritage). A town contractor harassed her at her home; police pulled her over; and Richlands officials even had a special prosecutor investigate her.

But this story has a happy ending: In November 2022, Mollo was elected to the Richlands Town Council. The records she uncovered led Richlands to change over to the county 911 center, which now dispatches Richlands residents’ calls. And in 2023, the Virginia Coalition for Open Government recognized Mollo by awarding her the Laurence E. Richardson Citizen Award for Open Government. Mollo’s recognition is well-deserved. Our communities are indebted to people like her who vindicate our right to public records, especially when they face such inexcusable harassment for their efforts.

The Error 404 Transparency Not Found Award: FOIAonline

In 2012, FOIAonline was launched with much fanfare as a way to bring federal transparency into the late 20th century. No longer would requesters have to mail or fax requests. Instead, FOIAonline was a consolidated starting point, managed by the Environmental Protection Agency (EPA), that let you file Freedom of Information Act requests with numerous federal entities from within a single digital interface.

Even better, the results of requests would be available online, meaning that if someone else asked for interesting information, it would be available to everyone, potentially reducing the number of duplicate requests. It was a good idea—but it was marred from the beginning by uneven uptake, agency infighting, and inscrutable design decisions that created endless headaches. In its latter years, FOIAonline would go down for days or weeks at a time without explanation. The portal saw agency after agency ditch the platform in favor of either homegrown solutions or third-party vendors.

Last year, the EPA announced that the grand experiment was being shuttered, leaving thousands of requesters uncertain about how and where to follow up on their open requests, and unceremoniously deleting millions of documents from public access without any indication of whether they would be made available again.

In a very on-brand twist of the knife, the decision to sunset FOIAonline was actually made two years prior, after an EPA office reported in a presentation that the service was likely to enter a “financial death spiral” of rising costs and reduced agency usage. Meanwhile, civil-society organizations such as MuckRock, the Project on Government Oversight, and the Internet Archive have worked to resuscitate and make available at least some of the documents the site used to host.

The Literary Judicial Thrashing of the Year Award: Pennridge, Penn., School District

Sometimes when you're caught breaking the law, the judge will throw the book at you. In the case of Pennridge School District in Bucks County, Penn. Judge Jordan B. Yeager catapulted an entire shelf of banned books at administrators for violating the state's Right-to-Know Law.

The case begins with Darren Laustsen, a local parent who was alarmed by a new policy to restrict access to books that deal with “sexualized content,” seemingly in lockstep with book-censorship laws happening around the country. Searching the school library's catalog, he came across a strange trend: Certain controversial books that appeared on other challenged-book lists had been checked out for a year or more. Since students are only allowed to check out books for a week, he (correctly) suspected that library staff were checking them out themselves to block access.

So he filed a public records request for all books checked out by non-students. Now, it's generally important for library patrons to have their privacy protected when it comes to the books they read—but it's a different story if public employees are checking out books as part of their official duties and effectively enabling censorship. The district withheld the records, provided incomplete information, and even went so far as to return books and re-check them out under a student's account in order to obscure the truth. And so Laustsen sued.

The judge issued a scathing and literarily robust ruling: “In short, the district altered the records that were the subject of the request, thwarted public access to public information, and effectuated a cover-up of faculty, administrators, and other non-students’ removal of books from Pennridge High School’s library shelves." The opinion was peppered with witty quotes from historically banned books, including Nineteen Eighty-Four, Alice in Wonderland, The Art of Racing in the Rain and To Kill a Mockingbird. After enumerating the district's claims that later proved to be inaccurate, he cited Kurt Vonnegut's infamous catchphrase from Slaughterhouse-Five: "So it goes."

The Photographic Recall Award: Los Angeles Police Department

Police agencies seem to love nothing more than trumpeting an arrest with an accompanying mugshot—but when the tables are turned, and it’s the cops’ headshots being disclosed, they seem to lose their minds and all sense of the First Amendment.

This unconstitutional escapade began (and is still going) after a reporter and police watchdog published headshots of Los Angeles Police Department officers, which they lawfully obtained via a public records lawsuit. LAPD cops and their union were furious. The city then sued the reporter, Ben Camacho, and the Stop LAPD Spying Coalition, demanding that they remove the headshots from the internet and return the records to LAPD.

You read that right: After a settlement in a public records lawsuit required the city to disclose the headshots, officials turned around and sued the requester for, uh, disclosing those same records, because the city claimed it accidentally released pictures of undercover cops.

But it gets worse: Last fall, a trial court denied a motion to throw out the city’s case seeking to claw back the images; Camacho and the coalition have appealed that decision and have not taken the images offline. And in February, the LAPD sought to hold Camacho and the coalition liable for damages it may face in a separate lawsuit brought against it by hundreds of police officers whose headshots were disclosed.

We’re short on space, but we’ll try explain the myriad ways in which all of the above is flagrantly unconstitutional: The First Amendment protects Camacho and the coalition’s ability to publish public records they lawfully obtained, prohibits courts from entering prior restraints that stop protected speech, and limits the LAPD’s ability to make them pay for any mistakes the city made in disclosing the headshots. Los Angeles officials should be ashamed of themselves—but their conduct shows that they apparently have no shame.

The Cops Anonymous Award: Chesterfield County Police Department, Va.

The Chesterfield County Police Department in Virginia refused to disclose the names of hundreds of police officers to a public records requester on this theory: Because the cops might at some point go undercover, the public could never learn their identities. It’s not at all dystopian to claim that a public law enforcement agency needs to have secret police!

Other police agencies throughout the state seem to deploy similar secrecy tactics, too.

The Keep Your Opinions to Yourself Award: Indiana Attorney General Todd Rokita

In March 2023, Indiana Attorney General Todd Rokita sent a letter to medical providers across the state demanding information about the types of gender-affirming care they may provide to young Hoosiers. But this was no unbiased probe: Rokita made his position very clear when he publicly blasted these health services as “the sterilization of vulnerable children” that “could legitimately be considered child abuse.” He made claims to the media that the clinics’ main goals weren’t to support vulnerable youth, but to rake in cash.

Yet as loud as he was about his views in the press, Rokita was suddenly tight-lipped once the nonprofit organization American Oversight filed a public records request asking for all the research, analyses and other documentation that he used to support his claims. Although his agency located 85 documents that were relevant to their request, Rokita refused to release a single page, citing a legal exception that allows him to withhold deliberative documents that are “expressions of opinion or are of a speculative nature.”

Perhaps if Rokita’s opinions on gender-affirming care weren't based on facts, he should've kept those opinions and speculations to himself in the first place.

The Failed Sunshine State Award: Florida Gov. Ron DeSantis

Florida’s Sunshine Law is known as one of the strongest in the nation, but Gov. Ron DeSantis spent much of 2023 working, pretty successfully, to undermine its superlative status with a slew of bills designed to weaken public transparency and journalism.

In March, DeSantis was happy to sign a bill to withhold all records related to travel done by the governor and a whole cast of characters. The law went into effect just more than a week before the governor announced his presidential bid. In addition, DeSantis has asserted his “executive privilege” to block the release of public records in a move that, according to experts like media law professor Catherine Cameron, is unprecedented in Florida’s history of transparency.

DeSantis suspended his presidential campaign in January. That may affect how many trips he’ll be taking out-of-state in the coming months, but it won’t undo the damage of his Sunshine-slashing policies.

Multiple active lawsuits are challenging DeSantis over his handling of Sunshine Law requests. In one, The Washington Post is challenging the constitutionality of withholding the governor’s travel records. In that case, a Florida Department of Law Enforcement official last month claimed the governor had delayed the release of his travel records. Nonprofit watchdog group American Oversight filed a lawsuit in February, challenging “the unjustified and unlawful delay” in responding to requests, citing a dozen records requests to the governor’s office that have been pending for one to three years.

“It’s stunning, the amount of material that has been taken off the table from a state that many have considered to be the most transparent,” Michael Barfield, director of public access for the Florida Center for Government Accountability (FCGA), told NBC News. The FCGA is now suing the governor’s office for records on flights of migrants to Massachusetts. “We’ve quickly become one of the least transparent in the space of four years.”

The Self-Serving Special Session Award: Arkansas Gov. Sarah Huckabee Sanders

By design, FOIA laws exist to help the people who pay taxes hold the people who spend those taxes accountable. In Arkansas, as in many states, taxpayer money funds most government functions: daily office operations, schools, travel, dinners, security, etc. As Arkansas’ governor, Sarah Huckabee Sanders has flown all over the country, accompanied by members of her family and the Arkansas State Police. For the ASP alone, the people of Arkansas paid $1.4 million in the last half of last year.

Last year, Sanders seemed to tire of the scrutiny being paid to her office and her spending. Sanders cited her family’s safety as she tried to shutter any attempts to see her travel records, taking the unusual step of calling a special session of the state Legislature to protect herself from the menace of transparency.

Notably, the governor had also recently been implicated in an Arkansas Freedom of Information Act case for these kinds of records.

The attempt to gut the law included a laundry list of carve-outs unrelated to safety, such as walking back the ability of public-records plaintiffs to recover attorney's fees when they win their case. Other attempts to scale back Arkansas' FOIA earlier in the year had not passed, and the state attorney general’s office was already working to study what improvements could be made to the law.  

Fortunately, the people of Arkansas came out to support the principle of government transparency, even as their governor decided she shouldn’t need to deal with it anymore. Over a tense few days, dozens of Arkansans lined up to testify in defense of the state FOIA and the value of holding elected officials, like Sanders, accountable to the people.

By the time the session wound down, the state Legislature had gone through multiple revisions. The sponsors walked back most of the extreme asks and added a requirement for the Arkansas State Police to provide quarterly reports on some of the governor’s travel costs. However, other details of that travel, like companions and the size of the security team, ultimately became exempt. Sanders managed to twist the whole fiasco into a win, though it would be a great surprise if the Legislature didn’t reconvene this year with some fresh attempts to take a bite out of FOIA.

While such a blatant attempt to bash public transparency is certainly a loser move, it clearly earns Sanders a win in the FOILIES—and the distinction of being one of the least transparent government officials this year.

The Doobie-ous Redaction Award: U.S. Department of Health and Human Services and Drug Enforcement Administration

The feds heavily redacted an email about reclassifying cannabis from a Schedule I to a Schedule III substance.

Bloomberg reporters got a major scoop when they wrote about a Health and Human Services memo detailing how health officials were considering major changes to the federal restrictions on marijuana, recommending reclassifying it from a Schedule I substance to Schedule III.

Currently, the Schedule I classification for marijuana puts it in the same league as heroin and LSD, while Schedule III classification would indicate lower potential for harm and addiction along with valid medical applications.

Since Bloomberg viewed but didn’t publish the memo itself, reporters from the Cannabis Business Times filed a FOIA request to get the document into the public record. Their request was met with limited success: HHS provided a copy of the letter, but redacted virtually the entire document besides the salutation and contact information. When pressed further by CBT reporters, the DEA and HHS would only confirm what the redacted documents had already revealed—virtually nothing.

HHS handed over the full, 250-page review several months later, after a lawsuit was filed by an attorney in Texas. The crucial information the agencies had fought so hard to protect: “Based on my review of the evidence and the FDA’s recommendation, it is my recommendation as the Assistant Secretary for Health that marijuana should be placed in Schedule III of the CSA.”

The “Clearly Releasable,” Clearly Nonsense Award: U.S. Air Force

Increasingly, federal and state government agencies require public records requesters to submit their requests through online portals. It’s not uncommon for these portals to be quite lacking. For example, some portals fail to provide space to include information crucial to requests.

But the Air Force deserves special recognition for the changes it made to its submission portal, which asked requesters if they would  agree to limit their requests to  information that the Air Force deemed "clearly releasable.” You might think, “surely the Air Force defined this vague ‘clearly releasable’ information.” Alas, you’d be wrong: The form stated only that requesters would “agree to accept any information that will be withheld in compliance with the principles of FOIA exemptions as a full release.” In other words, the Air Force asked requesters to give up the fight over information before it even began, and to accept the Air Force's redactions and rejections as non-negotiable.

Following criticism, the Air Force jettisoned the update to its portal to undo these changes. Moving forward, it's "clear" that it should aim higher when it comes to transparency.

The Scrubbed Scrubs Award: Ontario Ministry of Health, Canada

Upon taking office in 2018, Ontario Premier Doug Ford was determined to shake up the Canadian province’s healthcare system. His administration has been a bit more tight-lipped, however, about the results of that invasive procedure. Under Ford, Ontario’s Ministry of Health is fighting the release of information on how understaffed the province’s medical system is, citing “economic and other interests.” The government’s own report, partially released to Global News, details high attrition as well as “chronic shortages” of nurses.

The reporters’ attempts to find out exactly how understaffed the system is, however, were met with black-bar redactions. The government claims that releasing the information would negatively impact “negotiating contracts with health-care workers.” However, the refusal to release the information hasn’t helped solve the problem; instead, it’s left the public in the dark about the extent of the issue and what it would actually cost to address it.

Global News has appealed the withholdings. That process has dragged on for over a year, but a decision is expected soon.

The Judicial Blindfold Award: Mississippi Justice Courts

Courts are usually transparent by default. People can walk in to watch hearings and trials, and can get access to court records online or at the court clerk’s office. And there are often court rules or state laws that ensure courts are public.

Apparently, the majority of Mississippi Justice Courts don’t feel like following those rules. An investigation by ProPublica and the Northeast Mississippi Daily Journal found that nearly two-thirds of these county-level courts obstructed public access to basic information about law enforcement’s execution of search warrants. This blockade not only appeared to violate state rules on court access; it frustrated the public’s ability to scrutinize when police officers raid someone’s home without knocking and announcing themselves.

The good news is that the Daily Journal is pushing back. It filed suit in the justice court in Union County, Miss., and asked for an end to the practice of never making search-warrant materials public.

Mississippi courts are unfortunately not alone in their efforts to keep search warrant records secret. The San Bernardino Superior Court of California sought to keep secret search warrants used to engage in invasive digital surveillance, only disclosing most of them after the EFF sued.

It’s My Party and I Can Hide Records If I Want to Award: Wyoming Department of Education

Does the public really have a right to know if their tax dollars pay for a private political event?

Former Superintendent of Public Instruction Brian Schroeder and Chief Communications Officer Linda Finnerty in the Wyoming Department of Education didn’t seem to think so, according to Laramie County Judge Steven Sharpe.

Sharpe, in his order requiring disclosure of the records, wrote that the two were more concerned with “covering the agency’s tracks” and acted in “bad faith” in complying with Wyoming’s state open records law.

The lawsuit proved that Schroeder originally used public money for a "Stop the Sexualization of Our Children" event and provided misleading statements to the plaintiffs about the source of funding for the private, pro-book-banning event.

The former superintendent had also failed to provide texts and emails sent via personal devices that were related to the planning of the event, ignoring the advice of the state’s attorneys. Instead, Schroeder decided to “shop around” for legal advice and listen to a friend, private attorney Drake Hill, who told him to not provide his cell phone for inspection.

Meanwhile, Finnerty and the Wyoming Department of Education “did not attempt to locate financial documents responsive to plaintiffs’ request, even though Finnerty knew or certainly should have known such records existed.”

Transparency won this round with the disclosure of more than 1,500 text messages and emails—and according to Sharpe, the incident established a legal precedent on Wyoming public records access.

The Fee-l the Burn Award: Baltimore Police Department

In 2020, Open Justice Baltimore sued the Baltimore Police Department over the agency's demand that the nonprofit watchdog group pay more than $1 million to obtain copies of use-of-force investigation files. 

The police department had decreased their assessment to $245,000 by the time of the lawsuit, but it rejected the nonprofit’s fee waiver, questioning the public interest in the records and where they would change the public's understanding of the issue. The agency also claimed that fulfilling the request would be costly and burdensome for its short-staffed police department.

In 2023, Maryland’s Supreme Court issued a sizzling decision criticizing the BPD’s $245,000 fee assessment and its refusal to waive that fee in the name of public interest. The Supreme Court found that the public interest in how the department polices itself was clear and that the department should have considered how a denial of the fee waiver would “exacerbate the public controversy” and further “the perception that BPD has something to hide.”

The Supreme Court called BPD’s fee assessment “arbitrary and capricious” and remanded the case back to the police department, which must now reconsider the fee waiver. The unanimous decision from the state’s highest court did not mince its words on the cost of public records, either: “While an official custodian’s discretion in these matters is broad,” the opinion reads, “it is not boundless.”

The Continuing Failure Award: United States Citizenship and Immigration Services

Alien registration files, also commonly known as “A-Files,” contain crucial information about a non-citizen’s interaction with immigration agencies, and are central to determining eligibility for immigration benefits.

However, U.S. immigration agencies have routinely failed to release alien files within the statutory time limit for responding, according to Nightingale et al v. U.S. Citizenship and Immigration Services et al, a class-action lawsuit by a group of immigration attorneys and individual requesters.

The attorneys filed suit in 2019 against the U.S. Citizenship and Immigration Services, the Department of Homeland Security and U.S. Immigration and Customs Enforcement. In 2020, Judge William H. Orrick ruled that the agencies must respond to FOIA requests within 20 business days, and provide the court and class counsel with quarterly compliance reports. The case remains open.

With U.S. immigration courts containing a backlog of more than 2 million cases as of October of last year, according to the U.S. Government Accountability Office, the path to citizenship is bogged down for many applicants. The failure of immigration agencies to comply with statutory deadlines for requests only makes navigating the immigration system even more challenging. There is reason for hope for applicants, however. In 2022, Attorney General Merrick Garland made it federal policy to not require FOIA requests for copies of immigration proceedings, instead encouraging agencies to make records more readily accessible through other means.

Even the A-File backlog itself is improving. In the last status report, filed by the Department of Justice, they wrote that “of the approximately 119,140 new A-File requests received in the current reporting period, approximately 82,582 were completed, and approximately 81,980 were timely completed.”

The Creative Invoicing Award: Richmond, Va., Police Department

Some agencies claim outrageous fees for redacting documents to deter public access.

OpenOversightVA requested copies of general procedures—the basic outline of how police departments run—from localities across Virginia. While many departments either publicly posted them or provided them at no charge, Richmond Police responded with a $7,873.14 invoice. That’s $52.14 an hour to spend one hour on “review, and, if necessary, redaction” on each of the department’s 151 procedures.

This Foilies “winner” was chosen because of the wide gap between how available the information should be, and the staggering cost to bring it out of the file cabinet.

As MuckRock’s agency tracking shows, this is hardly an aberration for the agency. But this estimated invoice came not long after the department’s tear-gassing of protesters in 2020 cost the city almost $700,000. At a time when other departments are opening their most basic rulebooks (in California, for example, every law enforcement agency is required to post these policy manuals online), Richmond has been caught attempting to use a simple FOIA request as a cash cow.

The Foilies (Creative Commons Attribution License) were compiled by the Electronic Frontier Foundation (Director of Investigations Dave Maass, Senior Staff Attorney Aaron Mackey, Legal Fellow Brendan Gilligan, Investigative Researcher Beryl Lipton) and MuckRock (Co-Founder Michael Morisy, Data Reporter Dillon Bergin, Engagement Journalist Kelly Kauffman, and Contributor Tom Nash), with further review and editing by Shawn Musgrave. Illustrations are by EFF Designer Hannah Diaz. The Foilies are published in partnership with the Association of Alternative Newsmedia. 

Dave Maass

Four Voices You Should Hear this International Women’s Day

1 month 1 week ago

Around the globe, freedom of expression varies wildly in definition, scope, and level of access. The impact of the digital age on perceptions and censorship of speech has been felt across the political spectrum on a worldwide scale. In the debate over what counts as free expression and how it should work in practice, we often lose sight of how different forms of censorship can have a negative impact on different communities, and especially marginalized or vulnerable ones. This International Women’s Day, spend some time with four stories of hope and inspiration that teach us how to reflect on the past to build a better future.

1. Podcast Episode: Safer Sex Work Makes a Safer Internet

An internet that is safe for sex workers is an internet that is safer for everyone. Though the effects of stigmatization and criminalization run deep, the sex worker community exemplifies how technology can help people reduce harm, share support, and offer experienced analysis to protect each other. Public interest technology lawyer Kendra Albert and sex worker, activist, and researcher Danielle Blunt have been fighting for sex workers’ online rights for years and say that holding online platforms legally responsible for user speech can lead to censorship that hurts us all. They join EFF’s Cindy Cohn and Jason Kelley in this podcast to talk about protecting all of our free speech rights.

2. Speaking Freely: Sandra Ordoñez

Sandra (Sandy) Ordoñez is dedicated to protecting women being harassed online. Sandra is an experienced community engagement specialist, a proud NYC Latina resident of Sunset Park in Brooklyn, and a recipient of Fundación Carolina’s Hispanic Leadership Award. She is also a long-time diversity and inclusion advocate, with extensive experience incubating and creating FLOSS and Internet Freedom community tools. In this interview with EFF’s Jillian C. York, Sandra discusses free speech and how communities that are often the most directly affected are the last consulted.

3. Story: Coded Resistance, the Comic!

From the days of chattel slavery until the modern Black Lives Matter movement, Black communities have developed innovative ways to fight back against oppression. EFF's Director of Engineering, Alexis Hancock, documented this important history of codes, ciphers, underground telecommunications and dance in a blog post that became one of our favorite articles of 2021. In collaboration with The Nib and illustrator Chelsea Saunders, "Coded Resistance" was adapted into comic form to further explore these stories, from the coded songs of Harriet Tubman to Darnella Frazier recording the murder of George Floyd.

4. Speaking Freely: Evan Greer

Evan Greer is many things: a musician, an activist for LGBTQ issues, the Deputy Director of Fight for the Future, and a true believer in the free and open internet. In this interview, EFF’s Jillian C. York spoke with Evan about the state of free expression, and what we should be doing to protect the internet for future activism. Among the many topics discussed was how policies that promote censorship—no matter how well-intentioned—have historically benefited the powerful and harmed vulnerable or marginalized communities. Evan talks about what we as free expression activists should do to get at that tension and find solutions that work for everyone in society.

This blog is part of our International Women’s Day series. Read other articles about the fight for gender justice and equitable digital rights for all.

  1. Four Reasons to Protect the Internet this International Women’s Day
  2. Four Infosec Tools for Resistance this International Women’s Day
  3. Four Actions You Can Take To Protect Digital Rights this International Women’s Day
Paige Collings

Four Actions You Can Take To Protect Digital Rights this International Women’s Day

1 month 1 week ago

This International Women’s Day, defend free speech, fight surveillance, and support innovation by calling on our elected politicians and private companies to uphold our most fundamental rights—both online and offline.

1. Pass the “My Body, My Data” Act

Privacy fears should never stand in the way of healthcare. That's why this common-sense federal bill, sponsored by U.S. Rep. Sara Jacobs, will require businesses and non-governmental organizations to act responsibly with personal information concerning reproductive health care. Specifically, it restricts them from collecting, using, retaining, or disclosing reproductive health information that isn't essential to providing the service someone asks them for. The protected information includes data related to pregnancy, menstruation, surgery, termination of pregnancy, contraception, basal body temperature or diagnoses. The bill would protect people who, for example, use fertility or period-tracking apps or are seeking information about reproductive health services. It also lets people take on companies that violate their privacy with a strong private right of action.

2. Ban Government Use of Face Recognition

Study after study shows that facial recognition algorithms are not always reliable, and that error rates spike significantly when involving faces of folks of color, especially Black women, as well as trans and nonbinary people. Because of face recognition errors, a Black woman, Porcha Woodruff, was wrongfully arrested, and another, Lamya Robinson, was wrongfully kicked out of a roller rink.

Yet this technology is widely used by law enforcement for identifying suspects in criminal investigations, including to disparately surveil people of color. At the local, state, and federal level, people across the country are urging politicians to ban the government’s use of face surveillance because it is inherently invasive, discriminatory, and dangerous. Many U.S. cities have done so, including San Francisco and Boston. Now is our chance to end the federal government’s use of this spying technology. 

3. Tell Congress: Don’t Outlaw Encrypted Apps

Advocates of women's equality often face surveillance and repression from powerful interests. That's why they need strong end-to-end encryption. But if the so-called “STOP CSAM Act” passes, it would undermine digital security for all internet users, impacting private messaging and email app providers, social media platforms, cloud storage providers, and many other internet intermediaries and online services. Free speech for women’s rights advocates would also be at risk. STOP CSAM would also create a carveout in Section 230, the law that protects our online speech, exposing platforms to civil lawsuits for merely hosting a platform where part of the illegal conduct occurred. Tell Congress: don't pass this law that would undermine security and free speech online, two critical elements for fighting for equality for all genders.  

4. Tell Facebook: Stop Silencing Palestine

Since Hamas’ attack on Israel on October 7, Meta’s biased moderation tools and practices, as well as policies on violence and incitement and on dangerous organizations and individuals (DOI) have led to Palestinian content and accounts being removed and banned at an unprecedented scale. As Palestinians and their supporters have taken to social platforms to share images and posts about the situation in the Gaza strip, some have noticed their content suddenly disappear, or had their posts flagged for breaches of the platforms’ terms of use. In some cases, their accounts have been suspended, and in others features such liking and commenting have been restricted

This has an exacerbated impact for the most at risk groups in Gaza, such as those who are pregnant or need reproductive healthcare support, as sharing information online is both an avenue to communicating the reality with the world, as well as sharing information with others who need it the most.

This blog is part of our International Women’s Day series. Read other articles about the fight for gender justice and equitable digital rights for all.

  1. Four Reasons to Protect the Internet this International Women’s Day
  2. Four Infosec Tools for Resistance this International Women’s Day
  3. Four Voices You Should Hear this International Women’s Day
Paige Collings

Four Infosec Tools for Resistance this International Women’s Day 

1 month 1 week ago

While online violence is alarmingly common globally, women are often more likely to be the target of mass online attacks, nonconsensual leaks of sensitive information and content, and other forms of online violence. 

This International Women’s Day, visit EFF’s Surveillance Self-Defense (SSD) to learn how to defend yourself and your friends from surveillance. In addition to tutorials for installing and using security-friendly software, SSD walks you through concepts like making a security plan, the importance of strong passwords, and protecting metadata.

1. Make Your Own Security Plan

This IWD, learn what a security plan looks like and how you can build one. Trying to protect your online data—like pictures, private messages, or documents—from everything all the time is impractical and exhausting. But, have no fear! Security is a process, and through thoughtful planning, you can put together a plan that’s best for you. Security isn’t just about the tools you use or the software you download. It begins with understanding the unique threats you face and how you can counter those threats. 

2. Protect Yourself on Social Networks

Depending on your circumstances, you may need to protect yourself against the social network itself, against other users of the site, or both. Social networks are among the most popular websites on the internet. Facebook, TikTok, and Instagram each have over a billion users. Social networks were generally built on the idea of sharing posts, photographs, and personal information. They have also become forums for organizing and speaking. Any of these activities can rely on privacy and pseudonymity. Visit our SSD guide to learn how to protect yourself.

3. Tips for Attending Protests

Keep yourself, your devices, and your community safe while you make your voice heard. Now, more than ever, people must be able to hold those in power accountable and inspire others through the act of protest. Protecting your electronic devices and digital assets before, during, and after a protest is vital to keeping yourself and your information safe, as well as getting your message out. Theft, damage, confiscation, or forced deletion of media can disrupt your ability to publish your experiences, and those engaging in protest may be subject to search or arrest, or have their movements and associations surveilled. 

4. Communicate Securely with Signal or WhatsApp

Everything you say in a chat app should be private, viewable by only you and the person you're talking with. But that's not how all chats or DMs work. Most of those communication tools aren't end-to-end encrypted, and that means that the company who runs that software could view your chats, or hand over transcripts to law enforcement. That's why it's best to use a chat app like Signal any time you can. Signal uses end-to-end encryption, which means that nobody, not even Signal, can see the contents of your chats. Of course, you can't necessarily force everyone you know to use the communication tool of your choice, but thankfully other popular tools, like Apple's Messages, WhatsApp and more recently, Facebook's Messenger, all use end-to-end encryption too, as long as you're communicating with others on those same platforms. The more people who use these tools, even for innocuous conversations, the better.

On International Women’s Day and every day, stay safe out there! Surveillance self-defense can help.

This blog is part of our International Women’s Day series. Read other articles about the fight for gender justice and equitable digital rights for all.

  1. Four Reasons to Protect the Internet this International Women’s Day
  2. Four Voices You Should Hear this International Women’s Day
  3. Four Actions You Can Take To Protect Digital Rights this International Women’s Day
Paige Collings

Four Reasons to Protect the Internet this International Women’s Day

1 month 1 week ago

Today is International Women’s Day, a day celebrating the achievements of women globally but also a day marking a call to action for accelerating equality and improving the lives of women the world over. 

The internet is a vital tool for women everywhere—provided they have access and are able to use it freely. Here are four reasons why we’re working to protect the free and open internet for women and everyone.

1. The Fight For Reproductive Privacy and Information Access Is Not Over

Data privacy, free expression, and freedom from surveillance intersect with the broader fight for reproductive justice and safe access to abortion. Like so many other aspects of managing our healthcare, these issues are fundamentally tied to our digital lives. With the decision of Dobbs v. Jackson to overturn the protections that Roe v. Wade offered for people seeking abortion healthcare in the United States, what was benign data before is now potentially criminal evidence. This expanded threat to digital rights is especially dangerous for BIPOC, lower-income, immigrant, LGBTQ+ people and other traditionally marginalized communities, and the healthcare providers serving these communities. The repeal of Roe created a lot of new dangers for people seeking healthcare. EFF is working hard to protect your rights in two main areas: 1) your data privacy and security, and 2) your online right to free speech.

2. Governments Continue to Cut Internet Access to Quell Political Dissidence   

The internet is an essential service that enables people to build and create communities, shed light on injustices, and acquire vital knowledge that might not otherwise be available. Governments are very aware of their power to cut off access to this crucial lifeline, and frequently undertake targeted initiatives to shut down civilian access to the internet. In Iran, people have suffered Internet and social media blackouts on and off for nearly two years, following an activist movement rising up after the death of Mahsa Amini, a woman murdered in police custody for refusing to wear a hijab. The movement gained global attention, and in response, the Iranian government rushed to control visibility on the injustice. Social media has been banned in Iran and intermittent shutdowns of the entire peoples’ access to the Internet has cost the country millions, all in effort to control the flow of information and quell political dissidence.

3. People Need to Know When They Are Being Stalked Through Tracking Tech 

At EFF, we’ve been sounding the alarm about the way physical trackers like AirTags and Tiles can be slipped into a target’s bag or car, allowing stalkers and abusers unprecedented access to a person’s location without their knowledge. We’ve also been calling attention to stalkerware, commercially-available apps that are designed to be covertly installed on another person’s device for the purpose of monitoring their activity without their knowledge or consent. This is a huge threat to survivors of domestic abuse as stalkers can track their locations, as well as access a lot of sensitive information like all passwords and documents. For example, Imminent Monitor, once installed on a victim’s computer, could turn on their webcam and microphone, allow perpetrators to view their documents, photographs, and other files, and record all keystrokes entered. Everyone involved in these industries has the responsibility to create a safeguard for people.

4. LGBTQ+ Rights Online Are Being Attacked 

An increase in anti-LGBTQ+ intolerance is harming individuals and communities both online and offline across the globe. Several countries are introducing explicitly anti-LGBTQ+ initiatives to restrict freedom of expression and privacy, which is in turn fuelling offline intolerance against LGBTQ+ people. Across the United States, a growing number of states prohibited transgender youths from obtaining gender-affirming health care, and some restricted access for transgender adults. That’s why we’ve worked to pass data sanctuary laws in pro-LGBTQ+ states to shield health records from disclosure to anti-LGBTQ+ states.

The problem is global. In Jordan, the new Cybercrime Law of 2023 in Jordan restricts encryption and anonymity in digital communications. And in Ghana, the country’s Parliament just voted to pass the country’s draconian Family Values Bill, which introduces prison sentences for those who partake in LGBTQ+ sexual acts, as well as those who promote the rights of gay, lesbian or other non-conventional sexual or gender identities. EFF is working to expose and resist laws like these, and we hope you’ll join us!

This blog is part of our International Women’s Day series. Read other articles about the fight for gender justice and equitable digital rights for all.

  1. Four Infosec Tools for Resistance this International Women’s Day
  2. Four Voices You Should Hear this International Women’s Day
  3. Four Actions You Can Take To Protect Digital Rights this International Women’s Day
Paige Collings

The Atlas of Surveillance Removes Ring, Adds Third-Party Investigative Platforms

1 month 1 week ago

Running the Atlas of Surveillance, our project to map and inventory police surveillance across the United States, means experiencing emotional extremes.

Whenever we announce that we've added new data points to the Atlas, it comes with a great sense of satisfaction. That's because it almost always means that we're hundreds or even thousands of steps closer to achieving what only a few years ago would've seemed impossible: comprehensively documenting the surveillance state through our partnership with students at the University of Nevada, Reno Reynolds School of Journalism.

At the same time, it's depressing as hell. That's because it also reflects how quickly and dangerously the surveillance technology is metastasizing.

We have the exact opposite feeling when we remove items from the Atlas of Surveillance. It's a little sad to see our numbers drop, but at the same time that change in data usually means that a city or county has eliminated a surveillance program.

That brings us to the biggest change in the Atlas since our launch in 2018. This week, we removed 2,530 data points: an entire category of surveillance. With the announcement from Amazon that its home surveillance company Ring will no longer facilitate warrantless requests for consumer video footage, we've decided to sunset that particular dataset.

While law enforcement agencies still maintain accounts on Ring's Neighbors social network, it seems to serve as a communications tool, a function on par with services like Nixle and Citizen, which we currently don't capture in the Atlas. That's not to say law enforcement won't be gathering footage from Ring cameras: they will, through legal process or by directly asking residents to give them access via the Fusus platform. But that type of surveillance doesn't result from merely having a Neighbors account (agencies without accounts can use these methods to obtain footage), which was what our data documented. You can still find out which agencies are maintaining camera registries through the Atlas. 

Ring's decision was a huge victory – and the exact outcome EFF and other civil liberties groups were hoping for. It also has opened up our capacity to track other surveillance technologies growing in use by law enforcement. If we were going to remove a category, we decided we should add one too.

Atlas of Surveillance users will now see a new type of technology: Third-Party Investigative Platforms, or TPIPs. Commons TPIP products include Thomson Reuters CLEAR, LexisNexis Accurint Virtual Crime Center, TransUnion TLOxp, and SoundThinking CrimeTracer (formerly Coplink X from Forensic Logic). These are technologies we've been watching for awhile, but have been struggling to categorize and define. But here's the definition we've come up with:

Third-Party Investigative Platforms are cloud-based software systems that law enforcement agencies subscribe to in order to access, share, mine, and analyze various sources of investigative data. Some of the data the agencies upload themselves, but the systems also provide access to data from other law enforcement, as well as from commercial sources and data brokers. Many products offer AI features, such as pattern identification, face recognition, and predictive analytics. Some agencies employ multiple TPIPs.

We are calling this new category a beta feature in the Atlas, since we are still figuring out how best to research and compile this data nationwide. You'll find fairly comprehensive data on the use of CrimeTracer in Tennessee and Massachusetts, because both states provide the software to local law enforcement agencies throughout the state. Similarly, we've got a large dataset for the use of the Accurint Virtual Crime Center in Colorado, due to a statewide contract. (Big thanks to Prof. Ran Duan's Data Journalism students for working with us to compile those lists!) We've also added more than 60 other agencies around the country, and we expect that dataset to grow as we hone our research methods.

If you've got information on the use of TPIPs in your area, don't hesitate to reach out. You can email us at aos@eff.org, submit a tip through our online form, or file a public records request using the template that EFF and our students have developed to reveal the use of these platforms. 

Dave Maass

Join us for EFF's 8th Annual Tech Trivia Night!

1 month 1 week ago

Join us in San Francisco on May 9th for EFF's 8th annual Tech Trivia Night! Explore the obscure minutiae of digital security, online rights, and internet culture.

Enjoy delicious tacos, churros, and complimentary adult beverages and soft drinks as you and your team battle through rounds of questions—and cutthroat live judging!—to see who will take home the coveted 1st, 2nd, and 3rd place trophies and EFF swag!

Register Now

$45 for CURRENT EFF Members • $55 for General Admission

Thursday, May 9th, 2024 at Public Works from 6 PM to 10 PM
This event is 21+. Please remember to bring ID and a mask.

Thanks to EFF's Luminary Organizational Members DuckDuckGo, No Starch Press, and the Hering Foundation for their year-round support of EFF's mission.

Fighting for first place at EFF’s Tech Trivia Night helps us fight for your rights online! Sponsor one of our annual events and join the movement for digital privacy, free speech, and innovation. Please contact tierney@eff.org for more information.

EFF is dedicated to a harassment-free experience for everyone, and all participants are encouraged to view our full Event Expectations.

Melissa Srago

Victory! EFF Helps Resist Unlawful Warrant and Gag Order Issued to Independent News Outlet

1 month 2 weeks ago

Over the past month, the independent news outlet Indybay has quietly fought off an unlawful search warrant and gag order served by the San Francisco Police Department. Today, a court lifted the gag order and confirmed the warrant is void. The police also promised the court to not seek another warrant from Indybay in its investigation.

Nevertheless, Indybay was unconstitutionally gagged from speaking about the warrant for more than a month. And the SFPD once again violated the law despite past assurances that it was putting safeguards in place to prevent such violations.

EFF provided pro bono legal representation to Indybay throughout the process.

Indybay’s experience highlights a worrying police tactic of demanding unpublished source material from journalists, in violation of clearly established shield laws. Warrants like the one issued by the police invade press autonomy, chill news gathering, and discourage sources from contributing. While this is a victory, Indybay was still gagged from speaking about the warrant, and it would have had to pay thousands of dollars in legal fees to fight the warrant without pro bono counsel. Other small news organizations might not be so lucky. 

It started on January 18, 2024, when an unknown member of the public published a story on Indybay’s unique community-sourced newswire, which allows anyone to publish news and source material on the website. The author claimed credit for smashing windows at the San Francisco Police Credit Union.

On January 24, police sought and obtained a search warrant that required Indybay to turn over any text messages, online identifiers like IP address, or other unpublished information that would help reveal the author of the story. The warrant also ordered Indybay not to speak about the warrant for 90 days. With the help of EFF, Indybay responded that the search warrant was illegal under both California and federal law and requested that the SFPD formally withdraw it. After several more requests and shortly before the deadline to comply with the search warrant, the police agreed to not pursue the warrant further “at this time.” The warrant became void when it was not executed after 10 days under California law, but the gag order remained in place.

Indybay went to court to confirm the warrant would not be renewed and to lift the gag order. It argued it was protected by California and federal shield laws that make it all but impossible for law enforcement to use a search warrant to obtain unpublished source material from a news outlet. California law, Penal Code § 1524(g), in particular, mandates that “no warrant shall issue” for that information. The Federal Privacy Protection Act has some exceptions, but they were clearly not applicable in this situation. Nontraditional and independent news outlets like Indybay are covered by these laws (Indybay fought this same fight more than a decade ago when one of its photographers successfully quashed a search warrant). And when attempting to unmask a source, an IP address can sometimes be as revealing as a reporter’s notebook. In a previous case, EFF established that IP addresses are among the types of unpublished journalistic information typically protected from forced disclosure by law.

In addition, Indybay argued that the gag order was an unconstitutional content-based prior restraint on speech—noting that the government did not have a compelling interest in hiding unlawful investigative techniques.

Rather than fight the case, the police conceded the warrant was void, promised not to seek another search warrant for Indybay’s information during the investigation, and agreed to lift the gag order. A San Francisco Superior Court Judge signed an order confirming that.

That this happened at all is especially concerning since the SFPD had agreed to institute safeguards following its illegal execution of a search warrant against freelance journalist Bryan Carmody in 2019. In settling a lawsuit brought by Carmody, the SFPD agreed to ensure all its employees were aware of its policies concerning warrants to journalists. As a result the department instituted internal guidance and procedures, which do not all appear to have been followed with Indybay.

Moreover, the search warrant and gag order should never have been signed by the court given that it was obviously directed to a news organization. We call on the court and the SFPD to meet with those representing journalists to make sure that we don't have to deal with another unconstitutional gag order and search warrant in another few years.

The San Francisco Police Department's public statement on this case is incomplete. It leaves out the fact that Indybay was gagged for more than a month and that it was only Indybay's continuous resistance that prevented the police from acting on the warrant. It also does not mention whether the police department's internal policies were followed in this case. For one thing, this type of warrant requires approval from the chief of police before it is sought, not after. 

Read more here: 

Stipulated Order

Motion to Quash

Search Warrant

Trujillo Declaration

Burdett Declaration

SFPD Press Release

Mario Trujillo

Should Caddy and Traefik Replace Certbot?

1 month 2 weeks ago

Can free and open source software projects like Caddy and Traefik eventually replace EFF’s Certbot? Although Certbot continues to be developed, we think tools like these help offer a promising path forward in the further development of a secure and encrypted web. For some users, tools like these can replace Certbot completely. 

We started development on Certbot in the mid-2010s with the goal of making it as easy as possible for website operators to offer HTTPS. To accomplish this, we made Certbot interact the best we could with existing web servers like Apache and Nginx without requiring any changes on their end. Unfortunately, this approach of using an external tool to provide functionality beyond what the server was originally designed for presents several challenges. With the help of open source libraries and hundreds of contributors from around the world, we designed Certbot to try to reparse Apache and Nginx configuration files and modify them as needed to set up HTTPS. Certbot interacted with these web servers using the same command line tools as a human user, and then waiting an estimated period of time until the server had (probably) finished doing what we asked it to. 

All of this worked remarkably well. Today, Certbot is used to maintain HTTPS for over 30 million domain names and it continues to be one of the most popular ways for people to interact with Let’s Encrypt, a free certificate authority, which has been hugely successful by many metrics. Despite this, the ease of enabling HTTPS remains hindered by the need for people to run Certbot in addition to their web server. 

That's where software like Caddy and Traefik are different. They are designed with easy HTTPS automation in mind. Caddy even enables HTTPS by default. They both implement the ACME protocol internally, allowing them to integrate with services like Let’s Encrypt to automate regularly obtaining the certificates needed to offer HTTPS. Since this support is built into the server, it completely avoids problems that Certbot sometimes has as an external tool, such as not parsing configuration files in the same way that the software it's trying to configure did. Most importantly, there's less effort required for a website operator to turn on HTTPS, further lowering the barrier to entry, making the internet more secure for everyone. 

Both Caddy and Traefik are written in Go, a memory safe programming language. The Apache and Nginx web servers that Certbot interacts with were written in C, which is not memory safe. This may seem like a minor technical detail, but it’s not. A memory safe programming language is one that systematically prevents software written in it from having certain types of memory access errors which can occur in other programming languages. Studies have found that these memory safety errors are responsible for the majority of security vulnerabilities, leading to a growing push for the development of memory safe software. By adopting software like Caddy or Traefik, you’re able to proactively eliminate an entire class of common security vulnerabilities from that part of your system. 

With these benefits and Certbot’s limitations, should tools like Caddy and Traefik replace Certbot? Yes, they probably should eventually. While EFF does not endorse any specific product or service, we think that software like this is part of a larger suite of tools that will eventually make Certbot no longer needed. The ecosystem will be better served by using integrated software, not external tools that try to configure old and hard-to-use ones. 

No single approach to securing traffic to a website will work for everyone. For example, many hosting providers now offer HTTPS, and this will almost certainly be an easier approach than using any other external software. If you run a website and previously used a tool like Certbot though, consider whether software like Caddy or Traefik is a better fit for you. These tools have been around for years and have extensive user bases. You can use Caddy or Traefik as a TLS terminating reverse proxy or even use Caddy directly as your file server

If Certbot continues to work best for you for some use cases, that's also okay. We plan to continue developing the project until the happy day comes when running an HTTPS site is so simple that Certbot is no longer needed. Until that day, if you do continue using Certbot, please consider donating to EFF so that we’re able to continue supporting the project.

Brad Warren

Privacy First and Competition

1 month 2 weeks ago

Privacy First” is a simple, powerful idea: seeing as so many of today’s technological problems are also privacy problems, why don’t we fix privacy first?

Whether you’re worried about kids’ mental health, or tech’s relationship to journalism, or spying by foreign adversaries, or reproductive rights, or AI deepfakes, or nonconsensual pornography, you’re worried about a problem rooted in the primitive, deplorable state of American privacy law.

It’s really impossible to overstate how bad the state of federal privacy law is in America. The last time the USA got a big, muscular, broadly applicable new consumer privacy law, the year was 1988, and the law was targeted at video-store clerks who leaked your VHS rental history.

It’s been a minute. America is long overdue for a strong, comprehensive privacy law

A new privacy law will help us with all those issues, and more. It would level the playing field between giants with troves of user data and startups who want to build something better. Such a law would keep competition from becoming a race to the bottom on user privacy.

Importantly, a strong privacy law will go a long way to improving the dismal state of competition in America’s ossified and decaying tech sector.

Take the tech sector’s relationship to the news media. The ad-tech duopoly has rigged the advertising market and takes $0.51 out of every advertising dollar. Without their vast troves of nonconsensually harvested personal data, Meta and Google wouldn’t be able to misappropriate billions from the publishers. Banning surveillance advertising wouldn’t just be good for our privacy - it would give publishers leverage to shift those billions back onto their own balance sheets. 

Undoing market concentration will require interoperability so that users can move from dominant services to new, innovative rivals without losing their data and relationships. The biggest challenge to interoperability? Privacy. Every time a user moves from one service to another, the resulting data-flows create risks for those users and their friends, families, customers and other social connections. Congress knows this, which is why every proposed interoperability law incorporates its own little privacy law. Privacy shouldn’t be an afterthought in a tech regulation. A standalone privacy law would give lawmakers the freedom to promote interoperability without having to work out a new privacy system for each effort.

That’s also true of Right to Repair laws: these laws are routinely opposed by tech monopolists who insist that giving Americans the right to choose their own repair shop or parts exposes them to privacy risks. It’s true that our devices harbor vast troves of sensitive information - but that doesn’t mean we should let Big Tech (or Big Car) monopolize repair. Instead, we should require everyone - both original manufacturers and independent repair shops - to honor your privacy.

America’s legal privacy vacuum is largely the result of the commercial surveillance industry’s lobbying power. Increasing competition in the tech sector won’t just help our privacy: it’ll also weaken tech’s lobbying power, which is a function of the vast profits that can be extracted in the absence of “wasteful competition” and the ease with which a concentrated sector can converge on a common lobbying position. 

That’s why EFF has urged the FTC and DOJ to consider privacy impacts when scrutinizing proposed mergers: not just to protect internet users from the harms of surveillance business models, but to protect democracy from the corrupting influence of surveillance cartels.

Privacy isn’t dead. Far from it. For a quarter of a century, would-be tech monopolists have been insisting that we have no privacy and telling us to “get over it.” The vast majority of the public wants privacy and will take it if offered, and grab it if it’s not.  

Whenever someone tells you that privacy is dead, they’re just wishcasting. What they mean is: “If I can convince you privacy is dead, I can make more money at your expense.”

Monopolists want us to believe that their power over our lives is inevitable and unchangeable, just as the surveillance industry banks on convincing you that the fight for privacy was and always will be a lost cause. But we once had a better internet, and we can get a better internet again. The fight for that better internet starts with privacy, a battle that we all want to win.

Cory Doctorow

European Court of Human Rights Confirms: Weakening Encryption Violates Fundamental Rights

1 month 2 weeks ago

In a milestone judgment—Podchasov v. Russiathe European Court of Human Rights (ECtHR) has ruled that weakening of encryption can lead to general and indiscriminate surveillance of the communications of all users and violates the human right to privacy.  

In 2017, the landscape of digital communication in Russia faced a pivotal moment when the government required Telegram Messenger LLP and other “internet communication” providers to store all communication data—and content—for specified durations. These providers were also required to supply law enforcement authorities with users’ data, the content of their communications, as well as any information necessary to decrypt user messages. The FSB (the Russian Federal Security Service) subsequently ordered Telegram to assist in decrypting the communications of specific users suspected of engaging in terrorism-related activities.

Telegram opposed this order on the grounds that it would create a backdoor that would undermine encryption for all of its users. As a result, Russian courts fined Telegram and ordered the blocking of its app within the country. The controversy extended beyond Telegram, drawing in numerous users who contested the disclosure orders in Russian courts. A Russian citizen, Mr Podchasov, escalated the issue to the European Court of Human Rights (ECtHR), arguing that forced decryption of user communication would infringe on the right to private life under Article 8 of the European Convention of Human Rights (ECHR), which reads as follows:  

Everyone has the right to respect for his private and family life, his home and his correspondence (Article 8 ECHR, right to respect for private and family life, home and correspondence) 

EFF has always stood against government intrusion into the private lives of users and advocated for strong privacy guarantees, including the right to confidential communication. Encryption not only safeguards users’ privacy but also protects their right to freedom of expression protected under international human rights law. 

In a great victory for privacy advocates, the ECtHR agreed. The Court found that the requirement of continuous, blanket storage of private user data interferes with the right to privacy under the Convention, emphasizing that the possibility for national authorities to access these data is a crucial factor for determining a human rights violation [at 53]. The Court identified the inherent risks of arbitrary government action in secret surveillance in the present case and found again—following its stance in Roman Zakharov v. Russia—that the relevant legislation failed to live up to the quality of law standards and lacked the adequate and effective safeguards against misuse [75].  Turning to a potential justification for such interference, the ECtHR emphasized the need of a careful balancing test that considers the use of modern data storage and processing technologies and weighs the potential benefits against important private-life interests [62-64]. 

In addressing the State mandate for service providers to submit decryption keys to security services, the court's deliberations culminated in the following key findings [76-80]:

  1. Encryption being important for protecting the right to private life and other fundamental rights, such as freedom of expression: The ECtHR emphasized the importance of encryption technologies for safeguarding the privacy of online communications. Encryption safeguards and protects the right to private life generally while also supporting the exercise of other fundamental rights, such as freedom of expression.
  2. Encryption as a shield against abuses: The Court emphasized the role of encryption to provide a robust defense against unlawful access and generally “appears to help citizens and businesses to defend themselves against abuses of information technologies, such as hacking, identity and personal data theft, fraud and the improper disclosure of confidential information.” The Court held that this must be given due consideration when assessing measures which could weaken encryption.
  3. Decryption of communications orders weakens the encryption for all users: The ECtHR established that the need to decrypt Telegram's "secret chats" requires the weakening of encryption for all users. Taking note again of the dangers of restricting encryption described by many experts in the field, the Court held that backdoors could be exploited by criminal networks and would seriously compromise the security of all users’ electronic communications. 
  4. Alternatives to decryption: The ECtHR took note of a range of alternative solutions to compelled decryption that would not weaken the protective mechanisms, such as forensics on seized devices and better-resourced policing.  

In light of these findings, the Court held that the mandate to decrypt end-to-end encrypted communications risks weakening the encryption mechanism for all users, which was a disproportionate to the legitimate aims pursued. 

In summary [80], the Court concluded that the retention and unrestricted state access to internet communication data, coupled with decryption requirements, cannot be regarded as necessary in a democratic society, and are thus unlawful. It emphasized that a direct access of authorities to user data on a generalized basis and without sufficient safeguards impairs the very essence of the right to private life under the Convention. The Court also highlighted briefs filed by the European Information Society Institute (EISI) and Privacy International, which provided insight into the workings of end-to-end encryption and explained why mandated backdoors represent an illegal and disproportionate measure. 

Impact of the ECtHR ruling on current policy developments 

The ruling is a landmark judgment, which will likely draw new normative lines about human rights standards for private and confidential communication. We are currently supporting Telegram in its parallel complaint to the ECtHR, contending that blocking its app infringes upon fundamental rights. As part of a collaborative efforts of international human rights and media freedom organisations, we have submitted a third-party intervention to the ECtHR, arguing that blocking an entire app is a serious and disproportionate restriction on freedom of expression. That case is still pending. 

The Podchasov ruling also directly challenges ongoing efforts in Europe to weaken encryption to allow access and scanning of our private messages and pictures.

For example, the controversial UK's Online Safety Act creates the risk that online platforms will use software to search all users’ photos, files, and messages, scanning for illegal content. We recently submitted comments to the relevant UK regulator (Ofcom) to avoid any weakening of encryption when this law becomes operational. 

In the EU, we are concerned about the European Commission’s message-scanning proposal (CSAR) as being a disaster for online privacy. It would allow EU authorities to compel online services to scan users’ private messages and compare users’ photos to against law enforcement databases or use error-prone AI algorithms to detect criminal behavior. Such detection measures will inevitably lead to dangerous and unreliable Client-Side Scanning practices, undermining the essence of end-to-end encryption. As the ECtHR deems general user scanning as disproportionate, specifically criticizing measures that weaken existing privacy standards, forcing platforms like WhatsApp or Signal to weaken security by inserting a vulnerability into all users’ devices to enable message scanning must be considered unlawful

The EU regulation proposal is likely to be followed by other proposals to grant law enforcement access to encrypted data and communications. An EU high level expert group on ‘access to data for effective law enforcement’ is expected to make policy recommendations to the next EU Commission in mid-2024. 

We call on lawmakers to take the Court of Human Rights ruling seriously: blanket and indiscriminate scanning of user communication and the general weakening of encryption for users is unacceptable and unlawful. 

Christoph Schmon

Voting No on Prop E Is Easy and Important for San Francisco

1 month 2 weeks ago

San Francisco’s ballot initiative Proposition E is a dangerous and deceptive measure that threatens our privacy, safety, and democratic ideals. It would give the police more power to surveil, chase, and harm. It would allow the police to secretly acquire and use unproven surveillance technologies for a year or more without oversight, eliminating the hard-won protections backed by a majority of San Franciscans that are currently in place. Prop E is not a solution to the city’s challenges, but rather a threat to our rights and freedoms. 

Don’t be fooled by the misleading arguments of Prop E's supporters. A group of tech billionaires have contributed a small fortune to convince San Francisco voters that they would be safer if surveilled. They want us to believe that Prop E will make us safer and more secure, but the truth is that it will do the opposite. Prop E will allow the police to use any surveillance technology they want for up to a year without considering whether it works as promised—or at all—or whether it presents risks to residents’ privacy or safety. Police only have to present a use policy after a year of free and unaccountable use, and absent a majority vote of the Board of Supervisors rejecting the policy, this unaccountable use could continue indefinitely. Worse still, some technologies, like surveillance cameras and drones, would be exempt from oversight indefinitely, putting the unilateral decision about when, where, and how to deploy such technology in the hands of the SFPD.

We want something different for our city. In 2019, with the support a wide range of community members and civil society groups including EFF, San Francisco’s Board of Supervisors took a historic step forward by passing a groundbreaking surveillance transparency and accountability ordinance through a 10-1 vote. The law requires that before a city department, including the police, acquire or use a surveillance technology, the department must present a use policy to the Board of Supervisors, which then considers the proposal in a public process that offers opportunity for public comment. This process respects privacy, dignity, and safety and empowers residents to make their voices heard about the potential impacts and risks. 

Despite what Prop E proponents would have you believe, the city’s surveillance ordinance has not stopped police from acquiring new technologies. In fact, they have gained access to broad networks of live-feed cameras. Current law helps ensure that the police follow reasonable guidelines on using technology and mitigating potentially disparate harms. Prop E would gut police accountability from this law and return decision-making about how we are surveilled to closed spaces where unproven and unvetted vendor promises rule the narrative. 

As San Francisco residents, we must stand up for ourselves and our city and vote No on Prop E. Voting No on Prop E is not only an easy choice, but also a necessary one. It is a choice that reflects our values and vision for San Francisco. It is a choice that shows that we will not let a million-dollar campaign of fear drive us to sacrifice our rights. Voting No on Prop E is a choice that proves we are unwilling to accept anything less than what we deserve: privacy, safety, and accountability.

March 5 is election day. Make your voice heard. Vote No on Prop E.  

Nathan Sheard

Celebrating 15 Years of Surveillance Self-Defense

1 month 2 weeks ago

On March 3rd, 2009, we launched Surveillance Self-Defense (SSD). At the time, we pitched it as, "an online how-to guide for protecting your private data against government spying." In the last decade hundreds of people have contributed to SSD, over 20 million people have read it, and the content has nearly doubled in length from 40,000 words to almost 80,000. SSD has served as inspiration for many other guides focused on keeping specific populations safe, and those guides have in turn affected how we've approached SSD. A lot has changed in the world over the last 15 years, and SSD has changed with it. 

The Year Is 2009

Let's take a minute to travel back in time to the initial announcement of SSD. Launched with the support of the Open Society Institute, and written entirely by just a few people, we detailed exactly what our intentions were with SSD at the start:

EFF created the Surveillance Self-Defense site to educate Americans about the law and technology of communications surveillance and computer searches and seizures, and to provide the information and tools necessary to keep their private data out of the government's hands… The Surveillance Self-Defense project offers citizens a legal and technical toolkit with tips on how to defend themselves in case the government attempts to search, seize, subpoena or spy on their most private data.

SSD's design when it first launched in 2009.

To put this further into context, it's worth looking at where we were in 2009. Avatar was the top grossing movie of the year. Barack Obama was in his first term as president in the U.S. In a then-novel approach, Iranians turned to Twitter to organize protests. The NSA has a long history of spying on Americans, but we hadn't gotten to Jewel v. NSA or the Snowden revelations yet. And while the iPhone had been around for two years, it hadn't seen its first big privacy controversy yet (that would come in December of that year, but it'd be another year still before we hit the "your apps are watching you" stage).

Most importantly, in 2009 it was more complicated to keep your data secure than it is today. HTTPS wasn't common, using Tor required more technical know-how than it does nowadays, encrypted IMs were the fastest way to communicate securely, and full-disk encryption wasn't a common feature on smartphones. Even for computers, disk encryption required special software and knowledge to implement (not to mention time, solid state drives were still extremely expensive in 2009, so most people still had spinning disk hard drives, which took ages to encrypt and usually slowed down your computer significantly).

And thus, SSD in 2009 focused heavily on law enforcement and government access with its advice. Not long after the launch in 2009, in the midst of the Iranian uprising, we launched the international version, which focused on the concerns of individuals struggling to preserve their right to free expression in authoritarian regimes.

And that's where SSD stood, mostly as-is, for about six years. 

The Redesigns

In 2014, we redesigned and relaunched SSD with support from the Ford Foundation. The relaunch had at least 80 people involved in the writing, reviewing, design, and translation process. With the relaunch, there was also a shift in the mission as the threats expanded from just the government, to corporate and personal risks as well. From the press release:

"Everyone has something to protect, whether it's from the government or stalkers or data-miners," said EFF International Director Danny O'Brien. "Surveillance Self-Defense will help you think through your personal risk factors and concerns—is it an authoritarian government you need to worry about, or an ex-spouse, or your employer?—and guide you to appropriate tools and practices based on your specific situation."

2014 proved to be an effective year for a major update. After the murders of Michael Brown and Eric Garner, protestors hit the streets across the U.S., which made our protest guide particularly useful. There were also major security vulnerabilities that year, like Heartbleed, which caused all sorts of security issues for website operators and their visitors, and Shellshock, which opened up everything from servers to cameras to bug exploits, ushering in what felt like an endless stream of software updates on everything with a computer chip in it. And of course, there was still fallout from the Snowden leaks in 2013.

In 2018 we did another redesign, and added a new logo for SSD that came along with EFF's new design. This is more or less the same design of the site today.

SSD's current design, which further clarifies what sections a guide is in, and expands the security scenarios.

Perhaps the most notable difference between this iteration of SSD and the years before is the lack of detailed reasoning explaining the need for its existence on the front page. No longer was it necessary to explain why we all need to practice surveillance self-defense. Online surveillance had gone mainstream.

Shifting Language Over the Years

As the years passed and the site was redesigned, we also shifted how we talked about security. In 2009 we wrote about security with terms like, "adversaries," "defensive technology," "threat models," and "assets." These were all common cybersecurity terms at the time, but made security sound like a military exercise, which often disenfranchised the very people who needed help. For example, in the later part of the 2010s, we reworked the idea of "threat modeling," when we published Your Security Plan. This was meant to be less intimidating and more inclusive of the various types of risks that people face.

The advice in SSD has changed over the years, too. Take passwords as an example, where in 2009 we said, "Although we recommend memorizing your passwords, we recognize you probably won't." First off, rude! Second off, maybe that could fly with the lower number of accounts we all had back in 2009, but nowadays nobody is going to remember hundreds of passwords. And regardless, that seems pretty dang impossible when paired with the final bit of advice, "You should change passwords every week, every month, or every year — it all depends on the threat, the risk, and the value of the asset, traded against usability and convenience."

Moving onto 2015, we phrased this same sentiment much differently, "Reusing passwords is an exceptionally bad security practice, because if an attacker gets hold of one password, she will often try using that password on various accounts belonging to the same person… Avoiding password reuse is a valuable security precaution, but you won't be able to remember all your passwords if each one is different. Fortunately, there are software tools to help with this—a password manager."

Well, that's much more polite!

Since then, we've toned that down even more, "Reusing passwords is a dangerous security practice. If someone gets ahold of your password —whether that's from a data breach, or wherever else—they can often gain access to any other account you used that same password. The solution is to use unique passwords everywhere and take additional steps to secure your accounts when possible."

Security is an always evolving process, so too is how we talk about it. But the more people we bring on board, the better it is for everyone. How we talk about surveillance self-defense will assuredly continue to adapt in the future.

Shifting Language(s) Over the Years

Initially in 2009, SSD was only available in English, and soon after launch, in Bulgarian. In the 2014 re-launch, we added Arabic and Spanish. Then added French, Thai, Vietnamese, and Urdu in 2015. Later that year, we added a handful of Amharic translations, too. This was accomplished through a web of people in dozens of countries who volunteered to translate and review everything. Many of these translations were done for highly specific reasons. For example, we had a Google Policy Fellow, Endalk Chala, who was part of the Zone 9 bloggers in Ethiopia. He translated everything into Amharic as he was fighting for his colleagues and friends who were imprisoned in Ethiopia on terrorism charges.

By 2019, we were translating most of SSD into at least 10 languages: Amharic, Arabic, Spanish, French, Russian, Turkish, Vietnamese, Brazilian Portuguese, Thai, and Urdu (as well as additional, externally-hosted community translations in Indonesian Bahasa, Burmese, Traditional Chinese, Igbo, Khmer, Swahili, Yoruba, and Twi).

Currently, we're focusing on getting the entirety of SSD re-translated into seven languages, then focusing our efforts on translating specific guides into other languages. 

Always Updating

Since 2009, we've done our best to review and update the guides in SSD. This has included minor changes to respond to news events, depreciating guides completely when they're no longer applicable in modern security plans, and massive rewrites when technology has changed.

The original version of SSD was launched mostly as a static text (we even offered a printer-friendly version), though updates and revisions did occur, they were not publicly tracked as clearly as they are today. In its early years, SSD was able to provide useful guidance across a number of important events, like Occupy Wall Street, before the major site redesign in 2014, which helped it become more useful training activists, including for Ferguson and Standing Rock, amongst others. The ability to update SSD along with changing trends and needs has ensured it can always be useful as a resource.

That redesign also better facilitated the updates process. The site became easier to navigate and use, and easier to update. For example, in 2017 we took on a round of guide audits in response to concerns following the 2016 election. In 2019 we continued that process with around seven major updates to SSD, and in 2020, we did five. We don't have great stats for 2021 and 2022, but in 2023 we managed 14 major updates or new guides. We're hoping to have the majority of SSD reviewed and revamped by the end of this year, with a handful of expansions along the way.

Which brings us to the future of SSD. We will continue updating, adapting, and adding to SSD in the coming years. It is often impossible to know what will be needed, but rest assured we'll be there to answer that whenever we can. As mentioned above, this includes getting more translations underway, and continuing to ensure that everything is accurate and up-to-date so SSD can remain one of the best repositories of security information available online.

We hope you’ll join EFF in celebrating 15 years of SSD!

Thorin Klosowski

Privacy Isn't Dead. Far From It. | EFFector 36.3

1 month 2 weeks ago

As we continue the journey of fighting for digital freedoms, it can be hard to keep up on the latest happenings. Thankfully, EFF has a guide to keep you in the loop! EFFector 36.3 is out now and covers the latest news, including recent changes to the Kids Online Safety Act (spoiler alert: IT'S STILL BAD), why we flew a plane over San Francisco, and the first episode of Season 5 of our award-winning "How to Fix the Internet" podcast!

You can read the full newsletter here, or subscribe to get the next issue in your inbox automatically! You can also listen to the audio version of the newsletter on the Internet Archive, or by clicking the button below:


EFFector 36.3 | Privacy Isn't Dead. Far From It.

Since 1990 EFF has published EFFector to help keep readers on the bleeding edge of their digital rights. We know that the intersection of technology, civil liberties, human rights, and the law can be complicated, so EFFector is a great way to stay on top of things. The newsletter is chock full of links to updates, announcements, blog posts, and other stories to help keep readers—and listeners—up to date on the movement to protect online privacy and free expression. 

Thank you to the supporters around the world who make our work possible! If you're not a member yet, join EFF today to help us fight for a brighter digital future.

Christian Romero

A Virtual Reality Tour of Surveillance Tech at the Border: A Conversation with Dave Maass of the Electronic Frontier Foundation

1 month 2 weeks ago

This interview is crossposted from The Markup, a nonprofit news organization that investigates technology and its impact on society.

By: Monique O. Madan, Investigative Reporter at The Markup

After reading my daily news stories amid his declining health, my grandfather made it a habit of traveling the world—all from his desk and wheelchair. When I went on trips, he always had strong opinions and recommendations for me, as if he’d already been there. “I've traveled to hundreds of countries," he would tell me. "It's called Google Earth. Today, I’m going to Armenia.” My Abuelo’s passion for teleporting via Google Street View has always been one of my fondest memories and has never left me. 

So naturally, when I found out that Dave Maass of the Electronic Frontier Foundation gave virtual reality tours of surveillance technology along the U.S.–Mexico border, I had to make it happen. I cover technology at the intersection of immigration, criminal justice, social justice and government accountability, and Maass’ tour aligns with my work as I investigate border surveillance. 

My journey began in a small, quiet, conference room at the Homestead Cybrarium, a hybrid virtual public library where I checked out virtual reality gear. The moment I slid the headset onto my face and the tour started, I was transported to a beach in San Diego. An hour and a half later, I had traveled across 1,500 miles worth of towns and deserts and ended up in Brownsville, Texas.

During that time, we looked at surveillance technology in 27 different cities on both sides of the border. Some of the tech I saw were autonomous towers, aerostat blimps, sky towers, automated license plate readers, and border checkpoints. 

After the excursion, I talked with Maass, a former journalist, about the experience. Our conversation has been edited for brevity and clarity.

Monique O. Madan: You began by dropping me in San Diego, California, and it was intense. Tell me why you chose the location to start this experience.

Dave Maass: So I typically start the tour in San Diego for two reasons. One is because it is the westernmost part of the border, so it's a natural place to start. But more importantly, it is such a stark contrast to be able to jump from one side to the other, from the San Diego side to the Tijuana side.

When you're in San Diego, you're in this very militarized park that's totally empty, with patrol vehicles and this very fierce-looking wall and a giant surveillance tower over your head. You can really get a sense of the scale.

And once you're used to that, I jump you to the other side of the wall. You're able to suddenly see how it's party time in Tijuana, how they painted the wall, and how there are restaurants and food stands and people playing on the beach and there are all these Instagram moments.

Credit: Electronic Frontier Foundation

Yet on the other side is the American militarized border, you know, essentially spying on everybody who's just going about their lives on the Mexican side.

It also serves as a way to show the power of VR. If there were no wall, you could walk that in a minute. But because of the border wall, you've got to go all the way to the border crossing, and then come all the way back. And we're talking, potentially, hours for you to be able to go that distance. 

Madan: I felt like I was in two different places, but it was really the same place, just feet away from each other. We saw remote video surveillance systems, relocatable ones. We saw integrated fixed towers, autonomous surveillance towers, sky towers, aerostat radar systems, and then covert automated license plate readers. How do you get the average person to digest what all these things really mean?

7 Stops on Dave Maass’ Virtual Reality Surveillance Tour of the U.S.–Mexico Border

The following links take you to Google Street View.

Maass: Me and some colleagues at EFF, we were looking at how we could use virtual reality to help people understand surveillance. We came up with a very basic game called “Spot the Surveillance,” where you could put on a headset and it puts you in one location with a 360-degree camera view. We took a photo of a corner in San Francisco that already had a lot of surveillance, but we also Photoshopped in other pieces of surveillance. The idea was for people to look around and try to find the surveillance.

When they found one, it would ping, and it would tell you what the technology could do. And we found that that helped people learn to look around their environment for these technologies, to understand it. So it gave people a better idea of how we exist in the environment differently than if they were shown a picture or a PowerPoint presentation that was like, “This is what a license plate reader looks like. This is what a drone looks like.”

That is why when we're on the southern border tour, there are certain places where I don't point the technology out to you. I ask you to look around and see if you can find it yourself.

Sometimes I start with one where it's overhead because people are looking around. They're pointing to a radio tower, pointing to something else. It takes them a while before they actually look up in the sky and see there's this giant spy mob over their head. But, yeah, one of the other ones is these license plate readers that are hidden in traffic cones. People don't notice them there because they're just these traffic cones that are so ubiquitous along highways and streets that they don't actually think about it.

Madan: People have the impression that surveillance ops are only in militarized settings. Can you talk to me about whether that’s true?

Maass: Certainly there are towers in the middle of the desert. Certainly there are towers that are in remote or rural areas. But there are just so many that are in urban areas, from big cities to small towns.

Rather than just a close-up picture of a tower, once you actually see one and you're able to look at where the cameras are pointed, you start to see things like towers that are able to look into people's back windows, and towers that are able to look into people's backyards, and whole communities that are going to have glimpses over their neighborhood all the time.

But so rarely in the conversation is the impact on the communities that live on both the U.S. and Mexican side of the border, and who are just there all the time trying to get by and have, you know, the normal dream of prospering and raising a family.

Madan: What does this mean from a privacy, human rights, and civil liberties standpoint? 

Maass: There’s not a lot of transparency around issues of technology. That is one of the major flaws, both for human rights and civil liberties, but it's also a flaw for those who believe that technology is going to address whatever amorphous problem they've identified or failed to identify with border security and migration. So it's hard to know when this is being abused and how.

But what we can say is that as [the government] is applying more artificial intelligence to its camera system, it's able to document the pattern of life of people who live along the border.

It may be capturing people and learning where they work and where they're worshiping or who they are associated with. So you can imagine that if you are somebody who lives in that community and if you're living in that community your whole life, the government may have, by the time you're 31 years old, your entire driving history on file that somebody can access at any time, with who knows what safeguards are in place.

But beyond all that, it really normalizes surveillance for a whole community.

There are a lot of psychological studies out there about how surveillance can affect people over time, affect their behavior, and affect their perceptions of a society. That's one of the other things I worry about: What kind of psychological trauma is surveillance causing for these communities over the long term, in ways that may not be immediately perceptible?

Madan: One of the most interesting uses of experiencing this tour via the VR technology was being able to pause and observe every single detail at the border checkpoint.

Maass: Most people are just rolling through, and so you don't get to notice all of the different elements of a checkpoint. But because the Google Street View car went through, we can roll through it at our leisure and point out different things. I have a series of checkpoints that I go through with people, show them this is where the license plate reader is, this is where the scanner truck is, here's the first surveillance camera, here's the second surveillance camera. We can see the body-worn camera on this particular officer. Here's where people are searched. Here's where they're detained. Here's where their car is rolled through an X-ray machine.

Madan: So your team has been mapping border surveillance for a while. Tell us about that and how it fits into this experience.

Maass: We started mapping out the towers in 2022, but we had started researching and building a database of at least the amount of surveillance towers by district in 2019. 

I don't think it was apparent to anyone until we started mapping these out, how concentrated towers are in populated areas. Maybe if you were in one of those populated areas, you knew about it, or maybe you didn't.

In the long haul, it may start to tell a little bit more about border policy in general and whether any of these are having any kind of impact, and maybe we start to learn more about apprehensions and other kinds of data that we can connect to.

Madan: If someone wanted to take a tour like this, if they wanted to hop on in VR and visit a few of these places, how can they do that? 

Maass: So if they have a VR headset, a Meta Quest 2 or newer, the Wander app is what you're going to use. You can just go into the app and position yourself somewhere in the border. Jump around a little bit, maybe it will be like five feet, and you can start seeing a surveillance tower.

If you don’t have a headset and want to do it in your browser, you can go to EFF’s map and click on a tower. You’ll see a Street View link when you scroll down. Or you can use those tower coordinates and then go to your VR headset and try to find it.

Madan: What are your thoughts about the Meta Quest headset—formerly known as the Oculus Rift—being founded by Palmer Luckey, who also founded the company that made one of the towers on the tour?

Maass: There’s certainly some irony about using a technology that was championed by Palmer Luckey to shine light on another technology championed by Palmer Luckey. That's not the only tech irony, of course: Wander [the app used for the tour] also depends on using products from Google and Meta, both of whom continue to contribute to the rise of surveillance in society, to investigate surveillance.

Madan: What's your biggest takeaway as the person giving this tour?

Maass: I am a researcher and educator, and an activist and communicator. To me, this is one of the most impactful ways that I can reach people and give them a meaningful experience about the border. 

I think that when people are consuming information about the border, they're just getting little snippets from a little particular area. You know, it's always a little place that they're getting a little sliver of what's going on. 

But when we're able to do this with VR, I'm able to take them everywhere. I'm able to take them to both sides of the border. We're able to see a whole lot, and they're able to come away by the end of it, feeling like they were there. Like your brain starts filling in the blanks. People get this experience that they wouldn't be able to get any other way.

Being able to linger over these spaces on my own time showed me just how much surveillance is truly embedded in people's daily lives. When I left the library, I found myself inspecting traffic cones for license plate readers. 

As I continue to investigate border surveillance, this experience really showed me just how educational these tools can be for academics, research and journalism. 

Thanks for reading,
Investigative Reporter
The Markup

This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

Dave Maass

Ghana's President Must Refuse to Sign the Anti-LGBTQ+ Bill

1 month 3 weeks ago

After three years of political discussions, MPs in Ghana's Parliament voted to pass the country’s draconian Promotion of Proper Human Sexual Rights and Ghanaian Family Values Bill on February 28th. The bill now heads to Ghana’s President Nana Akufo-Addo to be signed into law. 

President Nana Akufo-Addo must protect the human rights of all people in Ghana and refuse to provide assent to the bill.

This anti-LGBTQ+ legislation introduces prison sentences for those who partake in LGBTQ+ sexual acts, as well as those who promote the rights of gay, lesbian or other non-conventional sexual or gender identities. This would effectively ban all speech and activity on and offline that even remotely supports LGBTQ+ rights.

Ghanaian authorities could probe the social media accounts of anyone applying for a visa for pro-LGBTQ+ speech or create lists of pro-LGBTQ+ supporters to be arrested upon entry. They could also require online platforms to suppress content about LGBTQ+ issues, regardless of where it was created. 

Doing so would criminalize the activity of many major cultural and commercial institutions. If President Akufo-Addo does approve the bill, musicians, corporations, and other entities that openly support LGBTQ+ rights would be banned in Ghana.

Despite this direct threat to online freedom of expression, tech giants are yet to speak out publicly against the LGBTQ+ persecution in Ghana. Twitter opened its first African office in Accra in April 2021, citing Ghana as “a supporter of free speech, online freedom, and the Open Internet.” Adaora Ikenze, Facebook’s head of Public Policy in Anglophone West Africa has said: “We want the millions of people in Ghana and around the world who use our services to be able to connect, share and express themselves freely and safely, and will continue to protect their ability to do that on our platforms.” Both companies have essentially dodged the question.

For many countries across Africa, and indeed the world, the codification of anti-LGBTQ+ discourses and beliefs can be traced back to colonial rule, and a recent CNN investigation from December 2023 found alleged links between the drafting of homophobic laws in Africa and a US nonprofit. The group denied those links, despite having hosted a political conference in Accra shortly before an early version of this bill was drafted.

Regardless of its origin, the past three years of political and social discussion have contributed to a decimation of LGBTQ+ rights in Ghana, and the decision by MPs in Ghana’s Parliament to pass this bill creates severe impacts not just for LGBTQ+ people in Ghana, but for the very principle of free expression online and off. President Nana Akufo-Addo must reject it.

Paige Collings

We Flew a Plane Over San Francisco to Fight Proposition E. Here's Why.

1 month 3 weeks ago

Proposition E, which San Franciscans will be asked to vote on in the March 5 election, is so dangerous that last weekend we chartered a plane to inform our neighbors about what the ballot measure does and urge them to vote NO on it. If you were in Dolores Park, Golden Gate Park, Chinatown, or anywhere in between on Saturday, there’s a chance you saw it, with a huge banner flying through the sky: “No Surveillance State! No on Prop E.”

Despite the fact that the San Francisco Chronicle has endorsed a NO vote on Prop E, and even quoted some police who don’t find its changes useful to keeping the public safe, proponents of Prop E have raised over $1 million to push this unnecessary, ill-thought out, and downright dangerous ballot measure.

San Francisco, Say NOPE: Vote NO on Prop E on March 5

What Does Prop E Do?

Prop E is a haphazard mess of proposals that tries to capitalize on residents’ fear of crime in an attempt to gut commonsense democratic oversight of the San Francisco Police Department (SFPD). In addition to removing certain police oversight authority from the civilian-staffed Police Commission and expanding the circumstances under which police may conduct high-speed vehicle chases, Prop E would also amend existing law passed in 2019 to protect San Franciscans from invasive, untested, or biased police surveillance technologies. Currently, if the SFPD wants to acquire a new technology, they must provide a detailed use policy to the democratically-elected Board of Supervisors, in a process that allows for public comment. The Board then votes on whether and how the police can use the technology.

Prop E guts these protective measures designed to bring communities into the conversation about public safety. If Prop E passes on March 5, then the SFPD can unilaterally use any technology they want for a full year without the Board’s approval, without publishing an official policy about how they’d use the technology, and without allowing community members to voice their concerns.

Why is Prop E Dangerous and Unnecessary?

Across the country, police often buy and deploy surveillance equipment without residents of their towns even knowing what police are using or how they’re using it. This means that dangerous technologies—technologies other cities have even banned—are being used without any transparency, accountability, or democratic control.

San Franciscans advocated for and overwhelmingly supported a law that provides them with more knowledge of, and a voice in, what technologies the police use. Under current law, if the SFPD wanted to use racist predictive policing algorithms that U.S. Senators are currently advising the Department of Justice to stop funding or if the SFPD wanted to buy up geolocation data being harvested from people’s cells phones and sold on the advertising data broker market, they have to let the public know and put it to a vote before the city’s democratically-elected governing body first. Prop E would gut any meaningful democratic check on police’s acquisition and use of surveillance technologies.

What Technology Would Prop E Allow Police to Use?

That's the thing—we don't know, and if Prop E passes, we may never know. Today, if the SFPD decides to use a piece of surveillance technology, there is a process for sharing that information with the public. With Prop E, that process won't happen until the technology has been in use for a full year. And if police abandon use of a technology before a year, we may never find out what technology police tried out and how they used it. 

Even though we don't know what technologies the SFPD is eyeing, we do know what technologies other police departments have been buying in cities around the country: AI-based “predictive policing,” and social media scanning tools are just two examples. And according to the City Attorney, Prop E would even enable the SFPD to outfit surveillance tools such as drones and surveillance cameras with face recognition technology. San Francisco currently has a ban on police using remote-controlled robots to deploy deadly force, but if passed, Prop E would allow police to invest in technologies like taser-armed drones without any oversight or potential for elected officials to block the sale. 

Don’t let police experiment on San Franciscans with dangerous, untested surveillance technologies. Say NOPE to a surveillance state. Vote NO on Prop E on March 5.  
Matthew Guariglia

Sen. Wyden Exposes Data Brokers Selling Location Data to Anti-Abortion Groups That Target Abortion Seekers

1 month 3 weeks ago

This post was written by Jack Beck, an EFF legal intern

In a recent letter to the FTC and SEC, Sen. Ron Wyden (OR) details new information on data broker Near, which sold the location data of people seeking reproductive healthcare to anti-abortion groups. Near enabled these groups to send targeted ads promoting anti-abortion content to people who had visited Planned Parenthood and similar clinics.

In May 2023, the Wall Street Journal reported that Near was selling location data to anti-abortion groups. Specifically, the Journal found that the Veritas Society, a non-profit established by Wisconsin Right to Life, had hired ad agency Recrue Media. That agency purchased location data from Near and used it to target anti-abortion messaging at people who had sought reproductive healthcare.

The Veritas Society detailed the operation on its website (on a page that was taken down but saved by the Internet Archive) and stated that it delivered over 14 million ads to people who visited reproductive healthcare clinics. These ads appeared on Facebook, Instagram, Snapchat, and other social media for people who had sought reproductive healthcare.

When contacted by Sen. Wyden’s investigative team, Recrue staff admitted that the agency used Near’s website to literally “draw a line” around areas their client wanted them to target. They drew these lines around reproductive health care facilities across the country, using location data purchased from Near to target visitors to 600 Planned Parenthood different locations. Sen. Wyden’s team also confirmed with Near that, until the summer of 2022, no safeguards were in place to protect the data privacy of people visiting sensitive places.

Moreover, as Sen. Wyden explains in his letter, Near was selling data to the government, though it claimed on its website to be doing no such thing. As of October 18, 2023, Sen. Wyden’s investigation found Near was still selling location data harvested from Americans without their informed consent.

Near’s invasion of our privacy shows why Congress and the states must enact privacy-first legislation that limits how corporations collect and monetize our data. We also need privacy statutes that prevent the government from sidestepping the Fourth Amendment by purchasing location information—as Sen. Wyden has proposed. Even the government admits this is a problem.  Furthermore, as Near’s misconduct illustrates, safeguards must be in place that protect people in sensitive locations from being tracked.

This isn’t the first time we’ve seen data brokers sell information that can reveal visits to abortion clinics. We need laws now to strengthen privacy protections for consumers. We thank Sen. Wyden for conducting this investigation. We also commend the FTC’s recent bar on a data broker selling sensitive location data. We hope this represents the start of a longstanding trend.

Adam Schwartz

EFF to D.C. Circuit: The U.S. Government’s Forced Disclosure of Visa Applicants’ Social Media Identifiers Harms Free Speech and Privacy

1 month 3 weeks ago

Special thanks to legal intern Alissa Johnson, who was the lead author of this post.

EFF recently filed an amicus brief in the U.S. Court of Appeals for the D.C. Circuit urging the court to reverse a lower court decision upholding a State Department rule that forces visa applicants to the United States to disclose their social media identifiers as part of the application process. If upheld, the district court ruling has severe implications for free speech and privacy not just for visa applicants, but also the people in their social media networks—millions, if not billions of people, given that the “Disclosure Requirement” applies to 14.7 million visa applicants annually.

Since 2019, visa applicants to the United States have been required to disclose social media identifiers they have used in the last five years to the U.S. government. Two U.S.-based organizations that regularly collaborate with documentary filmmakers around the world sued, challenging the policy on First Amendment and other grounds. A federal judge dismissed the case in August 2023, and plaintiffs filed an appeal, asserting that the district court erred in applying an overly deferential standard of review to plaintiffs’ First Amendment claims, among other arguments.

Our amicus brief lays out the privacy interests that visa applicants have in their public-facing social media profiles, the Disclosure Requirement’s chilling effect on the speech of both applicants and their social media connections, and the features of social media platforms like Facebook, Instagram, and X that reinforce these privacy interests and chilling effects.

Social media paints an alarmingly detailed picture of users’ personal lives, covering far more information that that can be gleaned from a visa application. Although the Disclosure Requirement implicates only “public-facing” social media profiles, registering these profiles still exposes substantial personal information to the U.S. government because of the number of people impacted and the vast amounts of information shared on social media, both intentionally and unintentionally. Moreover, collecting data across social media platforms gives the U.S. government access to a wealth of information that may reveal more in combination than any individual question or post would alone. This risk is even further heightened if government agencies use automated tools to conduct their review—which the State Department has not ruled out and the Department of Homeland Security’s component Customs and Border Protection has already begun doing in its own social media monitoring program. Visa applicants may also unintentionally reveal personal information on their public-facing profiles, either due to difficulties in navigating default privacy setting within or across platforms, or through personal information posted by social media connections rather than the applicants themselves.

The Disclosure Requirement’s infringements on applicants’ privacy are further heightened because visa applicants are subject to social media monitoring not just during the visa vetting process, but even after they arrive in the United States. The policy also allows for public social media information to be stored in government databases for upwards of 100 years and shared with domestic and foreign government entities.  

Because of the Disclosure Requirement’s potential to expose vast amounts of applicants’ personal information, the policy chills First Amendment-protected speech of both the applicant themselves and their social media connections. The Disclosure Requirement allows the government to link pseudonymous accounts to real-world identities, impeding applicants’ ability to exist anonymously in online spaces. In response, a visa applicant might limit their speech, shut down pseudonymous accounts, or disengage from social media altogether. They might disassociate from others for fear that those connections could be offensive to the U.S. government. And their social media connections—including U.S. persons—might limit or sever online connections with friends, family, or colleagues who may be applying for a U.S. visa for fear of being under the government’s watchful eye.  

The Disclosure Requirement hamstrings the ability of visa applicants and their social media connections to freely engage in speech and association online. We hope that the D.C. Circuit reverses the district court’s ruling and remands the case for further proceedings.

Saira Hussain
3 minutes 39 seconds ago
EFF's Deeplinks Blog: Noteworthy news from around the internet
Subscribe to EFF update feed