Outliving Outrage on the Public Interest Internet: the CDDB Story

5 days 4 hours ago

This is the third in our blog series on the public interest internet: past, present and future.

In our previous blog post, we discussed how in the early days of the internet, regulators feared that without strict copyright enforcement and pre-packaged entertainment, the new digital frontier would be empty of content. But the public interest internet barn-raised to fill the gap—before the fledgling digital giants commercialised and enclosed those innovations. These enclosures did not go unnoticed, however—and some worked to keep the public interest internet alive.

Compact discs (CDs) were the cutting edge of the digital revolution a decade before the web. Their adoption initially followed Lehman’s rightsholder-led transition – where existing publishers led the charge into a new medium, rather than the user-led homesteading of the internet. The existing record labels maintained control of CD production and distribution, and did little to exploit the new tech—but they did profit from bringing their old back catalogues onto the new digital format. The format was immensely profitable, because everyone re-bought their existing vinyl collections to move it onto CD. Beyond the improved fidelity of CDs, the music industry had no incentive to add new functionality to CDs or their players. When CD players were first introduced, they were sold exclusively as self-contained music devices—a straight-up replacement for record players that you could plug into speakers or your hi-fi “music centre,”  but not much else. They were digital, but in no way online or integrated with any other digital technology.

The exception was the CD playing hardware that was incorporated into the latest multimedia PCs—a repurposing of the dedicated music playing hardware which sent the CD to the PC as a pile of digital data. With this tech, you could use CDs as a read-only data store, a fixed set of data, a “CD-ROM”; or you could insert a CD music disc, and use your desktop PC to read in and play its digital audio files through tinny desktop speakers, or headphones.

The crazy thing was that those music CDs contained raw dumps of audio, but almost nothing else. There was no bonus artist info stored on the CDs; no digital record of the CD title, no digital version of the CD’s cover image JPEG, not even a user-readable filename or two: just 74 minutes of untitled digital sound data, split into separate tracks, like its vinyl forebear. Consequently, a PC with a CD player could read and play a CD, but had no idea what it was playing. About the only additional information a computer could extract from the CD beyond the raw audio was the total number of tracks, and how long each track lasted. Plug a CD into a player or a PC, and all it could tell you was that you were now listening to Track 3 of 12.

Around about the same time as movie enthusiasts were building the IMDb, music enthusiasts were solving this problem by collectively building their own compact disk database—the CD Database (CDDB). Programmer Ti Kan wrote open source client software that would auto-run when a CD was put into a computer, and grab the number of tracks and their length. This client would query a public online database (designed by another coder, Steve Scherf) to see if anyone else had seen a CD with the same fingerprint. If no one had, the program would pop up a window asking the PC user to enter the album details themselves, and would upload that information to the collective store, ready for the next user to find. All it took was one volunteer to enter the album info and associate it with the unique fingerprint of track durations, and every future CDDB client owner could grab the data and display it the moment the CD was inserted, and let its user pick tracks by their name, peruse artist details, and so on. 

The modern internet, buffeted as it is by monopolies, exploitation, and market and regulatory failure, still allows people to organize at low cost, with high levels of informality.

When it started, most users of the CDDB had to precede much of their music-listening time with a short burst of volunteer data entry. But within months, the collective contributions of the Internet’s music fans had created a unique catalogue of current music that far exceeded the information contained even in expensive, proprietary industry databases. Deprived of any useful digital accommodations by the music industry, CD fans, armed with the user-empowering PC and the internet, built their own solution.

This story, too, does not have a happy ending. In fact, in some ways the CDDB is the most notorious tale of enclosure on the early Net. Kan and Scherf soon realised the valuable asset that they were sitting on, and along with the hosting administrator of the original database server, built it into a commercial company, just as the overseers of Cardiff’s movie database had. Between 2000 and 2001, as “Gracenote”, this commercial company shifted from a free service, incorporated by its many happy users into a slew of open source players, to serving hardware companies, who they charged for a CD recognition service. It changed its client software to a closed proprietary software license, attached restrictive requirements on any code that used its API, and eventually blocked clients who did not agree to its license entirely.

The wider CDDB community was outraged, and the bitterness persisted online for years afterwards. Five years later, Scherf defended his actions in a Wired magazine interview. His explanation was the same as IMDB’s founders: that finding a commercial owner and business model was the only way to fund CDDB as a viable ongoing concern. He noted that other groups of volunteers, notably an alternative service called freedb, had forked the database and client code from a point just before Gracenote locked it up. He agreed that was their right, and encouraged them to keep at it, but expressed scepticism that they would survive. “The focus and dedication required for CDDB to grow could not be found in a community effort,” he told Wired. “If you look at how stagnant efforts like freedb have been, you’ll see what I mean.”  By locking down and commercializing CDDB, Scherf said that he “fully expect[ed] our disc-recognition service to be running for decades to come.”

Scherf may have overestimated the lifetime of CDs, and underestimated the persistence of free versions of the CDDB. While freedb closed last year,  Gnudb, an alternative derived from freedb, continues to operate. Its far smaller set of contributors don’t cover as much of the latest CD releases, but its data remains open for everyone to use—not just for the remaining CD diehards, but also as a permanent historical record of the CD era’s back catalogue: its authors, its releases, and every single track. Publicly available, publicly collected, and publicly usable, in perpetuity. Whatever criticism might be laid at the feet of this form of the public interest internet, fragility is not one of them. It hasn’t changed much, which may count as stagnation to Scherf—especially compared to the multi-million dollar company that Gracenote has become. But as Gracenote itself was bought up (first by Sony, then by Nielsen), re-branded, and re-focused, its predecessor has distinctly failed to disappear.

Some Internet services do survive and prosper by becoming the largest, or by being bought by the largest. These success stories are very visible, if not organically, then because they can afford marketers and publicists. If we listen exclusively to these louder voices, our assumption would be that the story of the Internet is one of consolidation and monopolization. And if—or perhaps just when—these conglomerates go bad, their failings are just as visible.

But smaller stories, successful or not, are harder to see. When we dive into this area, things become more complicated. Public interest internet services can be engulfed and transformed into strictly commercial operations, but they don’t have to be. In fact, they can persist and outlast their commercial cousins.

And that’s because the modern internet, buffeted as it is by monopolies, exploitation, and market and regulatory failure, still allows people to organize at low cost, with high levels of informality, in a way that can often be more efficient, flexible and antifragile than strictly commercial, private interest services,or the centrally-planned government production of public goods.

Next week: we continue our look at music recognition, and see how public interest internet initiatives can not only hang on as long as their commercial rivals, but continue to innovate, grow, and financially support their communities.

Danny O'Brien

The Enclosure of the Public Interest Internet

5 days 5 hours ago

This is the second in our blog series on the public interest internet: past, present and future.

It’s hard to believe now, but in the early days of the public internet, the greatest worry of some of its most high-powered advocates was that it would be empty. As the Clinton administration prepared to transition the internet from its academic and military origins to the heart of the promised “national information infrastructure” (NII), the government’s advisors fretted that the United States entertainment and information industries would have no commercial reason to switch from TV, radio, and recorded music. And without Hollywood and the record labels on board, the new digital environment would end up as a ghost mall, devoid of businesses or users.

 “All the computers, telephones, fax machines, scanners, cameras, keyboards, televisions, monitors, printers, switches, routers, wires, cables, networks and satellites in the world will not create a successful NII, if there is not content”, former Patent Office head Bruce Lehman’s notorious 1994 government green paper on intellectual property on the Net warned. The fear was that without the presence of the pre-packaged material of America’s entertainment industry, the nation would simply refuse to go online. As law professor Jessica Litman describes it, these experts’ vision of the Internet was “a collection of empty pipes, waiting to be filled with content.” 

Even as the politicians were drafting new, more punitive copyright laws intended to reassure Hollywood and the record labels (and tempt them into new, uncharted waters), the Internet’s first users were moving in and building anyway. Even with its tiny audience of technologists, first-adopters, and university students, the early net quickly filled with compelling “content,” a  free-wheeling, participatory online media that drew ever larger crowds as it evolved.

Even in the absence of music and movies, the first net users built towers of information about them anyway. In rec.arts.movies, the Usenet discussion forum devoted to all things Hollywood, posters had been compiling and sharing lists of their favourite motion picture actors, directors, and trivia since the 1980s. By the time of the Lehman report, the collective knowledge of the newsgroup had outgrown its textual FAQs, and expanded first to a collectively-managed database on Colorado University’s file site, and then onward to one of the very first database-driven websites, hosted on a spare server at Wales’ Cardiff University.

Built in the same barn-raising spirit of the early net, the public interest internet exploits the low cost of organizing online to provide stable, free repositories of user-contributed information. They have escaped an exploited fate as proprietary services owned by a handful of tech giants.

These days, you’ll know that Cardiff Movie Database by another name – the IMDb. The database that had grown out of the rec.arts.movies contributions was turned into a commercial company in 1996 and sold to Amazon in 1998 for around $55 million dollars (equivalent to $88 million today). The Cardiff volunteers, led by one of its original moderators, Col Needham, continued to run the service as salaried employees of an Amazon subsidiary.

The IMDB shows how the original assumptions of Internet growth were turned on their head. Instead of movie production companies leading the way, their own audience had successfully built and monetised the elusive “content” of the information superhighway by themselves—for themselves.  The data of the rec.arts.movie databases was used by Amazon as the seed to build an exclusive subscriptions service, IMDbpro, for movie business professionals, and to augment their Amazon Prime video streaming service with quick-access film facts. Rather than needing the movie moguls’ permission to fill the Internet, the Internet ended up supplying information that those moguls themselves happily paid a new, digital mogul for.

But what about those volunteers who gave their time and labor to the collective effort of building this database for everyone? Apart from the few who became employees and shareholders of the commercial IMDb, they didn’t get a cut of the service’s profits. They also lost access to the full fruits of that comprehensive movie database. While you can still download the updated core of the Cardiff Database for free, it only covers the most basic fields of the IMDb. It is licensed under a strictly non-commercial license, fenced off with limitations and restrictions. No matter how much you might contribute to the IMDb, you can’t profit from your labor. The deeper info that was originally built by the user-contributions  and supplemented by Amazon has been enclosed: shut away, in a proprietary paywalled property, gated off from the super-highway it rode in on.

It’s a story as old as the net is, and echoes historic stories of the enclosure of the commons. A pessimist would say that this has been the fate of much of the early net and its aspirations. Digital natives built, as volunteers, free resources for everyone. Then, struggling to keep them online in the face of the burdens of unexpected growth, they ended up selling up to commercial interests. Big Tech grew to its monopoly position by harvesting this public commons, and then locking it away.

But it’s not the only story from the early net. Everyone knows, too, the large public projects that somehow managed to steer away from this path. Wikipedia is the archetype, still updated by casual contributors and defiantly unpaid editors across the world, with the maintenance costs of its website comfortably funded by regular appeals from its attached non-profit. Less known, but just as unique, is Open Street Map (OSM), a user-built, freely-licensed alternative to Google Maps, which has compiled from public domain sources and the hard work of its volunteer cartographers one of the most comprehensive maps of the entire earth. 

These are flagships of what we at EFF call the public interest internet. They produce and constantly replenish priceless public goods, available for everyone, while remaining separate from government, those traditional maintainers of public goods. Neither are they commercial enterprises, creating private wealth and (one hopes) public benefit through the incentive of profit. Built in the same barn-raising spirit of the early net, the public interest internet exploits the low cost of organizing online to provide stable, free repositories of user-contributed information. Through careful stewardship, or unique advantages, they have somehow escaped an enclosed and exploited fate as a proprietary service owned by a handful of tech giants.

That said, while Wikipedia and OSM are easy, go-to examples of the public interest internet, they are not necessarily representative of it. Wikipedia and OSM, in their own way, are tech giants too. They run at the same global scale. They struggle with some of the same issues of accountability and market dominance. It’s hard to imagine a true competitor to Wikipedia or OSM emerging now, for instance—even though many have tried and failed. Their very uniqueness means that their influence is outsized. The remote, in-house politics at these institutions has real effects on the rest of society. Both Wikipedia and OSM have complex, often carefully negotiated, large-scale interactions with the tech giants. Google integrates Wikipedia into its searches, cementing the encyclopedia’s position. OSM is used by, and receives contributions from, Facebook and Apple. It can be hard to know how individual contributors or users can affect the governance of these mega-projects or change the course of them. And there’s a recurring fear that the tech giants have more influence than the builders of these projects.

Besides, if there’s really only a handful of popular examples of public good production by the public interest internet, is that really a healthy alternative to the rest of the net? Are these just crocodiles and alligators, a few visible survivors from a previous age of out-evolved dinosaurs, doomed to be ultimately outpaced by sprightlier commercial rivals?

At EFF, we don’t think so. We think there’s a thriving economy of smaller public interest internet projects, which have worked out their own ways to survive on the modern internet. We think they deserve a role and representation in the discussions governments are having about the future of the net. Going further, we’d say that the real dinosaurs are our current tech giants. The small, sprightly, and public-minded public interest internet has always been where the benefits of the internet have been concentrated. They’re the internet’s mammalian survivors, hiding out in the nooks of the net, waiting to take back control when the tech giants are history.

In our next installment, we take a look at one of the most notorious examples of early digital enclosure, its (somewhat) happier ending, and what it says about the survival skills of the public interest internet when a free database of compact discs outlasts the compact disc boom itself.

Danny O'Brien

Introducing the Public Interest Internet

5 days 5 hours ago

Say the word “internet” these days, and most people will call to mind images of Mark Zuckerberg and Jeff Bezos, of Google and Twitter: sprawling, intrusive, unaccountable. This tiny handful of vast tech corporations and their distant CEOs demand our online attention and dominate the offline headlines. 

But on the real internet, one or two clicks away from that handful of conglomerates, there remains a wider, more diverse, and more generous world. Often run by volunteers, frequently without any obvious institutional affiliation, sometimes tiny, often local, but free for everyone online to use and contribute to, this internet preceded Big Tech, and inspired the earliest, most optimistic vision of its future place in society.

When Big Tech is long gone, a better future will come from the seed of this public interest internet: seeds that are being planted now, and which need everyone to nurture them. 

The word “internet” has been so effectively hijacked by its most dystopian corners that it’s grown harder to even refer to this older element of online life, let alone bring it back into the forefront of society’s consideration. In his work documenting this space and exploring its future, academic, entrepreneur, and author Ethan Zuckerman has named it our “digital public infrastructure.” Hana Schank and her colleagues at the New America think tank have revitalized discussions around what they call “public interest technology.”  In Europe, activists, academics and public sector broadcasters talk about the benefits of the internet’s “public spaces” and improving and expanding the “public stack.” Author and activist Eli Pariser has dedicated a new venture to advancing better digital spaces—what its participants describe as the “New Public”.

Not to be outdone, we at EFF have long used the internal term: “the public interest internet.” While these names don’t quite point to exactly the same phenomenon, they all capture some aspect of the original promise of the internet. Over the last two decades, that promise largely disappeared from wider consideration.  By fading from view, it has grown underappreciated, underfunded, and largely undefended. Whatever you might call it, we see our mission to not just act as the public interest internet’s legal counsel when it is under threat, but also to champion it when it goes unrecognized. 

This blog series, we hope, will serve as a guided tour of some of the less visible parts of the modern public interest internet. None of the stories here, the organizations, collectives, and ongoing projects have grabbed the attention of the media or congressional committees (at least, not as effectively as Big Tech and its moguls). Nonetheless, they remain just as vital a part of the digital space. They not only better represent the spirit and vision of the early internet, they underlie much of its continuing success: a renewable resource that tech monopolies and individual users alike continue to draw from.

When Big Tech is long gone, a better future will come from the seed of this public interest internet: seeds that are being planted now, and which need everyone to nurture them until they’re strong enough to sustain our future in a more open and free society. 

But before we look into the future, let’s take a look at the past, to a time when the internet was made from nothing but the public—and because of that, governments and corporations declared that it could never prosper.

This is the introduction to our blog series on the public interest internet. Read more in the series: 

Danny O'Brien

Surveillance Self-Defense Playlist: Getting to Know Your Phone

5 days 6 hours ago

We are launching a new Privacy Breakdown of Mobile Phones "playlist" on Surveillance Self-Defense, EFF's online guide to defending yourself and your friends from surveillance by using secure technology and developing careful practices. This guided tour walks through the ways your phone communicates with the world, how your phone is tracked, and how that tracking data can be analyzed. We hope to reach everyone from those who may have a smartphone for the first time, to those who have had one for years and want to know more, to savvy users who are ready to level up.

The operating systems (OS) on our phones weren’t originally built with user privacy in mind or optimized fully to keep threatening services at bay. Along with the phone’s software, different hardware components have been added over time to make the average smartphone a Swiss army knife of capabilities, many of which can be exploited to invade your privacy and threaten your digital security. This new resource attempts to map out the hardware and software components, the relationships between the two, and what threats they can create. These threats can come from individual malicious hackers or organized groups all the way up to government level professionals. This guide will help users understand a wide range of topics relevant to mobile privacy, including: 

  • Location Tracking: Encompassing more than just GPS, your phone can be tracked through cellular data and WiFi as well. Find out the various ways your phone identifies your location.
  • Spying on Mobile Communications: The systems our phone calls were built on were based on a model that didn’t prioritize hiding information. That means targeted surveillance is a risk.
  • Phone Components and Sensors: Today’s modern phone can contain over four kinds of radio transmitters/receivers, including WiFi, Bluetooth, Cellular, and GPS.
  • Malware: Malicious software, or malware, can alter your phone in ways that make spying on you much easier.
  • Pros and Cons of Turning Your Phone Off: Turning your phone off can provide a simple solution to surveillance in certain cases, but can also be correlated with where it was turned off.
  • Burner Phones: Sometimes portrayed as a tool of criminals, burner phones are also often used by activists and journalists. Know the do's and don’ts of having a “burner.”
  • Phone Analysis and Seized Phones: When your phone is seized and analyzed by law enforcement, certain patterns and analysis techniques are commonly used to draw conclusions about you and your phone use.

This isn’t meant to be a comprehensive breakdown of CPU architecture in phones, but rather of the capabilities that affect your privacy more frequently, whether that is making a phone call, texting, or using navigation to get to a destination you have never been to before. We hope to give the reader a bird’s-eye view of how that rectangle in your hand works, take away the mystery behind specific privacy and security threats, and empower you with information you can use to protect yourself.

EFF is grateful for the support of the National Democratic Institute in providing funding for this security playlist. NDI is a private, nonprofit, nongovernmental organization focused on supporting democracy and human rights around the world. Learn more by visiting https://NDI.org.

Alexis Hancock

Foreign Intelligence Surveillance Court Rubber Stamps Mass Surveillance Under Section 702 - Again

5 days 8 hours ago

As someone once said, “the Founders did not fight a revolution to gain the right to government agency protocols.”  Well it was not just someone, it was Chief Justice John Roberts. He flatly rejected the government’s claim that agency protocols could solve the Fourth Amendment violations created by police searches of our communications stored in the cloud and accessible through our phones.  

Apparently, the Foreign Intelligence Surveillance Court (FISC) didn’t get the memo. That’s because, under a recently declassified decision from November 2020, the FISC again found that a series of overly complex but still ultimately swiss cheese agency protocols -- that are admittedly not even being followed -- resolve the Fourth Amendment problems caused by the massive governmental seizures and searches of our communications currently occurring under FISA Section 702. The annual review by the FISC is required by law -- it’s supposed to ensure that both the policies and the practices of the mass surveillance under 702 are sufficient. It failed on both counts.  

The protocols themselves are inherently problematic. The law only requires that intelligence officials “reasonably believe” the “target” of an investigation to be a foreigner abroad -- it is immaterial to the initial collection that there is an American, with full constitutional rights, on the other side of a communication

Justice Roberts was concerned with a single phone seized pursuant to a lawful arrest.  The FISC is apparently unconcerned when it rubber stamps mass surveillance impacting, by the government’s own admission, hundreds of thousand of nonsuspect Americans.

What’s going on here?  

From where we sit, it seems clear that the FISC continues to suffer from a massive case of national security constitutional-itis. That is the affliction (not really, we made it up) where ordinarily careful judges sworn to defend the Constitution effectively ignore the flagrant Fourth Amendment violations that occur when the NSA, FBI, (and to a lesser extent, the CIA, and NCTC) misuse the justification of national security to spy on Americans en mass. And this malady means that even when the agencies completely fail to follow the court's previous orders, they still get a pass to keep spying.  

The FISC decision is disappointing on at least two levels. First, the protocols themselves are not sufficient to protect Americans’ privacy. They allow the government to tap into the Internet backbone and seize our international (and lots of domestic) communications as they flow by -- ostensibly to see if they have been targeted. This is itself a constitutional violation, as we have long argued in our Jewel v. NSA case. We await the Ninth Circuit’s decision in Jewel on the government’s claim that this spying that everyone knows about is too secret to be submitted for real constitutional review by a public adversarial court (as opposed to the one-sided review by the rubber-stamping FISC).  

But even after that, the protocols themselves are swiss cheese when it comes to protecting Americans. At the outset, unlike traditional foreign intelligence surveillance, under Section 702, FISC judges do not authorize individualized warrants for specific targets. Rather, the role of a FISC judge under Section 702 is to approve abstract protocols that govern the Executive Branch’s mass surveillance and then review whether they have been followed.  

The protocols themselves are inherently problematic. The law only requires that intelligence officials “reasonably believe” the “target” of an investigation to be a foreigner abroad -- it is immaterial to the initial collection that there is an American, with full constitutional rights, on the other side of a conversation whose communications are both seized and searched without a warrant. It is also immaterial that the individuals targeted turn out to be U.S. persons.  This was one of the many problems which ultimately ended with the decommissioning of the Call Detail Records program, which despite being Congress' attempt to rein in the program which started under section 215 of the Patriot Act, still mass surveilled communications metadata, including inadvertently collecting millions of call detail records from American persons illegally. 

Next, the protocols allow collection for any “foreign intelligence,” purpose, which is a much broader scope than merely searching for terrorists. The term encompasses information that, for instance, could give the U.S. an advantage in trade negotiations. Once these communications are collected, the protocols allow the FBI to use the information for domestic criminal prosecutions if related to national security.  This is what Senator Wyden and others in Congress have rightly pointed out is a “backdoor” warrantless search. And those are just a few of the problems.  

While the protocols are complex and confusing, the end result is that nearly all Americans have their international communications seized initially and a huge number of them are seized and searched by the FBI, NSA, CIA and NCTC, often multiple times for various reasons, all without individual suspicion, much less a warrant.

Second, the government agencies -- especially the FBI -- apparently cannot be bothered to follow even these weak protocols.  This means that in practice, we users don’t even get that minimal protection.  The FISC decision reports that the FBI has never limited its searches to just those related to national security. Instead agents query the 702 system for investigations relating to health care fraud, transnational organized crime, violent gangs, domestic terrorism, public corruption and bribery. And that’s in just 7 FBI field offices reviewed. This is not a new problem, as the FISC notes. Although it once again seems to think that the FBI just needs to be told again to do it and to do proper training (which it has failed to do for years). The court notes that it is likely that other field offices also did searches for ordinary crimes, but that the FBI also failed to do proper oversight so we just don’t know how.  

A federal court would accept no such tomfoolery.....Yet the FISC is perfectly willing to sign off on the FBI’s failures and the Bureau’s flagrant disregard of its own rulings for year upon year.

Next, the querying system for this sensitive information had been designed to make it hard not to search the 702-collected data, including by requiring agents to opt out (not in) to searching the 702 data and then timing out that opt-out after only thirty minutes. And even then, the agents could just toggle “yes” to search 702 collected data, with no secondary checking prior to those searches. This happened multiple times (that we know of) to allow for searches without any national security justification. The FBI also continued to improperly conduct bulk searches, which are large batch queries using multiple search terms without written justifications as required by the protocols. Even the FISC calls these searches “indiscriminate,” yet it reauthorized the program.  

In her excellent analysis of the decision, Marcy Wheeler lists out the agency excuses that the Court accepted:

  • It took time for them to make the changes in their systems
  • It took time to train everyone
  • Once everyone got trained they all got sent home for COVID 
  • Given mandatory training, personnel “should be aware” of the requirements, even if actual practice demonstrates they’re not
  • FBI doesn’t do that many field reviews
  • Evidence of violations is not sufficient evidence to find that the program inadequately protects privacy
  • The opt-out system for FISA material — which is very similar to one governing the phone and Internet dragnet at NSA until 2011 that also failed to do its job — failed to do its job
  • The FBI has always provided national security justifications for a series of violations involving their tracking system where an Agent didn’t originally claim one
  • Bulk queries have operated like that since November 2019
  • He’s concerned but will require more reporting

And the dog also ate their homework.  While more reporting sounds nice, that’s the same thing ordered the last time, and the time before that.  Reporting of problems should lead to something actually being done to stop the problems.  

At this point, it’s just embarrassing. A federal court would accept no such tomfoolery from an impoverished criminal defendant facing years in prison. Yet the FISC is perfectly willing to sign off on the FBI and NSA failures and the agencies' flagrant disregard of its own rulings for year upon year.  Not all FISC decisions are disappointing.  In 2017, we were heartened that another FISC judge had been so fed up that it issued requirements that led to the end of the “about” searching of collected upstream data and even its partial destruction. And the extra reporting requirements do give us at least a glimpse into how bad it is that we wouldn’t otherwise have.  

But this time the FISC has let us all down again. It’s time for the judiciary, whether a part of the FISC or not, to inoculate themselves against the problem of throwing out the Fourth Amendment whenever the Executive Branch invokes national security, particularly when the constitutional violations are so flagrant, long-standing and pervasive. The judiciary needs to recognize mass spying as unconstitutional and stop what remains of it. Americans deserve better than this charade of oversight. 




Related Cases: Jewel v. NSA
Cindy Cohn

The Florida Deplatforming Law is Unconstitutional. Always has Been.

6 days 5 hours ago

Last week, the Florida Legislature passed a bill prohibiting social media platforms from “knowingly deplatforming” a candidate (the Transparency in Technology Act, SB 7072), on pain of a fine of up to $250k per day, unless, I kid you not, the platform owns a sufficiently large theme park. 

Governor DeSantis is expected to sign it into law, as he called for laws like this. He cited social media de-platforming Donald Trump as  examples of the political bias of what he called “oligarchs in Silicon Valley.” The law is not just about candidates, it also bans “shadow-banning” and cancels cancel culture by prohibiting censoring “journalistic enterprises,” with “censorship” including things like posting “an addendum” to the content, i.e. fact checks.

This law, like similar previous efforts, is mostly performative, as it almost certainly will be found unconstitutional. Indeed, the parallels with a nearly 50 years old compelled speech precedent are uncanny. In 1974, in Miami Herald Publishing Co. v. Tornillo, the Supreme Court struck down another Florida statute that attempted to compel the publication of candidate speech. 

50 Years Ago, Florida's Similar "Right of Reply" Law Was Found Unconstitutional

At the time, Florida had a dusty "right of reply" law on the books, which had not really been used, giving candidates the right to demand that any newspaper who criticized them print a reply to the newspaper's charges, at no cost. The Miami Herald had criticized Florida House candidate Pat Tornillo, and refused to carry Tornillo’s reply. Tornillo sued.

Tornillo lost at the trial court, but found some solace on appeal to the Florida Supreme Court.  The Florida high court held that the law was constitutional, writing that the “statute enhances rather than abridges freedom of speech and press protected by the First Amendment,” much like the proponents of today’s new law argue. 

So off the case went to the US Supreme Court. Proponents of the right of reply raised the same arguments used today—that government action was needed to ensure fairness and accuracy, because “the 'marketplace of ideas' is today a monopoly controlled by the owners of the market.”  

Like today, the proponents argued new technology changed everything. As the Court acknowledged in 1974, “[i]n the past half century a communications revolution has seen the introduction of radio and television into our lives, the promise of a global community through the use of communications satellites, and the specter of a ‘wired’ nation by means of an expanding cable television network with two-way capabilities.”  Today, you might say that a wired nation with two-way communications had arrived in the global community, but you can’t say the Court didn’t consider this concern.

You might wonder why the Florida Legislature would pass a law doomed to failure. Politics, of course.

The Court also accepted that the consolidation of major media meant “the dominant features of a press that has become noncompetitive and enormously powerful and influential in its capacity to manipulate popular opinion and change the course of events,” and acknowledged the development of what the court called “advocacy journalism,” eerily similar to the arguments raised today. 

Paraphrasing the arguments made in favor of the law, the Court wrote “The abuses of bias and manipulative reportage are, likewise, said to be the result of the vast accumulations of unreviewable power in the modern media empires. In effect, it is claimed, the public has lost any ability to respond or to contribute in a meaningful way to the debate on issues,” just like today’s proponents of the Transparency in Technology Act.

The Court was not swayed, not because this was dismissed as an issue, but because government coercion could not be the answer. “However much validity may be found in these arguments, at each point the implementation of a remedy such as an enforceable right of access necessarily calls for some mechanism, either governmental or consensual. If it is governmental coercion, this at once brings about a confrontation with the express provisions of the First Amendment.” There is much to dislike about content moderation practices, but giving the government more control is not the answer.

Even if one should decry the lack of responsibility of the media, the Court recognized “press responsibility is not mandated by the Constitution and like many other virtues it cannot be legislated.”  Accordingly, Miami Herald v. Tornillo reversed the Florida Supreme Court, and held the Florida statute compelling publication of candidates' replies unconstitutional.

Since Tornillo, courts have consistently applied it as binding precedent, including applying Tornillo to social media and internet search engines, the very targets of the Transparency in Technology Act (unless they own a theme park). Indeed, the compelled speech doctrine has even been used to strike down other attempts to counter perceived censorship of conservative speakers.1 

With the strong parallels with Tornillo, you might wonder why the Florida Legislature would pass a law doomed to failure, costing the state the time and expense of defending it in court. Politics, of course. The legislators who passed this bill probably knew it was unconstitutional, but may have seen political value in passing the base-pleasing statute, and blaming the courts when it gets struck down. 

Politics is also the reason for the much-ridiculed exception for theme park owners. It’s actually a problem for the law itself. As the Supreme Court explained in Florida Star v BJF, carve-outs like this make the bill even more susceptible to a First Amendment challenge as under-inclusive.  Theme parks are big business in Florida, and the law’s definition of social media platform would otherwise fit Comcast (which owns Universal Studios' theme parks), Disney, and even Legoland.  Performative legislation is less politically useful if it attacks a key employer and economic driver of your state. The theme park exception has also raised all sorts of amusing possibilities for the big internet companies to address this law by simply purchasing a theme park, which could easily be less expensive than compliance, even with the minimum 25 acres and 1 million visitors/year. Much as Section 230 Land would be high on my own must-visit list, striking the law down is the better solution.

The Control that Large Internet Companies Have on our Public Conversations Is An Important Policy Issue

The law is bad, and the legislature should feel bad for passing it, but this does not mean that the control that the large internet companies have on our public conversations isn’t an important policy issue. As we have explained to courts considering the broader issue, if a candidate for office is suspended or banned from social media during an election, the public needs to know why, and the candidate needs a process to appeal the decision. And this is not just for politicians - more often it is marginalized communities that bear the brunt of bad content moderation decisions. It is critical that the social platform companies provide transparency, accountability and meaningful due process to all impacted speakers, in the US and around the globe, and ensure that the enforcement of their content guidelines is fair, unbiased, proportional, and respectful of all users’ rights. 

This is why EFF and a wide range of non-profit organizations in the internet space worked together to develop the Santa Clara Principles, which call upon social media to (1) publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines; (2) provide notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension; and (3) provide a meaningful opportunity for timely appeal of any content removal or account suspension. 

  • 1.  Provisions like Transparency in Technology Act’s ban on addendums to posts (such as fact checking or link to authoritative sources) are not covered by the compelled speech doctrine, but rather fail as prior restraints on speech. We need not spend much time on that, as the Supreme Court has roundly rejected prior restraint.
Kurt Opsahl

Facebook Oversight Board Affirms Trump Suspension -- For Now

6 days 11 hours ago

Today’s decision from the Facebook Oversight Board regarding the suspension of President Trump’s account — to extend the suspension for six months and require Facebook to reevaluate in light of the platform’s stated policies — may be frustrating to those who had hoped for a definitive ruling. But it is also a careful and needed indictment of Facebook’s opaque and inconsistent moderation approach that offers several recommendations to help Facebook do better, focused especially on consistency and transparency. Consistency and transparency should be the hallmarks of all content decisions. Too often, neither hallmark is met. Perhaps most importantly, the Board affirms that it cannot and should not allow Facebook to avoid its responsibilities to its users.  We agree.

The decision is long, detailed, and worth careful review. In the meantime, here’s our top-level breakdown:

Today’s decision affirms, once again, that no amount of “oversight” can fix the underlying problem.

First, while the Oversight Board rightly refused to make special rules for politicians, rules we have previously opposed, it did endorse special rules and procedures for “influential users” and newsworthy posts. These rules recognize that some users can cause greater harm than others.  On a practical level, every decision to remove a post or suspend an account is highly contextual and requires often highly specific cultural competency. But we agree that special rules for influential users or highly newsworthy content requires even greater transparency and the investment of substantial resources.

Specifically, the Oversight Board explains that Facebook needs to document all of these special decisions well, clearly explain how any newsworthiness allowance applies to influential accounts, clearly explain how it cross checks such decisions including its rationale, standards, and processes of review, and the criteria for determining which pages to include. And Facebook should report error rates and thematic consistency of determinations as compared with its ordinary enforcement procedures.

More broadly, the Oversight Board also correctly notes that Facebook's penalty system is unclear and that it must better explain its strikes and penalties process, and inform users of strikes and penalties levied against them.

We wholeheartedly agree, as the Oversight Board emphasized, that “restrictions on speech are often imposed by or at the behest of powerful state actors against dissenting voices and members of political oppositions” and that  “Facebook must resist pressure from governments to silence their political opposition.” The Oversight Board urged Facebook to treat such requests with special care. We would have also required that all such requests be publicly reported.

The Oversight Board correctly also noted the need for Facebook to collect and preserve removed posts. Such posts are important for preserving the historical record as well as for human rights reporting, investigations, and accountability. 

While today’s decision reflects a notable effort to apply an international human rights framework, we continue to be concerned that an Oversight Board that is US-focused in its composition is not best positioned to help Facebook do better. But the Oversight Board did recognize the international dimension of the issues it confronts, and endorsed the Rabat Plan of Action, from the United Nations Office of the High Commissioner for Human Rights, as a framework for assessing the removal of posts that may incite hostility or violence. It specifically did not apply the First Amendment, even though the events leading to the decision were focused in the US.

Overall, these are good recommendations and we will be watching to see if Facebook takes them seriously. And we appreciate the Oversight Board’s refusal to make Facebook’s tough decisions for it. If anything, though, today’s decision affirms, once again, that no amount of “oversight” can fix the underlying problem: Content moderation is extremely difficult to get right, particularly at Facebook scale.

Corynne McSherry

Proposed New Internet Law in Mauritius Raises Serious Human Rights Concerns

1 week 4 days ago

As debate continues in the U.S. and Europe over how to regulate social media, a number of countries—such as India and Turkey—have imposed stringent rules that threaten free speech, while others, such as Indonesia, are considering them. Now, a new proposal to amend Mauritius’ Information and Communications Technologies Act (ICTA) with provisions to install a proxy server to intercept otherwise secure communications raises serious concerns about freedom of expression in the country.

Mauritius, a democratic parliamentary republic with a population just over 1.2 million, has an Internet penetration rate of roughly 68% and a high rate of social media use. The country’s Constitution guarantees the right to freedom of expression but, in recent years, advocates have observed a backslide in online freedoms.

In 2018, the government amended the ICTA, imposing heavy sentences—as high as ten years in prison—for online messages that “inconvenience” the receiver or reader. The amendment was in turn utilized to file complaints against journalists and media outlets in 2019.

In 2020, as COVID-19 hit the country, the government levied a tax on digital services operating  in the country, defined as any service supplied by “a foreign supplier over the internet or an electronic network which is reliant on the internet; or by a foreign supplier and is dependent on information technology for its supply.”

The latest proposal to amend the ICTA has raised alarm bells amongst local and international free expression advocates, as it would enable government officials who have established instances of “abuse and misuse” to block social media accounts and track down users using their IP addresses.

The amendments are reminiscent of those in India and Turkey in that they seek to regulate foreign social media, but differ in that Mauritius—a far smaller country—lacks the ability to force foreign companies to maintain a local presence. In a paper for a consultation of the amendments, proponents argue:

Legal provisions prove to be relatively effective only in countries where social media platforms have regional offices. Such is not the case for Mauritius. The only practical solution in the local context would be the implementation of a regulatory and operational framework which not only provides for a legal solution to the problem of harmful and illegal online content but also provides for the necessary technical enforcement measures required to handle this issue effectively in a fair, expeditious, autonomous and independent manner.

While some of the concerns raised in the paper—such as the fact that social media companies do not sufficiently moderate content in the country’s local language—are valid, the solutions proposed are disproportionate. 

A Change.org petition calling on local and international supporters to oppose the amendments notes that “Whether human … or AI, the system that will monitor, flag and remove information shared by users will necessarily suffer from conscious or unconscious bias. These biases will either be built into the algorithm itself, or will afflict those who operate the system.” 

Most concerning, however, is that authorities wish to install a local/proxy server that impersonates social media networks to fool devices and web browsers into sending secure information to the local server instead of social media networks, effectively creating an archive of the social media information of all users in Mauritius before resending it to the social media networks’ servers. This plan fails to mention how long the information will be archived, or how user data will be protected from data breaches.

Local free expression advocates are calling on the ICTA authorites to “concentrate their efforts in ethically addressing concerns made by citizens on posts that already exist and which have been deemed harmful.” Supporters are encouraged to sign the Change.org petition or submit comment to the open consultation by emailing socialmediaconsultation@icta.mu before May 5, 2021.









Jillian C. York

Tell Congress: Support the Fourth Amendment Is Not For Sale Act

1 week 4 days ago

Everyday, your personal information is being harvested by your smart phone applications, sold to data brokers, and used by advertisers hoping to sell you things. But what safeguards prevent the government from shopping in that same data marketplace? Mobile data regularly bought and sold, like your geolocation, is information that law enforcement or intelligence agencies would normally have to get a warrant to acquire. But these databrokers don’t ask for a warrant. The U.S. government has been using its purchase of this information as a loophole for acquiring personal information on individuals without a warrant. Now is the time to close that loophole. 

EFF is launching a campaign in support of the Fourth Amendment is Not For Sale Act, or H.R. 2738 and S.1265. This legislation prevents the government from purchasing information it would otherwise need a warrant to acquire. Tell your senators and representatives that this bill must be passed!

TAKE ACTION

TELL CONGRESS: THe fourth amendment is not for sale

We first wrote about the need for legislation like this in December 2020, after a troubling article in Motherboard. It reported that a Muslim prayer app (Muslim Pro), a Muslim dating app (Muslim Mingle), and many other popular apps had been selling geolocation data about their users to a company called X-Mode, which in turn provided this data to the U.S. military through defense contractors.

This violates the First and Fourth Amendments. Just because your phone apps know where you are does not mean the government should, too. The invasive marketplace for your data needs to be tamed by privacy legislation, not used by the government as an end-run around the warrant requirement. The Supreme Court has decided that our detailed location data is so revealing about our activities and associations that law enforcement must get a warrant in order to acquire it.

Government purchase of location data also threatens to chill people’s willingness to participate in protests in public places, associate with who they want, or practice their religion. History and legal precedent teach us that when the government indiscriminately collects records of First Amendment activities, it can lead to retaliation or further surveillance.

TAKE ACTION

TELL CONGRESS: THe fourth amendment is not for sale

You can read the full text of the bill below:

Matthew Guariglia

Brazil's Bill Repealing National Security Law Has its Own Threats to Free Expression

1 week 4 days ago

The Brazilian Chamber of Deputies is on track to approve  a law that threatens freedom of expression and the right to assemble and protest, with the stated aim of defending the democratic constitutional state. Bill 6764/02 repeals the Brazilian National Security Law (Lei de Segurança Nacional), one of the ominous legacies of the country’s dictatorship that lasted until 1985. Although there’s a broad consensus over the harm the National Security Law represents, Brazilian civil groups have been stressing that replacing it with a new act without careful discussion on its grounds, principles, and specific rules risks rebuilding a framework serving more to repressive than to democratic ends.

The Brazilian National Security Law has a track record of abuses in persecuting and silencing dissent, with vague criminal offenses and provisions targeting speech. After a relatively dormant period, it gained new prominence during President Bolsonaro’s administration. It has served as a legal basis for accusations against opposition leaders, critics, journalists, and even a congressman aligned to Bolsonaro in the country’s current turbulent political landscape.

However, its proposed replacement, Bill 6764/02, raises various concerns, some particularly unsettling for digital rights. Even with alternative drafts trying to untangle them, problems remain.

First, the espionage offense in the bill defines the handover of secret documents to foreign governments as a crime. It's crucial that this and related offenses do not apply to acts in a way that would raise serious human rights concerns: whistleblowers revealing facts or acts that could imply the violation of human rights, crimes committed by government officials, and other serious wrongdoings affecting public administration; or,  journalistic and investigative reporting, and the work of civil groups and activists, that bring to light governments’ unlawful practices and abuses. These acts should be clearly exempted from the offense. Amendments under discussion seek to address these concerns, but there’s no assurance they will prevail in the final text if this new law is approved.

The IACHR’s Freedom of Expression Rapporteur highlighted how often governments in Latin America classify information under national security reasons without proper assessment and substantiation. The report provides a number of examples in the region on the hurdles this represents to accessing information related to human rights violations and government surveillance. The IACHR Rapporteur stresses the key role of investigative journalists, the protection of their sources, and the need to grant legal backing against reprisal to whistleblowers who expose human rights violations and other wrongdoings. This aligns with the UN Freedom of Expression Rapporteur’s previous recommendations and reinforces the close relationship between democracy and strong safeguards for those who take a stand of unveiling sensitive public interest information. As the UN High Commissioner for Human Rights has already pointed out:

The right to privacy, the right to access to information and freedom of expression are closely linked. The public has the democratic right to take part in the public affairs and this right cannot be effectively exercised by solely relying on authorized information.

Second, the proposal also aims to tackle "fake news" by making “mass misleading communication” a crime against democratic institutions. Although the bill should be strictly tailored to counter exceptionally serious threats, bringing disinformation into its scope, on the contrary, potentially targets millions of Internet users. Disseminating “facts the person know is untrue” that could put at risk “the health of the electoral process” or “the free exercise of constitutional powers,” using "means not provided by the private messaging application," could lead to up to five years’ jail time.

We agree with the digital rights groups on the ground which have stressed the provision’s harmful implications to users’ freedom of expression.  Criminalizing the spread of disinformation is full of traps. It criminalizes speech by relying on vague terms (as in this bill) easily twisted to stifle critical voices and those challenging entrenched political power. Repeatedly, joint declarations of the Freedom of Expression Rapporteurs urged States not to take that road.

Moreover, the provision applies when such messages were distributed using "means not provided by the application." Presuming that the use of such means is inherently malicious poses a major threat to interoperability. The technical ability to plug one product or service into another product or service, even when one service provider hasn’t authorized that use, has been a key driver to competition and innovation. And dominant companies repeatedly abuse legal protections to ward off and try to punish competitors. 

This is not to say we do not care about the malicious spread of disinformation at scale. But it should not be part of this bill, given its specific scope, neither be addressed without careful attention to unintended consequences. There’s an ongoing debate, and other avenues to pursue that are aligned with fundamental rights and rely on joint efforts from the public and private sectors.

Political pressure has hastened the bill's vote. Bill 6764/02 may pass in a few days in the Chamber of Deputies, pending the Senate's approval. We join the call of civil and digital rights groups that a rushed approach actually creates greater risks for what the bill is supposed to protect. These and other troubling provisions put freedom of expression on the spot, serving also to spur government’s surveillance and repressive actions. These risks are what the defense of democracy should fend off, not reiterate. 

Veridiana Alimonti

EFF at 30: Protecting Free Speech, with Senator Ron Wyden

1 week 5 days ago

To commemorate the Electronic Frontier Foundation’s 30th anniversary, we present EFF30 Fireside Chats. This limited series of livestreamed conversations looks back at some of the biggest issues in internet history and their effects on the modern web.

To celebrate 30 years of defending online freedom, EFF was proud to welcome Senator Ron Wyden as our second special guest in EFF’s yearlong Fireside Chat series. Senator Wyden is a longtime supporter of digital rights, and as co-author of Section 230, one of the key pieces of legislation protecting speech online, he’s a well-recognized champion of free speech. EFF’s Legal Director, Dr. Corynne McSherry, spoke with the senator about the fight to protect free expression and how Section 230, despite recent attacks, is still the “single best law for small businesses and single best law for free speech.” He also answered questions from the audience about some of the hot topics that have swirled around the legislation for the last few years. 

You can watch the full conversation here or read the transcript.

On May 5, we’ll be holding our third EFF30 Fireside Chat, on surveillance, with special guest Edward Snowden. He will be joined by EFF Executive Director Cindy Cohn, EFF Director of Engineering for Certbot Alexis Hancock, and EFF Policy Analyst Matthew Guariglia as they weigh in on surveillance in modern culture, activism, and the future of privacy. 

RSVP NOW

Section 230 and Social Movements

Senator Wyden began the fireside chat with a reminder that some of the most important, and often divisive, social issues of the last few years, from #BlackLivesMatter to the #MeToo movement, would likely be censored much more heavily on platforms without Section 230. That’s because the law gives platforms both the power to moderate as they see fit, and partial immunity from liability for what’s posted on those sites, making the speech the legal responsibility of the original speaker.

Section 230...has always been for the person who doesn't have deep pockets

The First Amendment protects most speech online, but without Section 230, many platforms would be unable to host much of this important, but controversial speech because they would be stuck in litigation far more often. Section 230 has been essential for those who “don’t own their own TV stations” and others “without deep pockets” for getting their messages online, Wyden explained. 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FELSJofIhnRM%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Wyden also discussed the history of Section 230, which was passed in 1996. ”[Senator Chris Cox] and I wanted to make sure that innovators and creators and people who had promising ideas and wanted to know how they were going to get them out - we wanted to make sure that this new concept known as the internet could facilitate that.” 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FF916aJbM96Q%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Misconceptions Around Section 230

Wyden took aim at several of the misconceptions around 230, like the fact that the law is a benefit only for Big Tech. “One of the things that makes me angry...the one [idea] that really infuriates me, is that Section 230 is some kind of windfall for Big Tech. The fact of the matter is Big Tech’s got so much money that they can buy themselves out of any kind of legal scrape. We sure learned that when the first bill to start unraveling Section 230 passed, called SESTA/FOSTA.”

We need that fact-finding so that we make smart technology policy

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FtwOpQY2htzs%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Another common misunderstanding around the law is that it mandates platforms to be “neutral.” This couldn’t be further from the truth, Wyden explained: “There’s not a single word in Section 230 that requires neutrality….The point was essentially to let ‘lots of flowers bloom.’ If you want to have a conservative platform, more power to you...If you want to have a progressive platform, more power to you.“ 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FEM_gj6ZqCpA%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

How to Think About Changes to Intermediary Liability Laws

All the positive benefit for online speech that Section 230 allows doesn’t mean that Section 230 is perfect, however. But before making changes to the law, Wyden suggested, “There ought to be some basic fact finding before the Congress just jumps in to making sweeping changes to speech online.” EFF Legal Director, Corynne McSherry, agreed wholeheartedly: “We need that fact-finding so that we make smart technology policy,” adding that we need go no further than our experience with SESTA/FOSTA and its collateral damage to prove this point. 

The first thing we ought to do is tackle the incredible abuses in the privacy area

There are other ways to improve the online ecosystem as well. Asked for his thoughts on better ways to address problems, Senator Wyden was blunt: “The first thing we ought to do is tackle the incredible abuses in the privacy area. Every other week in this country Americans learn about what amounts to yet another privacy disaster.”

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FhDT4J224EB4%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Another area where we can improve the online ecosystem is in data sales and collection. Wyden recently introduced a bill, “The Fourth Amendment is Not For Sale,” that will help reign in the problem of apps and commercial data brokers selling things user location data.

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FusMYK5rKCpA%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

To wrap up the discussion, Senator Wyden took some questions about potential changes to Section 230. He lambasted SESTA/FOSTA, which EFF is challenging in court on behalf of two human rights organizations, a digital library, an activist for sex workers, and a certified massage therapist, as an example of a poorly guided amendment. 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2Fcl48SEXjliI%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Senator Wyden pointed out that every time a proposal to amend the law comes up, there should be a rubric of several questions asked about how the change would work, and what impact it would have on users. (EFF has its own rubric for laws that would affect intermediary liability for just these purposes.)

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FAWJ6o6jOKgA%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

We thank Senator Wyden for joining us to discuss free speech, Section 230, and the battle for digital rights. Please join us in the continuation of this fireside chat series on May 5 as we discuss surveillance with whistleblower Edward Snowden.

Jason Kelley

Apple’s AppTrackingTransparency is Upending Mobile Phone Tracking

2 weeks ago

Apple’s long-awaited privacy update for iOS is out, and it’s a solid step in the right direction. With the launch of iOS 14.5, hundreds of millions of iPhone users will now interact with Apple’s new AppTrackingTransparency feature. Allowing users to choose what third-party tracking they will or will not tolerate, and forcing apps to request those permissions, gives users more knowledge of what apps are doing, helps protect users from abuse, and allows them to make the best decisions for themselves.

In short, AppTrackingTransparency (or ATT) means that apps are now required to ask you permission if they want to track you and your activity across other apps. The kind of consent interface that ATT offers is not new, and it’s similar for other permissions that mobile users will be accustomed to (e.g., when an app requests access to your microphone, camera, or location). It’s normal for apps to be required to request the user’s permission for access to specific device functions or data, and third-party tracking should be no different. You can mark your ATT preferences app by app, or set it overall for all apps. 

Much of ATT revolves around your iPhone’s IDFA, or “ID for advertisers.” This 16-byte string of numbers and letters is like a license plate for your iPhone. (Google has the same kind of identifier for Android, called the Android Ad ID; these identifiers are referred to collectively as “ad IDs”). Previously, you could opt out of IDFA’s always-on surveillance deep in the settings of your iPhone; now, ATT means that IDFA settings are more visible, opt-in, and per app. 

The main feature of ATT is the technical control on IDFA, but the framework will regulate other kinds of tracking, too: if an app does not have your permission to “track” you, it is also not allowed to use identifiers like your phone number, for example, to do so. Presumably, this policy-level feature will depend on Apple’s app store review process to be effective.

Ad IDs are often compared to cookies, their tracker-enabling partner on the Web. But there’s a key difference: cookies were designed for, and continue to support, a wide range of user-friendly features. Cookies are the reason you don’t have to log in every time you visit a website, and why your shopping cart doesn’t empty if you leave a website in the middle of a visit. 

Ad IDs, on the other hand, were designed for one purpose and one purpose only: to let third parties track you. Ad IDs were created so that advertisers could access global, persistent identifiers for users without using the IMEI number or MAC address baked into phone hardware, with absolutely no pretense of user-friendliness or “shopping cart” use-case. Simply put: this feature on your phone has never worked in your favor. That’s why we applaud Apple’s efforts to give users more visible and granular choices to turn it off, and in particular ATT’s new requirement that app developers must ask for explicit permission to engage in this kind of tracking.

ATT is only a first step, and it has its weaknesses. It doesn’t do anything about “first-party” tracking, or an app tracking your behavior on that app itself. ATT might also be prone to “notification fatigue” if users become so accustomed to seeing it that they just click through it without considering the choice.

And, just like any other tracker-blocking initiative, ATT may set off a new round in the cat-and-mouse game between trackers and those who wish to limit them: if advertisers and data brokers see the writing on the wall that IDFA and other individual identifiers are no longer useful for tracking iPhone users, they may go back to the drawing board and find sneakier, harder-to-block tracking methods. ATT is unlikely to wipe out nonconsensual tracking in one fell swoop. But moving from a world in which tracking-by-default was sanctioned and enabled by Apple, to one where trackers must actively defy the tech giant, is a big step forward.

Apple is already pushing against the tide by proposing even this modest reform. Its decision to give users a choice to not be tracked has triggered a wave of melodramatic indignation from the tracking industry. In unraveling a tracking knot of its own creation, Apple has picked a fight with some of the most powerful companies and governments in the world.

Looking ahead, the mobile operating system market is essentially a duopoly, and Google controls the larger part of the -opoly. While Apple pushes through new privacy measures like ATT, Google has left its own Ad ID alone. Of the two, Apple is undoubtedly doing more to rein in the privacy abuses of advertising technology. Nearly every criticism that can be made about the state of privacy on iOS goes double for Android. Your move, Google.

Gennie Gebhart

Here Are 458 California Law Enforcement Agencies' Policy Documents All in One Place

2 weeks 1 day ago

Dylan Kubeny, a student at the Reynolds School of Journalism at the University of Nevada, Reno, served as the primary data hunter and co-author on this project. 

At this moment in history, law enforcement agencies in the United States face a long-overdue reevaluation of their priorities, practices, and processes for holding police officers accountable for both unconscious biases and overt abuse of power. 

But any examination of law enforcement requires transparency first: the public’s ability to examine what those priorities, practices, and processes are. While police are charged with enforcing the law, they too have their own rules to follow, and too often, those rules are opaque to the public. An imbalance in access to information is an imbalance of power. 

Today, EFF in partnership with Stanford Libraries' Systemic Racism Tracker project is releasing a data set with links to 458 policy manuals from California law enforcement agencies, including most police departments and sheriff offices and some district attorney offices, school district police departments, and university public safety departments. This data set represents our first attempt to aggregate these policy documents following the passage of S.B. 978, a state law that requires local law enforcement agencies to publish this information online. 

These policy manuals cover everything from administrative duties and record keeping to the use of force and the deployment of surveillance technologies. These documents reveal police officers’ responsibilities and requirements, but they also expose shortcomings, including an overreliance on boilerplate policies generated by a private company. 

Download the data set as an CSV file, or scroll to the bottom to find a catalog of links. 

Until a few years ago, many law enforcement agencies in California were reluctant to share their policy documents with the public. While a handful of agencies voluntarily chose to post these records online, the most reliable way to obtain these records was through the California Public Records Act (CPRA), which creates the legal right for everyday people to request information from the government. Most people don't know they have this power, and even fewer know how to exercise it effectively. 

To make these police records more accessible, California State Sen. Steven Bradford sponsored S.B. 978, which says all local law enforcement agencies "shall conspicuously post on their Internet Web sites all current standards, policies, practices, operating procedures, and education and training materials that would otherwise be available to the public if a request was made pursuant to the California Public Records Act.” 

The requirement became fully effective in January 2020, and now the public can visit individual websites to find links to these documents. However, despite the requirement that these records be posted "conspicuously," the links can often be challenging to find. With our new data set, the public now has access to a catalog of hundreds of currently available documents in one place. 

EFF supported SB 978's passage back in 2018 to increase government transparency through internet technology. We are currently collaborating with the Reynolds School of Journalism at the University of Nevada, Reno, to aggregate these policies. Stanford Libraries is using these records to build the Systemic Racism Tracker (SRT), a searchable database that harvests data about institutional practices that harm communities of color. The SRT's goals are to serve as a growing collection of references, documents, and data to support research and education about systematic racism. The SRT also aims to empower people to take action against harmful practices by knowing their rights and identifying, appraising, and connecting with government agencies, non-profit organizations, and grassroots groups that address racism.

"In order to understand, interrogate and work towards changing the very structures of systemic racism in policing, it is vital that we collect both current and historical policy and training manuals," said Felicia Smith, head of Stanford Libraries Learning and Outreach, who created the SRT project.

Although this data set is but the first step in a longer-term project, several elements of concern emerged in our initial analysis.

First and foremost, perhaps the most conspicuous pattern with these policies is the connection to Lexipol, a private company that sells boilerplate policies and training materials to law enforcement agencies. Over and over again, the police policies were formatted the same, used identical language, and included a copyright mark from this company. 

Lexipol has come under fire for writing policies that are too vague or permissive and for significantly differing from best practices. More often than not, rather than draft policies specifically tailored to the specific agency, these agencies simply copied and pasted the standard Lexipol policy. Mother Jones reported that 95% of agencies in California purchased policies or training materials from Lexipol. Our data showed that at least 379 agencies published policies from Lexipol. 

This raises questions about whether police are soliciting guidance from the community or policymakers or are simply accepting the recommendations from a private company that is not accountable to the public. 

In addition, we made the following findings: 

  • Although most agencies complied with S.B. 978 and posted at least some materials online, many agencies still had failed to take action even a year after the law took effect. In those cases, we filed CPRA requests for the records and requested they be posted on their websites. In some instances the agencies followed through, but we are still waiting on some entities such as the Bell Police Department and the Crescent City Police Department to upload their records. 
  • While most agencies complied with the requirement to post policies online, only a portion published training materials. In some cases, agencies only published the training session outlines and not the actual training presentations.
  • Link rot undermines transparency. As we conducted our research over just a few months, URLs for policies would change or disappear as agencies updated their policies or relaunched their websites. That is one reason we include archived links in this data set. 

In the coming months, Stanford Libraries aims to introduce a more robust tool that will allow for searching policies across departments and archiving policy changes over time. In the interim, this data set brings the public one step closer to understanding police practices and to holding law enforcement agencies accountable.

SB 978 Policy and Training Catalog 

The table below contains links to the SB 978 materials made available by local law enforcement agencies across California. There is little to no consistency across agencies for how this information is published online. Below you will find links to the primary page where a user would find links to SB 978 documents. In some cases, this may just be the agency's home page, which includes an SB 978 link in the sidebar. Because we have found that these links break quite often, we have also included an archived version of the link through the Internet Archive's Wayback Machine. We have also included direct links to the policies and training materials, however in many cases this is the same link as the primary page. 

We used the California Commission on Peace Officers Standards and Training’s list of California law enforcement agencies to prioritize municipal police, sheriff’s offices, university and school district police, and district attorneys in our data collection. Future research will cover other forms of local law enforcement.

Download the data set as an CSV file.

Primary Law Enforcement Agency Page

Archived Link

Policies

Training Materials

Alameda County District Attorney

Archived Link

Policy Docs

Not Available

Alameda County Sheriff's Office

Archived Link

Policy Docs

Training Docs

Alameda Police Department

Archived Link

Policy Docs

Training Docs, 2, 3, 4

Albany Police Department

Archived Link

Policy Docs

Training Docs

Alhambra Police Department

Archived Link

Policy Docs

Training Docs

Alpine County Sheriff's Department

Archived Link

Policy Docs

Not Available

Alturas Police Department

Archived Link

Policy Docs

Not Available

Amador County Sheriff's Department

Archived Link

Policy Docs

Not Available

American River College Police Department

Archived Link

Policy Docs

Not Available

Anaheim Police Department

Archived Link

Policy Docs

Not Available

Anderson Police Department

Archived Link

Policy Docs

Training Docs

Angels Camp Police Department

Archived Link

Policy Docs

Not Available

Antioch Police Department

Archived Link

Policy Docs

Training Docs

Apple Valley Unified School District Police Department

Archived Link

Policy Docs

Not Available

Arcadia Police Department

Archived Link

Policy Docs

Not Available

Arcata Police Department

Archived Link

Policy Docs

Not Available

Arroyo Grande Police Department

Archived Link

Policy Docs

Training Docs

Arvin Police Department

Archived Link

Policy Docs

Not Available

Atascadero Police Department

Archived Link

Policy Docs

Not Available

Atherton Police Department

Archived Link

Policy Docs

Training Docs

Atwater Police Department

Archived Link

Policy Docs

Not Available

Auburn Police Department

Archived Link

Policy Docs

Not Available

Avenal Police Department

Archived Link

Policy Docs

Not Available

Azusa Police Department

Archived Link

Policy Docs

Not Available

Bakersfield Police Department

Archived Link

Policy Docs

Not Available

Banning Police Department

Archived Link

Policy Docs

Not Available

Barstow Police Department

Archived Link

Policy Docs

Not Available

Bay Area Rapid Transit Police Department

Archived Link

Policy Docs

Training Docs

Bear Valley Police Department

Archived Link

Policy Docs

Not Available

Beaumont Police Department

Archived Link

Policy Docs

Not Available

Bell Gardens Police Department

Archived Link

Policy Docs

Not Available

Belmont Police Department

Archived Link

Policy Docs

Training Docs

Belvedere Police Department

Archived Link

Policy Docs

Not Available

Benicia Police Department

Archived Link

Policy Docs

Training Docs, 2, 3

Berkeley Police Department

Archived Link

Policy Docs

Training Docs

Beverly Hills Police Department

Archived Link

Policy Docs

Not Available

Blythe Police Department

Archived Link

Policy Docs

Not Available

Brawley Police Department

Archived Link

Policy Docs

Not Available

Brea Police Department

Archived Link

Policy Docs

Training Docs

Brentwood Police Department

Archived Link

Policy Docs

Not Available

Brisbane Police Department

Archived Link

Policy Docs

Not Available

Broadmoor Police Department

Archived Link

Policy Docs

Not Available

Buena Park Police Department

Archived Link

Policy Docs

Training Docs

Burbank Police Department

Archived Link

Policy Docs

Training Docs

Burlingame Police Department

Archived Link

Policy Docs

Training Docs

Butte County Sheriff's Department/Coroner

Archived Link

Policy Docs

Not Available

Cal Poly University Police

Archived Link

Policy Docs

Training Docs

Cal State LA Police Department

Archived Link

Policy Docs

Not Available

Calaveras County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Calexico Police Department

Archived Link

Policy Docs

Not Available

California City Police Department

Archived Link

Policy Docs

Not Available

Calistoga Police Department

Archived Link

Policy Docs

Not Available

Campbell Police Department

Archived Link

Policy Docs

Training Docs

Capitola Police Department

Archived Link

Policy Docs

Not Available

Carlsbad Police Department

Archived Link

Policy Docs

Training Docs

Carmel Police Department

Archived Link

Policy Docs

Not Available

Cathedral City Police Department

Archived Link

Policy Docs

Not Available

Central Marin Police Authority

Archived Link

Policy Docs

Not Available

Ceres Department of Public Safety

Archived Link

Policy Docs

Not Available

Chaffey Community College District Police Department

Archived Link

Policy Docs

Not Available

Chico Police Department

Archived Link

Policy Docs

Not Available

Chino Police Department

Archived Link

Policy Docs

Training Docs

Chowchilla Police Department

Archived Link

Policy Docs

Training Docs

Chula Vista Police Department

Archived Link

Policy Docs

Not Available

Citrus Community College District Department of Campus Safety

Archived Link

Policy Docs

Not Available

Citrus Heights Police Department

Archived Link

Policy Docs

Not Available

Claremont Police Department

Archived Link

Policy Docs

Training Docs

Clayton Police Department

Archived Link

Policy Docs

Not Available

Clearlake Police Department

Archived Link

Policy Docs

Not Available

Cloverdale Police Department

Archived Link

Policy Docs

Training Docs

Clovis Police Department

Archived Link

Policy Docs

Training Docs

Clovis Unified School District Police Department

Archived Link

Policy Docs

Training Docs

Coalinga Police Department

Archived Link

Policy Docs

Training Docs

Coast Community College District Police Department

Archived Link

Policy Docs

Not Available

Colma Police Department

Archived Link

Policy Docs

Training Docs

Colton Police Department

Archived Link

Policy Docs

Not Available

Colusa County District Attorney

Archived Link

Policy Docs

Not Available

Colusa County Sheriff's Department

Archived Link

Policy Docs

Not Available

Colusa Police Department

Archived Link

Policy Docs

Not Available

Concord Police Department

Archived Link

Policy Docs

Training Docs

Contra Costa Community College District Police Department

Archived Link

Policy Docs

Not Available

Contra Costa County District Attorney

Archived Link

Policy Docs

Not Available

Contra Costa County Sheriff's Department/Coroner

Archived Link

Policy Docs

Not Available

Corcoran Police Department

Archived Link

Policy Docs

Not Available

Corona Police Department

Archived Link

Policy Docs

Not Available

Coronado Police Department

Archived Link

Policy Docs

Training Docs

Costa Mesa Police Department

Archived Link

Policy Docs

Training Docs

Cosumnes River College Police Department

Archived Link

Policy Docs

Not Available

Cotati Police Department

Archived Link

Policy Docs

Not Available

Covina Police Department

Archived Link

Policy Docs

Not Available

CPSU Pomona Department of Public Safety

Archived Link

Policy Docs

Not Available

CSU Bakersfield University Police Department

Archived Link

Policy Docs

Not Available

CSU Channel Islands University Police Department

Archived Link

Policy Docs

Not Available

CSU Chico University Police Department

Archived Link

Policy Docs

Training Docs

CSU Dominguez Hills University Police and Parking

Archived Link

Policy Docs

Not Available

CSU East Bay University Police Department

Archived Link

Policy Docs

Not Available

CSU Fresno University Police Department

Archived Link

Policy Docs

Not Available

CSU Fullerton University Police Department

Archived Link

Policy Docs

Training Docs

CSU Long Beach University Police Department

Archived Link

Policy Docs

Not Available

CSU Monterey Bay University Police Department

Archived Link

Policy Docs

Training Docs

CSU Northridge Department of Police Services

Archived Link

Policy Docs

Training Docs

CSU Sacramento Public Safety/University Police Department

Archived Link

Policy Docs

Training Docs

CSU San Bernardino University Police Department

Archived Link

Policy Docs

Not Available

CSU San José University Police Department

Archived Link

Policy Docs

Not Available

CSU San Marcos University Police Department

Archived Link

Policy Docs

Not Available

CSU Stanislaus Police Department

Archived Link

Policy Docs

Training Docs

Cuesta College Department of Public Safety

Archived Link

Policy Docs

Training Docs

Culver City Police Department

Archived Link

Policy Docs

Training Docs

Cypress Police Department

Archived Link

Policy Docs

Training Docs

Daly City Police Department

Archived Link

Policy Docs

Training Docs

Davis Police Department

Archived Link

Policy Docs

Training Docs

Del Norte County Sheriff's Department

Archived Link

Policy Docs

Not Available

Del Rey Oaks Police Department

Archived Link

Policy Docs

Training Docs

Delano Police Department

Archived Link

Policy Docs

Not Available

Desert Hot Springs Police Department

Archived Link

Policy Docs

Training Docs

Dinuba Police Department

Archived Link

Policy Docs

Not Available

Dixon Police Department

Archived Link

Policy Docs

Not Available

Dos Palos Police Department

Archived Link

Policy Docs

Not Available

Downey Police Department

Archived Link

Policy Docs

Not Available

East Bay Regional Parks District Department of Public Safety

Archived Link

Policy Docs

Not Available

East Palo Alto Police Department

Archived Link

Policy Docs

Not Available

El Cajon Police Department

Archived Link

Policy Docs

Not Available

El Camino Community College District Police Department

Archived Link

Policy Docs

Not Available

El Centro Police Department

Archived Link

Policy Docs

Not Available

El Cerrito Police Department

Archived Link

Policy Docs

Training Docs

El Dorado County Sheriff's Department

Archived Link

Policy Docs

Not Available

El Monte Police Department

Archived Link

Policy Docs

Not Available

El Segundo Police Department

Archived Link

Policy Docs

Training Docs

Elk Grove Police Department

Archived Link

Policy Docs

Not Available

Emeryville Police Department

Archived Link

Policy Docs

Training Docs

Escalon Police Department

Archived Link

Policy Docs

Not Available

Escondido Police Department

Archived Link

Policy Docs

Training Docs

Etna Police Department

Archived Link

Policy Docs

Not Available

Eureka Police Department

Archived Link

Policy Docs

Not Available

Exeter Police Department

Archived Link

Policy Docs

Not Available

Fairfax Police Department

Archived Link

Policy Docs

Training Docs

Fairfield Police Department

Archived Link

Policy Docs

Training Docs

Farmersville Police Department

Archived Link

Policy Docs

Not Available

Ferndale Police Department

Archived Link

Policy Docs

Not Available

Firebaugh Police Department

Archived Link

Policy Docs

Not Available

Folsom Lake College Police Department

Archived Link

Policy Docs

Not Available

Folsom Police Department

Archived Link

Policy Docs

Training Docs

Fontana Police Department

Archived Link

Policy Docs

Training Docs

Fort Bragg Police Department

Archived Link

Policy Docs

Training Docs

Fortuna Police Department

Archived Link

Policy Docs

Not Available

Foster City Police Department

Archived Link

Policy Docs

Not Available

Fountain Valley Police Department

Archived Link

Policy Docs

Training Docs

Fowler Police Department

Archived Link

Policy Docs

Not Available

Fremont Police Department

Archived Link

Policy Docs

Training Docs

Fresno County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Fresno Police Department

Archived Link

Policy Docs

Not Available

Fullerton Police Department

Archived Link

Policy Docs

Not Available

Galt Police Department

Archived Link

Policy Docs

Not Available

Garden Grove Police Department

Archived Link

Policy Docs

Training Docs

Gardena Police Department

Archived Link

Policy Docs

Training Docs

Gilroy Police Department

Archived Link

Policy Docs

Not Available

Glendale Community College District Police Department

Archived Link

Policy Docs

Not Available

Glendale Police Department

Archived Link

Policy Docs

Training Docs

Glendora Police Department

Archived Link

Policy Docs

Training Docs

Glenn County Sheriff's Department/Coroner

Archived Link

Policy Docs

Not Available

Gonzales Police Department

Archived Link

Policy Docs

Not Available

Grass Valley Police Department

Archived Link

Policy Docs

Not Available

Greenfield Police Department

Archived Link

Policy Docs

Not Available

Gridley Police Department

Archived Link

Policy Docs

Not Available

Grover Beach Police Department

Archived Link

Policy Docs

Not Available

Guadalupe Police Department

Archived Link

Policy Docs

Not Available

Gustine Police Department

Archived Link

Policy Docs

Not Available

Hanford Police Department

Archived Link

Policy Docs

Not Available

Hawthorne Police Department

Archived Link

Policy Docs

Not Available

Hayward Police Department

Archived Link

Policy Docs

Not Available

Healdsburg Police Department

Archived Link

Policy Docs

Training Docs

Hemet Police Department

Archived Link

Policy Docs

Not Available

Hercules Police Department

Archived Link

Policy Docs

Training Docs

Hermosa Beach Police Department

Archived Link

Policy Docs

Not Available

Hillsborough Police Department

Archived Link

Policy Docs

Training Docs

Hollister Police Department

Archived Link

Policy Docs

Not Available

Humboldt County Sheriff's Department

Archived Link

Policy Docs

Not Available

Humboldt State University

Archived Link

Policy Docs

Training Docs

Huntington Beach Police Department

Archived Link

Policy Docs

Not Available

Huntington Park Police Department

Archived Link

Policy Docs

Training Docs

Huron Police Department

Archived Link

Policy Docs

Not Available

Imperial Police Department

Archived Link

Policy Docs

Not Available

Indio Police Department

Archived Link

Policy Docs

Not Available

Inglewood Police Department

Archived Link

Policy Docs

Not Available

Inyo County Sheriff's Department

Archived Link

Policy Docs

Not Available

Ione Police Department

Archived Link

Policy Docs

Not Available

Irvine Police Department

Archived Link

Policy Docs

Training Docs

Irwindale Police Department

Archived Link

Policy Docs

Training Docs

Jackson Police Department

Archived Link

Policy Docs

Not Available

Kensington Police Department

Archived Link

Policy Docs

Not Available

Kerman Police Department

Archived Link

Policy Docs

Not Available

Kern County Sheriff's Department

Archived Link

Policy Docs

Not Available

King City Police Department

Archived Link

Policy Docs

Not Available

Kings County Sheriff's Department

Archived Link

Policy Docs

Not Available

Kingsburg Police Department

Archived Link

Policy Docs

Not Available

La Habra Police Department

Archived Link

Policy Docs

Not Available

La Mesa Police Department

Archived Link

Policy Docs

Training Docs

La Palma Police Department

Archived Link

Policy Docs

Not Available

La Verne Police Department

Archived Link

Policy Docs

Training Docs

Laguna Beach Police Department

Archived Link

Policy Docs

Training Docs

Lake County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Lakeport Police Department

Archived Link

Policy Docs

Not Available

Lassen County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Lemoore Police Department

Archived Link

Policy Docs

Not Available

Lincoln Police Department

Archived Link

Policy Docs

Not Available

Lindsay Department of Public Safety

Archived Link

Policy Docs

Not Available

Livermore Police Department

Archived Link

Policy Docs

Training Docs

Livingston Police Department

Archived Link

Policy Docs

Not Available

Lodi Police Department

Archived Link

Policy Docs

Not Available

Lompoc Police Department

Archived Link

Policy Docs

Not Available

Long Beach Police Department

Archived Link

Policy Docs

Not Available

Los Alamitos Police Department

Archived Link

Policy Docs

Not Available

Los Altos Police Department

Archived Link

Policy Docs

Training Docs

Los Angeles City Department of Recreation and Parks, Park Ranger Division

Archived Link

Policy Docs

Not Available

Los Angeles County District Attorney

Archived Link

Policy Docs

Not Available

Los Angeles County Probation Department

Archived Link

Policy Docs

Training Docs

Los Angeles County Sheriff's Department

Archived Link

Policy Docs

Not Available

Los Angeles Police Department

Archived Link

Policy Docs

Training Docs

Los Angeles Port Police Department

Archived Link

Policy Docs

Not Available

Los Angeles School Police Department

Archived Link

Policy Docs

Training Docs

Los Angeles World Airports Police Department

Archived Link

Policy Docs

Not Available

Los Banos Police Department

Archived Link

Policy Docs

Training Docs

Los Gatos/Monte Sereno Police Department

Archived Link

Policy Docs

Not Available

Los Rios Community College District Police Department

Archived Link

Policy Docs

Not Available

Madera County Sheriff's Department

Archived Link

Policy Docs

Not Available

Madera Police Department

Archived Link

Policy Docs

Not Available

Mammoth Lakes Police Department

Archived Link

Policy Docs

Training Docs

Manhattan Beach Police Department

Archived Link

Policy Docs

Training Docs

Manteca Police Department

Archived Link

Policy Docs

Training Docs

Marin Community College District Police Department

Archived Link

Policy Docs

Not Available

Marin County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Marina Department of Public Safety

Archived Link

Policy Docs

Not Available

Martinez Police Department

Archived Link

Policy Docs

Training Docs

Marysville Police Department

Archived Link

Policy Docs

Not Available

McFarland Police Department

Archived Link

Policy Docs

Not Available

Mendocino County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Mendota Police Department

Archived Link

Policy Docs

Training Docs

Menifee Police Department

Archived Link

Policy Docs

Not Available

Menlo Park Police Department

Archived Link

Policy Docs

Not Available

Merced Community College Police Department

Archived Link

Policy Docs

Not Available

Merced County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Merced Police Department

Archived Link

Policy Docs

Training Docs

Mill Valley Police Department

Archived Link

Policy Docs

Training Docs

Milpitas Police Department

Archived Link

Policy Docs

Not Available

MiraCosta Community College District Police Department

Archived Link

Policy Docs

Training Docs

Modesto Police Department

Archived Link

Policy Docs

Training Docs

Modoc County Sheriff's Department

Archived Link

Policy Docs

Not Available

Mono County Sheriff's Department

Archived Link

Policy Docs

Not Available

Monrovia Police Department

Archived Link

Policy Docs

Training Docs

Montclair Police Department

Archived Link

Policy Docs

Training Docs

Montebello Police Department

Archived Link

Policy Docs

Training Docs

Monterey County Sheriff's Department

Archived Link

Policy Docs

Not Available

Monterey Park Police Department

Archived Link

Policy Docs

Training Docs

Monterey Police Department

Archived Link

Policy Docs

Training Docs

Moraga Police Department

Archived Link

Policy Docs

Not Available

Morgan Hill Police Department

Archived Link

Policy Docs

Training Docs

Morro Bay Police Department

Archived Link

Policy Docs

Not Available

Mountain View Police Department

Archived Link

Policy Docs

Training Docs

Mt. Shasta Police Department

Archived Link

Policy Docs

Not Available

Murrieta Police Department

Archived Link

Policy Docs

Training Docs

Napa County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Napa Police Department

Archived Link

Policy Docs

Not Available

Napa Valley College Police Department

Archived Link

Policy Docs

Training Docs

National City Police Department

Archived Link

Policy Docs

Training Docs

Nevada City Police Department

Archived Link

Policy Docs

Not Available

Nevada County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Newark Police Department

Archived Link

Policy Docs

Not Available

Newman Police Department

Archived Link

Policy Docs

Not Available

Newport Beach Police Department

Archived Link

Policy Docs

Training Docs

Novato Police Department

Archived Link

Policy Docs

Not Available

Oakdale Police Department

Archived Link

Policy Docs

Not Available

Oakland Police Department

Archived Link

Policy Docs

Training Docs

Oakley Police Department

Archived Link

Policy Docs

Not Available

Oceanside Police Department

Archived Link

Policy Docs

Training Docs

Oceanside Police Department Harbor Unit

Archived Link

Policy Docs

Training Docs

Ohlone Community College District Police Department

Archived Link

Policy Docs

Not Available

Ontario Police Department

Archived Link

Policy Docs

Training Docs

Orange County District Attorney

Archived Link

Policy Docs

Not Available

Orange County District Attorney, Public Assistance Fraud

Archived Link

Policy Docs

Not Available

Orange County Sheriff's Department/Coroner

Archived Link

Policy Docs

Not Available

Orange Cove Police Department

Archived Link

Policy Docs

Not Available

Orange Police Department

Archived Link

Policy Docs

Training Docs

Orland Police Department

Archived Link

Policy Docs

Training Docs

Oroville Police Department

Archived Link

Policy Docs

Training Docs

Oxnard Police Department

Archived Link

Policy Docs

Training Docs

Pacific Grove Police Department

Archived Link

Policy Docs

Training Docs

Pacifica Police Department

Archived Link

Policy Docs

Training Docs

Palm Springs Police Department

Archived Link

Policy Docs

Not Available

Palo Alto Police Department

Archived Link

Policy Docs

Not Available

Palos Verdes Estates Police Department

Archived Link

Policy Docs

Not Available

Paradise Police Department

Archived Link

Policy Docs

Not Available

Pasadena City College District Police Department

Archived Link

Policy Docs

Training Docs

Pasadena Police Department

Archived Link

Policy Docs

Not Available

Paso Robles Police Department

Archived Link

Policy Docs

Training Docs

Petaluma Police Department

Archived Link

Policy Docs

Not Available

Piedmont Police Department

Archived Link

Policy Docs

Training Docs

Pinole Police Department

Archived Link

Policy Docs

Training Docs

Pismo Beach Police Department

Archived Link

Policy Docs

Training Docs

Pittsburg Police Department

Archived Link

Policy Docs

Not Available

Placentia Police Department

Archived Link

Policy Docs

Not Available

Placer County District Attorney

Archived Link

Policy Docs

Not Available

Placer County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Placerville Police Department

Archived Link

Policy Docs

Not Available

Pleasant Hill Police Department

Archived Link

Policy Docs

Training Docs

Pleasanton Police Services

Archived Link

Policy Docs

Training Docs

Plumas County Sheriff's Department

Archived Link

Policy Docs

Not Available

Pomona Police Department

Archived Link

Policy Docs

Training Docs

Port Hueneme Police Department

Archived Link

Policy Docs

Not Available

Porterville Police Department

Archived Link

Policy Docs

Not Available

Red Bluff Police Department

Archived Link

Policy Docs

Training Docs

Redding Police Department

Archived Link

Policy Docs

Training Docs

Redlands Police Department

Archived Link

Policy Docs

Training Docs

Redondo Beach Police Department

Archived Link

Policy Docs

Training Docs

Redwood City Police Department

Archived Link

Policy Docs

Training Docs

Reedley Police Department

Archived Link

Policy Docs

Training Docs

Rialto Police Department

Archived Link

Policy Docs

Not Available

Richmond Police Department

Archived Link

Policy Docs

Not Available

Ridgecrest Police Department

Archived Link

Policy Docs

Not Available

Rio Dell Police Department

Archived Link

Policy Docs

Not Available

Ripon Police Department

Archived Link

Policy Docs

Not Available

Riverside Community College District Police Department

Archived Link

Policy Docs

Not Available

Riverside County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Riverside Police Department

Archived Link

Policy Docs

Not Available

Rocklin Police Department

Archived Link

Policy Docs

Training Docs

Rohnert Park Department of Public Safety

Archived Link

Policy Docs

Not Available

Roseville Police Department

Archived Link

Policy Docs

Not Available

Ross Police Department

Archived Link

Policy Docs

Not Available

Sacramento County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Sacramento Police Department

Archived Link

Policy Docs

Training Docs

Saddleback Community College Police Department

Archived Link

Policy Docs

Not Available

Saint Helena Police Department

Archived Link

Policy Docs

Not Available

San Benito County Sheriff's Department

Archived Link

Policy Docs

Training Docs

San Bernardino County Sheriff-Coroner

Archived Link

Policy Docs

Training Docs

San Bernardino Police Department

Archived Link

Policy Docs

Training Docs

San Bruno Police Department

Archived Link

Policy Docs

Not Available

San Diego County Probation Department

Archived Link

Policy Docs

Training Docs

San Diego County Sheriff's Department

Archived Link

Policy Docs

Not Available

San Diego Harbor Police Department

Archived Link

Policy Docs

Training Docs

San Diego Police Department

Archived Link

Policy Docs

Training Docs

San Diego State University Police Department

Archived Link

Policy Docs

Training Docs

San Fernando Police Department

Archived Link

Policy Docs

Training Docs

San Francisco County Sheriff's Department

Archived Link

Policy Docs

Training Docs

San Francisco Police Department

Archived Link

Policy Docs

Not Available

San Gabriel Police Department

Archived Link

Policy Docs

Training Docs

San Joaquin County Probation Department

Archived Link

Policy Docs

Training Docs

San Joaquin County Sheriff's Department

Archived Link

Policy Docs

Not Available

San Joaquin Delta College Police Department

Archived Link

Policy Docs

Training Docs

San Jose Police Department

Archived Link

Policy Docs

Training Docs

San Leandro Police Department

Archived Link

Policy Docs

Training Docs

San Luis Obispo County Sheriff's Department

Archived Link

Policy Docs

Training Docs

San Luis Obispo Police Department

Archived Link

Policy Docs

Training Docs

San Marino Police Department

Archived Link

Policy Docs

Training Docs

San Mateo County Sheriff's Office

Archived Link

Policy Docs

Training Docs

San Mateo Police Department

Archived Link

Policy Docs

Training Docs

San Pablo Police Department

Archived Link

Policy Docs

Not Available

San Rafael Police Department

Archived Link

Policy Docs

Training Docs

San Ramon Police Department

Archived Link

Policy Docs

Training Docs

Sand City Police Department

Archived Link

Policy Docs

Not Available

Sanger Police Department

Archived Link

Policy Docs

Not Available

Santa Ana Police Department

Archived Link

Policy Docs

Training Docs

Santa Ana Unified School District Police Department

Archived Link

Policy Docs

Not Available

Santa Barbara County Sheriff's Department

Archived Link

Policy Docs

Not Available

Santa Barbara Police Department

Archived Link

Policy Docs

Training Docs

Santa Clara County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Santa Clara Police Department

Archived Link

Policy Docs

Training Docs

Santa Cruz County District Attorney

Archived Link

Policy Docs

Not Available

Santa Cruz County Sheriff's Department

Archived Link

Policy Docs

Not Available

Santa Cruz Police Department

Archived Link

Policy Docs

Training Docs

Santa Fe Springs Police Services

Archived Link

Policy Docs

Not Available

Santa Maria Police Department

Archived Link

Policy Docs

Training Docs

Santa Monica Police Department

Archived Link

Policy Docs

Training Docs

Santa Paula Police Department

Archived Link

Policy Docs

Not Available

Santa Rosa Police Department

Archived Link

Policy Docs

Training Docs

Sausalito Police Department

Archived Link

Policy Docs

Not Available

Scotts Valley Police Department

Archived Link

Policy Docs

Not Available

Seal Beach Police Department

Archived Link

Policy Docs

Training Docs

Seaside Police Department

Archived Link

Policy Docs

Training Docs

Sebastopol Police Department

Archived Link

Policy Docs

Training Docs

Selma Police Department

Archived Link

Policy Docs

Not Available

Shafter Police Department

Archived Link

Policy Docs

Not Available

Shasta County Sheriff's Department

Archived Link

Policy Docs

Not Available

Sierra County Sheriff's Office

Archived Link

Policy Docs

Not Available

Sierra Madre Police Department

Archived Link

Policy Docs

Not Available

Signal Hill Police Department

Archived Link

Policy Docs

Not Available

Simi Valley Police Department

Archived Link

Policy Docs

Training Docs

Siskiyou County Sheriff's Department

Archived Link

Policy Docs

Not Available

Solano County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Soledad Police Department

Archived Link

Policy Docs

Training Docs

Sonoma County Probation Department

Archived Link

Policy Docs

Training Docs

Sonoma County Sheriff's Office

Archived Link

Policy Docs

Training Docs

Sonoma Police Department

Archived Link

Policy Docs

Training Docs

Sonoma State University Police and Parking Services

Archived Link

Policy Docs

Training Docs

Sonora Police Department

Archived Link

Policy Docs

Not Available

South Gate Police Department

Archived Link

Policy Docs

Not Available

South Lake Tahoe Police Department

Archived Link

Policy Docs

Not Available

South Pasadena Police Department

Archived Link

Policy Docs

Training Docs

South San Francisco Police Department

Archived Link

Policy Docs

Not Available

Southwestern Community College Police Department

Archived Link

Policy Docs

Not Available

Stanford University Department of Public Safety

Archived Link

Policy Docs

Not Available

Stanislaus County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Stockton Police Department

Archived Link

Policy Docs

Not Available

Suisun City Police Department

Archived Link

Policy Docs

Training Docs

Sunnyvale Department of Public Safety

Archived Link

Policy Docs

Not Available

Sutter County Sheriff's Department

Archived Link

Policy Docs

Not Available

Taft Police Department

Archived Link

Policy Docs

Not Available

Tehachapi Police Department

Archived Link

Policy Docs

Not Available

Tehama County Sheriff's Department

Archived Link

Policy Docs

Not Available

Tiburon Police Department

Archived Link

Policy Docs

Not Available

Torrance Police Department

Archived Link

Policy Docs

Not Available

Tracy Police Department

Archived Link

Policy Docs

Training Docs

Trinity County Sheriff's Department

Archived Link

Policy Docs

Not Available

Truckee Police Department

Archived Link

Policy Docs

Training Docs

Tulare County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Tulare Police Department

Archived Link

Policy Docs

Not Available

Tuolumne County Sheriff's Department

Archived Link

Policy Docs

Not Available

Turlock Police Department

Archived Link

Policy Docs

Training Docs

Tustin Police Department

Archived Link

Policy Docs

Training Docs

Twin Rivers Unified School District Police Services

Archived Link

Policy Docs

Not Available

UC Berkeley Police Department

Archived Link

Policy Docs

Not Available

UC Davis Police Department

Archived Link

Policy Docs

Not Available

UC Irvine Police Department

Archived Link

Policy Docs

Training Docs

UC Los Angeles Police Department

Archived Link

Policy Docs

Not Available

UC Merced Police Department

Archived Link

Policy Docs

Not Available

UC Riverside Police Department

Archived Link

Policy Docs

Not Available

UC San Diego Police Department

Archived Link

Policy Docs

Not Available

UC San Francisco Police Department

Archived Link

Policy Docs

Not Available

UC Santa Cruz Police Department

Archived Link

Policy Docs

Not Available

Ukiah Police Department

Archived Link

Policy Docs

Not Available

Union City Police Department

Archived Link

Policy Docs

Not Available

Upland Police Department

Archived Link

Policy Docs

Training Docs

Vacaville Police Department

Archived Link

Policy Docs

Training Docs

Vallejo Police Department

Archived Link

Policy Docs

Training Docs

Ventura County District Attorney

Archived Link

Policy Docs

Not Available

Ventura County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Ventura Police Department

Archived Link

Policy Docs

Training Docs

Vernon Police Department

Archived Link

Policy Docs

Training Docs

Victor Valley College Police Department

Archived Link

Policy Docs

Not Available

Visalia Police Department

Archived Link

Policy Docs

Not Available

Walnut Creek Police Department

Archived Link

Policy Docs

Not Available

Watsonville Police Department

Archived Link

Policy Docs

Not Available

Weed Police Department

Archived Link

Policy Docs

Not Available

West Cities Police Communications Center

Archived Link

Policy Docs

Not Available

West Covina Police Department

Archived Link

Policy Docs

Not Available

West Sacramento Police Department

Archived Link

Policy Docs

Not Available

West Valley-Mission Community College District Police Department

Archived Link

Policy Docs

Not Available

Westminster Police Department

Archived Link

Policy Docs

Not Available

Wheatland Police Department

Archived Link

Policy Docs

Not Available

Whittier Police Department

Archived Link

Policy Docs

Not Available

Williams Police Department

Archived Link

Policy Docs

Not Available

Willits Police Department

Archived Link

Policy Docs

Not Available

Windsor Police Department

Archived Link

Policy Docs

Not Available

Winters Police Department

Archived Link

Policy Docs

Not Available

Woodland Police Department

Archived Link

Policy Docs

Training Docs

Yolo County District Attorney

Archived Link

Policy Docs

Not Available

Yolo County Sheriff's Department

Archived Link

Policy Docs

Not Available

Yreka Police Department

Archived Link

Policy Docs

Not Available

Yuba City Police Department

Archived Link

Policy Docs

Training Docs

Yuba County Sheriff's Department

Archived Link

Policy Docs

Not Available

Dave Maass

Your Service Provider’s Terms of Service Shouldn’t Overrule Your Fourth Amendment Rights

2 weeks 3 days ago

Last week, EFF, ACLU, and ACLU of Minnesota filed an amicus brief in State v. Pauli, a case in the Minnesota Supreme Court, where we argue that cloud storage providers’ terms of service (TOS) can’t take away your Fourth Amendment rights. This is the first case on this important issue to reach a state supreme court, and could mean that anyone in Minnesota who violated any terms of a providers’ TOS could lose Fourth Amendment protections over all the files in their account.

The facts of the case are a little hazy, but at some point, Dropbox identified video files in Mr. Pauli’s account as child pornography and submitted the files to the National Center for Missing and Exploited Children (NCMEC), a private, quasi-governmental entity created by statute that works closely with law enforcement on child exploitation issues. After viewing the files, a NCMEC employee then forwarded them with a report to the Minnesota Bureau of Criminal Apprehension. This ultimately led to Pauli’s indictment on child pornography charges. Pauli challenged the search, but the trial court held that Dropbox’s TOS—which notified Pauli that Dropbox could monitor his account and disclose information to third parties if it believed such disclosure was necessary to comply with the law—nullified Pauli’s expectation of privacy in the video files. After the appellate court agreed, Pauli petitioned the state supreme court for review.

The lower courts’ analysis is simply wrong. Under this logic, your Fourth Amendment rights rise or fall based on unilateral contracts with your service providers—contracts that none of us read or negotiate but all of us must agree to so that we can use services that are a necessary part of daily life. As we argued in our brief, a company’s TOS should not dictate your constitutional rights, because terms of service are rules about the relationship between you and your service provider—not you and the government.

Companies draft terms of service to govern how their platforms may be used, and the terms of these contracts are extremely broad. Companies’ TOS control what kind of content you can post, how you can use the platform, and how platforms can protect themselves against fraud and other damage. Actions that could violate a company’s TOS include not just criminal activity, such as possessing child pornography, but also—as defined solely by the provider—actions like uploading content that defames someone or contains profanity, sharing a copyrighted article without permission from the copyright holder, or marketing your small business to all of your friends without their advance consent. While some might find activities such as these objectionable or annoying, they shouldn’t justify the government ignoring your Fourth Amendment right to privacy in your files simply because you store them in the cloud.

Given the vast amount of storage many service providers offer (most offer up to 2 terabytes for a small annual fee), accounts can hold tens of thousands of private and personal files, including photos, messages, diaries, medical records, legal data, and videos—each of which could reveal intimate details about our private and professional lives. Storing these records in the cloud with a service provider allows users to free up space on their personal devices, access their files from anywhere, and share (or not share) their files with others. The convenience and cost savings offered by commercial third-party cloud-storage providers means that very few of us would take the trouble to set up our own server to try to achieve privately all that we can do with our data when we could store it with a commercial service provider. But this also means that the only way to take advantage of this convenience is if we agree to a company’s TOS.

And several billion of us do agree every day. Since its advent in 2007, Dropbox’s user-base has soared to more than 700 million registered users. Apple offers free iCloud storage to users of its more than 1.5 billion active phones, tablets, laptops, and other devices around the world. And Google’s suite of cloud services—which includes both Gmail and Google Drive (offering access to stored and shareable documents, spreadsheets, photos, slide presentations, videos, and more)—enjoy 2 billion monthly active users. These users would be shocked to discover that by agreeing to their providers’ TOS, they could be giving up an expectation of privacy in their most private records.

In 2018, in Carpenter v. United States, all nine justices on the Supreme Court agreed that even if we store electronic equivalents of our Fourth Amendment-protected “papers” and “effects” with a third-party provider, we still retain privacy interests in those records. These constitutional rights would be meaningless, however, if they could be ignored simply because a user agreed to and then somehow violated their provider’s TOS.

The appellate court’s ruling in Pauli allows private agreements to trump bedrock Fourth Amendment guarantees for private communications and cloud-stored records. The ruling affects far more than child pornography cases: anyone who violated any terms of a providers’ TOS could lose Fourth Amendment protections over all the files in their account.

We hope the Minnesota Supreme Court will reject such a sweeping invalidation of constitutional rights. We look forward to the court’s decision.

Jennifer Lynch

Canada’s Attempt to Regulate Sexual Content Online Ignores Technical and Historical Realities

2 weeks 4 days ago

Canadian Senate Bill S-203, AKA the “Protecting Young Persons from Exposure to Pornography Act,” is another woefully misguided proposal aimed at regulating sexual content online. To say the least, this bill fails to understand how the internet functions and would be seriously damaging to online expression and privacy. It’s bad in a variety of ways, but there are three specific problems that need to be laid out:  1) technical impracticality, 2) competition harms, and 3) privacy and security.

First, S-203 would make any person or company criminally liable for any time an underage user engages with sexual content through its service. The law applies even if the person or company believed the user to be an adult, unless the person or company “implemented a prescribed age-verification method.”

Second, the bill seemingly imposes this burden on a broad swath of the internet stack. S-203 would criminalize the acts of independent performers, artists, blogs, social media, message boards, email providers, and any other intermediary or service in the stack that is in some way “for commercial purposes” and “makes available sexually explicit material on the Internet to a young person.” The only meaningful defense against the financial penalties that a person or company could assert would be to verify the legal adult age of every user and then store that data.

The bill would likely force many companies to simply eliminate sexual content

The sheer amount of technical infrastructure it would take for such a vast portion of the internet to “implement a prescribed age-verification method” would be costly and overwhelmingly complicated. It would also introduce many security concerns that weren’t previously there. Even if every platform had server side storage with robust security posture, processing high level personally identifiable information (PII) on the client side would be a treasure trove for anyone with a bit of app exploitation skills. And then if this did create a market space for third-party proprietary solutions to take care of a secure age verification system, the financial burden would only advantage the largest players online. Not only that, it’s ahistorical to assume that younger teenagers wouldn’t figure out ways to hack past whatever age verification system is propped up.

Then there’s the privacy angle. It’s ludicrous to expect all adult users to provide private personal information every time they log onto an app that might contain sexual content. The implementation of verification schemes in contexts like this may vary on how far privacy intrusions go, but it generally plays out as a cat and mouse game that brings surveillance and security threats instead of responding to initial concerns. The more that a verification system fails, the more privacy-invasive measures are taken to avoid criminal liability.

Because of the problems of implementing age verification, the bill would likely force many companies to simply eliminate sexual content instead of carrying the huge risk that an underage user will access it. But even a company that wanted to eliminate prohibited sexual content would face significant obstacles in doing so if they, like much of the internet, host user-generated content. It is difficult to detect and define the prohibited sexual content, and even more difficult when the bill recognizes that the law is not violated if such material “has a legitimate purpose related to science, medicine, education or the arts.” There is no automated tool that can make such distinctions; the inevitable result is that protected materials will be removed out of an abundance of caution. And history teaches us that the results are often sexist, misogynist, racist, LGBT-phobic, ableist, and so on. It is a feature, not a bug, that there is no one-size-fits-all way to neatly define what is and isn’t sexual content.

Ultimately, Canadian Senate Bill S-203 is another in a long line of morally patronizing legislation that doesn’t understand how the internet works. Even if there were a way to keep minors away from sexual content, there is no way without vast collateral damage. Sen. Julie Miville-Dechêne, who introduced the bill, stated “it makes no sense that the commercial porn platforms don’t verify age. I think it’s time to legislate.” We gently recommend that next time her first thought be to consult with experts.

Daly Barnett

EFF and ACLU Ask Supreme Court to Review Case Against Warrantless Searches of International Travelers’ Phones and Laptops

2 weeks 4 days ago
Border Officers Accessing Massive Amounts of Information from Electronic Devices

Washington, D.C. —The Electronic Frontier Foundation (EFF), the American Civil Liberties Union, and the ACLU of Massachusetts today filed a petition for a writ of certiorari, asking the Supreme Court to hear a challenge to the Department of Homeland Security’s policy and practice of warrantless and suspicionless searches of travelers’ electronic devices at U.S. airports and other ports of entry.

The lawsuit, Merchant v. Mayorkas, was filed in September 2017 on behalf of several travelers whose cell phones, laptops, and other electronic devices were searched without warrants at the U.S. border. In November 2019, a federal district court in Boston ruled that border agencies’ policies on electronic device searches violate the Fourth Amendment, and required border officers to have reasonable suspicion of digital contraband before they can search a traveler’s device. A three-judge panel at the First Circuit reversed this decision in February 2021.

“Border officers every day make an end-run around the Constitution by searching travelers’ electronic devices without a warrant or any suspicion of wrongdoing,” said EFF Senior Staff Attorney Sophia Cope. “The U.S. government has granted itself unfettered authority to rummage through our digital lives just because we travel internationally. This egregious violation of privacy happens with no justification under constitutional law and no demonstrable benefit. The Supreme Court must put a stop to it.”

“This case raises pressing questions about the Fourth Amendment’s protections in the digital age,” said Esha Bhandari, deputy director of the ACLU’s Speech, Privacy, and Technology Project. “When border officers search our phones and laptops, they can access massive amounts of sensitive personal information, such as private photographs, health information, and communications with partners, family, and friends—including discussions between lawyers and their clients, and between journalists and their sources. We are asking the Supreme Court to ensure that we don’t lose our privacy rights when we travel.”

Every year, a growing number of international travelers are subject to warrantless and suspicionless searches of their personal electronic devices at the U.S. border. These searches are often conducted for reasons that have nothing to do with stopping the importation of contraband or determining a traveler’s admissibility. Border officers claim the authority to search devices for a host of reasons, including enforcement of tax, financial, consumer protection, and environmental laws—all without suspicion of wrongdoing. Border officers also search travelers’ devices if they are interested in information about someone other than the traveler—like a business partner, family member, or a journalist’s source.

The petitioners in this case—all U.S. citizens—include a military veteran, journalists, an artist, a NASA engineer, and a business owner. Several are Muslims and people of color, and none were accused of any wrongdoing in connection with their device searches.

“It’s been frustrating to be subjected to this power-grab by the government,” said Diane Zorri, a college professor, former U.S. Air Force captain, and a plaintiff in the case. “My devices are mine, and the government should need a good reason before rifling through my phone and my computer. I’m proud to be part of this case to help protect travelers’ rights.”

The certiorari petition asks the Supreme Court to overturn the First Circuit’s decision and hold that the Fourth Amendment requires border officers to obtain a warrant based on probable before searching electronic devices, or at the least have reasonable suspicion that the device contains digital contraband.

For more information about Merchant v. Mayorkas go to:
https://www.eff.org/cases/alasaad-v-duke
https://www.aclu.org/cases/alasaad-v-wolf-challenge-warrantless-phone-and-laptop-searches-us-border

For the full petition for writ of certiorari:

https://www.eff.org/document/petition-writ-certiorari-3

Contact:  RebeccaJeschkeMedia Relations Director and Digital Rights Analystpress@eff.org KateLagrecaACLU of Massachusettsklagreca@aclum.org Aaron MadridAksozACLU Nationalmedia@aclu.org
Rebecca Jeschke

Tell Congress: Federal Money Shouldn’t Be Spent On Breaking Encryption

2 weeks 5 days ago

We don’t need government minders in our private conversations. That’s because private conversations, whether they happen offline or online, aren’t a public safety menace. They’re not an invitation to criminality, or terrorism, or a threat to children, no matter how many times those tired old lines get repeated. 

TAKE ACTION

TELL CONGRESS: DON’T SPEND TAX MONEY TO BREAK ENCRYPTION

Unfortunately, federal law enforcement officials have not stopped asking for backdoor access to Americans’ encrypted messages. FBI Director Christopher Wray did it again just last month, falsely claiming that end-to-end encryption and “user-only access” have “negligible security advantages” but have a “negative effect on law enforcement’s ability to protect the public.”

This year, there’s something we can do about it. Rep. Tom Malinowski (D-NJ) and Rep. Peter Meijer (R-MI) have put forward language that would ban federal money from being used to weaken security standards or introduce vulnerabilities into software or hardware.

Last year, the House of Representatives inserted an amendment in the Defense Appropriations bill that prohibits the use of funds to insert security backdoors. That provision targeted the NSA. This year’s proposal will cover a much broader range of federal agencies. It also includes language that would prevent the government from engaging in schemes like client-side scanning or a “ghost” proposal, which would undermine encryption without technically decrypting data.

Secure and private communications are the backbone of democracy and free speech around the world. If U.S. law enforcement is able to compel private companies to break encryption, criminals and authoritarian governments will be eager to use the same loopholes. There are no magic bullets, and no backdoors that will only get opened by the “good guys.”

It’s important that as many members of Congress as possible sign on as supporters of this proposal. We need to send a strong signal to federal law enforcement that they should, once and for all, stop insisting they should scan all of our messages. To get there, we need your help.

TAKE ACTION

TELL CONGRESS: DON’T SPEND TAX MONEY TO BREAK ENCRYPTION

Joe Mullin

Data Driven 2: California Dragnet—New Data Set Shows Scale of Vehicle Surveillance in the Golden State

2 weeks 5 days ago

This project is based on data processed by student journalist Olivia Ali, 2020 intern JJ Mazzucotelli, and research assistant Liam Harton, based on California Public Records Act requests filed by EFF and dozens of students at the University of Nevada, Reno Reynolds School of Journalism. 

Tiburon, California: a 13-square-mile peninsula town in Marin County, known for its glorious views of the San Francisco Bay and its eclectic retail district. 

What the town's tourism bureau may not want you to know: from the moment you drive into the city limits, your vehicle will be under extreme surveillance. The Tiburon Police Department has the dubious distinction of collecting, mile-for-mile, more data on drivers than any other agency surveyed for a new EFF data set. 

Today, EFF is releasing Data Driven 2: California Dragnet, a new public records collection and data set that shines light on the massive amount of vehicle surveillance conducted by police in California using automated license plate readers (ALPRs)—and how very little of this surveillance is actually relevant to an active public safety interest. 

Download the Data Driven 2: California Dragnet data set.

In 2019 alone, just 82 agencies collected more than 1 billion license plate scans using ALPRs. Yet, 99.9% of this surveillance data was not actively related to an investigation when it was collected. Nevertheless, law enforcement agencies stockpile this data, often for years, and often share the data with hundreds of agencies around the country.  

This means that law enforcement agencies have built massive databases that document our travel patterns, regardless of whether we're under suspicion. With a few keystrokes, a police officer can generate a list of places a vehicle has been seen, with few safeguards and little oversight.  

EFF's dataset also shows for the first time how some jurisdictions—such as Tiburon and  Sausalito in Northern California, and Beverly Hills and Laguna Beach in Southern California—are scanning drivers at a rate far higher than the statewide mean. In each of those cities, an average vehicle will be scanned by ALPRs every few miles it drives.

Tiburon first installed Vigilant Solutions ALPRs at the town's entrance and exit points and downtown about a decade ago. Today, with just six cameras, it has evolved into a massive surveillance program: on average, a vehicle will be scanned by cops once for every 1.85 miles it drives.  

Tiburon Police stockpile about 7.7-million license plate scans annually, and yet only .01% or 1 in 10,000 of those records were related to a crime or other public safety interest when they were collected. The data is retained for a year. 

ALPRs are a form of location surveillance: the data they collect can reveal our travel patterns and daily routines, the places we visit, and the people with whom we associate. In addition to the civil liberties threat, these data systems also create great security risks, with multiple known breaches of ALPR data and technology occurring over the last few years. 

EFF sought comment from Tiburon Police Chief Ryan Monaghan, who defended the program via email. "Since the deployment of the ALPRs, our crime data from the five years prior to having the ALPRs as well as the five years after and beyond have shown marked reductions in stolen vehicles and thefts from vehicles and an increase in the recovery of stolen vehicles," he wrote.  

EFF’s public records data set, which builds on a 2016-2017 survey (Data Driven 1), aims to provide journalists, policymakers, researchers, and local residents with data to independently evaluate and understand the state of California's ALPR dragnet. 

What Are Automated License Plate Readers?

A fixed ALPR and a mobile ALPR. Credit: Mike Katz-Lacabe (CC BY)

ALPRs are cameras that snap photographs of license plates and then upload the plate numbers, time/data, and GPS coordinates to a searchable database. This allows police to identify and track vehicles in real time, search the historical travel patterns of any vehicle, and identify vehicles that have been spotted near certain locations. 

Cops attach these cameras to fixed locations, like highway overpasses or traffic lights. Law enforcement agencies also install ALPRs on patrol vehicles, allowing police to capture data on whole neighborhoods by driving block-by-block, a tactic known as "gridding." In 2020, the California State Auditor issued a report that found that agencies were collecting large amounts of data without following state law and without addressing some of the most basic cybersecurity and civil liberties concerns when it comes to Californians' data. 

About the Data Set

The Data Driven 2: California Dragnet data set is downloadable as an Excel (.xlsx) file, with the data analysis broken into various tabs. We have also presented selections from the data as a table below. 

The dataset is based on dozens of California Public Records Act Requests filed by EFF and students at the Reynolds School of Journalism at the University of Nevada, Reno in collaboration with MuckRock News. Data Driven 2 is a sequel to the EFF and MuckRock's 2018 Data Driven report. 

To create the data set, we filed more than 100 requests for information under the California Public Records Act. We sought the following records from each agency: 

  1. The number of license plate scans captured by ALPRs per year in 2018 and 2019. These are also called "detections." 
  2. The number of scanned license plates each year for 2018 and 2019 that matched a "hot list," the term of art for a list of vehicles of interest. These matches are called "hits." 
  3. The list of other agencies that the law enforcement agency is exchanging data with, including both ALPR scans and hot lists. 

Most of these public records requests were filed in 2020. For a limited number of requests filed in 2021, we also requested detection and hit information for 2020. The agencies were selected because they had previously provided records for the original report or were known to use an ALPR system that could export the type of aggregate data required for this analysis. Not all agencies provided records in response to our requests. 

The spreadsheet includes links to public records for each agency, along with a table of their statistics. In addition, we have included "Daily Vehicle Miles Travelled" from the California Department of Transportation for help in comparing jurisdictions. 

The dataset covers 89 agencies from all corners of the state. However, the data was not always presented by the agencies in a uniform manner. Only 63 agencies provided comprehensive and separated data for both 2018 and 2019. Other agencies either produced data for incomparable time periods or provided incomplete or clearly erroneous data. In some cases, agencies did not have ALPRs for the full period, having either started or ended their programs mid-year.  (Note: More than 250 agencies use ALPR in California).

In general, our analysis below only includes agencies that provided both 2018 and 2019 data, which we then averaged together. However, we are including all the data we received in the spreadsheet.

Hit Ratio: Most ALPR Data Involves Vehicles Not Under Active Suspicion 

One way to examine ALPR data is to ask whether the data collected is relevant to an active investigation or other public safety interest at the time it is collected. 

Law enforcement agencies create "hot lists" of license plates, essentially lists of vehicles they are actively looking for, for example, because they're stolen, are suspected of being connected to a crime, or belong to an individual under state supervision, such as a sex offender. When an ALPR scans a license plate that matches a hot list, the system issues an alert to the law enforcement agency that the vehicle was sighted. 

Data that is not on a hot list is still stored, often for more than a year depending on the agency's policy, despite a lack of relevance to an active public safety interest. Police have argued they need this data in case one day you commit a crime, at which point they can look back at your historical travel patterns. EFF and other privacy organizations argue that this a fundamental violation of the privacy of millions of innocent drivers, as well as an enormous cybersecurity risk. 

The 63 agencies that provided us 2018-2019 data collected a combined average 840,000,000 plate images each year. However, only 0.05% of the data matched a hot list. 

Some agencies only provided us data for three months or one year. Other agencies lumped all the data together. While we have left them out of this analysis, their hit ratios closely followed what we saw in the agencies that provided us consistent data. 

The top 15 data-collecting law enforcement agencies accounted for 1.4 billion license plate scans over two years. On average, those 15 law enforcement agencies reported that only .05% of the data was on a hot list. 

Agency/Records Link

2018-2019 License Plate Scans

Hit Ratio (Percentage on a Hot List)

San Bernardino County Sheriff's Office

439,272,149

0.05%

Carlsbad Police Department

161,862,285

0.02%

Sacramento Police Department

142,170,129

0.08%

Torrance Police Department

132,904,262

0.04%

Chino Police Department

83,164,449

0.05%

Beverly Hills Police Department

67,520,532

0.03%

Fontana Police Department

66,255,835

0.06%

Contra Costa County Sheriff's Office

65,632,313

0.11%

Claremont Police Department (Link 2)

45,253,735

0.04%

Long Beach Police Department

44,719,586

0.09%

Livermore Police Department

39,430,629

0.04%

Laguna Beach Police Department

37,859,124

0.04%

Pleasant Hill Police Department

27,293,610

0.03%

Merced Police Department

25,895,158

0.04%

Brentwood Police Department

25,440,363

0.07%

In his written response to our finding that Tiburon Police store data for a year, even though 99.9% of the data was not tied to an alert, Chief Monaghan wrote: "Our retention cycle for the scan data is in line with industry-wide standards for ALPRs and does not contain any personal identifying information. Like other agencies that deploy ALPRs, we retain the information to use for investigative purposes only as many crimes are reported after the fact and license plate data captured by the ALPRs can be used as an investigative tool after the fact." 

Monaghan presents a few misconceptions worth dispelling. First, while many agencies do store data for one year or more, there is no industry standard for ALPR.  For example, Flock Safety, a vendor that provides ALPR to many California agencies, deletes data after 30 days. The California Highway Patrol is only allowed to hang onto data for 60 days. According to the National Conferences of State Legislatures, Maine has a 21 day retention period  and Arkansas has a 150 day retention period.  In New Hampshire, the law requires deletion after three minutes if the data is not connected to a crime.

Finally there is a certain irony when law enforcement claims that ALPR data is not personally identifying information, when one of the primary purposes of the data is to assist in identifying suspects. In fact, California's data breach laws explicitly name ALPR as a form of personal information when it is combined with a person's name. It is very easy for law enforcement to connect ALPR data to other data sets, such as a vehicle registration database, to determine the identity of the owner of the vehicle. In addition, ALPR systems also store photographs, which can potentially capture images of drivers' faces. 

Indeed, Tiburon's own ALPR policy says that raw ALPR data cannot be released publicly because it may contain confidential information. This is consistent with a California Supreme Court decision that found that the Los Angeles Police Department and Los Angeles County Sheriff's Department could not release unredacted ALPR data in response to a CPRA request because "the act of revealing the data would itself jeopardize the privacy of everyone associated with a scanned plate. Given that real parties each conduct more than one million scans per week, this threat to privacy is significant." The Supreme Court agreed with a lower court ruling that "ALPR data showing where a person was at a certain time could potentially reveal where that person lives, works, or frequently visits. ALPR data could also be used to identify people whom the police frequently encounter, such as witnesses or suspects under investigation."

Scans Per Mile: Comparing the Rate of ALPR Surveillance 

While the agencies listed in the previous section are each collecting a massive amount of data, it can be difficult to interpret how  law enforcement agencies' practices compare to one another. Obviously, some cities are bigger than others, and so we sought to establish a way to measure the proportionality of the ALPR data collected. 

One way to do this is to compare the number of license plate scans to the size of the jurisdiction's population. However, this method may not provide the clearest picture, since many commuters, tourists, and rideshare drivers cross city lines many times a day. In addition, the larger and more dense a population is, fewer people may own vehicles with more relying on public transit instead. So we ruled out that method. 

Another method is to compare the license plate scans to the number of vehicles registered or owned in a city. That runs into a similar problem: people don't always work in the same city where their car is registered, particularly in the San Francisco Bay Area and Los Angeles County. 

So, we looked for some metric that would allow us to compare the number of license plate scans to how much road traffic there is in a city. 

Fortunately, for each city and county in the state, the California Department of Transportation compiles annual data on "Vehicle Miles Traveled" (VMT), a number representing the average number of miles driven by vehicles on roads in the jurisdiction each day. VMT is becoming a standard metric in urban and transportation planning.

By comparing license plate scans to VMT we can begin to address the question: Are some cities disproportionally collecting more data than others? The answer is yes: Many cities are collecting data at a far higher rate than others.

There are a few different ways to interpret the rate. For example, in Tiburon, police are collecting on average one license plate scan for every 1.85 miles driven. That means that the average vehicle will be captured every 1.85 miles. To put that in perspective, a driver from outside of town who commutes to and from downtown Tiburon (about four miles each way to the town limits), five days a week, 52 weeks a year, should expect their license plate to be scanned on average 1,124 times annually.  

Another way to look at it is that for every 100 cars that drive one mile, Tiburon ALPRs on average will scan 54 license plates. 

Via email, Tiburon Chief Monaghan responded: "In terms of the number of scans compared to VMT, our ALPRS are strategically placed on two main arterial roadways leading in and out of Tiburon, Belvedere, and incorporated sections of Tiburon. As Tiburon is located on a peninsula, there are limited roadways in and out. The roadways where the ALPRs are deployed are not only used by those who live in the area, but also by commuters who use the ferries that operate out of Tiburon, those who work in Tiburon and Belvedere, parents taking their kids to schools in the area, and those visiting."

Tiburon isn't the only agency collecting data at a high rate. 

  • In Sausalito, on average police are capturing a plate for every 2.14 miles a car drives. That's the equivalent of 46 scans per 100 vehicles that drive one mile. 
  • In Laguna Beach, it's one ALPR scan for every three miles. On average, ALPRs will scan 33 plates for every 100 cars that drive a single mile. 
  • In Beverly Hills, it's one ALPR scan for every 4.63 miles, or 21 scans per 100 cars that drive one mile. 

In comparison: Across 60 cities, police collectively scanned on average one plate for every 48 miles driven by vehicles. Tiburon scanned license plates at more than 25 times that rate. 

For this analysis, we only included municipal police that provided data for both 2018 and 2019. We then used those figures to find an average number of daily plates scanned, which we then compared to the cities' average daily VMTs for 2018-2019. 

Here are the top 15 municipal police department based on miles per scan. 

Agency/Records Link

Average Number of Vehicle Miles Traveled Per Scan (2018-2019) 

Average Number of Scans Per 100 Vehicle Miles Traveled (2018-2019)

Tiburon Police Department (Link 2)

1.85 miles

54.11 scans

Sausalito Police Department (Link 2)

2.14 miles

46.65 scans

Laguna Beach Police Department

3.02 miles

33.12 scans

Beverly Hills Police Department

4.63 miles

21.60 scans

Claremont Police Department (Link 2)

5.99 miles

16.69 scans

La Verne Police Department

7.91 miles

12.64 scans

Carlsbad Police Department

8.19 miles

12.21 scans

Chino Police Department

8.46 miles

11.83 scans

Torrance Police Department

9.74 miles

10.27 scans

Clayton Police Department

10.22 miles

9.79 scans

Pleasant Hill Police Department

10.93 miles

9.15 scans

Oakley Police Department

10.97 miles

9.12 scans

Brentwood Police Department

11.42 miles

8.76 scans

Martinez Police Department

13.59 miles

7.36 scans 

A few caveats about this analysis: 

We must emphasize the term “average.” Road traffic is not even across every street, nor are ALPRs distributed evenly across a city or county. A driver who only drives a half mile along backroads each day may never be scanned. Or if a city installs ALPRs at every entrance and exit to town, as Tiburon has, a driver who commutes to and from the city everyday would likely be scanned at a much higher rate. 

In addition, many police departments attach ALPRs to their patrol cars. This means they are capturing data on parked cars they pass. Your risk of being scanned by an ALPR does not increase linearly with driving—someone who leaves their car parked all year may still be scanned several times. In many jurisdictions, both the city police and the county sheriff use ALPRs; our data analysis does not cover overlapping data collection. Our intention is not to help drivers determine exactly how often they’ve been scanned but to compare the volume of data collection across municipalities of different sizes. 

Finally, VMT is not an exact measurement but rather an estimate that is based on measuring roadway traffic on major arteries and projecting across the total number of miles of road in the city. 

As such, this ratio presents a useful metric to gauge proportionality broadly and should be interpreted as an estimate or a projection. 

What to Do With This Data

The Data Driven 2 dataset is designed to give a bird’s eye view of the size and scope of data collection through ALPRs. 

However, it does not dive into more granular variations in ALPR programs between agencies, such as the number or type of ALPR cameras used by each agency. For example, some agencies might use 50 stationary cameras, while others may use three mobile cameras on patrol cars and still others use a combination of both. Some agencies may distribute data collection evenly across a city, and others may target particular neighborhoods. 

In many cases, agencies provided us with "Data Sharing Reports" that list everyone with whom they are sharing data. This can be useful for ascertaining whether agencies are sharing data broadly with agencies outside of California or even with federal agencies, such as immigration enforcement. Please note that the data sharing list does change from day-to-day as agencies join and leave information exchange networks. 

Journalists and researchers can and should use our dataset as a jumping-off point to probe deeper.  By filing public records requests or posing questions directly to police leaders, we can find out how agencies are deploying ALPRs and how that impacts the amount of data collected, the usefulness of the data, and the proportionality of the data. Under a California Supreme Court ruling, requesters may also be able to obtain anonymized data on where ALPR data was collected.

Conclusion 

Law enforcement often tries to argue that using ALPR technology to scan license plates is no different than a police officer on a stakeout outside a suspected criminal enterprise who writes down the license plates of every car that pulls up. 

But let's say you lived in a place like Brentwood, where police collect on average 25-million license plates a year.

Let's assume that a police observer is able to snap a photo and scribble down the plate number, make, model, and color of a vehicle once every minute (which is still pretty superhuman).  Brentwood would have to add 200 full-time employees to collect as much data manually as they can with their ALPRs.  By comparison: The Brentwood Police Department currently has 71 officers, 36 civilian support workers, and 20 volunteers. The entire city of Brentwood has only 283 employees. 

If 200 human data recorders were positioned throughout a city, spying on civilians, it's unlikely people would stand for it. This report illustrates how that level of surveillance is indeed occurring in many cities across California. Just because the cameras are more subtle, doesn't make them any less creepy or authoritarian.  

2018-2019 Detections with Hit Ratio  

Agency/Records Link

2018-2019 Detections Combined

Hit Ratio (Percentage on a Hot List)

Antioch Police Department

22,524,415

0.10%

Auburn Police Department

3,190,715

0.03%

Bakersfield Police Department

370,635

0.11%

Bell Gardens Police Department

9,476,932

0.02%

Belvedere Police Department

4,089,986

0.01%

Beverly Hills Police Department

67,520,532

0.03%

Brawley Police Department

870,011

0.03%

Brentwood Police Department

25,440,363

0.07%

Buena Park Police Department

854,156

0.04%

Carlsbad Police Department

161,862,285

0.02%

Cathedral City Police Department

104,083

0.06%

Chino Police Department

83,164,449

0.05%

Chula Vista Police Department

672,599

0.03%

Citrus Heights Police Department

18,804,058

0.04%

Claremont Police Department (Link 2)

45,253,735

0.04%

Clayton Police Department

9,485,976

0.02%

Contra Costa County Sheriff's Office

65,632,313

0.11%

Cypress Police Department

288,270

0.04%

Emeryville Police Department

1,579,100

0.05%

Fairfield Police Department

785,560

0.06%

Folsom Police Department

14,624,819

0.03%

Fontana Police Department

66,255,835

0.06%

Fresno Police Department

3,673,958

0.15%

Fullerton Police Department

742,996

0.05%

Galt Police Department

23,478

0.02%

Garden Grove Police Department

332,373

0.24%

Gardena Police Department

5,762,032

0.05%

Imperial Police Department

23,294,978

0.03%

Irvine Police Department

651,578

0.05%

La Habra Police Department

888,136

0.05%

La Mesa Police Department

1,437,309

0.05%

La Verne Police Department

24,194,256

0.03%

Laguna Beach Police Department

37,859,124

0.04%

Livermore Police Department

39,430,629

0.04%

Lodi Police Department

3,075,433

0.05%

Long Beach Police Department

44,719,586

0.09%

Marin County Sheriff's Office

1,547,154

0.04%

Merced Police Department

25,895,158

0.04%

Mill Valley Police Department

529,157

0.12%

Monterey Park Police Department

2,285,029

0.04%

Newport Beach Police Department (Link 2)

772,990

0.04%

Orange County Sheriff's Office

2,575,993

0.09%

Palos Verdes Estates Police

16,808,440

0.03%

Pasadena Police Department

3,256,725

0.03%

Pleasant Hill Police Department

27,293,610

0.03%

Pomona Police Department

11,424,065

0.10%

Redondo Beach Police Department (Link 2)

18,436,371

0.04%

Sacramento Police Department

142,170,129

0.08%

San Bernardino County Sheriff's Office

439,272,149

0.05%

San Diego County Sheriff's Office

13,542,616

0.04%

San Diego Police Department

138,146

0.07%

San Mateo County Sheriff's Office (Link 2)

4,663,684

0.02%

Sausalito Police Department (Link 2)

15,387,157

0.02%

Simi Valley Police Department

480,554

0.11%

Stanislaus County Sheriff's Office

6,745,542

0.07%

Stockton Police Department

1,021,433

0.09%

Tiburon Police Department (Link 2)

15,424,890

0.01%

Torrance Police Department

132,904,262

0.04%

Tracy Police Department

1,006,393

0.06%

Tustin Police Department (Link 2)

1,030,106

0.04%

West Sacramento Police Department

2,337,027

0.05%

Westminster Police Department

1,271,147

0.05%

Yolo County Sheriff's Office

3,049,884

0.02%

Irregular Agencies

These agencies responded to our records requests, but did not provide complete, reliable or directly comparable information.  

Agency/Records Link

Detection Years Used in Hit Ratio

Detections

Hit Ratio (Percentage on a Hot List)

American Canyon Police Department

2018

394,827

0.18%

Beaumont Police Department

2019

83,141

0.10%

Bell Police Department

2018

806,327

Data Not Available

Burbank Police Department

2020

364,394

0.05%

Coronado Police Department

2019

616,573

0.07%

CSU Fullerton Police Department

2019

127,269

0.05%

Desert Hot Springs Police Department

Data Not Available

Data Not Available

Data Not Available

El Segundo Police Department

2020

24,797,764

0.07%

Fountain Valley Police Department

2018-2020

780,940

Data Not Available

Glendale Police Department

2019

119,356

0.04%

Hemet Police Department

2019

84,087

0.10%

Hermosa Beach Police Department

2019

274,577

0.51%

Martinez Police Department

2019

12,990,796

Data Not Available

Modesto Police Department

2019

10,262,235

0.06%

Oakley Police Department

2019

8,057,003

Data Not Available

Ontario Police Department

Jan 19 - Feb 17, 2021

2,957,671

0.07%

Orange Police Department

2018

387,592

0.06%

Palm Springs Police Department

2019

58,482

0.29%

Redlands Police Department

2019

4,027,149

0.08%

Ripon Police Department

2019

2,623,741

0.05%

Roseville Police Department

2019

3,733,042

0.03%

San Joaquin County Sheriff's Office

2018-2020

155,105

0.04%

San Jose Police Department

2020

1,686,836

0.09%

Seal Beach Police Department

2018

38,247

0.49%

Woodland Police Department

2019

1,382,297

0.05%

Dave Maass

No Digital Vaccine Bouncers

2 weeks 5 days ago

The U.S. is distributing more vaccines and the population is gradually becoming vaccinated. Returning to regular means of activity and movement has become the main focus for many Americans who want to travel or see family.

An increasingly common proposal to get there is digital proof-of-vaccination, sometimes called “Vaccine Passports.” On the surface, this may seem like a reasonable solution. But to “return to normal”, we also have to consider that inequity and problems with access are a part of that normal. Also, these proposals require a new infrastructure and culture of doorkeepers to public places regularly requiring visitors to display a token as a condition of entry. This would be a giant step towards pervasive tracking of our day-to-day movements. And these systems would create new ways for corporations to monetize our data and for thieves to steal our data.

That’s why EFF opposes new systems of digital proof-of-vaccination as a condition of going about our day-to-day lives. They’re not “vaccine passports” that will speed our way back to normal. They’re “vaccine bouncers” that will unnecessarily scrutinize us at doorways and unfairly turn many of us away.

What Are Vaccine Bouncers?

So-called “vaccine passports” are digital credentials proposed to be convenient, digital, and accessible ways to store and present your medical data. In this case, it shows you have been vaccinated. These are not actual passports for international travel, nor are they directly related to systems we have in place to prove you have been vaccinated. Though different proposals vary, these are all new ways of displaying medical data in a way that is not typical for our society as a whole.

These schemes require the creation of a vast new electronic gatekeeping system. People will need to download a token to their phone, or in some cases may print that token and carry it with them. Public places will need to acquire devices that can read these tokens. To enter public places, people will need to display their token to a doorkeeper. Many people will be bounced away at the door, because they are not vaccinated, or they left their phone at home, or the system is malfunctioning. This new infrastructure and culture will be difficult to dismantle when we reach herd immunity.

We already have vaccination documents we need to obtain for international travel to certain countries. But even the World Health Organization (W.H.O.), the entity that issues Yellow Cards to determine if one has had a Yellow Fever vaccine, has come out against vaccine passports.

Requiring people to present their medical data to go to the grocery store, access public services, and other vital activities calls into question who will be ultimately barred from coming in. A large number of people not only in the U.S., but worldwide, do not have access to any COVID vaccines. Many others do not have access to mobile phones, or even to the printers required to create the paper QR code that is sometimes suggested as the supposed work-around.

Also, many solutions will be built by private companies offering smartphone applications. Meaning, they will give rise to new databases of information not protected by any privacy law and transmitted on a daily basis far more frequently than submitting a one-time paper proof-of-vaccination to a school. Since we have no adequate federal data privacy law, we are relying on the pinky-promises of private companies to keep our data private and secure.

We’ve already seen mission creep with digital bouncer systems. Years ago, some bars deployed devices that scanned patrons’ identification as a condition of entry. The rationale was to quickly ascertain, and then forget, a narrow fact about patrons: whether they are old enough to buy alcohol, and thus enter the premises. Then these devices started to also collect information from patrons, which bars share with each other. Thus, we are not comforted when we hear people today say: “don’t worry, digital vaccine bouncers will only check whether a person was vaccinated, and will not also collect information about them.” Once the infrastructure is built, it requires just a few lines of code to turn digital bouncers into digital panopticons.

Temporary Measures with Long Term Consequences

When we get to an approximation of normal, what is the plan for vaccine passports? Most proposals are not clear on this point. What will become of that medical data? Will there be a push for making this a permanent part of life?

As with any massive new technological system, it will take significant time and great effort to make the system work. We’ve already seen how easy it is to evade New York’s new vaccine bouncer system, and how other digital COVID systems, due to their flaws, fail to advance public health. Even with the best efforts, by the time the bugs are worked out of a new digital vaccine system for COVID, it may not be helpful to combat the pandemic. There’s no need to rush into building a system that will only provide value to the companies that profit by building it.

Instead, our scarce resources should go to getting more people vaccinated. We are all in this together, so we should be opening up avenues of access for everyone to a better future in this pandemic. We should not be creating more issues, concerns, and barriers with experimental technology that needs to be worked out during one of the most devastating modern global crises of our time.

Alexis Hancock

EFF Sues Proctorio on Behalf of Student It Falsely Accused of Copyright Infringement to Get Critical Tweets Taken Down

2 weeks 6 days ago
Links to Software Code Excerpts in Tweets Are Fair Use

Phoenix, Arizona—The Electronic Frontier Foundation (EFF) filed a lawsuit today against Proctorio Inc. on behalf of college student Erik Johnson, seeking a judgment that he didn’t infringe the company’s copyrights when he linked to excerpts of its software code in tweets criticizing the software maker.

Proctorio, a developer of exam administration and surveillance software, misused the copyright takedown provisions of the Digital Millennium Copyright Act (DMCA) to have Twitter remove posts by Johnson, a Miami University computer engineering undergraduate and security researcher. EFF and co-counsel Osborn Maledon said in a complaint filed today in U.S. District Court, District of Arizona, that Johnson made fair use of excerpts of Proctorio’s software code, and the company’s false claims of infringement interfered with Johnson’s First Amendment right to criticize the company.

“Software companies don’t get to abuse copyright law to undermine their critics,” said EFF Staff Attorney Cara Gagliano. “Using pieces of code to explain your research or support critical commentary is no different from quoting a book in a book review.”

Proctoring apps like Proctorio’s are privacy-invasive software that “watches” students through eye-tracking and face detection for supposed signs of cheating as they take tests or complete schoolwork. The use of these “disciplinary technology” programs has skyrocketed amid the pandemic, raising questions about the extent to which they threaten student privacy and disadvantage students without access to high-speed internet and quiet spaces.

Proctorio has responded to public criticism by attacking people who speak out. The company’s CEO released on Reddit contents of a student’s chat log captured by Proctorio after the student posted complaints about the software on the social network. The company has also sued a remote learning specialist in Canada for posting links to Proctorio’s publicly available YouTube videos in a series of tweets showing the software tracks “abnormal” eye and head movements it deems suspicious.

Concerned about how much private information Proctorio collects from students’ computers, Johnson, whose instructors have given tests using Proctorio, examined the company’s software, including the files that are downloaded to any computer where the software is installed.

He published a series of tweets in September critiquing Proctorio, linking in three of those tweets to short software code excerpts that demonstrate the extent of the software’s tracking and access to users’ computers. In another tweet, Johnson included a screenshot of a video illustrating how the software is able to create a 360-degree image of students’ rooms that is accessible to teachers and seemingly Proctorio’s agents.

“Copyright holders should be held liable when they falsely accuse their critics of copyright infringement, especially when the goal is plainly to intimidate and undermine them,” said Gagliano. “We’re asking the court for a declaratory judgment that there is no infringement to prevent further legal threats and takedown attempts against Johnson for using code excerpts and screenshots to support his comments.”

For the complaint:
https://www.eff.org/document/johnson-v-proctorio-complaint

For more on proctoring surveillance:
https://www.eff.org/deeplinks/2020/08/proctoring-apps-subject-students-unnecessary-surveillance

Contact:  CaraGaglianoStaff Attorneycara@eff.org
Karen Gullo
Checked
2 hours 51 minutes ago
EFF's Deeplinks Blog: Noteworthy news from around the internet
Subscribe to EFF update feed