Tell Congress: Support the Fourth Amendment Is Not For Sale Act

1 month 1 week ago

Everyday, your personal information is being harvested by your smart phone applications, sold to data brokers, and used by advertisers hoping to sell you things. But what safeguards prevent the government from shopping in that same data marketplace? Mobile data regularly bought and sold, like your geolocation, is information that law enforcement or intelligence agencies would normally have to get a warrant to acquire. But these databrokers don’t ask for a warrant. The U.S. government has been using its purchase of this information as a loophole for acquiring personal information on individuals without a warrant. Now is the time to close that loophole. 

EFF is launching a campaign in support of the Fourth Amendment is Not For Sale Act, or H.R. 2738 and S.1265. This legislation prevents the government from purchasing information it would otherwise need a warrant to acquire. Tell your senators and representatives that this bill must be passed!

TAKE ACTION

TELL CONGRESS: THe fourth amendment is not for sale

We first wrote about the need for legislation like this in December 2020, after a troubling article in Motherboard. It reported that a Muslim prayer app (Muslim Pro), a Muslim dating app (Muslim Mingle), and many other popular apps had been selling geolocation data about their users to a company called X-Mode, which in turn provided this data to the U.S. military through defense contractors.

This violates the First and Fourth Amendments. Just because your phone apps know where you are does not mean the government should, too. The invasive marketplace for your data needs to be tamed by privacy legislation, not used by the government as an end-run around the warrant requirement. The Supreme Court has decided that our detailed location data is so revealing about our activities and associations that law enforcement must get a warrant in order to acquire it.

Government purchase of location data also threatens to chill people’s willingness to participate in protests in public places, associate with who they want, or practice their religion. History and legal precedent teach us that when the government indiscriminately collects records of First Amendment activities, it can lead to retaliation or further surveillance.

TAKE ACTION

TELL CONGRESS: THe fourth amendment is not for sale

You can read the full text of the bill below:

Matthew Guariglia

Brazil's Bill Repealing National Security Law Has its Own Threats to Free Expression

1 month 1 week ago

The Brazilian Chamber of Deputies is on track to approve  a law that threatens freedom of expression and the right to assemble and protest, with the stated aim of defending the democratic constitutional state. Bill 6764/02 repeals the Brazilian National Security Law (Lei de Segurança Nacional), one of the ominous legacies of the country’s dictatorship that lasted until 1985. Although there’s a broad consensus over the harm the National Security Law represents, Brazilian civil groups have been stressing that replacing it with a new act without careful discussion on its grounds, principles, and specific rules risks rebuilding a framework serving more to repressive than to democratic ends.

The Brazilian National Security Law has a track record of abuses in persecuting and silencing dissent, with vague criminal offenses and provisions targeting speech. After a relatively dormant period, it gained new prominence during President Bolsonaro’s administration. It has served as a legal basis for accusations against opposition leaders, critics, journalists, and even a congressman aligned to Bolsonaro in the country’s current turbulent political landscape.

However, its proposed replacement, Bill 6764/02, raises various concerns, some particularly unsettling for digital rights. Even with alternative drafts trying to untangle them, problems remain.

First, the espionage offense in the bill defines the handover of secret documents to foreign governments as a crime. It's crucial that this and related offenses do not apply to acts in a way that would raise serious human rights concerns: whistleblowers revealing facts or acts that could imply the violation of human rights, crimes committed by government officials, and other serious wrongdoings affecting public administration; or,  journalistic and investigative reporting, and the work of civil groups and activists, that bring to light governments’ unlawful practices and abuses. These acts should be clearly exempted from the offense. Amendments under discussion seek to address these concerns, but there’s no assurance they will prevail in the final text if this new law is approved.

The IACHR’s Freedom of Expression Rapporteur highlighted how often governments in Latin America classify information under national security reasons without proper assessment and substantiation. The report provides a number of examples in the region on the hurdles this represents to accessing information related to human rights violations and government surveillance. The IACHR Rapporteur stresses the key role of investigative journalists, the protection of their sources, and the need to grant legal backing against reprisal to whistleblowers who expose human rights violations and other wrongdoings. This aligns with the UN Freedom of Expression Rapporteur’s previous recommendations and reinforces the close relationship between democracy and strong safeguards for those who take a stand of unveiling sensitive public interest information. As the UN High Commissioner for Human Rights has already pointed out:

The right to privacy, the right to access to information and freedom of expression are closely linked. The public has the democratic right to take part in the public affairs and this right cannot be effectively exercised by solely relying on authorized information.

Second, the proposal also aims to tackle "fake news" by making “mass misleading communication” a crime against democratic institutions. Although the bill should be strictly tailored to counter exceptionally serious threats, bringing disinformation into its scope, on the contrary, potentially targets millions of Internet users. Disseminating “facts the person know is untrue” that could put at risk “the health of the electoral process” or “the free exercise of constitutional powers,” using "means not provided by the private messaging application," could lead to up to five years’ jail time.

We agree with the digital rights groups on the ground which have stressed the provision’s harmful implications to users’ freedom of expression.  Criminalizing the spread of disinformation is full of traps. It criminalizes speech by relying on vague terms (as in this bill) easily twisted to stifle critical voices and those challenging entrenched political power. Repeatedly, joint declarations of the Freedom of Expression Rapporteurs urged States not to take that road.

Moreover, the provision applies when such messages were distributed using "means not provided by the application." Presuming that the use of such means is inherently malicious poses a major threat to interoperability. The technical ability to plug one product or service into another product or service, even when one service provider hasn’t authorized that use, has been a key driver to competition and innovation. And dominant companies repeatedly abuse legal protections to ward off and try to punish competitors. 

This is not to say we do not care about the malicious spread of disinformation at scale. But it should not be part of this bill, given its specific scope, neither be addressed without careful attention to unintended consequences. There’s an ongoing debate, and other avenues to pursue that are aligned with fundamental rights and rely on joint efforts from the public and private sectors.

Political pressure has hastened the bill's vote. Bill 6764/02 may pass in a few days in the Chamber of Deputies, pending the Senate's approval. We join the call of civil and digital rights groups that a rushed approach actually creates greater risks for what the bill is supposed to protect. These and other troubling provisions put freedom of expression on the spot, serving also to spur government’s surveillance and repressive actions. These risks are what the defense of democracy should fend off, not reiterate. 

Veridiana Alimonti

EFF at 30: Protecting Free Speech, with Senator Ron Wyden

1 month 1 week ago

To commemorate the Electronic Frontier Foundation’s 30th anniversary, we present EFF30 Fireside Chats. This limited series of livestreamed conversations looks back at some of the biggest issues in internet history and their effects on the modern web.

To celebrate 30 years of defending online freedom, EFF was proud to welcome Senator Ron Wyden as our second special guest in EFF’s yearlong Fireside Chat series. Senator Wyden is a longtime supporter of digital rights, and as co-author of Section 230, one of the key pieces of legislation protecting speech online, he’s a well-recognized champion of free speech. EFF’s Legal Director, Dr. Corynne McSherry, spoke with the senator about the fight to protect free expression and how Section 230, despite recent attacks, is still the “single best law for small businesses and single best law for free speech.” He also answered questions from the audience about some of the hot topics that have swirled around the legislation for the last few years. 

You can watch the full conversation here or read the transcript.

On May 5, we’ll be holding our third EFF30 Fireside Chat, on surveillance, with special guest Edward Snowden. He will be joined by EFF Executive Director Cindy Cohn, EFF Director of Engineering for Certbot Alexis Hancock, and EFF Policy Analyst Matthew Guariglia as they weigh in on surveillance in modern culture, activism, and the future of privacy. 

RSVP NOW

Section 230 and Social Movements

Senator Wyden began the fireside chat with a reminder that some of the most important, and often divisive, social issues of the last few years, from #BlackLivesMatter to the #MeToo movement, would likely be censored much more heavily on platforms without Section 230. That’s because the law gives platforms both the power to moderate as they see fit, and partial immunity from liability for what’s posted on those sites, making the speech the legal responsibility of the original speaker.

Section 230...has always been for the person who doesn't have deep pockets

The First Amendment protects most speech online, but without Section 230, many platforms would be unable to host much of this important, but controversial speech because they would be stuck in litigation far more often. Section 230 has been essential for those who “don’t own their own TV stations” and others “without deep pockets” for getting their messages online, Wyden explained. 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FELSJofIhnRM%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Wyden also discussed the history of Section 230, which was passed in 1996. ”[Senator Chris Cox] and I wanted to make sure that innovators and creators and people who had promising ideas and wanted to know how they were going to get them out - we wanted to make sure that this new concept known as the internet could facilitate that.” 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FF916aJbM96Q%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Misconceptions Around Section 230

Wyden took aim at several of the misconceptions around 230, like the fact that the law is a benefit only for Big Tech. “One of the things that makes me angry...the one [idea] that really infuriates me, is that Section 230 is some kind of windfall for Big Tech. The fact of the matter is Big Tech’s got so much money that they can buy themselves out of any kind of legal scrape. We sure learned that when the first bill to start unraveling Section 230 passed, called SESTA/FOSTA.”

We need that fact-finding so that we make smart technology policy

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FtwOpQY2htzs%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Another common misunderstanding around the law is that it mandates platforms to be “neutral.” This couldn’t be further from the truth, Wyden explained: “There’s not a single word in Section 230 that requires neutrality….The point was essentially to let ‘lots of flowers bloom.’ If you want to have a conservative platform, more power to you...If you want to have a progressive platform, more power to you.“ 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FEM_gj6ZqCpA%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

How to Think About Changes to Intermediary Liability Laws

All the positive benefit for online speech that Section 230 allows doesn’t mean that Section 230 is perfect, however. But before making changes to the law, Wyden suggested, “There ought to be some basic fact finding before the Congress just jumps in to making sweeping changes to speech online.” EFF Legal Director, Corynne McSherry, agreed wholeheartedly: “We need that fact-finding so that we make smart technology policy,” adding that we need go no further than our experience with SESTA/FOSTA and its collateral damage to prove this point. 

The first thing we ought to do is tackle the incredible abuses in the privacy area

There are other ways to improve the online ecosystem as well. Asked for his thoughts on better ways to address problems, Senator Wyden was blunt: “The first thing we ought to do is tackle the incredible abuses in the privacy area. Every other week in this country Americans learn about what amounts to yet another privacy disaster.”

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FhDT4J224EB4%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Another area where we can improve the online ecosystem is in data sales and collection. Wyden recently introduced a bill, “The Fourth Amendment is Not For Sale,” that will help reign in the problem of apps and commercial data brokers selling things user location data.

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FusMYK5rKCpA%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

To wrap up the discussion, Senator Wyden took some questions about potential changes to Section 230. He lambasted SESTA/FOSTA, which EFF is challenging in court on behalf of two human rights organizations, a digital library, an activist for sex workers, and a certified massage therapist, as an example of a poorly guided amendment. 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2Fcl48SEXjliI%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Senator Wyden pointed out that every time a proposal to amend the law comes up, there should be a rubric of several questions asked about how the change would work, and what impact it would have on users. (EFF has its own rubric for laws that would affect intermediary liability for just these purposes.)

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FAWJ6o6jOKgA%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

We thank Senator Wyden for joining us to discuss free speech, Section 230, and the battle for digital rights. Please join us in the continuation of this fireside chat series on May 5 as we discuss surveillance with whistleblower Edward Snowden.

Jason Kelley

Apple’s AppTrackingTransparency is Upending Mobile Phone Tracking

1 month 2 weeks ago

Apple’s long-awaited privacy update for iOS is out, and it’s a solid step in the right direction. With the launch of iOS 14.5, hundreds of millions of iPhone users will now interact with Apple’s new AppTrackingTransparency feature. Allowing users to choose what third-party tracking they will or will not tolerate, and forcing apps to request those permissions, gives users more knowledge of what apps are doing, helps protect users from abuse, and allows them to make the best decisions for themselves.

In short, AppTrackingTransparency (or ATT) means that apps are now required to ask you permission if they want to track you and your activity across other apps. The kind of consent interface that ATT offers is not new, and it’s similar for other permissions that mobile users will be accustomed to (e.g., when an app requests access to your microphone, camera, or location). It’s normal for apps to be required to request the user’s permission for access to specific device functions or data, and third-party tracking should be no different. You can mark your ATT preferences app by app, or set it overall for all apps. 

Much of ATT revolves around your iPhone’s IDFA, or “ID for advertisers.” This 16-byte string of numbers and letters is like a license plate for your iPhone. (Google has the same kind of identifier for Android, called the Android Ad ID; these identifiers are referred to collectively as “ad IDs”). Previously, you could opt out of IDFA’s always-on surveillance deep in the settings of your iPhone; now, ATT means that IDFA settings are more visible, opt-in, and per app. 

The main feature of ATT is the technical control on IDFA, but the framework will regulate other kinds of tracking, too: if an app does not have your permission to “track” you, it is also not allowed to use identifiers like your phone number, for example, to do so. Presumably, this policy-level feature will depend on Apple’s app store review process to be effective.

Ad IDs are often compared to cookies, their tracker-enabling partner on the Web. But there’s a key difference: cookies were designed for, and continue to support, a wide range of user-friendly features. Cookies are the reason you don’t have to log in every time you visit a website, and why your shopping cart doesn’t empty if you leave a website in the middle of a visit. 

Ad IDs, on the other hand, were designed for one purpose and one purpose only: to let third parties track you. Ad IDs were created so that advertisers could access global, persistent identifiers for users without using the IMEI number or MAC address baked into phone hardware, with absolutely no pretense of user-friendliness or “shopping cart” use-case. Simply put: this feature on your phone has never worked in your favor. That’s why we applaud Apple’s efforts to give users more visible and granular choices to turn it off, and in particular ATT’s new requirement that app developers must ask for explicit permission to engage in this kind of tracking.

ATT is only a first step, and it has its weaknesses. It doesn’t do anything about “first-party” tracking, or an app tracking your behavior on that app itself. ATT might also be prone to “notification fatigue” if users become so accustomed to seeing it that they just click through it without considering the choice.

And, just like any other tracker-blocking initiative, ATT may set off a new round in the cat-and-mouse game between trackers and those who wish to limit them: if advertisers and data brokers see the writing on the wall that IDFA and other individual identifiers are no longer useful for tracking iPhone users, they may go back to the drawing board and find sneakier, harder-to-block tracking methods. ATT is unlikely to wipe out nonconsensual tracking in one fell swoop. But moving from a world in which tracking-by-default was sanctioned and enabled by Apple, to one where trackers must actively defy the tech giant, is a big step forward.

Apple is already pushing against the tide by proposing even this modest reform. Its decision to give users a choice to not be tracked has triggered a wave of melodramatic indignation from the tracking industry. In unraveling a tracking knot of its own creation, Apple has picked a fight with some of the most powerful companies and governments in the world.

Looking ahead, the mobile operating system market is essentially a duopoly, and Google controls the larger part of the -opoly. While Apple pushes through new privacy measures like ATT, Google has left its own Ad ID alone. Of the two, Apple is undoubtedly doing more to rein in the privacy abuses of advertising technology. Nearly every criticism that can be made about the state of privacy on iOS goes double for Android. Your move, Google.

Gennie Gebhart

Here Are 458 California Law Enforcement Agencies' Policy Documents All in One Place

1 month 2 weeks ago

Dylan Kubeny, a student at the Reynolds School of Journalism at the University of Nevada, Reno, served as the primary data hunter and co-author on this project. 

At this moment in history, law enforcement agencies in the United States face a long-overdue reevaluation of their priorities, practices, and processes for holding police officers accountable for both unconscious biases and overt abuse of power. 

But any examination of law enforcement requires transparency first: the public’s ability to examine what those priorities, practices, and processes are. While police are charged with enforcing the law, they too have their own rules to follow, and too often, those rules are opaque to the public. An imbalance in access to information is an imbalance of power. 

Today, EFF in partnership with Stanford Libraries' Systemic Racism Tracker project is releasing a data set with links to 458 policy manuals from California law enforcement agencies, including most police departments and sheriff offices and some district attorney offices, school district police departments, and university public safety departments. This data set represents our first attempt to aggregate these policy documents following the passage of S.B. 978, a state law that requires local law enforcement agencies to publish this information online. 

These policy manuals cover everything from administrative duties and record keeping to the use of force and the deployment of surveillance technologies. These documents reveal police officers’ responsibilities and requirements, but they also expose shortcomings, including an overreliance on boilerplate policies generated by a private company. 

Download the data set as an CSV file, or scroll to the bottom to find a catalog of links. 

Until a few years ago, many law enforcement agencies in California were reluctant to share their policy documents with the public. While a handful of agencies voluntarily chose to post these records online, the most reliable way to obtain these records was through the California Public Records Act (CPRA), which creates the legal right for everyday people to request information from the government. Most people don't know they have this power, and even fewer know how to exercise it effectively. 

To make these police records more accessible, California State Sen. Steven Bradford sponsored S.B. 978, which says all local law enforcement agencies "shall conspicuously post on their Internet Web sites all current standards, policies, practices, operating procedures, and education and training materials that would otherwise be available to the public if a request was made pursuant to the California Public Records Act.” 

The requirement became fully effective in January 2020, and now the public can visit individual websites to find links to these documents. However, despite the requirement that these records be posted "conspicuously," the links can often be challenging to find. With our new data set, the public now has access to a catalog of hundreds of currently available documents in one place. 

EFF supported SB 978's passage back in 2018 to increase government transparency through internet technology. We are currently collaborating with the Reynolds School of Journalism at the University of Nevada, Reno, to aggregate these policies. Stanford Libraries is using these records to build the Systemic Racism Tracker (SRT), a searchable database that harvests data about institutional practices that harm communities of color. The SRT's goals are to serve as a growing collection of references, documents, and data to support research and education about systematic racism. The SRT also aims to empower people to take action against harmful practices by knowing their rights and identifying, appraising, and connecting with government agencies, non-profit organizations, and grassroots groups that address racism.

"In order to understand, interrogate and work towards changing the very structures of systemic racism in policing, it is vital that we collect both current and historical policy and training manuals," said Felicia Smith, head of Stanford Libraries Learning and Outreach, who created the SRT project.

Although this data set is but the first step in a longer-term project, several elements of concern emerged in our initial analysis.

First and foremost, perhaps the most conspicuous pattern with these policies is the connection to Lexipol, a private company that sells boilerplate policies and training materials to law enforcement agencies. Over and over again, the police policies were formatted the same, used identical language, and included a copyright mark from this company. 

Lexipol has come under fire for writing policies that are too vague or permissive and for significantly differing from best practices. More often than not, rather than draft policies specifically tailored to the specific agency, these agencies simply copied and pasted the standard Lexipol policy. Mother Jones reported that 95% of agencies in California purchased policies or training materials from Lexipol. Our data showed that at least 379 agencies published policies from Lexipol. 

This raises questions about whether police are soliciting guidance from the community or policymakers or are simply accepting the recommendations from a private company that is not accountable to the public. 

In addition, we made the following findings: 

  • Although most agencies complied with S.B. 978 and posted at least some materials online, many agencies still had failed to take action even a year after the law took effect. In those cases, we filed CPRA requests for the records and requested they be posted on their websites. In some instances the agencies followed through, but we are still waiting on some entities such as the Bell Police Department and the Crescent City Police Department to upload their records. 
  • While most agencies complied with the requirement to post policies online, only a portion published training materials. In some cases, agencies only published the training session outlines and not the actual training presentations.
  • Link rot undermines transparency. As we conducted our research over just a few months, URLs for policies would change or disappear as agencies updated their policies or relaunched their websites. That is one reason we include archived links in this data set. 

In the coming months, Stanford Libraries aims to introduce a more robust tool that will allow for searching policies across departments and archiving policy changes over time. In the interim, this data set brings the public one step closer to understanding police practices and to holding law enforcement agencies accountable.

SB 978 Policy and Training Catalog 

The table below contains links to the SB 978 materials made available by local law enforcement agencies across California. There is little to no consistency across agencies for how this information is published online. Below you will find links to the primary page where a user would find links to SB 978 documents. In some cases, this may just be the agency's home page, which includes an SB 978 link in the sidebar. Because we have found that these links break quite often, we have also included an archived version of the link through the Internet Archive's Wayback Machine. We have also included direct links to the policies and training materials, however in many cases this is the same link as the primary page. 

We used the California Commission on Peace Officers Standards and Training’s list of California law enforcement agencies to prioritize municipal police, sheriff’s offices, university and school district police, and district attorneys in our data collection. Future research will cover other forms of local law enforcement.

Download the data set as an CSV file.

Primary Law Enforcement Agency Page

Archived Link

Policies

Training Materials

Alameda County District Attorney

Archived Link

Policy Docs

Not Available

Alameda County Sheriff's Office

Archived Link

Policy Docs

Training Docs

Alameda Police Department

Archived Link

Policy Docs

Training Docs, 2, 3, 4

Albany Police Department

Archived Link

Policy Docs

Training Docs

Alhambra Police Department

Archived Link

Policy Docs

Training Docs

Alpine County Sheriff's Department

Archived Link

Policy Docs

Not Available

Alturas Police Department

Archived Link

Policy Docs

Not Available

Amador County Sheriff's Department

Archived Link

Policy Docs

Not Available

American River College Police Department

Archived Link

Policy Docs

Not Available

Anaheim Police Department

Archived Link

Policy Docs

Not Available

Anderson Police Department

Archived Link

Policy Docs

Training Docs

Angels Camp Police Department

Archived Link

Policy Docs

Not Available

Antioch Police Department

Archived Link

Policy Docs

Training Docs

Apple Valley Unified School District Police Department

Archived Link

Policy Docs

Not Available

Arcadia Police Department

Archived Link

Policy Docs

Not Available

Arcata Police Department

Archived Link

Policy Docs

Not Available

Arroyo Grande Police Department

Archived Link

Policy Docs

Training Docs

Arvin Police Department

Archived Link

Policy Docs

Not Available

Atascadero Police Department

Archived Link

Policy Docs

Not Available

Atherton Police Department

Archived Link

Policy Docs

Training Docs

Atwater Police Department

Archived Link

Policy Docs

Not Available

Auburn Police Department

Archived Link

Policy Docs

Not Available

Avenal Police Department

Archived Link

Policy Docs

Not Available

Azusa Police Department

Archived Link

Policy Docs

Not Available

Bakersfield Police Department

Archived Link

Policy Docs

Not Available

Banning Police Department

Archived Link

Policy Docs

Not Available

Barstow Police Department

Archived Link

Policy Docs

Not Available

Bay Area Rapid Transit Police Department

Archived Link

Policy Docs

Training Docs

Bear Valley Police Department

Archived Link

Policy Docs

Not Available

Beaumont Police Department

Archived Link

Policy Docs

Not Available

Bell Gardens Police Department

Archived Link

Policy Docs

Not Available

Belmont Police Department

Archived Link

Policy Docs

Training Docs

Belvedere Police Department

Archived Link

Policy Docs

Not Available

Benicia Police Department

Archived Link

Policy Docs

Training Docs, 2, 3

Berkeley Police Department

Archived Link

Policy Docs

Training Docs

Beverly Hills Police Department

Archived Link

Policy Docs

Not Available

Blythe Police Department

Archived Link

Policy Docs

Not Available

Brawley Police Department

Archived Link

Policy Docs

Not Available

Brea Police Department

Archived Link

Policy Docs

Training Docs

Brentwood Police Department

Archived Link

Policy Docs

Not Available

Brisbane Police Department

Archived Link

Policy Docs

Not Available

Broadmoor Police Department

Archived Link

Policy Docs

Not Available

Buena Park Police Department

Archived Link

Policy Docs

Training Docs

Burbank Police Department

Archived Link

Policy Docs

Training Docs

Burlingame Police Department

Archived Link

Policy Docs

Training Docs

Butte County Sheriff's Department/Coroner

Archived Link

Policy Docs

Not Available

Cal Poly University Police

Archived Link

Policy Docs

Training Docs

Cal State LA Police Department

Archived Link

Policy Docs

Not Available

Calaveras County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Calexico Police Department

Archived Link

Policy Docs

Not Available

California City Police Department

Archived Link

Policy Docs

Not Available

Calistoga Police Department

Archived Link

Policy Docs

Not Available

Campbell Police Department

Archived Link

Policy Docs

Training Docs

Capitola Police Department

Archived Link

Policy Docs

Not Available

Carlsbad Police Department

Archived Link

Policy Docs

Training Docs

Carmel Police Department

Archived Link

Policy Docs

Not Available

Cathedral City Police Department

Archived Link

Policy Docs

Not Available

Central Marin Police Authority

Archived Link

Policy Docs

Not Available

Ceres Department of Public Safety

Archived Link

Policy Docs

Not Available

Chaffey Community College District Police Department

Archived Link

Policy Docs

Not Available

Chico Police Department

Archived Link

Policy Docs

Not Available

Chino Police Department

Archived Link

Policy Docs

Training Docs

Chowchilla Police Department

Archived Link

Policy Docs

Training Docs

Chula Vista Police Department

Archived Link

Policy Docs

Not Available

Citrus Community College District Department of Campus Safety

Archived Link

Policy Docs

Not Available

Citrus Heights Police Department

Archived Link

Policy Docs

Not Available

Claremont Police Department

Archived Link

Policy Docs

Training Docs

Clayton Police Department

Archived Link

Policy Docs

Not Available

Clearlake Police Department

Archived Link

Policy Docs

Not Available

Cloverdale Police Department

Archived Link

Policy Docs

Training Docs

Clovis Police Department

Archived Link

Policy Docs

Training Docs

Clovis Unified School District Police Department

Archived Link

Policy Docs

Training Docs

Coalinga Police Department

Archived Link

Policy Docs

Training Docs

Coast Community College District Police Department

Archived Link

Policy Docs

Not Available

Colma Police Department

Archived Link

Policy Docs

Training Docs

Colton Police Department

Archived Link

Policy Docs

Not Available

Colusa County District Attorney

Archived Link

Policy Docs

Not Available

Colusa County Sheriff's Department

Archived Link

Policy Docs

Not Available

Colusa Police Department

Archived Link

Policy Docs

Not Available

Concord Police Department

Archived Link

Policy Docs

Training Docs

Contra Costa Community College District Police Department

Archived Link

Policy Docs

Not Available

Contra Costa County District Attorney

Archived Link

Policy Docs

Not Available

Contra Costa County Sheriff's Department/Coroner

Archived Link

Policy Docs

Not Available

Corcoran Police Department

Archived Link

Policy Docs

Not Available

Corona Police Department

Archived Link

Policy Docs

Not Available

Coronado Police Department

Archived Link

Policy Docs

Training Docs

Costa Mesa Police Department

Archived Link

Policy Docs

Training Docs

Cosumnes River College Police Department

Archived Link

Policy Docs

Not Available

Cotati Police Department

Archived Link

Policy Docs

Not Available

Covina Police Department

Archived Link

Policy Docs

Not Available

CPSU Pomona Department of Public Safety

Archived Link

Policy Docs

Not Available

CSU Bakersfield University Police Department

Archived Link

Policy Docs

Not Available

CSU Channel Islands University Police Department

Archived Link

Policy Docs

Not Available

CSU Chico University Police Department

Archived Link

Policy Docs

Training Docs

CSU Dominguez Hills University Police and Parking

Archived Link

Policy Docs

Not Available

CSU East Bay University Police Department

Archived Link

Policy Docs

Not Available

CSU Fresno University Police Department

Archived Link

Policy Docs

Not Available

CSU Fullerton University Police Department

Archived Link

Policy Docs

Training Docs

CSU Long Beach University Police Department

Archived Link

Policy Docs

Not Available

CSU Monterey Bay University Police Department

Archived Link

Policy Docs

Training Docs

CSU Northridge Department of Police Services

Archived Link

Policy Docs

Training Docs

CSU Sacramento Public Safety/University Police Department

Archived Link

Policy Docs

Training Docs

CSU San Bernardino University Police Department

Archived Link

Policy Docs

Not Available

CSU San José University Police Department

Archived Link

Policy Docs

Not Available

CSU San Marcos University Police Department

Archived Link

Policy Docs

Not Available

CSU Stanislaus Police Department

Archived Link

Policy Docs

Training Docs

Cuesta College Department of Public Safety

Archived Link

Policy Docs

Training Docs

Culver City Police Department

Archived Link

Policy Docs

Training Docs

Cypress Police Department

Archived Link

Policy Docs

Training Docs

Daly City Police Department

Archived Link

Policy Docs

Training Docs

Davis Police Department

Archived Link

Policy Docs

Training Docs

Del Norte County Sheriff's Department

Archived Link

Policy Docs

Not Available

Del Rey Oaks Police Department

Archived Link

Policy Docs

Training Docs

Delano Police Department

Archived Link

Policy Docs

Not Available

Desert Hot Springs Police Department

Archived Link

Policy Docs

Training Docs

Dinuba Police Department

Archived Link

Policy Docs

Not Available

Dixon Police Department

Archived Link

Policy Docs

Not Available

Dos Palos Police Department

Archived Link

Policy Docs

Not Available

Downey Police Department

Archived Link

Policy Docs

Not Available

East Bay Regional Parks District Department of Public Safety

Archived Link

Policy Docs

Not Available

East Palo Alto Police Department

Archived Link

Policy Docs

Not Available

El Cajon Police Department

Archived Link

Policy Docs

Not Available

El Camino Community College District Police Department

Archived Link

Policy Docs

Not Available

El Centro Police Department

Archived Link

Policy Docs

Not Available

El Cerrito Police Department

Archived Link

Policy Docs

Training Docs

El Dorado County Sheriff's Department

Archived Link

Policy Docs

Not Available

El Monte Police Department

Archived Link

Policy Docs

Not Available

El Segundo Police Department

Archived Link

Policy Docs

Training Docs

Elk Grove Police Department

Archived Link

Policy Docs

Not Available

Emeryville Police Department

Archived Link

Policy Docs

Training Docs

Escalon Police Department

Archived Link

Policy Docs

Not Available

Escondido Police Department

Archived Link

Policy Docs

Training Docs

Etna Police Department

Archived Link

Policy Docs

Not Available

Eureka Police Department

Archived Link

Policy Docs

Not Available

Exeter Police Department

Archived Link

Policy Docs

Not Available

Fairfax Police Department

Archived Link

Policy Docs

Training Docs

Fairfield Police Department

Archived Link

Policy Docs

Training Docs

Farmersville Police Department

Archived Link

Policy Docs

Not Available

Ferndale Police Department

Archived Link

Policy Docs

Not Available

Firebaugh Police Department

Archived Link

Policy Docs

Not Available

Folsom Lake College Police Department

Archived Link

Policy Docs

Not Available

Folsom Police Department

Archived Link

Policy Docs

Training Docs

Fontana Police Department

Archived Link

Policy Docs

Training Docs

Fort Bragg Police Department

Archived Link

Policy Docs

Training Docs

Fortuna Police Department

Archived Link

Policy Docs

Not Available

Foster City Police Department

Archived Link

Policy Docs

Not Available

Fountain Valley Police Department

Archived Link

Policy Docs

Training Docs

Fowler Police Department

Archived Link

Policy Docs

Not Available

Fremont Police Department

Archived Link

Policy Docs

Training Docs

Fresno County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Fresno Police Department

Archived Link

Policy Docs

Not Available

Fullerton Police Department

Archived Link

Policy Docs

Not Available

Galt Police Department

Archived Link

Policy Docs

Not Available

Garden Grove Police Department

Archived Link

Policy Docs

Training Docs

Gardena Police Department

Archived Link

Policy Docs

Training Docs

Gilroy Police Department

Archived Link

Policy Docs

Not Available

Glendale Community College District Police Department

Archived Link

Policy Docs

Not Available

Glendale Police Department

Archived Link

Policy Docs

Training Docs

Glendora Police Department

Archived Link

Policy Docs

Training Docs

Glenn County Sheriff's Department/Coroner

Archived Link

Policy Docs

Not Available

Gonzales Police Department

Archived Link

Policy Docs

Not Available

Grass Valley Police Department

Archived Link

Policy Docs

Not Available

Greenfield Police Department

Archived Link

Policy Docs

Not Available

Gridley Police Department

Archived Link

Policy Docs

Not Available

Grover Beach Police Department

Archived Link

Policy Docs

Not Available

Guadalupe Police Department

Archived Link

Policy Docs

Not Available

Gustine Police Department

Archived Link

Policy Docs

Not Available

Hanford Police Department

Archived Link

Policy Docs

Not Available

Hawthorne Police Department

Archived Link

Policy Docs

Not Available

Hayward Police Department

Archived Link

Policy Docs

Not Available

Healdsburg Police Department

Archived Link

Policy Docs

Training Docs

Hemet Police Department

Archived Link

Policy Docs

Not Available

Hercules Police Department

Archived Link

Policy Docs

Training Docs

Hermosa Beach Police Department

Archived Link

Policy Docs

Not Available

Hillsborough Police Department

Archived Link

Policy Docs

Training Docs

Hollister Police Department

Archived Link

Policy Docs

Not Available

Humboldt County Sheriff's Department

Archived Link

Policy Docs

Not Available

Humboldt State University

Archived Link

Policy Docs

Training Docs

Huntington Beach Police Department

Archived Link

Policy Docs

Not Available

Huntington Park Police Department

Archived Link

Policy Docs

Training Docs

Huron Police Department

Archived Link

Policy Docs

Not Available

Imperial Police Department

Archived Link

Policy Docs

Not Available

Indio Police Department

Archived Link

Policy Docs

Not Available

Inglewood Police Department

Archived Link

Policy Docs

Not Available

Inyo County Sheriff's Department

Archived Link

Policy Docs

Not Available

Ione Police Department

Archived Link

Policy Docs

Not Available

Irvine Police Department

Archived Link

Policy Docs

Training Docs

Irwindale Police Department

Archived Link

Policy Docs

Training Docs

Jackson Police Department

Archived Link

Policy Docs

Not Available

Kensington Police Department

Archived Link

Policy Docs

Not Available

Kerman Police Department

Archived Link

Policy Docs

Not Available

Kern County Sheriff's Department

Archived Link

Policy Docs

Not Available

King City Police Department

Archived Link

Policy Docs

Not Available

Kings County Sheriff's Department

Archived Link

Policy Docs

Not Available

Kingsburg Police Department

Archived Link

Policy Docs

Not Available

La Habra Police Department

Archived Link

Policy Docs

Not Available

La Mesa Police Department

Archived Link

Policy Docs

Training Docs

La Palma Police Department

Archived Link

Policy Docs

Not Available

La Verne Police Department

Archived Link

Policy Docs

Training Docs

Laguna Beach Police Department

Archived Link

Policy Docs

Training Docs

Lake County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Lakeport Police Department

Archived Link

Policy Docs

Not Available

Lassen County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Lemoore Police Department

Archived Link

Policy Docs

Not Available

Lincoln Police Department

Archived Link

Policy Docs

Not Available

Lindsay Department of Public Safety

Archived Link

Policy Docs

Not Available

Livermore Police Department

Archived Link

Policy Docs

Training Docs

Livingston Police Department

Archived Link

Policy Docs

Not Available

Lodi Police Department

Archived Link

Policy Docs

Not Available

Lompoc Police Department

Archived Link

Policy Docs

Not Available

Long Beach Police Department

Archived Link

Policy Docs

Not Available

Los Alamitos Police Department

Archived Link

Policy Docs

Not Available

Los Altos Police Department

Archived Link

Policy Docs

Training Docs

Los Angeles City Department of Recreation and Parks, Park Ranger Division

Archived Link

Policy Docs

Not Available

Los Angeles County District Attorney

Archived Link

Policy Docs

Not Available

Los Angeles County Probation Department

Archived Link

Policy Docs

Training Docs

Los Angeles County Sheriff's Department

Archived Link

Policy Docs

Not Available

Los Angeles Police Department

Archived Link

Policy Docs

Training Docs

Los Angeles Port Police Department

Archived Link

Policy Docs

Not Available

Los Angeles School Police Department

Archived Link

Policy Docs

Training Docs

Los Angeles World Airports Police Department

Archived Link

Policy Docs

Not Available

Los Banos Police Department

Archived Link

Policy Docs

Training Docs

Los Gatos/Monte Sereno Police Department

Archived Link

Policy Docs

Not Available

Los Rios Community College District Police Department

Archived Link

Policy Docs

Not Available

Madera County Sheriff's Department

Archived Link

Policy Docs

Not Available

Madera Police Department

Archived Link

Policy Docs

Not Available

Mammoth Lakes Police Department

Archived Link

Policy Docs

Training Docs

Manhattan Beach Police Department

Archived Link

Policy Docs

Training Docs

Manteca Police Department

Archived Link

Policy Docs

Training Docs

Marin Community College District Police Department

Archived Link

Policy Docs

Not Available

Marin County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Marina Department of Public Safety

Archived Link

Policy Docs

Not Available

Martinez Police Department

Archived Link

Policy Docs

Training Docs

Marysville Police Department

Archived Link

Policy Docs

Not Available

McFarland Police Department

Archived Link

Policy Docs

Not Available

Mendocino County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Mendota Police Department

Archived Link

Policy Docs

Training Docs

Menifee Police Department

Archived Link

Policy Docs

Not Available

Menlo Park Police Department

Archived Link

Policy Docs

Not Available

Merced Community College Police Department

Archived Link

Policy Docs

Not Available

Merced County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Merced Police Department

Archived Link

Policy Docs

Training Docs

Mill Valley Police Department

Archived Link

Policy Docs

Training Docs

Milpitas Police Department

Archived Link

Policy Docs

Not Available

MiraCosta Community College District Police Department

Archived Link

Policy Docs

Training Docs

Modesto Police Department

Archived Link

Policy Docs

Training Docs

Modoc County Sheriff's Department

Archived Link

Policy Docs

Not Available

Mono County Sheriff's Department

Archived Link

Policy Docs

Not Available

Monrovia Police Department

Archived Link

Policy Docs

Training Docs

Montclair Police Department

Archived Link

Policy Docs

Training Docs

Montebello Police Department

Archived Link

Policy Docs

Training Docs

Monterey County Sheriff's Department

Archived Link

Policy Docs

Not Available

Monterey Park Police Department

Archived Link

Policy Docs

Training Docs

Monterey Police Department

Archived Link

Policy Docs

Training Docs

Moraga Police Department

Archived Link

Policy Docs

Not Available

Morgan Hill Police Department

Archived Link

Policy Docs

Training Docs

Morro Bay Police Department

Archived Link

Policy Docs

Not Available

Mountain View Police Department

Archived Link

Policy Docs

Training Docs

Mt. Shasta Police Department

Archived Link

Policy Docs

Not Available

Murrieta Police Department

Archived Link

Policy Docs

Training Docs

Napa County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Napa Police Department

Archived Link

Policy Docs

Not Available

Napa Valley College Police Department

Archived Link

Policy Docs

Training Docs

National City Police Department

Archived Link

Policy Docs

Training Docs

Nevada City Police Department

Archived Link

Policy Docs

Not Available

Nevada County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Newark Police Department

Archived Link

Policy Docs

Not Available

Newman Police Department

Archived Link

Policy Docs

Not Available

Newport Beach Police Department

Archived Link

Policy Docs

Training Docs

Novato Police Department

Archived Link

Policy Docs

Not Available

Oakdale Police Department

Archived Link

Policy Docs

Not Available

Oakland Police Department

Archived Link

Policy Docs

Training Docs

Oakley Police Department

Archived Link

Policy Docs

Not Available

Oceanside Police Department

Archived Link

Policy Docs

Training Docs

Oceanside Police Department Harbor Unit

Archived Link

Policy Docs

Training Docs

Ohlone Community College District Police Department

Archived Link

Policy Docs

Not Available

Ontario Police Department

Archived Link

Policy Docs

Training Docs

Orange County District Attorney

Archived Link

Policy Docs

Not Available

Orange County District Attorney, Public Assistance Fraud

Archived Link

Policy Docs

Not Available

Orange County Sheriff's Department/Coroner

Archived Link

Policy Docs

Not Available

Orange Cove Police Department

Archived Link

Policy Docs

Not Available

Orange Police Department

Archived Link

Policy Docs

Training Docs

Orland Police Department

Archived Link

Policy Docs

Training Docs

Oroville Police Department

Archived Link

Policy Docs

Training Docs

Oxnard Police Department

Archived Link

Policy Docs

Training Docs

Pacific Grove Police Department

Archived Link

Policy Docs

Training Docs

Pacifica Police Department

Archived Link

Policy Docs

Training Docs

Palm Springs Police Department

Archived Link

Policy Docs

Not Available

Palo Alto Police Department

Archived Link

Policy Docs

Not Available

Palos Verdes Estates Police Department

Archived Link

Policy Docs

Not Available

Paradise Police Department

Archived Link

Policy Docs

Not Available

Pasadena City College District Police Department

Archived Link

Policy Docs

Training Docs

Pasadena Police Department

Archived Link

Policy Docs

Not Available

Paso Robles Police Department

Archived Link

Policy Docs

Training Docs

Petaluma Police Department

Archived Link

Policy Docs

Not Available

Piedmont Police Department

Archived Link

Policy Docs

Training Docs

Pinole Police Department

Archived Link

Policy Docs

Training Docs

Pismo Beach Police Department

Archived Link

Policy Docs

Training Docs

Pittsburg Police Department

Archived Link

Policy Docs

Not Available

Placentia Police Department

Archived Link

Policy Docs

Not Available

Placer County District Attorney

Archived Link

Policy Docs

Not Available

Placer County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Placerville Police Department

Archived Link

Policy Docs

Not Available

Pleasant Hill Police Department

Archived Link

Policy Docs

Training Docs

Pleasanton Police Services

Archived Link

Policy Docs

Training Docs

Plumas County Sheriff's Department

Archived Link

Policy Docs

Not Available

Pomona Police Department

Archived Link

Policy Docs

Training Docs

Port Hueneme Police Department

Archived Link

Policy Docs

Not Available

Porterville Police Department

Archived Link

Policy Docs

Not Available

Red Bluff Police Department

Archived Link

Policy Docs

Training Docs

Redding Police Department

Archived Link

Policy Docs

Training Docs

Redlands Police Department

Archived Link

Policy Docs

Training Docs

Redondo Beach Police Department

Archived Link

Policy Docs

Training Docs

Redwood City Police Department

Archived Link

Policy Docs

Training Docs

Reedley Police Department

Archived Link

Policy Docs

Training Docs

Rialto Police Department

Archived Link

Policy Docs

Not Available

Richmond Police Department

Archived Link

Policy Docs

Not Available

Ridgecrest Police Department

Archived Link

Policy Docs

Not Available

Rio Dell Police Department

Archived Link

Policy Docs

Not Available

Ripon Police Department

Archived Link

Policy Docs

Not Available

Riverside Community College District Police Department

Archived Link

Policy Docs

Not Available

Riverside County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Riverside Police Department

Archived Link

Policy Docs

Not Available

Rocklin Police Department

Archived Link

Policy Docs

Training Docs

Rohnert Park Department of Public Safety

Archived Link

Policy Docs

Not Available

Roseville Police Department

Archived Link

Policy Docs

Not Available

Ross Police Department

Archived Link

Policy Docs

Not Available

Sacramento County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Sacramento Police Department

Archived Link

Policy Docs

Training Docs

Saddleback Community College Police Department

Archived Link

Policy Docs

Not Available

Saint Helena Police Department

Archived Link

Policy Docs

Not Available

San Benito County Sheriff's Department

Archived Link

Policy Docs

Training Docs

San Bernardino County Sheriff-Coroner

Archived Link

Policy Docs

Training Docs

San Bernardino Police Department

Archived Link

Policy Docs

Training Docs

San Bruno Police Department

Archived Link

Policy Docs

Not Available

San Diego County Probation Department

Archived Link

Policy Docs

Training Docs

San Diego County Sheriff's Department

Archived Link

Policy Docs

Not Available

San Diego Harbor Police Department

Archived Link

Policy Docs

Training Docs

San Diego Police Department

Archived Link

Policy Docs

Training Docs

San Diego State University Police Department

Archived Link

Policy Docs

Training Docs

San Fernando Police Department

Archived Link

Policy Docs

Training Docs

San Francisco County Sheriff's Department

Archived Link

Policy Docs

Training Docs

San Francisco Police Department

Archived Link

Policy Docs

Not Available

San Gabriel Police Department

Archived Link

Policy Docs

Training Docs

San Joaquin County Probation Department

Archived Link

Policy Docs

Training Docs

San Joaquin County Sheriff's Department

Archived Link

Policy Docs

Not Available

San Joaquin Delta College Police Department

Archived Link

Policy Docs

Training Docs

San Jose Police Department

Archived Link

Policy Docs

Training Docs

San Leandro Police Department

Archived Link

Policy Docs

Training Docs

San Luis Obispo County Sheriff's Department

Archived Link

Policy Docs

Training Docs

San Luis Obispo Police Department

Archived Link

Policy Docs

Training Docs

San Marino Police Department

Archived Link

Policy Docs

Training Docs

San Mateo County Sheriff's Office

Archived Link

Policy Docs

Training Docs

San Mateo Police Department

Archived Link

Policy Docs

Training Docs

San Pablo Police Department

Archived Link

Policy Docs

Not Available

San Rafael Police Department

Archived Link

Policy Docs

Training Docs

San Ramon Police Department

Archived Link

Policy Docs

Training Docs

Sand City Police Department

Archived Link

Policy Docs

Not Available

Sanger Police Department

Archived Link

Policy Docs

Not Available

Santa Ana Police Department

Archived Link

Policy Docs

Training Docs

Santa Ana Unified School District Police Department

Archived Link

Policy Docs

Not Available

Santa Barbara County Sheriff's Department

Archived Link

Policy Docs

Not Available

Santa Barbara Police Department

Archived Link

Policy Docs

Training Docs

Santa Clara County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Santa Clara Police Department

Archived Link

Policy Docs

Training Docs

Santa Cruz County District Attorney

Archived Link

Policy Docs

Not Available

Santa Cruz County Sheriff's Department

Archived Link

Policy Docs

Not Available

Santa Cruz Police Department

Archived Link

Policy Docs

Training Docs

Santa Fe Springs Police Services

Archived Link

Policy Docs

Not Available

Santa Maria Police Department

Archived Link

Policy Docs

Training Docs

Santa Monica Police Department

Archived Link

Policy Docs

Training Docs

Santa Paula Police Department

Archived Link

Policy Docs

Not Available

Santa Rosa Police Department

Archived Link

Policy Docs

Training Docs

Sausalito Police Department

Archived Link

Policy Docs

Not Available

Scotts Valley Police Department

Archived Link

Policy Docs

Not Available

Seal Beach Police Department

Archived Link

Policy Docs

Training Docs

Seaside Police Department

Archived Link

Policy Docs

Training Docs

Sebastopol Police Department

Archived Link

Policy Docs

Training Docs

Selma Police Department

Archived Link

Policy Docs

Not Available

Shafter Police Department

Archived Link

Policy Docs

Not Available

Shasta County Sheriff's Department

Archived Link

Policy Docs

Not Available

Sierra County Sheriff's Office

Archived Link

Policy Docs

Not Available

Sierra Madre Police Department

Archived Link

Policy Docs

Not Available

Signal Hill Police Department

Archived Link

Policy Docs

Not Available

Simi Valley Police Department

Archived Link

Policy Docs

Training Docs

Siskiyou County Sheriff's Department

Archived Link

Policy Docs

Not Available

Solano County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Soledad Police Department

Archived Link

Policy Docs

Training Docs

Sonoma County Probation Department

Archived Link

Policy Docs

Training Docs

Sonoma County Sheriff's Office

Archived Link

Policy Docs

Training Docs

Sonoma Police Department

Archived Link

Policy Docs

Training Docs

Sonoma State University Police and Parking Services

Archived Link

Policy Docs

Training Docs

Sonora Police Department

Archived Link

Policy Docs

Not Available

South Gate Police Department

Archived Link

Policy Docs

Not Available

South Lake Tahoe Police Department

Archived Link

Policy Docs

Not Available

South Pasadena Police Department

Archived Link

Policy Docs

Training Docs

South San Francisco Police Department

Archived Link

Policy Docs

Not Available

Southwestern Community College Police Department

Archived Link

Policy Docs

Not Available

Stanford University Department of Public Safety

Archived Link

Policy Docs

Not Available

Stanislaus County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Stockton Police Department

Archived Link

Policy Docs

Not Available

Suisun City Police Department

Archived Link

Policy Docs

Training Docs

Sunnyvale Department of Public Safety

Archived Link

Policy Docs

Not Available

Sutter County Sheriff's Department

Archived Link

Policy Docs

Not Available

Taft Police Department

Archived Link

Policy Docs

Not Available

Tehachapi Police Department

Archived Link

Policy Docs

Not Available

Tehama County Sheriff's Department

Archived Link

Policy Docs

Not Available

Tiburon Police Department

Archived Link

Policy Docs

Not Available

Torrance Police Department

Archived Link

Policy Docs

Not Available

Tracy Police Department

Archived Link

Policy Docs

Training Docs

Trinity County Sheriff's Department

Archived Link

Policy Docs

Not Available

Truckee Police Department

Archived Link

Policy Docs

Training Docs

Tulare County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Tulare Police Department

Archived Link

Policy Docs

Not Available

Tuolumne County Sheriff's Department

Archived Link

Policy Docs

Not Available

Turlock Police Department

Archived Link

Policy Docs

Training Docs

Tustin Police Department

Archived Link

Policy Docs

Training Docs

Twin Rivers Unified School District Police Services

Archived Link

Policy Docs

Not Available

UC Berkeley Police Department

Archived Link

Policy Docs

Not Available

UC Davis Police Department

Archived Link

Policy Docs

Not Available

UC Irvine Police Department

Archived Link

Policy Docs

Training Docs

UC Los Angeles Police Department

Archived Link

Policy Docs

Not Available

UC Merced Police Department

Archived Link

Policy Docs

Not Available

UC Riverside Police Department

Archived Link

Policy Docs

Not Available

UC San Diego Police Department

Archived Link

Policy Docs

Not Available

UC San Francisco Police Department

Archived Link

Policy Docs

Not Available

UC Santa Cruz Police Department

Archived Link

Policy Docs

Not Available

Ukiah Police Department

Archived Link

Policy Docs

Not Available

Union City Police Department

Archived Link

Policy Docs

Not Available

Upland Police Department

Archived Link

Policy Docs

Training Docs

Vacaville Police Department

Archived Link

Policy Docs

Training Docs

Vallejo Police Department

Archived Link

Policy Docs

Training Docs

Ventura County District Attorney

Archived Link

Policy Docs

Not Available

Ventura County Sheriff's Department

Archived Link

Policy Docs

Training Docs

Ventura Police Department

Archived Link

Policy Docs

Training Docs

Vernon Police Department

Archived Link

Policy Docs

Training Docs

Victor Valley College Police Department

Archived Link

Policy Docs

Not Available

Visalia Police Department

Archived Link

Policy Docs

Not Available

Walnut Creek Police Department

Archived Link

Policy Docs

Not Available

Watsonville Police Department

Archived Link

Policy Docs

Not Available

Weed Police Department

Archived Link

Policy Docs

Not Available

West Cities Police Communications Center

Archived Link

Policy Docs

Not Available

West Covina Police Department

Archived Link

Policy Docs

Not Available

West Sacramento Police Department

Archived Link

Policy Docs

Not Available

West Valley-Mission Community College District Police Department

Archived Link

Policy Docs

Not Available

Westminster Police Department

Archived Link

Policy Docs

Not Available

Wheatland Police Department

Archived Link

Policy Docs

Not Available

Whittier Police Department

Archived Link

Policy Docs

Not Available

Williams Police Department

Archived Link

Policy Docs

Not Available

Willits Police Department

Archived Link

Policy Docs

Not Available

Windsor Police Department

Archived Link

Policy Docs

Not Available

Winters Police Department

Archived Link

Policy Docs

Not Available

Woodland Police Department

Archived Link

Policy Docs

Training Docs

Yolo County District Attorney

Archived Link

Policy Docs

Not Available

Yolo County Sheriff's Department

Archived Link

Policy Docs

Not Available

Yreka Police Department

Archived Link

Policy Docs

Not Available

Yuba City Police Department

Archived Link

Policy Docs

Training Docs

Yuba County Sheriff's Department

Archived Link

Policy Docs

Not Available

Dave Maass

Your Service Provider’s Terms of Service Shouldn’t Overrule Your Fourth Amendment Rights

1 month 2 weeks ago

Last week, EFF, ACLU, and ACLU of Minnesota filed an amicus brief in State v. Pauli, a case in the Minnesota Supreme Court, where we argue that cloud storage providers’ terms of service (TOS) can’t take away your Fourth Amendment rights. This is the first case on this important issue to reach a state supreme court, and could mean that anyone in Minnesota who violated any terms of a providers’ TOS could lose Fourth Amendment protections over all the files in their account.

The facts of the case are a little hazy, but at some point, Dropbox identified video files in Mr. Pauli’s account as child sexual assault material and submitted the files to the National Center for Missing and Exploited Children (NCMEC), a private, quasi-governmental entity created by statute that works closely with law enforcement on child exploitation issues. After viewing the files, a NCMEC employee then forwarded them with a report to the Minnesota Bureau of Criminal Apprehension. This ultimately led to Pauli’s indictment on charges for possession of pictorial representations of minors. Pauli challenged the search, but the trial court held that Dropbox’s TOS—which notified Pauli that Dropbox could monitor his account and disclose information to third parties if it believed such disclosure was necessary to comply with the law—nullified Pauli’s expectation of privacy in the video files. After the appellate court agreed, Pauli petitioned the state supreme court for review.

The lower courts’ analysis is simply wrong. Under this logic, your Fourth Amendment rights rise or fall based on unilateral contracts with your service providers—contracts that none of us read or negotiate but all of us must agree to so that we can use services that are a necessary part of daily life. As we argued in our brief, a company’s TOS should not dictate your constitutional rights, because terms of service are rules about the relationship between you and your service provider—not you and the government.

Companies draft terms of service to govern how their platforms may be used, and the terms of these contracts are extremely broad. Companies’ TOS control what kind of content you can post, how you can use the platform, and how platforms can protect themselves against fraud and other damage. Actions that could violate a company’s TOS include not just criminal activity, such as possessing child sexual assault material, but also—as defined solely by the provider—actions like uploading content that defames someone or contains profanity, sharing a copyrighted article without permission from the copyright holder, or marketing your small business to all of your friends without their advance consent. While some might find activities such as these objectionable or annoying, they shouldn’t justify the government ignoring your Fourth Amendment right to privacy in your files simply because you store them in the cloud.

Given the vast amount of storage many service providers offer (most offer up to 2 terabytes for a small annual fee), accounts can hold tens of thousands of private and personal files, including photos, messages, diaries, medical records, legal data, and videos—each of which could reveal intimate details about our private and professional lives. Storing these records in the cloud with a service provider allows users to free up space on their personal devices, access their files from anywhere, and share (or not share) their files with others. The convenience and cost savings offered by commercial third-party cloud-storage providers means that very few of us would take the trouble to set up our own server to try to achieve privately all that we can do with our data when we could store it with a commercial service provider. But this also means that the only way to take advantage of this convenience is if we agree to a company’s TOS.

And several billion of us do agree every day. Since its advent in 2007, Dropbox’s user-base has soared to more than 700 million registered users. Apple offers free iCloud storage to users of its more than 1.5 billion active phones, tablets, laptops, and other devices around the world. And Google’s suite of cloud services—which includes both Gmail and Google Drive (offering access to stored and shareable documents, spreadsheets, photos, slide presentations, videos, and more)—enjoy 2 billion monthly active users. These users would be shocked to discover that by agreeing to their providers’ TOS, they could be giving up an expectation of privacy in their most private records.

In 2018, in Carpenter v. United States, all nine justices on the Supreme Court agreed that even if we store electronic equivalents of our Fourth Amendment-protected “papers” and “effects” with a third-party provider, we still retain privacy interests in those records. These constitutional rights would be meaningless, however, if they could be ignored simply because a user agreed to and then somehow violated their provider’s TOS.

The appellate court’s ruling in Pauli allows private agreements to trump bedrock Fourth Amendment guarantees for private communications and cloud-stored records. The ruling affects far more than child sexual assault material cases: anyone who violated any terms of a providers’ TOS could lose Fourth Amendment protections over all the files in their account.

We hope the Minnesota Supreme Court will reject such a sweeping invalidation of constitutional rights. We look forward to the court’s decision.

Jennifer Lynch

Canada’s Attempt to Regulate Sexual Content Online Ignores Technical and Historical Realities

1 month 2 weeks ago

Canadian Senate Bill S-203, AKA the “Protecting Young Persons from Exposure to Pornography Act,” is another woefully misguided proposal aimed at regulating sexual content online. To say the least, this bill fails to understand how the internet functions and would be seriously damaging to online expression and privacy. It’s bad in a variety of ways, but there are three specific problems that need to be laid out:  1) technical impracticality, 2) competition harms, and 3) privacy and security.

First, S-203 would make any person or company criminally liable for any time an underage user engages with sexual content through its service. The law applies even if the person or company believed the user to be an adult, unless the person or company “implemented a prescribed age-verification method.”

Second, the bill seemingly imposes this burden on a broad swath of the internet stack. S-203 would criminalize the acts of independent performers, artists, blogs, social media, message boards, email providers, and any other intermediary or service in the stack that is in some way “for commercial purposes” and “makes available sexually explicit material on the Internet to a young person.” The only meaningful defense against the financial penalties that a person or company could assert would be to verify the legal adult age of every user and then store that data.

The bill would likely force many companies to simply eliminate sexual content

The sheer amount of technical infrastructure it would take for such a vast portion of the internet to “implement a prescribed age-verification method” would be costly and overwhelmingly complicated. It would also introduce many security concerns that weren’t previously there. Even if every platform had server side storage with robust security posture, processing high level personally identifiable information (PII) on the client side would be a treasure trove for anyone with a bit of app exploitation skills. And then if this did create a market space for third-party proprietary solutions to take care of a secure age verification system, the financial burden would only advantage the largest players online. Not only that, it’s ahistorical to assume that younger teenagers wouldn’t figure out ways to hack past whatever age verification system is propped up.

Then there’s the privacy angle. It’s ludicrous to expect all adult users to provide private personal information every time they log onto an app that might contain sexual content. The implementation of verification schemes in contexts like this may vary on how far privacy intrusions go, but it generally plays out as a cat and mouse game that brings surveillance and security threats instead of responding to initial concerns. The more that a verification system fails, the more privacy-invasive measures are taken to avoid criminal liability.

Because of the problems of implementing age verification, the bill would likely force many companies to simply eliminate sexual content instead of carrying the huge risk that an underage user will access it. But even a company that wanted to eliminate prohibited sexual content would face significant obstacles in doing so if they, like much of the internet, host user-generated content. It is difficult to detect and define the prohibited sexual content, and even more difficult when the bill recognizes that the law is not violated if such material “has a legitimate purpose related to science, medicine, education or the arts.” There is no automated tool that can make such distinctions; the inevitable result is that protected materials will be removed out of an abundance of caution. And history teaches us that the results are often sexist, misogynist, racist, LGBT-phobic, ableist, and so on. It is a feature, not a bug, that there is no one-size-fits-all way to neatly define what is and isn’t sexual content.

Ultimately, Canadian Senate Bill S-203 is another in a long line of morally patronizing legislation that doesn’t understand how the internet works. Even if there were a way to keep minors away from sexual content, there is no way without vast collateral damage. Sen. Julie Miville-Dechêne, who introduced the bill, stated “it makes no sense that the commercial porn platforms don’t verify age. I think it’s time to legislate.” We gently recommend that next time her first thought be to consult with experts.

Daly Barnett

EFF and ACLU Ask Supreme Court to Review Case Against Warrantless Searches of International Travelers’ Phones and Laptops

1 month 2 weeks ago
Border Officers Accessing Massive Amounts of Information from Electronic Devices

Washington, D.C. —The Electronic Frontier Foundation (EFF), the American Civil Liberties Union, and the ACLU of Massachusetts today filed a petition for a writ of certiorari, asking the Supreme Court to hear a challenge to the Department of Homeland Security’s policy and practice of warrantless and suspicionless searches of travelers’ electronic devices at U.S. airports and other ports of entry.

The lawsuit, Merchant v. Mayorkas, was filed in September 2017 on behalf of several travelers whose cell phones, laptops, and other electronic devices were searched without warrants at the U.S. border. In November 2019, a federal district court in Boston ruled that border agencies’ policies on electronic device searches violate the Fourth Amendment, and required border officers to have reasonable suspicion of digital contraband before they can search a traveler’s device. A three-judge panel at the First Circuit reversed this decision in February 2021.

“Border officers every day make an end-run around the Constitution by searching travelers’ electronic devices without a warrant or any suspicion of wrongdoing,” said EFF Senior Staff Attorney Sophia Cope. “The U.S. government has granted itself unfettered authority to rummage through our digital lives just because we travel internationally. This egregious violation of privacy happens with no justification under constitutional law and no demonstrable benefit. The Supreme Court must put a stop to it.”

“This case raises pressing questions about the Fourth Amendment’s protections in the digital age,” said Esha Bhandari, deputy director of the ACLU’s Speech, Privacy, and Technology Project. “When border officers search our phones and laptops, they can access massive amounts of sensitive personal information, such as private photographs, health information, and communications with partners, family, and friends—including discussions between lawyers and their clients, and between journalists and their sources. We are asking the Supreme Court to ensure that we don’t lose our privacy rights when we travel.”

Every year, a growing number of international travelers are subject to warrantless and suspicionless searches of their personal electronic devices at the U.S. border. These searches are often conducted for reasons that have nothing to do with stopping the importation of contraband or determining a traveler’s admissibility. Border officers claim the authority to search devices for a host of reasons, including enforcement of tax, financial, consumer protection, and environmental laws—all without suspicion of wrongdoing. Border officers also search travelers’ devices if they are interested in information about someone other than the traveler—like a business partner, family member, or a journalist’s source.

The petitioners in this case—all U.S. citizens—include a military veteran, journalists, an artist, a NASA engineer, and a business owner. Several are Muslims and people of color, and none were accused of any wrongdoing in connection with their device searches.

“It’s been frustrating to be subjected to this power-grab by the government,” said Diane Zorri, a college professor, former U.S. Air Force captain, and a plaintiff in the case. “My devices are mine, and the government should need a good reason before rifling through my phone and my computer. I’m proud to be part of this case to help protect travelers’ rights.”

The certiorari petition asks the Supreme Court to overturn the First Circuit’s decision and hold that the Fourth Amendment requires border officers to obtain a warrant based on probable before searching electronic devices, or at the least have reasonable suspicion that the device contains digital contraband.

For more information about Merchant v. Mayorkas go to:
https://www.eff.org/cases/alasaad-v-duke
https://www.aclu.org/cases/alasaad-v-wolf-challenge-warrantless-phone-and-laptop-searches-us-border

For the full petition for writ of certiorari:

https://www.eff.org/document/petition-writ-certiorari-3

Contact:  RebeccaJeschkeMedia Relations Director and Digital Rights Analystpress@eff.org KateLagrecaACLU of Massachusettsklagreca@aclum.org Aaron MadridAksozACLU Nationalmedia@aclu.org
Rebecca Jeschke

Tell Congress: Federal Money Shouldn’t Be Spent On Breaking Encryption

1 month 2 weeks ago

We don’t need government minders in our private conversations. That’s because private conversations, whether they happen offline or online, aren’t a public safety menace. They’re not an invitation to criminality, or terrorism, or a threat to children, no matter how many times those tired old lines get repeated. 

TAKE ACTION

TELL CONGRESS: DON’T SPEND TAX MONEY TO BREAK ENCRYPTION

Unfortunately, federal law enforcement officials have not stopped asking for backdoor access to Americans’ encrypted messages. FBI Director Christopher Wray did it again just last month, falsely claiming that end-to-end encryption and “user-only access” have “negligible security advantages” but have a “negative effect on law enforcement’s ability to protect the public.”

This year, there’s something we can do about it. Rep. Tom Malinowski (D-NJ) and Rep. Peter Meijer (R-MI) have put forward language that would ban federal money from being used to weaken security standards or introduce vulnerabilities into software or hardware.

Last year, the House of Representatives inserted an amendment in the Defense Appropriations bill that prohibits the use of funds to insert security backdoors. That provision targeted the NSA. This year’s proposal will cover a much broader range of federal agencies. It also includes language that would prevent the government from engaging in schemes like client-side scanning or a “ghost” proposal, which would undermine encryption without technically decrypting data.

Secure and private communications are the backbone of democracy and free speech around the world. If U.S. law enforcement is able to compel private companies to break encryption, criminals and authoritarian governments will be eager to use the same loopholes. There are no magic bullets, and no backdoors that will only get opened by the “good guys.”

It’s important that as many members of Congress as possible sign on as supporters of this proposal. We need to send a strong signal to federal law enforcement that they should, once and for all, stop insisting they should scan all of our messages. To get there, we need your help.

TAKE ACTION

TELL CONGRESS: DON’T SPEND TAX MONEY TO BREAK ENCRYPTION

Joe Mullin

Data Driven 2: California Dragnet—New Data Set Shows Scale of Vehicle Surveillance in the Golden State

1 month 2 weeks ago

This project is based on data processed by student journalist Olivia Ali, 2020 intern JJ Mazzucotelli, and research assistant Liam Harton, based on California Public Records Act requests filed by EFF and dozens of students at the University of Nevada, Reno Reynolds School of Journalism. 

Tiburon, California: a 13-square-mile peninsula town in Marin County, known for its glorious views of the San Francisco Bay and its eclectic retail district. 

What the town's tourism bureau may not want you to know: from the moment you drive into the city limits, your vehicle will be under extreme surveillance. The Tiburon Police Department has the dubious distinction of collecting, mile-for-mile, more data on drivers than any other agency surveyed for a new EFF data set. 

Today, EFF is releasing Data Driven 2: California Dragnet, a new public records collection and data set that shines light on the massive amount of vehicle surveillance conducted by police in California using automated license plate readers (ALPRs)—and how very little of this surveillance is actually relevant to an active public safety interest. 

Download the Data Driven 2: California Dragnet data set.

In 2019 alone, just 82 agencies collected more than 1 billion license plate scans using ALPRs. Yet, 99.9% of this surveillance data was not actively related to an investigation when it was collected. Nevertheless, law enforcement agencies stockpile this data, often for years, and often share the data with hundreds of agencies around the country.  

This means that law enforcement agencies have built massive databases that document our travel patterns, regardless of whether we're under suspicion. With a few keystrokes, a police officer can generate a list of places a vehicle has been seen, with few safeguards and little oversight.  

EFF's dataset also shows for the first time how some jurisdictions—such as Tiburon and  Sausalito in Northern California, and Beverly Hills and Laguna Beach in Southern California—are scanning drivers at a rate far higher than the statewide mean. In each of those cities, an average vehicle will be scanned by ALPRs every few miles it drives.

Tiburon first installed Vigilant Solutions ALPRs at the town's entrance and exit points and downtown about a decade ago. Today, with just six cameras, it has evolved into a massive surveillance program: on average, a vehicle will be scanned by cops once for every 1.85 miles it drives.  

Tiburon Police stockpile about 7.7-million license plate scans annually, and yet only .01% or 1 in 10,000 of those records were related to a crime or other public safety interest when they were collected. The data is retained for a year. 

ALPRs are a form of location surveillance: the data they collect can reveal our travel patterns and daily routines, the places we visit, and the people with whom we associate. In addition to the civil liberties threat, these data systems also create great security risks, with multiple known breaches of ALPR data and technology occurring over the last few years. 

EFF sought comment from Tiburon Police Chief Ryan Monaghan, who defended the program via email. "Since the deployment of the ALPRs, our crime data from the five years prior to having the ALPRs as well as the five years after and beyond have shown marked reductions in stolen vehicles and thefts from vehicles and an increase in the recovery of stolen vehicles," he wrote.  

EFF’s public records data set, which builds on a 2016-2017 survey (Data Driven 1), aims to provide journalists, policymakers, researchers, and local residents with data to independently evaluate and understand the state of California's ALPR dragnet. 

What Are Automated License Plate Readers?

A fixed ALPR and a mobile ALPR. Credit: Mike Katz-Lacabe (CC BY)

ALPRs are cameras that snap photographs of license plates and then upload the plate numbers, time/data, and GPS coordinates to a searchable database. This allows police to identify and track vehicles in real time, search the historical travel patterns of any vehicle, and identify vehicles that have been spotted near certain locations. 

Cops attach these cameras to fixed locations, like highway overpasses or traffic lights. Law enforcement agencies also install ALPRs on patrol vehicles, allowing police to capture data on whole neighborhoods by driving block-by-block, a tactic known as "gridding." In 2020, the California State Auditor issued a report that found that agencies were collecting large amounts of data without following state law and without addressing some of the most basic cybersecurity and civil liberties concerns when it comes to Californians' data. 

About the Data Set

The Data Driven 2: California Dragnet data set is downloadable as an Excel (.xlsx) file, with the data analysis broken into various tabs. We have also presented selections from the data as a table below. 

The dataset is based on dozens of California Public Records Act Requests filed by EFF and students at the Reynolds School of Journalism at the University of Nevada, Reno in collaboration with MuckRock News. Data Driven 2 is a sequel to the EFF and MuckRock's 2018 Data Driven report. 

To create the data set, we filed more than 100 requests for information under the California Public Records Act. We sought the following records from each agency: 

  1. The number of license plate scans captured by ALPRs per year in 2018 and 2019. These are also called "detections." 
  2. The number of scanned license plates each year for 2018 and 2019 that matched a "hot list," the term of art for a list of vehicles of interest. These matches are called "hits." 
  3. The list of other agencies that the law enforcement agency is exchanging data with, including both ALPR scans and hot lists. 

Most of these public records requests were filed in 2020. For a limited number of requests filed in 2021, we also requested detection and hit information for 2020. The agencies were selected because they had previously provided records for the original report or were known to use an ALPR system that could export the type of aggregate data required for this analysis. Not all agencies provided records in response to our requests. 

The spreadsheet includes links to public records for each agency, along with a table of their statistics. In addition, we have included "Daily Vehicle Miles Travelled" from the California Department of Transportation for help in comparing jurisdictions. 

The dataset covers 89 agencies from all corners of the state. However, the data was not always presented by the agencies in a uniform manner. Only 63 agencies provided comprehensive and separated data for both 2018 and 2019. Other agencies either produced data for incomparable time periods or provided incomplete or clearly erroneous data. In some cases, agencies did not have ALPRs for the full period, having either started or ended their programs mid-year.  (Note: More than 250 agencies use ALPR in California).

In general, our analysis below only includes agencies that provided both 2018 and 2019 data, which we then averaged together. However, we are including all the data we received in the spreadsheet.

Hit Ratio: Most ALPR Data Involves Vehicles Not Under Active Suspicion 

One way to examine ALPR data is to ask whether the data collected is relevant to an active investigation or other public safety interest at the time it is collected. 

Law enforcement agencies create "hot lists" of license plates, essentially lists of vehicles they are actively looking for, for example, because they're stolen, are suspected of being connected to a crime, or belong to an individual under state supervision, such as a sex offender. When an ALPR scans a license plate that matches a hot list, the system issues an alert to the law enforcement agency that the vehicle was sighted. 

Data that is not on a hot list is still stored, often for more than a year depending on the agency's policy, despite a lack of relevance to an active public safety interest. Police have argued they need this data in case one day you commit a crime, at which point they can look back at your historical travel patterns. EFF and other privacy organizations argue that this a fundamental violation of the privacy of millions of innocent drivers, as well as an enormous cybersecurity risk. 

The 63 agencies that provided us 2018-2019 data collected a combined average 840,000,000 plate images each year. However, only 0.05% of the data matched a hot list. 

Some agencies only provided us data for three months or one year. Other agencies lumped all the data together. While we have left them out of this analysis, their hit ratios closely followed what we saw in the agencies that provided us consistent data. 

The top 15 data-collecting law enforcement agencies accounted for 1.4 billion license plate scans over two years. On average, those 15 law enforcement agencies reported that only .05% of the data was on a hot list. 

Agency/Records Link

2018-2019 License Plate Scans

Hit Ratio (Percentage on a Hot List)

San Bernardino County Sheriff's Office

439,272,149

0.05%

Carlsbad Police Department

161,862,285

0.02%

Sacramento Police Department

142,170,129

0.08%

Torrance Police Department

132,904,262

0.04%

Chino Police Department

83,164,449

0.05%

Beverly Hills Police Department

67,520,532

0.03%

Fontana Police Department

66,255,835

0.06%

Contra Costa County Sheriff's Office

65,632,313

0.11%

Claremont Police Department (Link 2)

45,253,735

0.04%

Long Beach Police Department

44,719,586

0.09%

Livermore Police Department

39,430,629

0.04%

Laguna Beach Police Department

37,859,124

0.04%

Pleasant Hill Police Department

27,293,610

0.03%

Merced Police Department

25,895,158

0.04%

Brentwood Police Department

25,440,363

0.07%

In his written response to our finding that Tiburon Police store data for a year, even though 99.9% of the data was not tied to an alert, Chief Monaghan wrote: "Our retention cycle for the scan data is in line with industry-wide standards for ALPRs and does not contain any personal identifying information. Like other agencies that deploy ALPRs, we retain the information to use for investigative purposes only as many crimes are reported after the fact and license plate data captured by the ALPRs can be used as an investigative tool after the fact." 

Monaghan presents a few misconceptions worth dispelling. First, while many agencies do store data for one year or more, there is no industry standard for ALPR.  For example, Flock Safety, a vendor that provides ALPR to many California agencies, deletes data after 30 days. The California Highway Patrol is only allowed to hang onto data for 60 days. According to the National Conferences of State Legislatures, Maine has a 21 day retention period  and Arkansas has a 150 day retention period.  In New Hampshire, the law requires deletion after three minutes if the data is not connected to a crime.

Finally there is a certain irony when law enforcement claims that ALPR data is not personally identifying information, when one of the primary purposes of the data is to assist in identifying suspects. In fact, California's data breach laws explicitly name ALPR as a form of personal information when it is combined with a person's name. It is very easy for law enforcement to connect ALPR data to other data sets, such as a vehicle registration database, to determine the identity of the owner of the vehicle. In addition, ALPR systems also store photographs, which can potentially capture images of drivers' faces. 

Indeed, Tiburon's own ALPR policy says that raw ALPR data cannot be released publicly because it may contain confidential information. This is consistent with a California Supreme Court decision that found that the Los Angeles Police Department and Los Angeles County Sheriff's Department could not release unredacted ALPR data in response to a CPRA request because "the act of revealing the data would itself jeopardize the privacy of everyone associated with a scanned plate. Given that real parties each conduct more than one million scans per week, this threat to privacy is significant." The Supreme Court agreed with a lower court ruling that "ALPR data showing where a person was at a certain time could potentially reveal where that person lives, works, or frequently visits. ALPR data could also be used to identify people whom the police frequently encounter, such as witnesses or suspects under investigation."

Scans Per Mile: Comparing the Rate of ALPR Surveillance 

While the agencies listed in the previous section are each collecting a massive amount of data, it can be difficult to interpret how  law enforcement agencies' practices compare to one another. Obviously, some cities are bigger than others, and so we sought to establish a way to measure the proportionality of the ALPR data collected. 

One way to do this is to compare the number of license plate scans to the size of the jurisdiction's population. However, this method may not provide the clearest picture, since many commuters, tourists, and rideshare drivers cross city lines many times a day. In addition, the larger and more dense a population is, fewer people may own vehicles with more relying on public transit instead. So we ruled out that method. 

Another method is to compare the license plate scans to the number of vehicles registered or owned in a city. That runs into a similar problem: people don't always work in the same city where their car is registered, particularly in the San Francisco Bay Area and Los Angeles County. 

So, we looked for some metric that would allow us to compare the number of license plate scans to how much road traffic there is in a city. 

Fortunately, for each city and county in the state, the California Department of Transportation compiles annual data on "Vehicle Miles Traveled" (VMT), a number representing the average number of miles driven by vehicles on roads in the jurisdiction each day. VMT is becoming a standard metric in urban and transportation planning.

By comparing license plate scans to VMT we can begin to address the question: Are some cities disproportionally collecting more data than others? The answer is yes: Many cities are collecting data at a far higher rate than others.

There are a few different ways to interpret the rate. For example, in Tiburon, police are collecting on average one license plate scan for every 1.85 miles driven. That means that the average vehicle will be captured every 1.85 miles. To put that in perspective, a driver from outside of town who commutes to and from downtown Tiburon (about four miles each way to the town limits), five days a week, 52 weeks a year, should expect their license plate to be scanned on average 1,124 times annually.  

Another way to look at it is that for every 100 cars that drive one mile, Tiburon ALPRs on average will scan 54 license plates. 

Via email, Tiburon Chief Monaghan responded: "In terms of the number of scans compared to VMT, our ALPRS are strategically placed on two main arterial roadways leading in and out of Tiburon, Belvedere, and incorporated sections of Tiburon. As Tiburon is located on a peninsula, there are limited roadways in and out. The roadways where the ALPRs are deployed are not only used by those who live in the area, but also by commuters who use the ferries that operate out of Tiburon, those who work in Tiburon and Belvedere, parents taking their kids to schools in the area, and those visiting."

Tiburon isn't the only agency collecting data at a high rate. 

  • In Sausalito, on average police are capturing a plate for every 2.14 miles a car drives. That's the equivalent of 46 scans per 100 vehicles that drive one mile. 
  • In Laguna Beach, it's one ALPR scan for every three miles. On average, ALPRs will scan 33 plates for every 100 cars that drive a single mile. 
  • In Beverly Hills, it's one ALPR scan for every 4.63 miles, or 21 scans per 100 cars that drive one mile. 

In comparison: Across 60 cities, police collectively scanned on average one plate for every 48 miles driven by vehicles. Tiburon scanned license plates at more than 25 times that rate. 

For this analysis, we only included municipal police that provided data for both 2018 and 2019. We then used those figures to find an average number of daily plates scanned, which we then compared to the cities' average daily VMTs for 2018-2019. 

Here are the top 15 municipal police department based on miles per scan. 

Agency/Records Link

Average Number of Vehicle Miles Traveled Per Scan (2018-2019) 

Average Number of Scans Per 100 Vehicle Miles Traveled (2018-2019)

Tiburon Police Department (Link 2)

1.85 miles

54.11 scans

Sausalito Police Department (Link 2)

2.14 miles

46.65 scans

Laguna Beach Police Department

3.02 miles

33.12 scans

Beverly Hills Police Department

4.63 miles

21.60 scans

Claremont Police Department (Link 2)

5.99 miles

16.69 scans

La Verne Police Department

7.91 miles

12.64 scans

Carlsbad Police Department

8.19 miles

12.21 scans

Chino Police Department

8.46 miles

11.83 scans

Torrance Police Department

9.74 miles

10.27 scans

Clayton Police Department

10.22 miles

9.79 scans

Pleasant Hill Police Department

10.93 miles

9.15 scans

Oakley Police Department

10.97 miles

9.12 scans

Brentwood Police Department

11.42 miles

8.76 scans

Martinez Police Department

13.59 miles

7.36 scans 

A few caveats about this analysis: 

We must emphasize the term “average.” Road traffic is not even across every street, nor are ALPRs distributed evenly across a city or county. A driver who only drives a half mile along backroads each day may never be scanned. Or if a city installs ALPRs at every entrance and exit to town, as Tiburon has, a driver who commutes to and from the city everyday would likely be scanned at a much higher rate. 

In addition, many police departments attach ALPRs to their patrol cars. This means they are capturing data on parked cars they pass. Your risk of being scanned by an ALPR does not increase linearly with driving—someone who leaves their car parked all year may still be scanned several times. In many jurisdictions, both the city police and the county sheriff use ALPRs; our data analysis does not cover overlapping data collection. Our intention is not to help drivers determine exactly how often they’ve been scanned but to compare the volume of data collection across municipalities of different sizes. 

Finally, VMT is not an exact measurement but rather an estimate that is based on measuring roadway traffic on major arteries and projecting across the total number of miles of road in the city. 

As such, this ratio presents a useful metric to gauge proportionality broadly and should be interpreted as an estimate or a projection. 

What to Do With This Data

The Data Driven 2 dataset is designed to give a bird’s eye view of the size and scope of data collection through ALPRs. 

However, it does not dive into more granular variations in ALPR programs between agencies, such as the number or type of ALPR cameras used by each agency. For example, some agencies might use 50 stationary cameras, while others may use three mobile cameras on patrol cars and still others use a combination of both. Some agencies may distribute data collection evenly across a city, and others may target particular neighborhoods. 

In many cases, agencies provided us with "Data Sharing Reports" that list everyone with whom they are sharing data. This can be useful for ascertaining whether agencies are sharing data broadly with agencies outside of California or even with federal agencies, such as immigration enforcement. Please note that the data sharing list does change from day-to-day as agencies join and leave information exchange networks. 

Journalists and researchers can and should use our dataset as a jumping-off point to probe deeper.  By filing public records requests or posing questions directly to police leaders, we can find out how agencies are deploying ALPRs and how that impacts the amount of data collected, the usefulness of the data, and the proportionality of the data. Under a California Supreme Court ruling, requesters may also be able to obtain anonymized data on where ALPR data was collected.

Conclusion 

Law enforcement often tries to argue that using ALPR technology to scan license plates is no different than a police officer on a stakeout outside a suspected criminal enterprise who writes down the license plates of every car that pulls up. 

But let's say you lived in a place like Brentwood, where police collect on average 25-million license plates a year.

Let's assume that a police observer is able to snap a photo and scribble down the plate number, make, model, and color of a vehicle once every minute (which is still pretty superhuman).  Brentwood would have to add 200 full-time employees to collect as much data manually as they can with their ALPRs.  By comparison: The Brentwood Police Department currently has 71 officers, 36 civilian support workers, and 20 volunteers. The entire city of Brentwood has only 283 employees. 

If 200 human data recorders were positioned throughout a city, spying on civilians, it's unlikely people would stand for it. This report illustrates how that level of surveillance is indeed occurring in many cities across California. Just because the cameras are more subtle, doesn't make them any less creepy or authoritarian.  

2018-2019 Detections with Hit Ratio  

Agency/Records Link

2018-2019 Detections Combined

Hit Ratio (Percentage on a Hot List)

Antioch Police Department

22,524,415

0.10%

Auburn Police Department

3,190,715

0.03%

Bakersfield Police Department

370,635

0.11%

Bell Gardens Police Department

9,476,932

0.02%

Belvedere Police Department

4,089,986

0.01%

Beverly Hills Police Department

67,520,532

0.03%

Brawley Police Department

870,011

0.03%

Brentwood Police Department

25,440,363

0.07%

Buena Park Police Department

854,156

0.04%

Carlsbad Police Department

161,862,285

0.02%

Cathedral City Police Department

104,083

0.06%

Chino Police Department

83,164,449

0.05%

Chula Vista Police Department

672,599

0.03%

Citrus Heights Police Department

18,804,058

0.04%

Claremont Police Department (Link 2)

45,253,735

0.04%

Clayton Police Department

9,485,976

0.02%

Contra Costa County Sheriff's Office

65,632,313

0.11%

Cypress Police Department

288,270

0.04%

Emeryville Police Department

1,579,100

0.05%

Fairfield Police Department

785,560

0.06%

Folsom Police Department

14,624,819

0.03%

Fontana Police Department

66,255,835

0.06%

Fresno Police Department

3,673,958

0.15%

Fullerton Police Department

742,996

0.05%

Galt Police Department

23,478

0.02%

Garden Grove Police Department

332,373

0.24%

Gardena Police Department

5,762,032

0.05%

Imperial Police Department

23,294,978

0.03%

Irvine Police Department

651,578

0.05%

La Habra Police Department

888,136

0.05%

La Mesa Police Department

1,437,309

0.05%

La Verne Police Department

24,194,256

0.03%

Laguna Beach Police Department

37,859,124

0.04%

Livermore Police Department

39,430,629

0.04%

Lodi Police Department

3,075,433

0.05%

Long Beach Police Department

44,719,586

0.09%

Marin County Sheriff's Office

1,547,154

0.04%

Merced Police Department

25,895,158

0.04%

Mill Valley Police Department

529,157

0.12%

Monterey Park Police Department

2,285,029

0.04%

Newport Beach Police Department (Link 2)

772,990

0.04%

Orange County Sheriff's Office

2,575,993

0.09%

Palos Verdes Estates Police

16,808,440

0.03%

Pasadena Police Department

3,256,725

0.03%

Pleasant Hill Police Department

27,293,610

0.03%

Pomona Police Department

11,424,065

0.10%

Redondo Beach Police Department (Link 2)

18,436,371

0.04%

Sacramento Police Department

142,170,129

0.08%

San Bernardino County Sheriff's Office

439,272,149

0.05%

San Diego County Sheriff's Office

13,542,616

0.04%

San Diego Police Department

138,146

0.07%

San Mateo County Sheriff's Office (Link 2)

4,663,684

0.02%

Sausalito Police Department (Link 2)

15,387,157

0.02%

Simi Valley Police Department

480,554

0.11%

Stanislaus County Sheriff's Office

6,745,542

0.07%

Stockton Police Department

1,021,433

0.09%

Tiburon Police Department (Link 2)

15,424,890

0.01%

Torrance Police Department

132,904,262

0.04%

Tracy Police Department

1,006,393

0.06%

Tustin Police Department (Link 2)

1,030,106

0.04%

West Sacramento Police Department

2,337,027

0.05%

Westminster Police Department

1,271,147

0.05%

Yolo County Sheriff's Office

3,049,884

0.02%

Irregular Agencies

These agencies responded to our records requests, but did not provide complete, reliable or directly comparable information.  

Agency/Records Link

Detection Years Used in Hit Ratio

Detections

Hit Ratio (Percentage on a Hot List)

American Canyon Police Department

2018

394,827

0.18%

Beaumont Police Department

2019

83,141

0.10%

Bell Police Department

2018

806,327

Data Not Available

Burbank Police Department

2020

364,394

0.05%

Coronado Police Department

2019

616,573

0.07%

CSU Fullerton Police Department

2019

127,269

0.05%

Desert Hot Springs Police Department

Data Not Available

Data Not Available

Data Not Available

El Segundo Police Department

2020

24,797,764

0.07%

Fountain Valley Police Department

2018-2020

780,940

Data Not Available

Glendale Police Department

2019

119,356

0.04%

Hemet Police Department

2019

84,087

0.10%

Hermosa Beach Police Department

2019

274,577

0.51%

Martinez Police Department

2019

12,990,796

Data Not Available

Modesto Police Department

2019

10,262,235

0.06%

Oakley Police Department

2019

8,057,003

Data Not Available

Ontario Police Department

Jan 19 - Feb 17, 2021

2,957,671

0.07%

Orange Police Department

2018

387,592

0.06%

Palm Springs Police Department

2019

58,482

0.29%

Redlands Police Department

2019

4,027,149

0.08%

Ripon Police Department

2019

2,623,741

0.05%

Roseville Police Department

2019

3,733,042

0.03%

San Joaquin County Sheriff's Office

2018-2020

155,105

0.04%

San Jose Police Department

2020

1,686,836

0.09%

Seal Beach Police Department

2018

38,247

0.49%

Woodland Police Department

2019

1,382,297

0.05%

Dave Maass

No Digital Vaccine Bouncers

1 month 2 weeks ago

The U.S. is distributing more vaccines and the population is gradually becoming vaccinated. Returning to regular means of activity and movement has become the main focus for many Americans who want to travel or see family.

An increasingly common proposal to get there is digital proof-of-vaccination, sometimes called “Vaccine Passports.” On the surface, this may seem like a reasonable solution. But to “return to normal”, we also have to consider that inequity and problems with access are a part of that normal. Also, these proposals require a new infrastructure and culture of doorkeepers to public places regularly requiring visitors to display a token as a condition of entry. This would be a giant step towards pervasive tracking of our day-to-day movements. And these systems would create new ways for corporations to monetize our data and for thieves to steal our data.

That’s why EFF opposes new systems of digital proof-of-vaccination as a condition of going about our day-to-day lives. They’re not “vaccine passports” that will speed our way back to normal. They’re “vaccine bouncers” that will unnecessarily scrutinize us at doorways and unfairly turn many of us away.

What Are Vaccine Bouncers?

So-called “vaccine passports” are digital credentials proposed to be convenient, digital, and accessible ways to store and present your medical data. In this case, it shows you have been vaccinated. These are not actual passports for international travel, nor are they directly related to systems we have in place to prove you have been vaccinated. Though different proposals vary, these are all new ways of displaying medical data in a way that is not typical for our society as a whole.

These schemes require the creation of a vast new electronic gatekeeping system. People will need to download a token to their phone, or in some cases may print that token and carry it with them. Public places will need to acquire devices that can read these tokens. To enter public places, people will need to display their token to a doorkeeper. Many people will be bounced away at the door, because they are not vaccinated, or they left their phone at home, or the system is malfunctioning. This new infrastructure and culture will be difficult to dismantle when we reach herd immunity.

We already have vaccination documents we need to obtain for international travel to certain countries. But even the World Health Organization (W.H.O.), the entity that issues Yellow Cards to determine if one has had a Yellow Fever vaccine, has come out against vaccine passports.

Requiring people to present their medical data to go to the grocery store, access public services, and other vital activities calls into question who will be ultimately barred from coming in. A large number of people not only in the U.S., but worldwide, do not have access to any COVID vaccines. Many others do not have access to mobile phones, or even to the printers required to create the paper QR code that is sometimes suggested as the supposed work-around.

Also, many solutions will be built by private companies offering smartphone applications. Meaning, they will give rise to new databases of information not protected by any privacy law and transmitted on a daily basis far more frequently than submitting a one-time paper proof-of-vaccination to a school. Since we have no adequate federal data privacy law, we are relying on the pinky-promises of private companies to keep our data private and secure.

We’ve already seen mission creep with digital bouncer systems. Years ago, some bars deployed devices that scanned patrons’ identification as a condition of entry. The rationale was to quickly ascertain, and then forget, a narrow fact about patrons: whether they are old enough to buy alcohol, and thus enter the premises. Then these devices started to also collect information from patrons, which bars share with each other. Thus, we are not comforted when we hear people today say: “don’t worry, digital vaccine bouncers will only check whether a person was vaccinated, and will not also collect information about them.” Once the infrastructure is built, it requires just a few lines of code to turn digital bouncers into digital panopticons.

Temporary Measures with Long Term Consequences

When we get to an approximation of normal, what is the plan for vaccine passports? Most proposals are not clear on this point. What will become of that medical data? Will there be a push for making this a permanent part of life?

As with any massive new technological system, it will take significant time and great effort to make the system work. We’ve already seen how easy it is to evade New York’s new vaccine bouncer system, and how other digital COVID systems, due to their flaws, fail to advance public health. Even with the best efforts, by the time the bugs are worked out of a new digital vaccine system for COVID, it may not be helpful to combat the pandemic. There’s no need to rush into building a system that will only provide value to the companies that profit by building it.

Instead, our scarce resources should go to getting more people vaccinated. We are all in this together, so we should be opening up avenues of access for everyone to a better future in this pandemic. We should not be creating more issues, concerns, and barriers with experimental technology that needs to be worked out during one of the most devastating modern global crises of our time.

Alexis Hancock

EFF Sues Proctorio on Behalf of Student It Falsely Accused of Copyright Infringement to Get Critical Tweets Taken Down

1 month 3 weeks ago
Links to Software Code Excerpts in Tweets Are Fair Use

Phoenix, Arizona—The Electronic Frontier Foundation (EFF) filed a lawsuit today against Proctorio Inc. on behalf of college student Erik Johnson, seeking a judgment that he didn’t infringe the company’s copyrights when he linked to excerpts of its software code in tweets criticizing the software maker.

Proctorio, a developer of exam administration and surveillance software, misused the copyright takedown provisions of the Digital Millennium Copyright Act (DMCA) to have Twitter remove posts by Johnson, a Miami University computer engineering undergraduate and security researcher. EFF and co-counsel Osborn Maledon said in a complaint filed today in U.S. District Court, District of Arizona, that Johnson made fair use of excerpts of Proctorio’s software code, and the company’s false claims of infringement interfered with Johnson’s First Amendment right to criticize the company.

“Software companies don’t get to abuse copyright law to undermine their critics,” said EFF Staff Attorney Cara Gagliano. “Using pieces of code to explain your research or support critical commentary is no different from quoting a book in a book review.”

Proctoring apps like Proctorio’s are privacy-invasive software that “watches” students through eye-tracking and face detection for supposed signs of cheating as they take tests or complete schoolwork. The use of these “disciplinary technology” programs has skyrocketed amid the pandemic, raising questions about the extent to which they threaten student privacy and disadvantage students without access to high-speed internet and quiet spaces.

Proctorio has responded to public criticism by attacking people who speak out. The company’s CEO released on Reddit contents of a student’s chat log captured by Proctorio after the student posted complaints about the software on the social network. The company has also sued a remote learning specialist in Canada for posting links to Proctorio’s publicly available YouTube videos in a series of tweets showing the software tracks “abnormal” eye and head movements it deems suspicious.

Concerned about how much private information Proctorio collects from students’ computers, Johnson, whose instructors have given tests using Proctorio, examined the company’s software, including the files that are downloaded to any computer where the software is installed.

He published a series of tweets in September critiquing Proctorio, linking in three of those tweets to short software code excerpts that demonstrate the extent of the software’s tracking and access to users’ computers. In another tweet, Johnson included a screenshot of a video illustrating how the software is able to create a 360-degree image of students’ rooms that is accessible to teachers and seemingly Proctorio’s agents.

“Copyright holders should be held liable when they falsely accuse their critics of copyright infringement, especially when the goal is plainly to intimidate and undermine them,” said Gagliano. “We’re asking the court for a declaratory judgment that there is no infringement to prevent further legal threats and takedown attempts against Johnson for using code excerpts and screenshots to support his comments.”

For the complaint:
https://www.eff.org/document/johnson-v-proctorio-complaint

For more on proctoring surveillance:
https://www.eff.org/deeplinks/2020/08/proctoring-apps-subject-students-unnecessary-surveillance

Contact:  CaraGaglianoStaff Attorneycara@eff.org
Karen Gullo

Fighting FLoC and Fighting Monopoly Are Fully Compatible

1 month 3 weeks ago

Are tech giants really damned if they do and damned if they don’t (protect our privacy)?

That’s a damned good question that’s been occasioned by Google’s announcement that they’re killing the invasive, tracking third-party cookie (yay!) and replacing it with FLoC, an alternative tracking scheme that will make it harder for everyone except Google to track you (uh, yay?)  (You can find out if Google is FLoCing with you with our Am I FLoCed tool).

Google’s move to kill the third-party cookie has been greeted with both cheers and derision. On the one hand, some people are happy to see the death of one of the internet’s most invasive technologies. We’re glad to see it go, too - but we’re pretty upset to see that it’s going to be replaced with a highly invasive alternative tracking technology (bad enough) that can eliminate the majority of Google’s competitors in the data-acquisition and ad-targeting sectors in a single stroke (worse). 

It’s no wonder that so many people have concluded that privacy and antitrust are on a collision course. Google says nuking the third-party cookie will help our privacy, specifically because it will remove so many of its (often more unethical) ad-tech competitors from the web. 

But privacy and competition are not in conflict.  As EFF’s recent white paper demonstrated, we can have Privacy Without Monopoly. In fact, we can’t settle for anything less.

FLoC is quite a power-move for Google. Faced with growing concerns about privacy, the company proposes to solve them by making itself the protector of our privacy, walling us off from third-party tracking except when Google does it. All the advertisers that rely on non-Google ad-targeting will have to move to Google, and pay for their services, using a marketplace that they’ve rigged in their favor.  To give credit where it is due, the move does mean that some bad actors in the digital ad space may be thwarted. But it’s a very cramped view of how online privacy should work. Google’s version of protecting our privacy is appointing itself the gatekeeper who decides when we’re spied on while skimming from advertisers with nowhere else to go. Compare that with Apple, which just shifted the default to “no” for all online surveillance by apps, period (go, Apple!).

And while here we think Apple is better than Google, that’s not how any of this should work. The truth is, despite occasional counter-examples, the tech giants can’t be relied on to step up to provide real privacy for users when it conflicts with their business models.  The baseline for privacy should be a matter of law and basic human rights, not just a matter of a corporate whim. America is long, long overdue for a federal privacy law with a private right of action. Users must be empowered to enforce privacy accountability, instead of relying on the largesse of the giants or on overstretched civil servants. 

Just because FLoC is billed as pro-privacy and also criticized as anti-competitive, it doesn’t mean that privacy and competition aren’t compatible.  To understand how that can be, first remember the reason to support competition: not for its own sake, but for what it can deliver to internet users. The benefit of well-thought-through competition is more control over our digital lives and better (not just more) choices.

Competition on its own is meaningless or even harmful: who wants companies to compete to see which one can trick or coerce you into surrendering your fundamental human rights, in the most grotesque and humiliating ways at the least benefit to you? To make competition work for users, start with Competitive Compatibility and interoperability - the ability to connect new services to existing ones, with or without permission from their operators, so long as you’re helping users exercise more choice over their online lives.  A competitive internet - one dominated by interoperable services - would be one where you didn’t have to choose between your social relationships and your privacy. When all your friends are on Facebook, hanging out with them online means subjecting yourself to Facebook’s totalizing, creepy, harmful surveillance. 

But if Facebook was forced to be interoperable, then rival services that didn’t spy on you could enter the market, and you could use those services to talk to your friends who were still on Facebook (for reasons beyond your understanding).  This done poorly could be worse for privacy, but done well, it does not have to be. Interoperability is key to smashing monopoly power, and interoperability's benefits depend on strong laws protecting privacy.

With or without interoperability, we need a strong privacy law. Tech companies unilaterally deciding what user privacy means is dangerous, even when they come up with a good answer (Apple) but especially not when their answer comes packaged in a nakedly anticompetitive power-grab (Google). Of course, it doesn’t help that some of the world’s largest, most powerful corporations depend on this unilateral power, and use some of their tremendous profits to fight every attempt to create a strong national privacy law that empowers users to hold them accountable.

Competition and privacy reinforce each other in technical ways, too: lack of competition is the reason online tracking technologies all feed the same two companies’ data warehouses. These companies dominate logins, search, social media and the other areas that the people who build and maintain our digital tools need to succeed. A diverse and competitive online world is one with substantial technical hurdles to building the kinds of personal dossiers on users that today’s ad-tech companies depend on for their profitability. 

The only sense in which “pro-privacy” and “competition” are in tension is the twisted sense implied by FLoC, where “pro-privacy” means “only one company gets to track you and present who you are to others.”  

Of course that’s incompatible with competition.

(What’s more, FLoC won’t even deliver that meaningless assurance. As we note in our original post, FLoC also creates real opportunities for fingerprinting and other forms of re-identification. FLoC is anti-competitive and anti-privacy.)

Real privacy—less data-collection, less data-retention and less data-processing, with explicit consent when those activities take place—is perfectly compatible with competition. It's one of the main reasons to want antitrust enforcement.

All of this is much easier to understand if you think about the issues from the perspective of users, not corporations. You can be pro-Apple (when Apple is laying waste to Facebook’s ability to collect our data) and anti-Apple (when Apple is skimming a destructive ransom from software vendors like Hey). This is only a contradiction if you think of it from Apple’s point of view - but if you think of it from the users’ point of view, there's no contradiction at all.

We want competition because we want users to be in control of their digital lives - to have digital self-determination and choices that support that self-determination. Right now, that means that we need a strong privacy law and a competitive landscape that gives breathing space to better options than Google’s “track everything but in a slightly different way” FLoC.  

As always, when companies have their users’ backs, EFF has the companies’ backs. And as always, the reason we get their backs is because we care about users, not companies.

We fight for the users.

Cory Doctorow

Indian Government's Plans to Ban Cryptocurrency Outright Are A Bad Idea

1 month 3 weeks ago

While Turkey hit the headlines last week with a ban on paying for items with cryptocurrency, the government of India appears to be moving towards outlawing cryptocurrency completely. An unnamed senior government official told Reuters last month that a forthcoming bill this parliamentary session would include the prohibition of the “possession, issuance, mining, trading and transferring [of] crypto-assets.” Officials have subsequently done little to dispel the concern that they are seeking a full cryptocurrency ban: in response to questions by Indian MPs about the timing and the content of a potential Cryptocurrency Act, the Finance Ministry was non-committal, beyond stating that the bill would follow “due process.” 

If the Indian government plans to effectively police its own draconian rules, it would need to seek to block, disrupt, and spy on Internet traffic

If rumors of a complete ban accurately describe the bill, it would be a drastic and over-reaching prohibition that would require draconian oversight and control to enforce. But it would also be in keeping with previous overreactions to cryptocurrency by regulators and politicians in India.

India regulators’ involvement with cryptocurrency began four years ago with concerns about consumer safety in the face of scams, Ponzi schemes, and the unclear future of many blockchain projects. The central bank issued a circular prohibiting all regulated entities, including banks, from servicing businesses dealing in virtual currencies. Nearly two years later, the ban was overturned by the Indian Supreme Court on the ground that it amounted to disproportionate regulatory action in the absence of evidence of harm caused to the regulated entities. A subsequent report in 2019 by the Finance Ministry proposed a draft bill that would have led to a broad ban on the use of cryptocurrency. It’s this bill that commentators suspect will form the core of the new legislation.

The Indian government is worried about the use of cryptocurrency to facilitate illegal activity, but this ignores the many entirely legal uses for cryptocurrencies that already exist and that will continue to develop in the future. Cryptocurrency is naturally more censorship-resistant than many other forms of financial instruments currently available. It provides a powerful market alternative to the existing financial behemoths that exercise control over much of our online transactions today, so that websites engaged in legal (but controversial) speech have a way to receive funds when existing financial institutions refuse to serve them. Cryptocurrency innovation also holds the promise of righting other power imbalances: it can expand financial inclusion by lowering the cost of credit, offering instant transaction resolution, and enhancing customer verification processes. Cryptocurrency can help unbanked individuals get access to financial services.

If the proposed cryptocurrency bill does impose a full prohibition, as rumors suggest, the Indian government should consider, too, the enforcement regime it would have to create. Many cryptocurrencies, including Bitcoin, offer some privacy-enhancing features which make it relatively easy for the geographical location of a cryptocurrency transaction to be concealed, so while India's cryptocurrency users would be prohibited from using local, regulated cryptocurrency services, they could still covertly join the rest of the world's cryptocurrency markets. As the Internet and Mobile Association of India has warned, the result would be that Indian cryptocurrency transactions would move to “illicit” sites that would be far worse at protecting consumers.

Moreover, if the Indian government plans to effectively police its own draconian rules, it would need to seek to block, disrupt, and spy on Internet traffic to detect or prevent cryptocurrency transactions. Those are certainly powers that the past and present Indian administrations have sought: but unless they are truly necessary and proportionate to a legitimate aim, such interference will violate international law, and, if India’s Supreme Court decides they are unreasonable, will fail once again to pass judicial muster.

The Indian government has claimed that it does want to support blockchain technology in general. In particular, the current government has promoted the idea of a “Digital Rupee”, which it expects to be placed on a statutory footing in the same bill that bans private cryptocurrencies. It’s unclear what the two actions have in common. A centrally-run digital currency has no reason to be implemented on a blockchain, a technology that is primarily needed for distributed trust consensus, and has little applicability when the government itself is providing the centralized backstop for trust. Meanwhile, legitimate companies and individuals exploring the blockchain for purposes for which it is well-suited will always fear falling afoul of the country’s criminal sanctions—which will, Reuter’s source claims, include ten-year prison sentences in its list of punishments. Such liability would be the severest disincentive to any independent investor or innovator, whether they are commercial or working in the public interest.

Addressing potential concerns around cryptocurrency by banning the entire technology would be excessive and unjust. It denies Indians access to the innovations that may come from this sector, and, if enforced at all, would require prying into Indian’s digital communications to an unnecessary and disproportionate degree.

Sasha Mathew

Senators Demand Answers on the Dangers of Predictive Policing

1 month 3 weeks ago

Predictive policing is dangerous and yet its use among law enforcement agencies is growing. Predictive policing advocates, and companies that make millions selling technology to police departments, like to say the technology is based on “data” and therefore it cannot be racially biased. But this technology will disproportionately hurt Black and other overpoliced communities, because the data was created by a criminal punishment system that is racially biased. For example, a data set of arrests, even if they are nominally devoid of any racial information, can still be dangerous by virtue of the fact that police make a disparately high number of arrests in Black neighborhoods.

Technology can never predict crime. Rather, it can invite police to regard with suspicion those people who were victims of crime, or live and work in places where crime has been committed in the past. 

For all these reasons and more, EFF has argued that the technology should be banned from being used by law enforcement agencies, and some cities across the United States have already begun to do so. 

Now, a group of our federal elected officials is raising concerns on the dangers of predictive policing. Sen. Ron Wyden penned a probing letter to Attorney General Garland asking about how the technology is used. He is joined by Rep. Yvette Clarke, Sen. Ed Markey, Sen. Elizabeth Warren, Sen. Jeffery Merkley, Sen. Alex Padilla, Sen. Raphael Warnock, and Rep. Sheila Jackson Lee.. 

They ask, among other things, whether the U.S. Department of Justice (DOJ)  has done any legal analysis to see if the use of Predictive Policing complies with the 1964 Civil Rights Act. It’s clear that the Senators and Representatives are concerned with the harmful legitimizing effects “data” can have on racially biased policing: “These algorithms, which automate police decisions, not only suffer from a lack of meaningful oversight over whether they actually improve public safety, but are also likely to amplify prejudices against historically marginalized groups."

The elected officials are also concerned about how many jurisdictions the DOJ has helped to fund predictive policing and the data collection requisite to run such programs, as well as whether these programs are measured in any real way for efficacy, reliability, and validity. This is important considering that many of the algorithms being used are withheld from public scrutiny on the assertion that they are proprietary and operated by private companies. Recently, an audit by the state of Utah found that the the state had contracted with a company for surveillance, data analysis, and predictive AI, yet the company actually had no functioning AI and was able to hide that fact inside the black box of proprietary secrets. 

You can read more of the questions the elected officials asked of the Attorney General in the full letter, which you can find below. 

Matthew Guariglia

Video Hearings Tuesday and Wednesday: EFF Will Tell Copyright Office That Consumers Should Have the Freedom to Fix, Modify Digital Devices They Own

1 month 3 weeks ago
DMCA Blocks Consumers from Downloading Apps That Big Tech Companies Don’t Approve Of

San Francisco—On Tuesday, April 20, and Wednesday, April 21, experts from the Electronic Frontier Foundation (EFF) fighting copyright abuse will testify at virtual hearings held by the Copyright Office in favor of exemptions to the Digital Millennium Copyright Act (DMCA) so people who have purchased digital devices—from cameras and e-readers to smart TVs—can repair or modify them, or download new software to enhance their functionality.

The online hearings are part of a rulemaking process held every three years by the Copyright Office to determine whether people are harmed by DMCA “anti-circumvention” provisions, which prohibit anyone from bypassing or disabling access controls built into products by manufacturers to lock down the software that runs them. These provisions are often abused by technology companies to control how their devices are used and stop consumers, innovators, competitors, researchers, and everyday repair businesses from offering new, lower-cost, and creative services.

EFF Staff Attorney Cara Gagliano will testify Tuesday in support of a universal DMCA exemption for the repair and modification of any software-enabled device, including everything from digital cameras and e-readers to automated litterboxes and robotic pets. The Copyright Office’s existing policy of granting exemptions in piecemeal fashion for certain devices every three years is unjustified and completely inadequate—the legal analysis for the exemption, that it’s needed to allow noninfringing uses, is the same across all devices, Gagliano will testify.

EFF Senior Staff Attorney Mitch Stoltz will testify Wednesday in support of expanding the Copyright Office’s “jailbreaking” exception to the anti-circumvention law. In past years, EFF has fought for and won the right to “jailbreak” or “root” personal computing devices including smartphones, tablets, wearables, smart TVs, and smart speakers, allowing people to install the software of their choice on the devices they own without the manufacturer’s permission. This year, Stoltz will urge the Copyright Office to expand that exemption to cover “streaming boxes” and “streaming sticks”—devices that add “smart TV” functionality to an ordinary TV.

WHAT: Virtual Hearings on DMCA Rulemaking

WHEN AND WHERE:
April 20, 7:30 am - 9:30 am (device repair and modification).  To stream via Zoom: https://loc.zoomgov.com/webinar/register/WN__bjnscN1TpiTrIVf19DsRw
April 21, 7:30 am – 9:30 am (“jailbreaking” streaming boxes). To stream via Zoom: https://loc.zoomgov.com/webinar/register/WN_T8S5cKSHQ-ujcVPBHH_ozw

For EFF comments to the Copyright Office:
https://www.eff.org/document/eff-comment-2021-dmca-rulemaking-repair
https://www.eff.org/document/eff-comment-2021-dmca-rulemaking-reply-repair
https://www.eff.org/document/eff-comment-2021-dmca-rulemaking-jailbreaking
https://www.eff.org/document/eff-comment-2021-dmca-rulemaking-reply-jailbreaking-0

For full hearing agendas:
https://www.copyright.gov/1201/2021/public-hearings/hearing-agenda.pdf

For more about DMCA rulemaking and copyright abuse:
https://www.eff.org/issues/dmca-rulemaking
https://www.eff.org/issues/innovation
https://www.eff.org/deeplinks/2017/02/copyright-law-versus-internet-culture

Contact:  CaraGaglianoStaff Attorneycara@eff.org MitchStoltzSenior Staff Attorneymitch@eff.org
Karen Gullo

Proctoring Tools and Dragnet Investigations Rob Students of Due Process

1 month 3 weeks ago

Update, April 16, 2021: The Foundation for Individual Rights in Education (FIRE) points out that Dartmouth has publicly expressed a commitment to upholding free speech and dissent on campus.  The medical school should strive to uphold these policies, and as FIRE argues, they may even be considered contracts that the school has breached with its social media policy that prohibits "disparaging" and "inappropriate" online speech.

Like many schools, Dartmouth College has increasingly turned to technology to monitor students taking exams at home. And while many universities have used proctoring tools that purport to help educators prevent cheating, Dartmouth’s Geisel School of Medicine has gone dangerously further. Apparently working under an assumption of guilt, the university is in the midst of a dragnet investigation of complicated system logs, searching for data that might reveal student misconduct, without a clear understanding of how those logs can be littered with false positives. Worse still, those attempting to assert their rights have been met with a university administration more willing to trust opaque investigations of inconclusive data sets rather than their own students.

The Boston Globe explains that the medical school administration’s attempts to detect supposed cheating have become a flashpoint on campus, exemplifying a worrying trend of schools prioritizing misleading data over the word of their students. The misguided dragnet investigation has cast a shadow over the career aspirations of over twenty medical students.

Dartmouth medical school has cast suspicion on students by relying on access logs that are far from concrete evidence of cheating

What’s Wrong With Dartmouth’s Investigation

In March, Dartmouth’s Committee on Student Performance and Conduct (CSPC) accused several students of accessing restricted materials online during exams. These accusations were based on a flawed review of an entire year’s worth of the students’ log data from Canvas, the online learning platform that contains class lectures and information. This broad search was instigated by a single incident of confirmed misconduct, according to a contentious town hall between administrators and students (we've re-uploaded this town hall, as it is now behind a Dartmouth login screen). These logs show traffic between students’ devices and specific files on Canvas, some of which contain class materials, such as lecture slides. At first glance, the logs showing that a student’s device connected to class files would appear incriminating: timestamps indicate the files were retrieved while students were taking exams. 

But after reviewing the logs that were sent to EFF by a student advocate, it is clear to us that there is no way to determine whether this traffic happened intentionally, or instead automatically, as background requests from student devices, such as cell phones, that were logged into Canvas but not in use. In other words, rather than the files being deliberately accessed during exams, the logs could have easily been generated by the automated syncing of course material to devices logged into Canvas but not used during an exam. It’s simply impossible to know from the logs alone if a student intentionally accessed any of the files, or if the pings exist due to automatic refresh processes that are commonplace in most websites and online services. Most of us don’t log out of every app, service, or webpage on our smartphones when we’re not using them.

Much like a cell phone pinging a tower, the logs show files being pinged in short time periods and sometimes being accessed at the exact second that students are also entering information into the exam, suggesting a non-deliberate process. The logs also reveal that the files accessed are largely irrelevant to the tests in question, also indicating  an automated, random process. A UCLA statistician wrote a letter explaining that even an automated process can result in multiple false-positive outcomes. Canvas' own documentation explicitly states that the data in these logs "is meant to be used for rollups and analysis in the aggregate, not in isolation for auditing or other high-stakes analysis involving examining single users or small samples." Given the technical realities of how these background refreshes take place, the log data alone should be nowhere near sufficient to convict a student of academic dishonesty. 

Along with The Foundation for Individual Rights in Education (FIRE), EFF sent a letter to the Dean of the Medical School on March 30th, explaining how these background connections work and pointing out that the university has likely turned random correlations into accusations of misconduct. The Dean’s reply was that the cases are being reviewed fairly. We disagree.

For the last year, we’ve seen far too many schools ignore legitimate student concerns about inadequate, or overbroad, anti-cheating software

It appears that the administration is the victim of confirmation bias, turning fallacious evidence of misconduct into accusations of cheating. The school has admitted in some cases that the log data appeared to have been created automatically, acquitting some students who pushed back. But other students have been sanctioned, apparently entirely based on this spurious interpretation of the log data. Many others are anxiously waiting to hear whether they will be convicted so they can begin the appeal process, potentially with legal counsel. 

These convictions carry heavy weight, leaving permanent marks on student transcripts that could make it harder for them to enter residencies and complete their medical training. At this level of education, this is not just about being accused of cheating on a specific exam. Being convicted of academic dishonesty could derail an entire career. 

University Stifles Speech After Students Express Concerns Online

Worse still, following posts from an anonymous Instagram account apparently run by students concerned about the cheating accusations and how they were being handled, the Office of  Student Affairs introduced a new social media policy.

An anonymous Instagram account detailed some concerns students have with how these cheating allegations were being handled (accessed April 7). As of April 15, the account was offline.

The policy was emailed to students on April 7 but backdated to April 5—the day the Instagram posts appeared. The new policy states that, "Disparaging other members of the Geisel UME community will trigger disciplinary review." It also prohibits social media speech that is not “courteous, respectful, and considerate of others” or speech that is “inappropriate.” Finally, the policy warns, "Students who do not follow these expectations may face disciplinary actions including dismissal from the School of Medicine." 

One might wonder whether such a policy is legal. Unfortunately, Dartmouth is a private institution and so not prohibited by the First Amendment from regulating student speech.

If it were a public university with a narrower ability to regulate student speech, the school would be stepping outside the bounds of its authority if it enforced the social media policy against medical school students speaking out about the cheating scandal. On the one hand, courts have upheld the regulation of speech by students in professional programs at public universities under codes of ethics and other established guidance on professional conduct. For example, in a case about a mortuary student’s posts on Facebook, the Minnesota Supreme Court held that a university may regulate students’ social media speech if the rules are “narrowly tailored and directly related to established professional conduct standards.” Similarly, in a case about a nursing student’s posts on Facebook, the Eleventh Circuit held that “professional school[s] have discretion to require compliance with recognized standards of the profession, both on and off campus, so long as their actions are reasonably related to legitimate pedagogical concerns.” On the other hand, the Sixth Circuit has held that a university can’t invoke a professional code of ethics to discipline a student when doing so is clearly a “pretext” for punishing the student for her constitutionally protected speech.

Although the Dartmouth medical school is immune from a claim that its social media policy violates the First Amendment, it seems that the policy might unfortunately be a pretext to punish students for legitimate speech. Although the policy states that the school is concerned about social media posts that are “lapses in the standards of professionalism,” the timing of the policy suggests that the administrators are sending a message to students who dare speak out against the school’s dubious allegations of cheating. This will surely have a chilling effect on the community to the extent that students will refrain from expressing their opinions about events that occur on campus and affect their future careers. The Instagram account was later taken down, indicating that the chilling effect on speech may have already occurred. (Several days later, a person not affiliated with Dartmouth, and therefore protected from reprisal, has reposted many of the original Instagram's posts.)

Students are at the mercy of private universities when it comes to whether their freedom of speech will be respected. Students select private schools based on their academic reputation and history, and don’t necessarily think about a school’s speech policies. Private schools shouldn’t take advantage of this, and should instead seek to sincerely uphold free speech principles.

Investigations of Students Must Start With Concrete Evidence

Though this investigation wasn’t the result of proctoring software, it is part and parcel of a larger problem: educators using the pandemic as an excuse to comb for evidence of cheating in places that are far outside their technical expertise. Proctoring tools and investigations like this one flag students based on flawed metrics and misunderstandings of technical processes, rather than concrete evidence of misconduct. 

Simply put: these logs should not be used as the sole evidence for potentially ruining a student’s career. 

Proctoring software that assumes all students take tests the same way—for example, in rooms that they can control, their eyes straight ahead, fingers typing at a routine pace—puts a black mark on the record of students who operate outside the norm. One problem that has been widely documented with proctoring software is that students with disabilities (especially those with motor impairment) are consistently flagged as exhibiting suspicious behavior by software suites intended to detect cheating. Other proctoring software has flagged students for technical snafus such as device crashes and Internet cuts out, as well as completely normal behavior that could indicate misconduct if you squint hard enough.

For the last year, we’ve seen far too many schools ignore legitimate student concerns about inadequate, or overbroad, anti-cheating software. Across the country, thousands of students, and some parents, have created petitions against the use of proctoring tools, most of which (though not all) have been ignored. Students taking the California and New York bar exams—as well as several advocacy organizations and a group of deans—advocated against the use of proctoring tools for those exams. As expected, many of those students then experienced “significant software problems” with the Examsoft proctoring software, specifically, causing some students to fail. 

Many proctoring companies have defended their dangerous, inequitable, privacy-invasive, and often flawed software tools by pointing out that humans—meaning teachers or administrators—usually have the ability to review flagged exams to determine whether or not a student was actually cheating. That defense rings hollow when those reviewing the results don’t have the technical expertise—or in some cases, the time or inclination—to properly examine them.

Similar to schools that rely heavily on flawed proctoring software, Dartmouth medical school has cast suspicion on students by relying on access logs that are far from concrete evidence of cheating. Simply put: these logs should not be used as the sole evidence for potentially ruining a student’s career. 

The Dartmouth faculty has stated that they will not continue to look at Canvas logs in the future for violations (51:45 into the video of the town hall). That’s a good step forward. We insist that the school also look beyond these logs for the students currently being investigated, and end this dragnet investigation entirely, unless additional evidence is presented.

Jason Kelley

EFF Partners with DuckDuckGo to Enhance Secure Browsing and Protect User Information on the Web

1 month 3 weeks ago
DuckDuckGo Smarter Encryption Will Be Incorporated Into HTTPS Everywhere

San Francisco, California—Boosting protection of Internet users’ personal data from snooping advertisers and third-party trackers, the Electronic Frontier Foundation (EFF) today announced it has enhanced its groundbreaking HTTPS Everywhere browser extension by incorporating rulesets from DuckDuckGo Smarter Encryption.

The partnership represents the next step in the evolution of HTTPS Everywhere, a collaboration with The Tor Project and a key component of EFF’s effort to encrypt the web and make the Internet ecosystem safe for users and website owners.

“DuckDuckGo Smarter Encryption has a list of millions of HTTPS-encrypted websites, generated by continually crawling the web instead of through crowdsourcing, which will give HTTPS Everywhere users more coverage for secure browsing,” said Alexis Hancock, EFF Director of Engineering and manager of HTTPS Everywhere and Certbot web encrypting projects. “We’re thrilled to be partnering with DuckDuckGo as we see HTTPS become the default protocol on the net and contemplate HTTPS Everywhere’s future.”

“EFFs pioneering work with the HTTPS Everywhere extension took privacy protection in a new and needed direction, seamlessly upgrading people to secure website connections,” said Gabriel Weinberg, DuckDuckGo founder and CEO. “We're delighted that EFF has now entrusted DuckDuckGo to power HTTPS Everywhere going forward, using our next generation Smarter Encryption dataset."

When EFF launched HTTPS Everywhere over a decade ago, the majority of web servers used the non-secure HTTP protocol to transfer web pages to browsers, rendering user content and information vulnerable to attacks.

EFF began building and maintaining a crowd-sourced list of encrypted HTTPS versions of websites for a free browser extension— HTTPS Everywhere—which automatically takes users to them. That keeps users’ web searching, pages visited, and other private information encrypted and safe from trackers and data thieves that try to intercept and steal personal information in transit from their browser.

Fast forward ten years­—the web is undergoing a massive change to HTTPS. Mozilla’s Firefox has an HTTPS-only mode, while Google Chrome is slowly moving towards HTTPS mode.

DuckDuckGo, a privacy-focused search engine, also joined the effort with Smarter Encryption to help users browse securely by detecting unencrypted, non-secure HTTP connections to websites and automatically upgrading them to encrypted connections.

With more domain coverage in Smarter Encryption, HTTPS Everywhere users are provided even more protection. HTTPS Everywhere rulesets will continue to be hosted through this year, giving our partners who use them time to adjust. We will stop taking new requests for domains to be added at the end of May.

To download HTTPS Everywhere:
https://www.eff.org/https-everywhere

For more on encrypting the web:
https://www.eff.org/encrypt-the-web

For more from DuckDuckGo:
https://spreadprivacy.com/eff-adopts-duckduckgo-smarter-encryption/

Contact:  AlexisHancockDirector of Engineering, Certbot alexis@eff.org DuckDuckGoPress@duckduckgo.com
Karen Gullo

HTTPS Everywhere Now Uses DuckDuckGo’s Smarter Encryption

1 month 4 weeks ago

Over the last few months the HTTPS Everywhere project has been deciding what to do with the new landscape of HTTPS in major browsers. Encrypted web traffic has increased in the last few years and major browsers have made strides in seeing that HTTPS becomes the default. This project has shepherded a decade of encrypted web traffic and we look onward to setting our efforts protecting people when new developments occur in the future. That said we’d like to announce that we have partnered with the DuckDuckGo team to utilize their Smarter Encryption rulesets into the HTTPS Everywhere web extension.

This is happening for several reasons:

  • Firefox has an HTTPS-Only Mode now.
  • Chrome doesn’t have HTTPS by default, but is slowly moving towards that goal with now directing to HTTPS in the navigation bar first before going to HTTP.
  • DuckDuckGo’s Smarter Encryption covers more domains than our current model.
  • Browsers and websites are moving away from issues that created a need for more granular ruleset maintenance.
    • Mixed content is now blocked in major browsers
    • Different domains for secure connection are now an older habit (i.e. secure.google.com), further removing the need for granular maintenance on HTTPS Everywhere rulesets
    • Chrome’s Manifest V3 declarativeNetRequest API will force the web extensions to have a ruleset cap. Instead of competing with other extensions like DuckDuckGo, if the user prefers to use HTTPS Everywhere or DuckDuckGo's Privacy Essentials, they will receive virtually the same coverage. We don’t want to create confusion for users on “who to choose” when it comes to getting the best coverage.
    • As HTTPS Everywhere goes into “maintenance mode”, users will have the opportunity to move to DuckDuckGo’s Privacy Essentials or use a browser that has HTTPS by default.

More info on DuckDuckGo’s Smarter Encryption here: https://spreadprivacy.com/duckduckgo-smarter-encryption/

Phases for HTTPS Everywhere’s Rulesets

  • DuckDuckGo Update Channel with Smarter Encryption Rulesets [April 15, 2021].
  • Still accept HTTPS Everywhere Ruleset changes in Github Repo until the end of May, 2021.
  • Still host HTTPS Everywhere Rulesets until various partners and downstream channels that use our current rulesets, make needed changes and decisions.
  • Sunset HTTPS Everywhere Rulesets [Late 2021]

Afterwards, this will start the HTTPS Everywhere web extension EoL (End of Life) stage, which will be determined later after completing the sunset of HTTPS Everywhere Rulesets. By adding the DuckDuckGo Smarter Encryption Update Channel we can give everyone time to adjust and plan.

Thank you for contributing and using this project through the years. We hope you can celebrate with us the monumental effort HTTPS Everywhere has accomplished.

Alexis Hancock

Congress, Don’t Let ISP Lobbyists Sabotage Fiber for All

1 month 4 weeks ago

For the first time, an American president has proposed a plan that wouldn’t just make a dent in the digital divide, it will end it. By deploying federal resources at the level and scale this country has not seen since electrification nearly 100 years ago, the U.S. will again connect every resident to a necessary service. Like with water and electricity, robust internet access, as the pandemic has proven, is an essential service. And so the effort and resources expended are well-worth it.

The president’s plan, which matches well with current Congressional efforts of Representative James Clyburn and Senator Klobuchar, is welcomed news and a boost to efforts by Congress to get the job done. This plan draws a necessary line that government should only be investing its dollars in “future-proof” (i.e. fiber) broadband infrastructure, which is something we have failed to do for years with subsidies for persistently low metrics for what qualifies as broadband. Historically the low expectations pushed by the telecom industry have resulted in a lot of wasted tax dollars. Every state (barring North Dakota) has wasted billions in propping up outdated networks. Americans are left with infrastructure unprepared for 21st century, a terrible ratio of price to internet speed, and one of the largest private broadband provider bankruptcies in history. 

Policymakers are learning from these mistakes as well as from the demand shown by the COVID-19 crisis and this is the year we chart a new course. Now is the time for people to push their elected representatives in Congress to pass the Accessible, Affordable Internet for All Act.

Take Action

Tell Congress: Americans Deserve Fast, Reliable, and Affordable Internet

What Is “Future-Proof" and Why Does Fiber Meet the Definition?

Fiber is the superior medium for 21st-century broadband, and policymakers need to understand that reality in making decisions about broadband policy. No other data transmission medium has the inherent capacity and future potential as fiber, which is why all 21st-century networks, from 5G to Low Earth Orbit Satellites, are dependent on fiber. However, some of the broadband access industry keeps trying to sell state and federal legislatures on the need to subsidize slower, outdated infrastructure, which diverts money from going towards fiber and puts it into their pockets.

What allows for that type of grifting on a massive scale is the absence of the government mandating future proofing in its infrastructure investments. The absence of a future-proof requirement in law means the government subsidizes whatever meets the lowest speed required today without a thought about tomorrow. This is arguably one of the biggest reasons why so much of our broadband subsidies have gone to waste over the decades to any qualifying service with no thought towards the underlying infrastructure. The President’s endorsement of future-proofing broadband infrastructure is a significant and necessary course correction and it needs to be codified into the infrastructure package Congress is contemplating.

If every dollar the government spends on building broadband infrastructure needed to go to an infrastructure that met the needs of today and far into the future, there are a lot of older, slower, more expensive broadband options that are no longer eligible for government money or qualify as sufficient. At the same time, that same fiber infrastructure can be leveraged to enable dozens of services not just in broadband access, but other data-intensive services, which makes the Accessible, Affordable Internet for All Act’s preference for open access important and likely should be expanded on, given how fiber lifts all boats. One possible solution is to require government bidders to design networks around enabling wireless and wireline services instead of just one broadband service. Notably, the legislation currently lacks a future proof requirement in government investments, but given the President’s endorsement, it is our hope that it will be included in any final package that passes Congress to avoid wastefully enriching capacity-limited (and future-limited) broadband services. Building the roads to enable commerce should be how the government views broadband infrastructure.

We Need Fast Upload Speeds and Fiber, No Matter What Cable Lobbyists Try to Claim

The cable industry unsurprisingly hates the President’s proposal to get fiber to everyone because they are the high-speed monopolists for a vast majority of us and nothing forces them to upgrade other than fiber. Fiber has orders of magnitude greater capacity than cable systems and will be able to do deliver cheaper access to the high-speed era than cable systems as demand grows. In the 2020 debate on California’s broadband program, the state was deciding whether or not to  continue to subsidize DSL copper broadband, the cable lobby regularly argued that no one needs a fast upload that fiber provides because, look, no one who uses cable broadband uses a lot of upload (especially the throttled users).

There are a lot of flaws with the premise that asymmetric use of the internet is user preference, rather than the result of the current standards and cable packages. But the most obvious one is the fact that you need a market of symmetric gigabit and higher users for the technology sector to develop widely used applications and services. 

Just like with every other innovation we have seen on the internet, if the capacity is delivered, someone comes up with ways to use it. And that multi-gigabit market is coming online, but it will start in China, where an estimated 57% of all multi-gigabit users will reside by 2023 under current projections. China has in fact been laying fiber optics nine times faster than the United States since 2013 and that is a problem for American competitiveness. If we are not building fiber networks everywhere soon to catch up in the next five years, the next Silicon Valley built around the gigabit era of broadband access will be in China and not here. It will also mean that next-generation applications and services will just be unusable by Americans stuck on upload throttled cable systems. The absence of a major fiber infrastructure investment by the government effectively means many of us will be stuck in the past while paying monopoly rents to cable.

Fiber Is Not “Too Expensive,” Nor Should Investment Go to Starlink (SpaceEx) No Matter What Its Lobbyists Say

The current FCC is still reeling from the fact that the outgoing FCC at the end of its term granted nearly $1 billion to Starlink to cover a lot of questionable things. Despite the company historically saying it did not need a dollar of subsidy. And the actual fact that it really does not.

But if government money is on the table, it appears clear that Starlink will deploy its lobbyists to argue for its share, even when it needs none of it. That should be a clear concern to any congressional office that will be lobbied on the broadband infrastructure package. Under the current draft of the Accessible, Affordable Internet for All Act, it seems very unlikely that Starlink’s current deployment will qualify for any money, and certainly not if future proofing was included as a condition of receiving federal dollars. 

This is due to the fact that satellites are inherently capacity-constrained as a means to deliver broadband access. Each satellite deployed must maintain line of sight, can carry only so much traffic, require more and more satellites to share the burden for more capacity, and ultimately need to beam down to a fiber-connected base station. It is this capacity constraint that is why Starlink will never be competitive in cities. But Starlink does benefit from a focus on fiber in that the more places its base stations can connect between fiber and satellites, the more robust the network itself will become.

But in the end, there is no way for those satellites to keep up with the expected increases in capacity that fiber infrastructure will yield nor are they as long-lasting an investment as each satellite needs to be replaced at a fairly frequent basis as new ones are launched while fiber will remain useful once laid for decades. While they will argue that they are a cheaper solution for rural Americans, the fact is the number of Americans that cannot feasibly get a fiber line is extremely small. Basically, if you can get an electrical line to a house, then you can get them a fiber line.

For policymakers, Starlink’s service is best understood as the equivalent of a universal basic income of broadband access where it reaches far and wide and establishes a robust floor.  That on its own has a lot of value and is a reason why its effort to expand to boats, trucks, and airplanes is a good thing. But this is not the tool of long-term economic development for rural communities. It should be understood as a lifeline when all other options are exhausted rather than the frontal solution by the government for ending the digital divide. Lastly, given the fact that the satellite constellation is meant to serve customers globally and not just in the United States, it makes no sense for the United States to subsidize a global infrastructure to enrich one private company. The investments need to be in domestic infrastructure.

The Future Is Not 5G, No Matter What Wireless Carrier Lobbyists Say

For years, the wireless industry hyped 5G broadband, resulting in a lot of congressional hearings, countless hours by the FCC focusing its regulatory power on it, a massive merger between T-Mobile and Sprint, and very little to actually show for it today. In fact, early market analysis has found that 5G broadband is making zero profits, mostly because people are not willing to pay that much more on their wireless bill for the new network. The reality is 5G’s future is likely to be in non-broadband markets that have yet to emerge. But most importantly, national 5G coverage does not happen if you don’t have dense fiber networks everywhere. Any infrastructure plan that comes out of the government should avoid making 5G as part of its core focus given that it is the derivative benefit of fiber. You can have fiber without 5G, but you can’t have 5G without the fiber.

So even when companies like AT&T argue for 5G to be the infrastructure plan, the wireless industry has slowly started to come around to the fact that it's actually the fiber in the end that matters. Last year, the wireless industry acknowledged that 5G and fiber are linked and even AT&T is emphasizing now that the future is fiber. The risks are great if we put wireless on par with wires in infrastructure policy as we are seeing now with the Rural Development Opportunity Fund giving money to potentially speculative gigabit wireless bids instead of proven fiber to the home. This has prompted a lot of Members of Congress to ask the FCC to double-check the applicants before money goes out the door. Rather than repeat the RDOF mistakes, it's best to understand that infrastructure means the foundation of what delivers these services and not the services themselves.

We Can Get Fiber to Everyone If We Embrace the Public Model of Broadband Access

The industry across the board will refuse to acknowledge that the private model of broadband has failed many communities before and during the pandemic, whereas the public model of broadband has soared in areas where it exists. In both President Biden’s plan and the Clyburn/Klobuchar legislation is an emphasis on embracing local government and local community broadband networks. The Affordable, Accessible Internet for All Act outright bans states from preventing local governments from building broadband service. Those state laws that will be repealed by the bill were primarily driven by the cable lobby afraid of cheap public fiber access

By today we know their opposition to community fiber is premised on keeping broadband prices exceedingly high. We know now that if you eliminate the profit motive in delivering fiber, there is almost no place in this country that can’t be connected to fiber. When we have cooperatives delivering gigabit fiber at $100 a month to an average of 2.5 people per mile, one is hard-pressed in finding what areas are left out of reach. But equally important is making access affordable to low-income people, and given that we’ve seen municipal fiber offer ten years of free 100/100 mbps to 28,000 students of low-income families at a subsidy cost of $3.50 a month, it seems clear that public fiber is an essential piece to solving both the universal access challenge as well as making sure all people can afford to connect to the internet. Whatever infrastructure bill passes Congress, it must fully embrace the public model of access for fiber for all to be a reality.

 

Ernesto Falcon
Checked
2 hours 29 minutes ago
EFF's Deeplinks Blog: Noteworthy news from around the internet
Subscribe to EFF update feed