Stop Military Surveillance Drones from Coming Home

1 day 14 hours ago

A federal statute authorizes the Pentagon to transfer surveillance technology, among other military equipment, to state and local police. This threatens privacy, free speech, and racial justice.

So Congress should do the right thing and enact Representative Ayanna Pressley’s amendment, Moratorium on Transfer of Controlled Property to Enforcement Agencies, to H.R. 4350, the National Defense Authorization Act for Fiscal Year 2022 (NDAA22). It would greatly curtail the amount of dangerous military equipment, including surveillance drones, that could be transferred to local and state law enforcement agencies through the Department of Defense’s “1033 program.” It has already placed $7.4 billion in military equipment with police departments since 1990. 

The program includes both “controlled” property, such as weapons and vehicles, and “uncontrolled” property, such as first aid kits and tents. Pressley’s amendment would prevent the transfer of all “controlled” property, which includes “unmanned aerial vehicles,” or drones. It also includes: Manned aircraft, Wheeled armored vehicles, Command and control vehicles, specialized firearms and ammunition under .50 caliber, Breaching apparatus, and Riot batons and shields. 

Even without the Department of Defense landing drones into our communities, police use of these autonomous flying robots is rapidly expanding. Some police departments are so eager to get their hands on drones that they’ve claimed they need them to help fight COVID-19. The Chicago Police Department even launched a massive drone program using only off-the-books money taken through civil asset forfeiture.

 We know what will happen if police get their hands on more and more military surveillance drones. Technology given out on the condition that it can only be used in “extreme” circumstances often ends up being used in everyday acts of over-policing. And police have already used drones to monitor how people exercise their First Amendment-protected rights. 

After the New York City Police Department accused one activist, Derrick Ingram, of injuring an officer’s ears by speaking too loudly through his megaphone at a protest, police flew drones by his apartment window—a clear act of intimidation for activists and protestors. The government also flew surveillance drones over multiple protests against police racism and violence during the summer of 2020. When police fly drones over a crowd of protestors, they chill free speech and political expression through fear of reprisal and retribution from police. Police could easily apply face surveillance technology to footage collected by a surveillance drone that passed over a crowd, creating a preliminary list of everyone that attended that day’s protest. 

With the United States ending its multi-decade occupation of Afghanistan, military equipment once used in warfare is now inching closer to re-deployment onto U.S. streets. The scaling back of military involvement in Iraq coincided with a massive influx of weapons, armed vehicles, and other Department of Defense surplus being fed directly into police departments. We must prevent a repeat of history. 

In 2015, after public reaction against militarized police in Ferguson, Missouri, President Obama made a few reforms to the 1033 program.  Specifically, he banned transfer to the homefront of armored vehicles, weaponized aircraft and vehicles, weapons of over a specific caliber, grenade launchers, and bayonets. But this did not go far enough to ensure that section 1033 will not contribute to the mass surveillance of people on U.S. soil. 

We’re calling on the public and members of Congress to support Ayanna Pressley’s amendment, the Moratorium on Transfer of Controlled Property to Enforcement Agencies, to H.R. 4350.

Matthew Guariglia

HTTPS Is Actually Everywhere

1 day 17 hours ago

For more than 10 years, EFF’s HTTPS Everywhere browser extension has provided a much-needed service to users: encrypting their browser communications with websites and making sure they benefit from the protection of HTTPS wherever possible. Since we started offering HTTPS Everywhere, the battle to encrypt the web has made leaps and bounds: what was once a challenging technical argument is now a mainstream standard offered on most web pages. Now HTTPS is truly just about everywhere, thanks to the work of organizations like Let’s Encrypt. We’re proud of EFF’s own Certbot tool, which is Let’s Encrypt’s software complement that helps web administrators automate HTTPS for free.

The goal of HTTPS Everywhere was always to become redundant. That would mean we’d achieved our larger goal: a world where HTTPS is so broadly available and accessible that users no longer need an extra browser extension to get it. Now that world is closer than ever, with mainstream browsers offering native support for an HTTPS-only mode.

With these simple settings available, EFF is preparing to deprecate the HTTPS Everywhere web extension as we look to new frontiers of secure protocols like SSL/TLS. After the end of this year, the extension will be in “maintenance mode.” for 2022. We know many different kinds of users have this tool installed, and want to give our partners and users the needed time to transition. We will continue to inform users that there are native HTTPS-only browser options before the extension is fully sunset.

Some browsers like Brave have for years used HTTPS redirects provided by HTTPS Everywhere’s Ruleset list. But even with innovative browsers raising the bar for user privacy and security, other browsers like Chrome still hold a considerable share of the browser market. The addition of a native setting to turn on HTTPS in these browsers impacts millions of people.

Follow the steps below to turn on these native HTTPS-only features in Firefox, Chrome, Edge, and Safari and celebrate with us that HTTPS is truly everywhere for users.

Firefox

The steps below apply to Firefox desktop. HTTPS-only for mobile is currently only available in Firefox Developer mode, which advanced users can enable in about:config. 

Preferences > Privacy & Security > Scroll to Bottom > Enable HTTPS-Only Mode

Chrome

HTTPS-only in Chrome is available for both desktop and mobile in Chrome 94 (released today!).

Settings > Privacy and security > Security > Scroll to bottom > Toggle “Always use secure connections”

Edge

This is still considered an “experimental feature” in Edge, but is available in Edge 92.

  1. Visit edge://flags/#edge-automatic-https and enable Automatic HTTPS
  2. Hit the “Restart” button that appears to restart Microsoft Edge.

Visit edge://settings/privacy, scroll down, and turn on “Automatically switch to more secure connections with Automatic HTTPS”.

Safari

HTTPS is upgraded by default when possible in Safari 15, recently released September 20th, for macOS Big Sur and macOS Catalina devices. No setting changes are needed from the user.

Updates for Safari 15

Alexis Hancock

Why EFF Flew a Plane Over Apple's Headquarters

1 day 20 hours ago

For the last month, civil liberties and human rights organizations, researchers, and customers have demanded that Apple cancel its plan to install photo-scanning software onto devices. This software poses an enormous danger to privacy and security. Apple has heard the message, and announced that it would delay the system while consulting with various groups about its impact. But in order to trust Apple again, we need the company to commit to canceling this mass surveillance system.

The delay may well be a diversionary tactic. Every September, Apple holds one of its big product announcement events, where Apple executives detail the new devices and features coming out. Apple likely didn’t want concerns about the phone-scanning features to steal the spotlight. 

But we can’t let Apple’s disastrous phone-scanning idea fade into the background, only to be announced with minimal changes down the road. To make sure Apple is listening to our concerns, EFF turned to an old-school messaging system: aerial advertising.  

EFF banner flies over Apple Park, the corporate headquarters of Apple, located in Cupertino, California

EFF banner flies over the previous Apple headquarters

During Apple’s event, a plane circled the company’s headquarters carrying an impossible-to-miss message: Apple, don’t scan our phones! The evening before Apple’s event, protestors also rallied nationwide in front of Apple stores. The company needs to hear us, and not just dismiss the serious problems with its scanning plan. A delay is not a cancellation, and the company has also been dismissive of some concerns, referring to them as “confusion” about the new features.

Privacy Is Not For Sale

Apple’s iMessage is one of the preeminent end-to-end encrypted chat clients. End-to-end encryption is what allows users to exchange messages without having them intercepted and read by repressive governments, corporations, and other bad actors. We don’t support encryption for its own sake: we fight for it because encryption is one of the most powerful tools individuals have for maintaining their digital privacy and security in an increasingly insecure world.

Now that Apple’s September event is over, Apple must reach out to groups that have criticized it and seek a wider range of suggestions on how to deal with difficult problems, like protecting children online. EFF, for its part, will be holding an event with various groups that work in this space to share research and concerns that Apple and other tech companies should find useful. While Apple tends to announce big features without warning, that practice is a dangerous one when it comes to making sweeping changes to technology as essential as secure messaging. 

The world, thankfully, has moved towards encrypted communications over the last two decades, not away from them, and that’s a good thing. If Apple wants to maintain its reputation as a pro-privacy company, it must continue to choose real end-to-end encryption over government demands to read user’s communication. Privacy matters now more than ever. It will continue to be a selling point and a distinguishing feature of some products and companies. For now, it’s an open question whether Apple will continue to be one of them. 

Further Reading:

Jason Kelley

How California’s Broadband Infrastructure Law Promotes Local Choice

5 days 18 hours ago

The legislative session has ended and Governor Newsom is expected to sign into law S.B. 4 and A.B. 14. These bills stand as the final pieces of the state’s new broadband infrastructure program. With a now-estimated $7.5 billion assembled between federal and state funds, California has the resources it needs to largely close the digital divide in the coming years. This program allows local cities and counties to access infrastructure dollars to solve problems in their own communities along with empowering local private entities, rather than depend on large, private multi-nationals who aren’t willing to make the needed generational investment into infrastructure in most areas of the state.

EFF will explain below why local communities need to take charge, and how the new law will facilitate local choice in broadband. No state has taken this approach yet and departed from the old model of handing over all the subsidies to giant corporations. That’s why it’s important for Californians to understand the opportunity before them now.

Why it Has to be a Local Public, Private, or Public/Private Entity

If the bankruptcy of Frontier Communications has taught us anything, it is the following two lessons. First, large national private ISPs will forgo 21st-century fiber infrastructure in as many places they can to pad their short-term profits. Government subsidies to build in different areas do not change this behavior. Second, the future of broadband access depends on the placement of fiber optic wires. Fiber is an investment in long-term value over short-term profits. EFF’s technical analysis has also laid out why fiber optics is future-proof infrastructure by showing that no other transmission medium for broadband even comes close, which makes its deployment essential for a long-term solution.

AT&T and cable companies, such as Comcast and Charter, are going to try to take advantage of this program by making offers that sound nice. But they will leverage existing legacy infrastructure that is rapidly approaching obsolescence. While they may be able to offer connectivity that’s “good enough for today” at a cheaper price than delivering fiber, there is no future in those older connections. It’s clear that higher uploads are becoming the norm, and at ever-increasing speeds. As California’s tech sector begins to embrace distributed work, only communities with 21st-century fiber broadband access will be viable places for those workers to live. Fiber optics’ benefits are clear. The challenge of  fiber optics is that its high upfront construction costs require very long-term financing models to deliver on its promise. Here is how the state’s new program makes that financing possible.

A Breakdown of the New Broadband Infrastructure Program

The infrastructure law has four mechanisms in place to help finance and plan new, local options: a grant program for the unserved; long-term financing designed around public, non-profit, and tribal entities; a state-run middle-mile program; and a state technical assistance program. Let’s get into the weeds on each of them.

Broadband Infrastructure Grant Account – The state is making more than $2 billion (and possibly up to $3.5 billion) available in grants, over the coming years, to finance (at 100% of the state’s cost) the construction of broadband networks in areas that need them. To qualify, such areas must lack the following three traits, premised on federal and state mapping data:

  • Broadband service at speeds of at least 25 mbps downstream and 3 mbps upstream (this is mostly folks reliant on DSL copper access or less)
  • Latency that is sufficiently low to allow real-time interactive applications
  • Is not currently receiving money from, and is carrying out the objectives of, the Rural Digital Opportunity Fund

To focus the grant funds,  priority is placed on areas that do not even have 10 mbps downstream and 1 mbps upstream—this is mostly areas that only have satellite internet. This program is focused on having the state paying the construction costs for people who have no internet access at all, as opposed to those with slow, useless, or inadequate access.

Loan Loss Reserve Fund – The State Treasury will establish this fund to enable long-term financing by cities, counties, community service districts, public utilities, municipal utility districts, joint powers authority, local educational agencies, tribal governments, electrical cooperatives, and non-profits. It will be designed to help these entities obtain very low interest rates with low debt obligations. Think of this program like our mortgage-lending system.  30-year fixed mortgages enable many people to purchase homes, even if they could never gather the cash necessary to make the purchase all at once. Fiber is well-suited for this type of financing vehicle; it will be able to deliver speeds useful for multiple decades and carries lower maintenance costs than other broadband options.

State Open-Access Middle-Mile – The state of California, overseen by the Department of Technology, will deploy fiber infrastructure on an open-access basis—meaning on non-discriminatory terms and accessible by ISPs— with an emphasis on developing rural exchange points. The goal behind this infrastructure is to deliver multi-gigabit capacity to areas building broadband access, and also to bring down the cost to affordable rates for obtaining backhaul capacity to the global internet. To use an analogy, the state is building the highways to connect communities to the airport—and the world. The option to connect to these internet highways will be made available to all comers. So, for example, small local businesses or local townships can connect a fiber line to these facilities to build a local broadband network.

Technical Assistance by the State – Fiber infrastructure is a game-changer on the ground. Echoing the way the federal government advised local governments and communities on the deployment a similarly revolutionary technology—electricity— the new broadband infrastructure law deputizes the California Public Utilities Commission to provide technical assistance for these plans. The CPUC will provide local governments and providers with assistance for grant applications to other federal programs and participate in the development of infrastructure plans with county governments.

How all These Programs Work Together and End the Reliance on AT&T and Comcast

Any small business, local government, or even a school district will soon have these tools to solve their own problems. As they look to use the programs listed above, it’s important for any local player seeking to build their own broadband solution to understand it will take multi-year effort to do it right. The loan-loss reserve program will focus on multi-decade repayment plans. This gives eligible entities access to billions of loan dollars for future-proof fiber infrastructure. The grants are meant to eliminate the construction burden of delivering access to the most difficult-to-serve populations in pockets throughout the state. But, any real effort to build a network will have to include their underserved neighbors. For those communities, the state will attempt to deliver the best-priced access to bandwidth capacity through its middle-mile program. Doing so will help keep prices as low as is feasible to enable the delivery of cheap, fast internet in areas that otherwise would never have seen access.

And for any of this to happen, every community needs someone at the local level who is well-versed in how to use the state’s program. That’s where the technical assistance by the state comes in, to help locals navigate the hardest parts of developing a local broadband solution.

Still, no state program can make folks on the ground do the work. That’s why we need people engaged in their communities. If you are tired of relying on big providers that prioritize Wall Street investors over your local community’s needs and are motivated to figure out a solution at home, this is your moment.  This new law not only had you in mind, it’s counting on you to step up to the plate.

Ernesto Falcon

No, Tech Monopolies Don’t Serve National Security

6 days 14 hours ago

In what appears to be a “throw spaghetti on the wall approach” to stopping antitrust reform targeting Big Tech, a few Members of Congress and a range of former military and intelligence officials wrote a letter asserting that these companies need to be protected for national security. It’s a spurious argument that seeks to leverage fear of China to prevent changes desperately needed for consumer choice and innovation.

The argument they make is that gigantic tech companies are the only ones who can innovate and compete with China. But this completely misses the point on innovation. When companies have monopolies, they have no reason to innovate since they have captured the market. There is no need to compete to have the best product when you are the only product. Innovation depends on the best ideas from everyone being put forth to the public.   

Now, we don’t know if these folks actually believe in the argument or if they think the rest of us will believe in the argument because they say it, but this letter is really only about delaying legislative antitrust action through raising not just fictional concerns, but completely bogus takes on how innovation happens on the internet.

This Has Been Tried Before, and It Didn’t Work Then

The irony about the national security argument is that it takes a page straight out of the AT&T monopoly playbook and history. Forty years ago, AT&T was the largest corporation in the world and was facing antitrust action both in Congress and the courts. In a Hail Mary effort to get the Department of Justice to abandon its lawsuit, AT&T lobbyists went to the Department of Defense and convinced them that a monopoly communications network was essential for national security.

Source: New York Times Archive found here https://www.nytimes.com/1981/04/09/business/weinberger-defends-at-t.html

The plan was to convince then-President Ronald Reagan that he should directly order the Department of Justice to end the case, despite nearly six years of court hearings detailing how AT&T leveraged its monopoly power. In fact, a year prior to the Department of Defense weighing in opposition to further antitrust action, a federal jury had already awarded MCI $1.8 billion in antitrust damages against AT&T.

The situation with Big Tech is similar to the AT&T monopoly of the past facing antitrust actions on various fronts and like AT&T is attempting to change the narrative and come up with any excuse to avoid the right outcome, which is opening up the tech industry to competition.

Innovation Does Not Come From Big Tech; It Gets Bought by Them

The signers of the letter adopt the view that massive consolidation of the industry is necessary for innovation. But the exact opposite is true. Due to the size of these companies and their targeted acquisitions, innovation is either unnecessary or simply bought up. Startups with new ideas aren’t being launched to make something that competes with Google, Facebook, Apple, and Amazon’s services or products because the lion share of investor money has gone towards creating products that Big Tech will pay lots of money to acquire.

Congressional investigations identified this “kill zone” as the area of tech products and services that orbit the dominant platforms' products, such as search in the case of Google or social media in the case of Facebook. In fact, one would be hard-pressed to find a new organic product from Big Tech that didn’t find its origins in buying another company.

After a lengthy investigation by the House Judiciary Committee and Senate hearings into the merger practices of these companies with a wide array of experts and industry players, the congressional record is full of evidence to demonstrate that the size of Big Tech is, in fact, suppressing competition that sparks innovation. Think about how the tech industry used to be a place where previous giants were replaced regularly with the next best thing that initially started as a garage startup. EFF calls this the life cycle of competition, and it has been fading from the tech industry due to where things are now. This is why EFF strongly supports bills such as the ACCESS Act and the Open App Markets Act because they would open up dominant platforms to new entrants and help empower smaller players to innovate without interference again.

It comes as no surprise that 79% of Americans view Big Tech mergers as anti-competitive because the public isn’t fooled. These companies aren’t huge because it gives them some sort of cutting edge; they are huge because it conveys dominance, control, and monopoly profits. The public understands this, but, clearly, some Members of Congress are not getting it.

Ernesto Falcon

What’s Up with WhatsApp Encrypted Backups

6 days 18 hours ago

WhatsApp is rolling out an option for users to encrypt their message backups, and that is a big win for user privacy and security. The new feature is expected to be available for both iOS and Android “in the coming weeks.” EFF has pointed out unencrypted backups as a huge weakness for WhatsApp and for any messenger that claims to offer end-to-end encryption, and we applaud this improvement. Next, encryption for backups should become the default for all users, not just an option.

Currently, users can choose to periodically back up their WhatsApp message history on iCloud (for iOS phones) or Google Drive (for Android phones), or to never back them up at all. Backing up your messages means that you can still access them if, for example, your phone is lost or destroyed. 

WhatsApp does not have access to these backups, but backup service providers Apple and Google sure do. Unencrypted backups are vulnerable to government requests, third-party hacking, and disclosure by Apple or Google employees. That’s why EFF has consistently recommended that users not back up their messages to the cloud, and further that you encourage your friends and contacts to skip it too. Backing up secure messenger conversations to the cloud unencrypted (or encrypted in a way that allows the company running the backup to access message contents) means exposing the plaintext to third parties, and introduces a significant hole in the protection the messenger can offer.

When encrypted WhatsApp backups arrive, that will change. With fully encrypted backups, Apple and Google will no longer be able to access backed up WhatsApp content. Instead, WhatsApp backups will be encrypted with a very long (64-digit) encryption key generated on the user’s device. Users in need of a high level of security can directly save this key in their preferred password manager. All others can rely on WhatsApp’s recovery system, which will store the encryption key in a way that WhatsApp cannot access, protected by a password of the user’s choosing

This privacy win from Facebook-owned WhatsApp is striking in its contrast to Apple, which has been under fire recently for its plans for on-device scanning of photos that minors send on Messages, as well as of every photo that any Apple user uploads to iCloud. While Apple has paused to consider more feedback on its plans, there’s still no sign that they will include fixing one of its longstanding privacy pitfalls: no effective encryption across iCloud backups. WhatsApp is raising the bar, and Apple and others should follow suit.

Gennie Gebhart

The Catalog of Carceral Surveillance: Patents Aren't Products (Yet)

1 week ago

In EFF’s Catalog of Carceral Surveillance, we explore patents filed by or awarded to prison communication technology companies Securus and Global Tel*Link in the past five years. The dystopian technology the patents describe are exploitative and dehumanizing. And if the companies transformed their patents into real products, the technology would pose extreme threats to incarcerated people and their loved ones.

But importantly, patents often precede the actual development or deployment of a technology. Though applications may demonstrate an interest in advancing a particular technology, these intentions don’t always progress beyond the proposal, and many inventions that are described in patent applications don't wind up being built. What we can glean from a patent application is that the company is thinking about the technology and that it might be coming down the pipeline.

In 2019, Platinum Equity, the firm that has owned Securus Technologies since 2017, restructured the company, placing it under the parent company Aventiv. Aventiv claimed it would lead Securus through a transformation process that includes greater respect for human rights. According to Aventiv, many of patents filed prior to 2019 will remain just ideas, never to be built. Following the publication of our initial Catalog of Carceral Surveillance posts, Aventiv responded with the following statement: "We at Aventiv are committed to protecting the civil liberties of all those who use our products. As a technology provider, we continuously seek to improve and to create new solutions to keep our communities safe.”

Aventiv’s statement goes on to respond to EFF’s post describing a patent filed by Securus that envisions a system for monitoring online purchases made by incarcerated people and their families. The company wrote: “The patent is not currently in development as it was an idea versus a product we will pursue,” and added that to “ensure there is no additional misunderstanding, we will be abandoning this patent and reviewing all open patents to certify that they align with our transformation efforts.”

Aventiv stated, “The patent you reference is 10904297, which was filed in June 2019, prior to our company publicly announcing a multi-year transformation effort."

The company did not offer additional details regarding the other patents of theirs we spotlight, including those focused on drone detection, gaming services, and tablet advertisements. 

The statement concluded: “Our organization is focused on better serving justice-involved people by making our products more accessible and affordable, investing in free educational and reentry programming, and taking more opportunities—just like this one—to listen to consumers. To ensure there is no additional misunderstanding, we will be abandoning this patent and reviewing all open patents to certify that they align with our transformation efforts."

GTL declined to comment for this series.

GTL and Securus were once among the greatest opponents of federal regulation of prison phone calls. They’ve claimed to have adjusted their positions. Both announced over the summer that they are supportive of reforms to create more accessible prison communications. Each began to offer inmates free phone calls and free tablets

To better understand the potential (but not certain) futures of these companies, EFF created the  Catalog of Carceral Surveillance to spotlight the patents that could pave the way toward chilling developments in surveillance.

In the coming months, EFF plans to follow up with Aventiv to hold them to their word and will continue to remind prison technology companies of their responsibilities to the families they serve.

View the Catalog of Carceral Surveillance below. New posts will be added daily

 

Beryl Lipton

The Federal Government Just Can’t Get Enough of Your Face

1 week ago

There are more federal facial recognition technology (FRT) systems than there are federal agencies using them, according to the U.S. Government Accountability Office (GAO). Its latest report on current and planned use of FRT by federal agencies reveals that, among the 24 agencies surveyed, there are 27 federal FRT systems. Just three agencies—the U.S. Departments of Homeland Security, Defense, and Justice—use 18 of these systems for, as they put it, domestic law enforcement and national security purposes.

But 27 current federal systems are not enough to satisfy these agencies. The DOJ, DHS, and Department of the Interior also accessed FRT systems “owned by 29 states and seven localities for law enforcement purposes.” Federal agencies further accessed eight commercial FRT systems, including four agencies that accessed the infamous Clearview AI. That’s all just current use. Across federal agencies, there are plans in the next two years to develop or purchase 13 more FRT systems, access two more local systems, and enter two more contracts with Clearview AI.

As EFF has pointed out again and again, government use of FRT is anathema to our fundamental freedoms. Law enforcement use of FRT disproportionately impacts people of color, turns us all into perpetual suspects, and increases the likelihood of false arrest. Law enforcement agencies have also used FRT to spy on protestors.

Clearview AI, a commercial facial surveillance entity used by many federal agencies, extracts the faceprints of billions of unsuspecting people, without their consent, and uses them to provide information to law enforcement and federal agencies. They are currently being sued in both Illinois state court and federal court for violating the Illinois Biometric Information Privacy Act (BIPA). Illinois' BIPA requires opt-in consent to obtain someone’s faceprint. Recently, an Illinois state judge allowed the state case to proceed, opening a path for the American Civil Liberties Union (ACLU)  to fight against Clearview AI’s business model, which trades in your privacy for their profit. You can read the opinion of the judge here, and find EFF’s two amicus briefs against Clearview AI here and here

FRT in the hands of the government erodes the rights of the people. Even so, the federal government’s appetite for your face—through one of their 27 systems or commercial systems such as Clearview AI—is insatiable. Regulation is not sufficient here; the only effective solution to this pervasive problem is a ban on the federal use of FRT. Cities across the country from San Francisco, to Minneapolis, to Boston, have already passed strong local ordinances to do so.

Now we must go to Congress. EFF supports Senator Markey’s Facial Recognition and Biometrics Technology Moratorium Act, which would ban the federal government’s use of FRT and some other biometric technologies. Join our campaign and contact your members of Congress  and tell them to support this ban. The government can’t get enough of your face. Tell them they can’t have it.

Take Action

Tell Congress to Ban Federal Use of Face Recognition

You can find the GAO’s Report here

Chao Liu

Texas’ Social Media Law is Not the Solution to Censorship

1 week ago

The big-name social media companies have all done a rather atrocious job of moderating user speech on their platforms. However, much like Florida's similarly unconstitutional attempt to address the issue (S.B. 7072), Texas' recently enacted H.B. 20 would make the matter worse for Texans and everyone else.

Signed into law by Governor Abbott last week, the Texas law prohibits platforms with more than 50 million users nationwide from moderating user posts based on viewpoint or geographic location. However, as we stated in our friend-of-the-court brief in support of NetChoice and the Computer & Communications Industry Associations lawsuit challenging Florida's law (NetChoice v. Moody), "Every court that has considered the issue, dating back to at least 2007, has rightfully found that private entities that operate online platforms for speech and that open those platforms for others to speak enjoy a First Amendment right to edit and curate that speech."

Inconsistent and opaque content moderation by online media services is a legitimate problem. It continues to result in the censorship of a range of important speech, often disproportionately impacting people who aren’t elected officials. That's why EFF joined with a cohort of allies in 2018 to draft the Santa Clara Principles on Transparency and Accountability in Content Moderation, offering one model for how platforms can begin voluntarily implementing content moderation practices grounded in a human rights framework. Under the proposed principles, platforms would:

  1. Publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines.
  2. Provide notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension.
  3. Provide a meaningful opportunity for timely appeal of any content removal or account suspension.

H.B. 20 does attempt to mandate some of the transparency measures called for in the Santa Clara Principles. Although these legal mandates might be appropriate as part of a carefully crafted legislative scheme, H.B. 20 is not the result of a reasonable policy debate. Rather it is a retaliatory law aimed at violating the First Amendment rights of online services in a way that will ultimately harm all internet users.

We fully expect that once H.B. 20 is challenged, courts will draw from the wealth of legal precedent and find the law unconstitutional. Perhaps recognizing that H.B. 20 is imperiled for the same reasons as Florida’s law, the Lonestar State this week filed a friend-of-the-court brief in the appeal of a federal court’s ruling that Florida’s law is unconstitutional.

Despite Texas and Florida’s laws being unconstitutional, the concerns regarding social media platforms' control on our public discourse is a critical policy issue. It is vitally important that platforms take action to provide transparency, accountability, and meaningful due process to all impacted speakers and ensure that the enforcement of their content guidelines is fair, unbiased, proportional, and respectful of human rights. 

Nathan Sheard

Lessons From History: Afghanistan and the Dangerous Afterlives of Identifying Data

1 week ago

As the United States pulled its troops out of Afghanistan after a 20-year occupation, byproducts of the prolonged deployment took on new meaning and represented a new chapter of danger for the Afghan people. For two decades, the United States spearheaded the collection of information on the people of Afghanistan, both for commonplace bureaucratic reasons like payroll and employment data—and in massive databases of biometric material accessible through devices called HIIDE. 

HIIDE, the Handheld Interagency Identity Detection Equipment, are devices used to collect biometric data like fingerprints and iris scans and store that information on large accessible databases. Ostensibly built in order to track terrorists and potential terrorists, the program also was used to verify the identities of contractors and Afghans working with U.S. forces. The military reportedly had an early goal of getting 80% of the population of Afghanistan into the program. With the Taliban retaking control of the nation, reporting about the HIIDE program prompted fears that the equipment could be seized and used to identify and target vulnerable people. 

Some sources, including those who spoke to the MIT Technology Review, claimed that the HIIDE devices offered only limited utility to any future regimes hoping to use them and that the data they access is stored remotely and therefore less of a concern. They did raise alarms, however, on the wide-reaching and detailed Afghan Personnel and Pay System (APPS), used to pay contractors and employees working for the Afghan Ministry of Interior and Ministry of Defense. This database contains detailed information on every member of the Afghan National Army and Afghan National Police—prompting renewed fears that this information could be used to find people who assisted the U.S. military or Afghan state-building, policing, and counter-insurgency measures. 

There has always been concern and protest over how the U.S military used this information, but now that concern takes on new dimensions. This is, unfortunately, a side effect of the collection and retention of data on individuals. No matter how secure you think the data is—and no matter how much you trust the current government to use the information responsibly and benevolently—there is always a risk that either priorities and laws will change, or an entirely new regime will take over and inherit that data. 

One of the most infamous examples was the massive trove of information collected and housed by Prussian and other German police and city governments in the early twentieth century. U.S. observers given tours of the Berlin police filing system were shocked to find dozens of rooms filled with files. In total, over 12 million records were kept containing personal and identifying information for people who had born, lived, or traveled through Berlin since the system began. Although Prussian police were known for political policing and brutal tactics, during the Weimar period between 1918 and 1933, police were lenient and even begrudgingly accepting of LGBTQ+ people at a time when most other countries severely criminalized people with same-sex desires and gender-nonconforming people. 

All of this changed when the Nazis rose to power and seized control of not just the government and economy of a major industrialized nation, but also millions of police files containing detailed information about people, who they were, and where to find them

The history of the world is filled with stories of information—collected responsibly or not, with intended uses that were benevolent or not—having long afterlives. The information governments collect today could fall into more malevolent hands tomorrow. You don't even need to go abroad in search of a government finding new nefarious uses for information collected on individuals for entirely different and benevolent purposes. 

With the afterlives of biometric surveillance and data retention now re-threatening people in Afghanistan, we are now regrettably able to add this chapter to this history of the dangers of mass data collection. Better protections on information and its uses can only go so far. In many instances, the only way to ensure that people are not made vulnerable by the misuse of private information is to limit, wherever possible, how much of it is collected in the first place. 

Matthew Guariglia

Surveillance Self-Defense Guides Now Available in Burmese

1 week ago

As part of our goal to expand the impact of our digital security guide, Surveillance Self-Defense (SSD), we recently translated the majority of its contents into Burmese. This repository of resources on circumventing surveillance across a variety of different platforms, devices, and threat models is now available in English, and in whole or in part in 11 other languages: Amharic, Arabic, Spanish, French, Russian, Turkish, Vietnamese, Brazilian Portuguese, Burmese, Thai, and Urdu.

The last year has seen significant numbers of protests by the people of Myanmar against human and digital rights violations by the military, prompted by the recent military coup in the country. Fighting back against human rights violations shouldn’t require you to have a computer science degree, and so our SSD guides help explain, in clear language, how to protect yourself from digital surveillance and unpack key concepts that make doing so easier. These guides offer overviews and recommendations for digital security protection during protests, network circumvention, using VPNs and Tor, using Signal, social media safety, and so on. 

We hope these resources will help those in Myanmar access reliable, up-to-date digital security guidance during a high-stress time, localized to the unique considerations in Myanmar. In addition to this project, we also plan to ​​translate our new mobile phone privacy guide into multiple languages, including Turkish, Russian, and Spanish. We’d like to thank the National Democratic Institute for providing funds for these translations, and Localization Lab for their efforts in completing them.

Jason Kelley

EFF and Allies Urge Council of Europe to Add Strong Human Rights Safeguards Before Final Adoption of Flawed Cross Border Surveillance Treaty

1 week 1 day ago

EFF, European Digital Rights (EDRi), the Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC), and other civil society organizations have worked closely on recommendations to strengthen human rights protections in a flawed international cross border police surveillance treaty drafted by the Council of Europe (CoE). At a virtual hearing today before the CoE Parliamentary Assembly (PACE) Committee on Legal Affairs and Human Rights, EFF Policy Director for Global Privacy Katitza Rodriguez presented a summary of the concerns we and our partners have about the treaty’s weak privacy and human rights safeguards.

There is much at stake, as the draft Second Additional Protocol to the Budapest Convention on Cybercrime will reshape cross-border law enforcement data-gathering on a global scale. The Protocol’s objectives are to facilitate cross-border investigations between countries with varying legal systems and standards for accessing people’s personal information. In her testimony, the text of which is published in full below, Rodriguez highlighted key shortcomings in the Protocol, and recommendations for fixing them.

EFF Testimony and Statement to Committee on Legal Affairs and Human Rights, Parliamentary Assembly, Council of Europe

At the highest level, the current Protocol should establish clear and enforceable baseline safeguards in cross-border evidence gathering, but fails to do so. Though new police powers are mandatory, corresponding privacy protections are frequently optional, and the Protocol repeatedly defers to harmonised safeguards in an active attempt to entice states with weaker human rights records to sign on. The result is a net dilution of privacy and human rights on a global scale. But the right to privacy is a universal right. International law enforcement powers should come with detailed legal safeguards for privacy and data protection. When it comes to data protection, Convention 108+ should be the global reference. By its recommendations to the Council of Ministers, PACE has an opportunity to establish a commonly acceptable legal framework for international law enforcement that places privacy and human rights at its core.

Protecting Online Anonymity


Substantively, we have concerns regarding Article 7 of the Protocol, which permits direct access by law enforcement in one country to subscriber identity information held by a company in another country. In our opinion, Article 7 fails to provide, or excludes, critical safeguards contained in many national laws. For example, Article 7 does not include any explicit restrictions on targeting activities which implicate fundamental rights, such as freedom of expression or association, and prevents Parties from requiring foreign police to demonstrate that the subscriber data they seek will advance a criminal investigation.[1]

We are particularly concerned that Article 7’s explanatory text fails to acknowledge that subscriber data can be highly intrusive. Your IP address can tell authorities what websites you visit and what accounts you used. Police can also request the name and address associated with your IP address in order to link your identity to your online activity, and that can be used to learn deeply intimate aspects of your daily habits. Article 7’s identification power undermines online anonymity in a context that embraces legal systems with widely divergent approaches to criminal justice, including some governments that are autocratic in nature. The resulting threat to journalists, human rights defenders, politicians, political dissidents, whistleblowers and others is indefensible.

This is why we've urged PACE to remove Article 7 entirely from the text of the Protocol. States would still be able to access subscriber data in cross-border contexts, but would instead rely on Article 8, which includes more safeguards for human rights. If Article 7 is retained, we’ve urged for additional minimum safeguards, such as:

  • Ensuring that the explanatory text properly acknowledges that access to subscriber data can be highly intrusive.
  • Providing Parties with the option, at least, of requiring prior judicial authorization for requests made under Article 7.
  • Requiring Parties to establish a clear evidentiary basis for Article 7 requests.
  • Ensuring that Article 7 requests provide enough factual background to assesscompliance with human rights standards and protected privileges.
  • Requiring notification or consultation with a responding state for all Article 7 demands.
  • Requiring refusal of Article 7 requests when necessary to address lack of doublcriminality or protection of legal privileges.
  • Providing the ability to reserve Article 7 in a more nuanced and timely manner.
  • Ensuring that Article 7 demands include details regarding legal remedies and obligations for service provider refusal.
Raising the Bar for Data Protection


When it comes to Article 14’s data protection safeguards, we have asked that the Protocol be amended so that signatories may refuse to apply its most intrusive powers (Articles 6, 7 and 12) when dealing with any other signatory that has not also ratified Convention 108+. We also hope the Parliamentary Assembly will support the Committee of Convention 108’s mission, and remember (or take note) that the Committee of Ministers supports making Convention 108 the global reference for data protection, including in the implementation of this Protocol.

Article 14 itself falls short of modern data protection requirements and, in some contexts, will actively undermine emerging international standards. Two examples:

  • Fails to require independent oversight of law enforcement investigative activities. For example, many oversight functions can be exercised by government officials housed within the same agencies directing the investigations;
  • Article 14 limits the situations in which biometric data can be considered ‘sensitive and in need of additional protection despite a growing international consensus that biometric data is categorically sensitive.

But even with the weak standards contained in Article 14, signatories are explicitly permitted to bypass these safeguards through various mechanisms, none of which provide any assurance that meaningful privacy protections will be in place. For example, any two or more signatories can enter into an international data protection agreement that will supersede the safeguards outlined in Article 14. The agreement does not need to provide a comparable or adequate level of protection to the default rules.

Signatories can even adopt less protective standards in secret agreements or arrangements and continue to rely on the Protocol’s law enforcement powers. We have therefore recommended that the Protocol be amended to ensure a minimum threshold of privacy protection in Article 14, one which may be supplemented with more rigorous protections but cannot be replaced by weaker standards. This would also be done in a vein to avoid the fragmentation of privacy regimes.

Make Joint Investigative Team Limitations Explicit


Under Article 12, signatories can form joint investigative teams that can bypass core existing frameworks such as the MLAT regime when using highly intrusive cross-border investigative techniques or when transferring personal information between team members.

We have asked that the Protocol be amended so that some of its core intended limitations are made explicit. This is particularly important given that many teams may ultimately be operating with a higher level of informality and driven by police officers without input or supervision from other government bodies typically involved in overseeing cross-border investigations. Specifically, we have asked that the Protocol (or, alternatively, the explanatory text) clearly and unequivocally state that participants in a joint investigative team must not take investigative measures within the territory of another participant in the team and that no participant may violate the laws of another participant of that team.

We also ask that the Protocol be amended so that Parties are obligated to involve their central authorities (and, preferably, the entity responsible for data protection oversight) in the formation and general operation of an investigative team, and that agreements governing investigative teams be made public except to the degree that doing so would threaten investigative secrecy or is necessary to achieve other important public interest objectives.

Read more on this topic:

 

 

 

Karen Gullo

Protestors Nationwide Rally to Tell Apple: "Don't Break Your Promise!"

1 week 1 day ago

Yesterday in San Francisco, Chicago, Boston, New York, and other cities across the U.S activists rallied in front of Apple stores demanding that the company fully cancel its plan to introduce surveillance software into its devices. In addition to protests at stores organized by EFF and Fight for the Future, EFF also took the message directly to Apple’s headquarters by flying a banner above the campus during its annual iPhone launch event today. 

The last time EFF held a protest at an Apple store, in 2016, it was to support the company’s strong stance in protecting encryption. That year, Apple challenged the FBI’s request to install a backdoor into its operating system. This year, in early August, Apple stunned its many supporters by announcing a set of upcoming features, intended to help protect children, that would create an infrastructure that could easily be redirected to greater surveillance and censorship. These features would pose enormous danger to iPhone users’ privacy and security, offering authoritarian governments a new mass surveillance system to spy on citizens. 

After public pushback in August, Apple announced earlier this month that its scanning program would be delayed. Protestors this week rallied to urge Apple to abandon its program and commit to protecting user privacy and security. Speakers included EFF Activist Joe Mullin and Executive Director Cindy Cohn.

Mullin told the crowd at the San Francisco protest how essential it was that Apple continue its commitment to protecting users: “From San Francisco to Dubai, Apple told the whole world that iPhone is all about privacy,” said Mullin. “But faced with government pressure, they caved. Now 60,000 users have signed a petition telling Apple they refuse to be betrayed.”

Holding signs that read “Don’t Scan My Phone” and “No Spy Phone,” protestors chanted “No 1984, no, Apple—no backdoor!" and “2-4-6-8, stand with users, not the state; 3-5-7-9, privacy is not a crime!”

“We can't be silent while Tim Cook and other Apple leaders congratulate themselves on their new products after they've signed on to a mass surveillance project,” said Mullin.  “No scanners on our phones!”


Apple has said that it will take additional time over the coming months to collect input about its child protection features. Later this month, EFF ​​hopes to begin that conversation with a public event that will bring together representatives from diverse constituencies who rely on encrypted platforms. Discussion will focus on the ramifications of these decisions, what we would like to see changed about the products, and protective principles for initiatives that aim to police private digital spaces. We hope Apple and other tech companies will join us as well. You can find out more soon about this upcoming event by visiting our events page.

Read further on this topic: 

Jason Kelley

The Catalog of Carceral Surveillance: Overt and Covert Surveillance of Prisoners Via Telephones and Tablets

1 week 2 days ago

In the novel 1984, George Orwell imagines a technology called a “viewscreen” which not only lets you watch TV but lets a surveillance state watch you. This omni-present panopticon helped “big brother” keep the citizens paranoid and under control. Now, thanks to the work of the notorious prison telephony company Securus, this nightmare can be a reality for millions of prisoners in the United States

Video visitation devices are a core business of Securus, allowing people in prison to talk to their friends, family, and attorneys over video chat services. While video visitation can be one of several appropriate forms of inmate communications, prisons have used this new service to limit in-person visitation hours or even cut them entirely. Also, these devices often provide low quality video chat at excessive prices--often reaching hours of inmate labor per minute of chat.

Worst of all, they now can record audio and video of imprisoned people surreptitiously, with no outward indicator. When the call is over, it can also perform biometric identification on the inmate or anyone the inmate was talking to. Although the friends and family of imprisoned people are often subjected to facial recognition when visiting a prison in person, this represents an even more intense level of surveillance by inviting it into the home of a visitor who is having a video visit with an inmate. 

Securus has also patented a method of overt video surveillance. In its patent document, Securus suggests that their communication devices (such as tablets) could notify the user that they are being recorded during one approved task and then secretly recorded during another. The covertly captured user’s face can then be processed with face recognition software. 

Flowchart showing Securus system for capturing audio and video for biometric use.

Securus poses a hypothetical situation where an inmate is attempting to log in to a prison communication device. The inmate may have their picture or video covertly taken if they fail the login attempt a predetermined number of times. That picture or video could then be matched with facial recognition to determine whether someone was trying to log into someone else’s account. 

Securus also provides the following troubling example: “if a correctional facility wants an inmate to acknowledge that he has read the facility's Inmate Handbook ... or the like, the resident may be required to enter his or her credentials on the resident facing device, after the resident viewed the material on the device. Whereupon, … the resident-facing device may take the inmate's photo at that time, not turning on the light, so the inmate will not know a picture is being taken.”

What if facial recognition or biometrics have a false negative, will a prisoner be punished because the algorithm didn’t recognize them one day? Facial recognition algorithms are of course not perfect and can have especially high failure rates for people of color and women. These patents could lead to false accusations, and punishments ranging from revoking visitation privileges to solitary confinement--making the already draconian U.S. prison system even more cruel. 

Securus intends covert surveillance not just against prisoners interacting with a phone or tablet, but also  prisoners who just happen to be standing nearby. In their patent, Securus suggests: “database may include one or more covert surveillance rules including, for example, …  a maximum threshold number of inmates allowed to be in different areas within the facility at a given time, etc.”

Also, Securus doesn’t limit its biometric surveillance to the face. According to the patent, its system also “may include biometric signatures of inmates (e.g., voice, facial, iris, fingerprint, etc.) and/or of other facility personnel (e.g., correctional officers, staff, etc.).” Given Securus’ lackadaisical attitude towards the privacy of inmates’ friends and families, we fear visitors may end up in this biometric database as well. 

Even the contemporary panopticon of the U.S. prison system is not enough to satiate the prison-industrial complex. With the disturbing technologies it is inventing, Securus is ensuring that people in prison and their families will be subjected to greater and greater intrusions to sustain the insatiable maw of carceral capitalism. 

Cooper Quintin

Geofence Warrants Threaten Civil Liberties and Free Speech Rights in Kenosha and Nationwide

1 week 5 days ago

In the days following the police shooting of Jacob Blake on August 23, 2020, hundreds of protestors marched in the streets of Kenosha, Wisconsin. Federal law enforcement, it turns out, collected location data on many of those protesters. The Bureau of Alcohol, Tobacco and Firearms (ATF) used a series of “geofence warrants” to force Google to hand over data on people who were in the vicinity of—but potentially as far as a football field away from—property damage incidents. These warrants, which police are increasingly using across the country, threaten the right to protest and violate the Fourth Amendment. 

Geofence warrants require companies to provide information on every electronic device in a geographical area during a given time period. ATF used at least 12 geofence warrants issued to Google—the only company known to provide data in response to these warrants—to collect people’s location data during the Kenosha protests. The center of each geographic area was a suspected arson incident. However, the warrants reach broadly and require location data for long periods of time. One of the warrants encompassed a third of a major public park for a two-hour window during the protests. The ATF effectively threw a surveillance dragnet over many protesters, using “general warrants” that violate the Fourth Amendment and threaten the First Amendment right to protest free from government spying.

Police can use geofence warrants to collect information on and movements of innocent people at protests. This can include device information, account information, email addresses, phone numbers, and information on Google services used by the device owner, and the data can come from both Android and Apple devices. Someone who goes to a protest and happens to be nearby when a crime occurs may get caught up in a police investigation. Police in Minneapolis, for example, used a geofence warrant during the protests over the killing of George Floyd. The public only learned about it because the dragnet, centered around a property damage incident, caught an innocent bystander filming the protests, and Google notified him (which it doesn’t always do). The police can also use this data to create dossiers on activists and organizers.

In this way, geofence warrants also eliminate anonymity that people may rely on in order to protest or otherwise freely associate in public spaces. Law enforcement’s ability to catalogue the location of peaceful protestors will chill their exercise of their First Amendment rights. This is especially problematic when, as with the August 2020 protests in Kenosha, people are taking to the streets to hold the police themselves accountable.

Google recently published data showing that police have issued at least 20,000 warrants, just over the last three years, and the sheer volume of these warrants is increasing exponentially year over year. For example, California issued 209 geofence warrant requests in 2018, but in 2020, it issued nearly 2,000. Each warrant may result in the disclosure of information on tens or hundreds of devices. The vast majority of these warrants are issued by state and local police, which makes them difficult to track.

Google must start standing up for its users against this massive overreach. In addition to serious harms to privacy and free expression, geofence warrants operate without transparency. After years of pressure, Google has finally provided some limited data. But the vast majority of geofence warrants remain sealed, with no information from Google or law enforcement on their targets, geographic area and length of time, and their purported justifications. As a result, most people have no way of knowing whether they are caught up in one of these dragnets. Such uncertainty further chills the constitutional rights to freely protest and associate.

Matthew Guariglia

The Other 20-Year Anniversary: Freedom and Surveillance Post-9/11

1 week 5 days ago

The twentieth anniversary of the attacks of September 11, 2021 are a good time to reflect on the world we’ve built since then. Those attacks caused incalculable heartbreak, anger and fear. But by now it is clear that far too many things that were put into place in the immediate aftermath of the attacks, especially in the areas of surveillance and government secrecy, are deeply problematic for our democracy, privacy and fairness. It’s time to set things right. 

The public centerpiece of our effort to increase government surveillance in response to the attacks was the passage of the Patriot Act, which will have its own 20th anniversary on October 26. But much more happened, and far too much of it was not revealed until years later.  Our government developed  a huge and expensive set of secret spying operations that eviscerated the line between domestic and foreign surveillance and swept up millions of non-suspect Americans' communications and records. With some small but critical exceptions, Congress almost completely abdicated its responsibility to check the power of the Executive. Later, the secret FISA court shifted from merely approving specific warrants to a quasi-agency charged with reviewing entire huge secret programs without either the knowledge or the authority to provide meaningful oversight. All of these are a critical part of the legacy of September 11.

Yet even after all of these years, there’s no clear evidence that you can surveil yourself to safety.

Of course, we did not invent national security or domestic surveillance overreach 20 years ago. Since the creation of the Federal Bureau of Investigation in the early twentieth century, and the creation of the National Security Agency in 1952, the federal government has been reprimanded and reformed for overreaching and violating constitutionally protected rights. Even before 9/11, the NSA’s program FAIRVIEW forged agreements between the government agency and telecom companies in order to monitor phone calls going in and out of the country. But 9/11 gave the NSA the inciting incident it needed to take what it  has long wanted: a shift to a collect-it-all strategy inside the U.S. to match, in many ways, the one it had already developed outside the U.S., and the secret governmental support to try to make it happen. As for those of us in the general public, we were told in the abstract that giving up our privacy would make us more secure even as we were kept in the dark about what that actually meant, especially for the Muslims and other Americans unfairly targeted. 

The surveillance infrastructure forged or augmented in the post-war-on-terror world is largely still with us. In the case of the United States, in addition to the computer servers, giant analysis buildings, weak or wrong legal justifications, and the secret price tag, one of the lasting and more harmful effects has been on the public. Specifically, we are still too often beholden to the mentality that collecting and analyzing enough information can keep a nation safe. Yet even after all of these years, there’s no clear evidence that you can surveil yourself to safety. This is true in general but it’s especially true for international terrorism threats, which have never been numerous or alike enough to be used to train machine learning models, much less make trustworthy predictions. 

But there are copious amounts of evidence of ongoing surveillance metastasis: the intelligence fusion centers, the national security apparatus, the Department of Homeland Security, enhanced border and customs surveillance have been deputized to do things far afield from their original purpose of preventing another foreign terrorist attack. Even without serious transparency, we know that those powers and tools have been used for political policing, surveilling activists and immigrants, denying entry to people because of their political stances on social media, and putting entire border communities under surveillance.

The news in the past 20 years isn’t all bad, though. We have seen the government end many of the specific methods developed and deployed by the NSA immediately after 9/11.  This includes the infamous bulk call details record program (albeit replaced with an only slightly less problematic program). It also includes the NSA’s metadata collection and the “about” searching done under the UPSTREAM program off of the Internet backbone.  We also have cut back on the unlimited gag orders accompanying National Security Letters. Each of these was accomplished through different paths, but none of them exist today as they did immediately after 9/11. We even pushed through some modest reforms of the FISA court.  

But the biggest good news is the growth of encryption across the digital world, from the encrypting of links between the servers of giants like Google, to the Let’s Encrypt project encrypting web traffic, to the rise of end-to-end encrypted tools like Signal and WhatsApp that have given people around the world greater protections against surveillance even as the governments have become more voracious in their appetites for our data. Of course, the fights over encryption continue, but we should note and celebrate our victories when we can. 

Other nefarious programs continue, including the Internet backbone surveillance that EFF has long sought to bring to the public courts in Jewel v. NSA.  And in addition to federal surveillance, we’ve seen the filtering of the “collect it all” mentality manifest in our local police departments both through massive surveillance technology injections and in the slow enmeshing of local with federal surveillance. We still do not have a full public account of the types and scope of surveillance that has been deployed domestically, much less internationally, although EFF is trying to piece some of it together with our Atlas of Surveillance

Twenty years is a good long time.  We now know more of what our government did in the aftermath and we know how little safety most of these programs produced, along with the disproportionate impact it had on some of our most vulnerable communities. It’s time to start applying the clear lessons from that time and continue to uncover, question, and dismantle both the mass surveillance and the unfettered secrecy that were ushered in when we were all afraid. 

Related Cases: Jewel v. NSA
Cindy Cohn

The Catalog of Carceral Surveillance: Voice Recognition and Surveillance

1 week 5 days ago

This post has been updated to provide additional context about patents and patent applications, which are indications of an entity’s interest in a particular product but not proof that the product is currently in development or available for use. You can read more about the role of patents in this series in our post, “The Catalog of Carceral Surveillance: Patents Aren't Products (Yet)

Prison phone companies have been profiting off the desire for human connection for as long as they’ve been in business. Historically, there’s been one primary instrument for that connection — voice — and only one way to milk it for revenue: by charging exorbitant rates for phone calls. It’s been a profitable business model for both the companies and their partners, the jails and prisons. 

In recent years, though, prison reform advocates and the families of people who are incarcerated, sick of dumping their savings into the maws of these phone providers, have worked to tip this cash cow. They made enough noise that the Federal Communication Commission (FCC) set a cap on per-minute charges on interstate phone calls. 

So two of the largest providers of prison communications have initiated new ways of mining inmates for income.

Prisoners know their calls while in-custody are generally being monitored. Prisoners may also be aware that they’re being recorded (both legally and not so legally). Still, it may shock prisoners that Securus and GTL are working to monetize their ability to eavesdrop on and catalogue thousands of voices traversing the phone lines of penal facilities in nearly every state every day.

In the name of security and fraud prevention, these two prison communications companies have developed ways to store and analyze the trove of voices they’ve recorded. The companies create voice prints of people speaking on a prison’s phone lines. The companies claim that, through multi-modal audio mining, these voice prints can be matched to their databases of voices to identify individuals across phone calls and facilities. These systems are already in place throughout the country, including in Arkansas, Florida, and Texas.

These companies are already notorious for expanding the expense of prison communications  beyond the prison to the support networks and families outside. These companies are now working on expanding biometric surveillance to the greater carceral community too. The companies use the technology to identify and profile anyone who has a voice that crosses into a prison. This includes all the parents, children, lovers, and friends of incarcerated people.

In a patent published in January 2021, Securus described collecting audio samples of individuals’ voices both at the moment of intake and while inmates are communicating with people on the outside. Facilities often acquire voice samples by threatening a loss of privileges should an inmate refuse to bow to the surveillance state. 

“Here’s another part of myself that I had to give away again in this prison system,” one inmate recalled in a 2019 article by The Intercept after he was told that failing to help train the system to recognize his voice would result in a loss of his ability to use the phone. As with other efforts to mass collect biometric and personal information, what happens to the data once it’s been collected and stored, including with whom it’s shared and who has access, is still an open question.  

Securus and GTL have other ideas in the works for possible uses, particularly as these voiceprints can be connected to other databases and people, both in and out of prison. 

Securus would like to see automated background checks based on their voice recognition technology. “[D]etainees with criminal records may be released at the end of a short term stay in a holding tank or may be bonded out without being detected," says Securus.

In another patent, GTL claims that it will be able to use voiceprint verification to identify “unauthorized” callers based on whether a second voice on one end of the phone line differs from an initial, authorized voice. So, if a prisoner’s girlfriend rings in and passes the phone to a child whose voiceprint weren’t vetted and approved, the phone system can boot the callers from the call altogether. 

Both companies would like to be able to map networks of individuals calling inmates, generating profiles of those who call multiple inmates or who stay in contact with their fellow prisoners once released. 

Global Tel*Link has branded its version as Voice IQ. Securus’s own Investigator Pro claims that “You’ve Never Seen Voice Biometrics Like This.”

With these new patents and initiatives, Securus and Global Tel*Link seek to identify, and almost certainly misidentify, more inmates and their families than ever before, forging new frontiers in the ways America’s prison complex can scrutinize the vulnerable. 

An earlier version of this article incorrectly identified the owner of a patent claiming the ability to identify "unauthorized callers" via voice print verification  as Securus, rather than GTL.

Beryl Lipton

Don’t Stop Now: Join EFF, Fight for the Future at Apple Protests Nationwide

1 week 6 days ago

We’re winning—but we can’t let up the pressure. Apple has delayed their plan to install dangerous mass surveillance software onto their devices, but we need them to cancel the program entirely. Next week, just before Apple’s big iPhone launch event, we need your help to make sure the company does the right thing. 

Activists from EFF, Fight for the Future, and other digital civil liberties organizations have planned protests around the country for Monday, September 13, at 6PM PT to demand that Apple completely drop its planned surveillance software program. You can find a list of the protests here. Protests are already planned in Boston, Atlanta, Washington D.C., New York City, and Portland (OR).

EFF will host a protest at San Francisco Union Square, with signs, stickers, and speakers, but you can protest no matter where you are:

RSVP NOW

JOIN THE PROTEST AND TELL APPLE: DON'T SCAN OUR PHONES

Whether you’re a longtime fan of Apple’s products or you’ve never used an iPhone in your life, we must hold companies accountable for the promises they make to protect privacy and security. Apple has found its way to making the right choice in the past, and we know they can do it again. 

So bring a friend, wear your EFF merch, and make your voice heard! We’ve got sign designs ready for you to print below, or you can make your own! And you can always add our custom EFF "I do not consent to the search of this device" lock screen to your phone. 

And to make sure that Apple gets the message that encryption is simply too important to give up on, EFF will also be sending it straight to Apple's headquarters—by flying an aerial banner over the campus during their September 14 iPhone launch event.

On September 7, we delivered nearly 60,000 petitions to Apple. Over 90 organizations across the globe have also urged the company not to implement them. We’re pleased Apple is now listening to the concerns of customers, researchers, civil liberties organizations, human rights activists, LGBTQ people, youth representatives, and other groups, about the dangers posed by its phone scanning tools. But we can’t let up the pressure until Apple commits, fully, to protecting privacy and security. 

COVID Protocol: We are committed to upholding public health guidelines related to COVID. Please don't attend if you have any COVID symptoms, and we encourage masking and social distancing.

Below you can find printable images for your protest:

Further Reading:

Jason Kelley

The Catalog of Carceral Surveillance: Prison Gaming and AR/VR Services

1 week 6 days ago

This post has been updated to provide additional context about patents and patent applications, which are indications of an entity’s interest in a particular product but not proof that the product is currently in development or available for use. You can read more about the role of patents in this series in our post, “The Catalog of Carceral Surveillance: Patents Aren't Products (Yet)

No matter how many rights are taken away from people in prison, no matter how brutally they are treated, the prison industrial complex hasn’t stopped concocting new ways to extract money from prisoners and their families. Securus, in one of their newest patents, granted February of 2021, has imagined a new tool for doing just that: using a tablet issued to individual inmates to allow them to make video calls, access information about their case, and give them the opportunity to pay money for temporary access to video games. 

Just because you are in prison doesn’t mean you get a pardon from microtransactions and the insatiable maw of capitalism.

Importantly, patents often precede the actual development or deployment of a technology. Though applications may demonstrate an interest in advancing a particular technology, these intentions don’t always progress beyond the proposal, and many inventions that are described in patent applications don't wind up being built. What we can glean from a patent application is that the company is thinking about the technology and that it might be coming down the pipeline.

In this case, the tablet being dreamed up would be useful for more than just extracting money from prisoners. Securus proposes that it can also be used as a biometric surveillance device. According to the patent: 

“the monitoring system may be configured to collect sensor information from the resident communications device in order to detect unsafe conditions during a gaming session. For instance, the monitoring system may collect sensor information from the resident communications device indicating a level of stress or agitation by the resident during game play. One or more gyroscope sensors included within the resident communications device may be used to detect unsafe handling of the resident communications device. Heart rate and blood pressure information detected by sensors worn by the resident may be transmitted via RFID (Radio Frequency Identification) to the resident communications device.”

If the sensors and biometrics in such a handheld device could be used as an indicator of a prisoner’s mood, it’s possible that those observations could be used for other prisoner evaluations. We could imagine a scenario in which a prisoner’s excitement or anger while playing a  game is misused by the system to determine that the prisoner is uncooperative or a recidivism risk. Perhaps prisons may even use this data in parole hearings or when deciding to give punishments/rewards to prisoners. If Securus, or any company, were to develop a gaming device with such mood monitoring, your “gamer moment” in the digital world could be used as more cause for more punishments in the real one.

Following the publication of this post, Aventiv, the parent company of Securus, provided a statement to EFF that said it is “committed to protecting the civil liberties of all those who use our products” and that the company is “reviewing all open patents to certify that they align with our transformation efforts.”

Global Tel-Link

The gaming world is also potentially lucrative ground for another prison telecommunication company, Global Tel-Link, which has filed a patent for a virtual reality (VR) service for prison inmates. It would, according to their own description, allow the prisoner to, “for a brief time, imagine himself outside or away from the controlled environment.”  

According to the patent, Global Tel-Link envisions this prison VR system as supplementing, or entirely replacing, in-person visits so that inmates could interact with friends and family in a controlled and monitored virtual environment. We imagine Global Tel-Link would presumably charge for this service in the same way it charges for all of its other prison  communications products.

But prison inmates aren’t the only ones subject to monitoring and technological “assistance” from Global Tel-Link. The company has also filed a patent for Augmented Reality to assist and surveil the prison guards: essentially “Google Glass: Prison Edition.”

An image from the patent, Global Tel-Link’s AR device patent

According to the patent, Global Tel-Link’s AR device worn by guards would have myriad functions. It would be able to perform facial recognition on inmates and display vital details such as their name and history. It also purports to display the location of any rogue radio frequency signals that could indicate contraband cell phones. Further, it reportedly has object detection powers, and can highlight any dangerous or contraband objects such as weapons or open doors which should be closed. 

The patent is particularly troubling because AR already has many potential privacy problems created by their always-on cameras and other monitoring devices. AR systems’ civil liberties problems would be heightened if they are used in prisons, given that incarcerated individuals already have much less privacy and autonomy than people in public.

The patent also describes how the AR devices could be used to track prison guards. Global Tel-Link states: “In some situations, activities of guards lack monitoring, giving some staff/guards the opportunity to get involved in importation of contraband goods into the controlled environment.” 

Who watches the watchers? Apparently Global Tel-Link does. The company declined to comment for this post.

GTL’s proposed system described in the patent could also monitor whether prison guards are staying on task. According to the patent, the AR system “stores ... a set of criteria defining whether a monitored activity is determined as ‘normal’ and ‘abnormal.’ For example, the set of criteria includes the time range to complete an assignment, the designed path for an assignment, the dwelling time at one location, the heart rate range, the regular presence locations of inmates, etc.”

We are skeptical about the ability of Global Tel-Link to fit so many advanced technologies into a wearable device, especially at a price that will be comfortable to the owners of public and private prisons. However, if the company is able to build its patent, the result would be that prison guards would join incarcerated people under the omnipresent eye of the panopticon.

Cooper Quintin

EFF Activists To Lead Protest Demanding Apple Cancel iPhone Scanning Program and Keep Its Privacy Promises To Customers

1 week 6 days ago
Demonstrations Planned at Apple Stores in San Francisco, Boston, Portland, and Atlanta

San Francisco—Electronic Frontier Foundation (EFF) activists will lead a protest on Monday, September 13, at 6 pm PT, demanding Apple drop its planned iPhone surveillance software program, which will endanger the privacy and security of its customers and open a backdoor to increased surveillance around the world.

Demonstrators from EFF and Fight For the Future (FFTF) will rally in front of Apple’s flagship store to send a message to the iPhone giant that the program, a shocking about-face for users who have relied on the company’s leadership in privacy and security, must be cancelled.

To make sure that Apple gets the message that encryption is simply too important to give up on, EFF will also be sending it straight to Apple's headquarters—by flying an aerial banner over the campus during their September 14 iPhone launch event.

“Users want the devices they have purchased to work for them—not to spy on them for others,” said Joe Mullin, a policy analyst on EFF’s activism team who will speak at Monday’s protest. “Delaying the program is a step in the right direction, but it is not enough. Apple needs to take the next step to protect its users and abandon the program.”

Protests at Apple Stores, organized by EFF, FFTF, and OpenMedia, are planned in Boston, Portland, Atlanta, and other cities. A map of the locations can be found at https://www.nospyphone.com/#map.

EFF and partners have delivered petitions with 60,000 signatures telling Apple not to scan customers’ phones. In addition, EFF joined the Center for Democracy and Technology (CDT) and more than 90 other organizations in sending a letter urging Apple CEO Tim Cook to stop the company’s plans to weaken privacy and security on Apple’s iPhones and other products.

The iPhone surveillance software will continuously scan user photos to compare them to a secret government-created database of child abuse images. The parental notification scanner uses on-device machine learning to scan messages, then informs a third party, which breaks the promise of end-to-end encryption. Apple’s new surveillance infrastructure will be all too easy for governments to redirect to greater surveillance and censorship.

What:
Don’t Scan Our Phones Protest

Speaker:
EFF Policy Analyst Joe Mullin

Where:
Apple Store
300 Post Street
San Francisco CA 94108

When:
Monday, September 13
6 pm PT

For more about Apple's program:
https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life

Contact:  JoeMullinPolicy Analystjoe@eff.org
Karen Gullo
Checked
38 minutes 36 seconds ago
EFF's Deeplinks Blog: Noteworthy news from around the internet
Subscribe to EFF update feed