Right to Repair Advocates Have Had Good Victories. We Have To Keep Fighting.

2 months 1 week ago

It’s been a good year for right to repair advocates. Colorado passed an important law to allow wheelchair users access to resources they need to fix their own chairs. The Federal Trade Commission has stepped up enforcement of companies that limit the right to repair. And New York made history by passing the first broad consumer right to repair legislation at the end of 2022, requiring some digital electronics manufacturers to provide access to parts, tools, and information necessary for repairing their products.

Thank you to everyone who wrote in to support these bills, and especially to our allies in the Repair Coalition who lead this fight. Despite these wins, however, it’s important that those who care about the right to repair keep pushing to build on these steps. Because while there are many victories to celebrate, there is still a long way to go. And the hard-won fights for the steps forward we took have exposed just how much opposition there is to the basic idea that you should be able to tinker with your own stuff.

Take the New York law, for example. While it is indisputably a milestone, the law signed by Gov. Kathy Hochul took a huge step back from the version of the bill that had passed both houses of New York’s state legislature. It was significantly weakened at the last hurdle. Why? The Times Union (Albany, N.Y.) reported that TechNet, which represents tech industry groups, launched a targeted lobbying assault on New York Gov. Kathy Hochul, asking for her to veto the bill, to modify the bill, and exempt specific types of companies from being covered under it.

They succeeded in a few major ways. The bill passed by the legislature would have covered all digital electronics, such as phones, tablets, and IT equipment. The law, as modified by the governor, will only cover products made after July 1, 2023. It also walked back language from the bill passed by the legislature by excluding products sold under “business-to-government” or “business-to-business” contracts. That could mean that schools, hospitals, and other organizations that manage a lot of devices will not benefit from the law. There are also a couple of loopholes added to the law, such as one that allows companies to offer assemblies of parts rather than the individual parts. Manufacturers may see this as an invitation to circumvent the spirit of the law, by making consumers buy unnecessary bundles of parts rather than just the one they need.

Finally, the law also says companies don’t have to provide materials to bypass security features, which is an important step in the legitimate diagnosis and repair of electronic devices. This provision responds to debunked worries that allowing independent repairers to work on devices is a security risk. We’ve written before about why that’s nonsense. We urge lawmakers in other states who are looking at right to repair bills for 2023 not to fall into the same traps.

Companies know that the right to repair is popular, and the wins this year—especially in New York—show that advocates can rally people like you to tell lawmakers how important it is to the everyday person. Big firms are feeling the pressure. Microsoft, Apple, and even John Deere, which have all opposed the right to repair in the past, have bowed to pressure and made concessions.

Two things, however, show that we still need to push harder. First, voluntary company action is typically either done for public relations or is at best the product of compromise, and doesn’t address the problems people have. It can also come at a cost. For example, the John Deere right to repair agreement with the Farm Bureau doesn’t fix all of the issues farmers face and doesn’t do anything to foster competition for repair. It also contains a promise that, in exchange for these half-measures, the organization won’t support any right to repair legislation. Time will tell if John Deere follows up on its side of the deal this time.

Second, the incredible lobbying effort still mobilized against right to repair laws, as in New York, shows that companies will make public promises, but privately don’t want to be held to them. That’s why anyone who cares about the right to repair should take this year as a sign to keep on pushing. Your work is making a difference. We just have to keep going.

Hayley Tsukayama

Fair Use Creep Is A Feature, Not a Bug

2 months 1 week ago

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, and addressing what's at stake, and what we need to do to make sure that copyright promotes creativity and innovation.

Lawyers, scholars, and activists, including EFF, often highlight Section 512 of the Digital Millennium Copyright Act and Section 230 (originally of the Communications Decency Act) as the legal foundations of the internet. But there’s another, much older, doctrine that’s at least as important: Fair use, which dates back many decades and is codified in law as Section 107 of the Copyright Act. Fair use is, in essence, the right of the public to use a copyrighted work in a variety of circumstances, without the rightsholder’s permission. It’s why a reviewer can quote from the book they’re reviewing, a parody video can include excerpts from a movie, and security researchers can copy a software program in order to test it for malware.

Fair use is essential to internet for at least two reasons. First, the vast majority of what we do online, from email to texting to viewing images and making TikToks, involves creating, replicating, and/or repurposing copyrighted works. Since copyright is a limited but lengthy monopoly over those works, in theory, using or even viewing them might require a license; now, and for many decades in the future.

Second, technological innovation rarely means starting from scratch. Instead, developers build on existing technologies, hopefully improving them. But if the technology in question involves code, it is likely copyrightable. If so, that add-on innovation might require a license from the rightsholder, giving them a veto right on technological development.

As digital technologies dramatically (and sometime controversially) expand the reach of copyright, fair use helps ensure that the rights of the public expand as well.

Examples abound. In 2021, for example, the Supreme Court held that Google’s use of certain Java Application Programming Interfaces (APIs) was a lawful fair use. While we argued that the API’s weren’t copyrightable in the first place, the decision gave more legal certainty to software developers’ common practice of using, re-using, and re-implementing software interfaces written by others, a custom that underlies most of the internet and personal computing technologies we use every day. Or consider Authors’ Guild v. Hathitrust, where the Second Circuit Court of Appeals held that fair use sheltered book digitization. Contrary to the complaints of rightsholders, neither decision has discouraged investment in new creativity.

Today, fair use is helping to defend the efforts of public interest organizations to share culture, ideas, and knowledge in ways that would never have been possible without the internet. In one case, at stake is the ability of librarians to make decisions about how to curate and lend the books in their collections. In another, at stake is access to the law.

In Hachette v. Internet Archive, four of the biggest publishers in the world, are trying to shut down Controlled Digital Lending, which allows people to check out digital copies of books for two weeks or less and only permits patrons to check out as many copies as the Archive and its partner libraries physically own. That means that if the Archive and its partner libraries have only one copy of a book, then only one patron can borrow it at a time.

Supported by authors, libraries, and scholars, the Internet Archive has explained that CDL is a lawful fair use that serves copyright’s ultimate purpose: enriching our common culture. Through CDL, the Internet Archive is fostering research and learning by helping its patrons access books and by keeping books in circulation when their publishers have lost interest in them. Digital lending also allows patrons to borrow books without having their reading habits tracked by commercial entities, like OverDrive and Amazon, that may not share librarians’ traditional commitment to protecting privacy. Perhaps most importantly, it gives librarians the power to curate their own digital collections, just as they curate their physical collections. If the publishers have their way, however, books, like an increasing amount of other copyrighted works, will only be rented, never owned, available subject to the publishers’ whim.

In ASTM et al v. Public.Resource.Org, three huge industry associations are trying to prevent a tiny nonprofit, Public.Resource.Org, from posting online standards, such as building codes, that have been made into laws. Our laws belong to all of us, and we should be able to find, read, and comment on them free of registration requirements, fees, and other roadblocks. The industry associations insist that because they helped shepherd the volunteers who actually develop those standards, they own and can control access to those laws. As Public Resource explained to a federal appeals court last year, even assuming the standards can be subject to copyright at all, posting them online, for free, to facilitate research and comment, is a quintessential fair use. A lower court has already reached that conclusion, and we expect the appeals court will agree.

The lawsuits are ongoing, but these projects, and the benefits they create, might not exist at all if these nonprofits couldn’t rely on the fair use doctrine.

But even where a use is clearly lawful and fair, efforts to invoke it can be stymied by practical, technical, and legal barriers. Defending fair uses can be expensive. As Professor Larry Lessig once said, “Fair use is the right to hire a lawyer” – and many of us don’t have the resources to do that, nor access to pro bono counsel. What's worse is rightsholders often rely on a combination of contracts, technical measures, and legal constraints to prevent or inhibit fair uses. In the gaming space, for example, vendors require users to agree to contracts that forbid them from using add-on services, and do not hesitate to sue third parties who try to provide those services. They put digital locks on games to prevent efforts to remix or even just preserve games for posterity. And if anyone breaks those digital locks, even for otherwise lawful reason, they may face a legal claim under Section 1201 of the DMCA.

But this problem goes far beyond traditional creative industries. Manufacturers of everything from medical devices to tractors use the same tactics to prevent independent repair and competitive innovation that are otherwise protected fair uses.

As technology grows creeps into every facet of our lives, rightholders will continue to look to copyright to jealously guard their legacy position as gatekeepers. Fortunately for the public, fair use has likewise grown to protect the original purpose of copyright: to encourage forward progress. And no matter what Hollywood or John Deere tells you, that’s a feature, not a bug.

Related Cases: Oracle v. GoogleHachette v. Internet ArchiveFreeing the Law with Public.Resource.OrgAuthors Guild v. HathiTrust
Corynne McSherry

Have You Tried Turning It Off and On Again: Rethinking Tech Regulation and Creative Labor

2 months 1 week ago

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, and addressing what's at stake, and what we need to do to make sure that copyright promotes creativity and innovation.

“The creatures outside looked from pig to man, and from man to pig, and from pig to man again; but already it was impossible to say which was which.” -George Orwell, Animal Farm

The Internet Copyright Wars are in their third decade, and despite the billions of dollars and trillions of phosphors spilled on its battlegrounds around the world, precious little progress has been made. A quarter of a century after Napster’s founding, we’re still haunted by the same false binaries that have deadlocked us since the era of 56k modems:

  • Team User v. Team Creator. Creators are users, and not merely because “everything is a remix.” Creative labor builds on the works that came before it. “Genre” is just another word for “works that share a common set of touchstones, norms and assumptions.”
  • Big Tech v. Big Content. Entertainment monopolies aren’t staunch defenders of the creative workers whose labors generate their profits (far from it!) and tech giants aren’t selfless liberators of oppressed artists stuck sharecropping for legacy entertainment companies (not by a long chalk!). No matter whether a giant multinational is a member of the MPA or TechNet, it has the same overriding imperative: to reduce its wage bill and thus retain more earnings for its shareholders.

There is nothing especially virtuous or wicked about either tech companies or entertainment companies. Indeed, in an era in which Google owns the world’s most popular video site; where Amazon and Apple both own movie and television studios; where Microsoft owns multiple game production studios, and where the Big Three music labels own substantial stakes in Spotify, there is no longer a meaningful distinction between “a giant tech company” and “a giant entertainment company.” Both are simply: “a giant company.”

And giant companies are gonna giant company. As paperclip-maximizing artificial life-forms, limited liability corporations are on a remorseless, ceaseless quest for ways of reducing the cost of their inputs, and if payments to creative workers can be squeezed, they will be.

Advanced economies around the world have spent the past 40 years expanding copyright. Today, copyright lasts longer and covers more works than ever, with higher damages and lower bars to securing them than ever. Companies that sell entertainment products are more profitable than ever, and the entertainment sector is larger than ever.

But the share of that income going to creative workers is lower than it has been in generations, and it is continuing to decline

No one listens to a song because they loved the record executive who signed the performer’s royalty statement

Even if you think that copyright’s only legitimate purpose is to incentivize creativity, this stinks. No one listens to a song because they loved the record executive who signed the performer’s royalty statement or read a book because they wanted to reward the hard work of the lawyer who drafted the author’s contract. A copyright system that makes intermediaries richer and creative workers poorer is indefensible.

How can more copyright lead to less money for creators? To answer this question, we need to look at the structure of the entertainment and tech sectors. The web has been degraded into “five giant websites, each filled with screenshots of the other four.” 

The entertainment industry is no better, consisting of:

  • Five giant publishers;
  • Four giant movie studios;
  • Three giant record labels (who own three giant music publishers);
  • Two giant ad-tech companies (and two giant app companies);
  • One giant ebook and audiobook retailer.

Giving a creator extra copyright is like giving a bullied kid extra lunch money: it doesn’t matter how much money you give that kid, the bullies are going to take it all.

As these platforms have locked up billions of users inside walled gardens, they have made it all-but-impossible for creators to reach their audiences without first acceding to whatever terms a massive gatekeeper demands.

Under these market conditions, giving a creator extra copyright is like giving a bullied kid extra lunch money: it doesn’t matter how much money you give that kid, the bullies are going to take it all. This is true even - especially - if the bullies use some of that stolen lunch money to pay for a massive global ad campaign exhorting us to think of the poor hungry kids and demanding that we give them even more lunch money.

To create a copyright system that works for creative workers and their audiences, we need to think beyond copyright. Here are some non-copyright policies that would make copyright better:

The fight that matters isn’t tech vs. content—it’s corporate consolidation vs. creative workers and their audiences. We won’t win that fight with ever-more-Draconian copyright laws - we’ll win it with interventions that are laser-focused on increasing worker power, blunting corporate power, and transferring cash from the corporate side of the ledger to the creators’ side.

Cory Doctorow

Open Data and the AI Black Box

2 months 1 week ago

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, and addressing what's at stake and what we need to do to make sure that copyright promotes creativity and innovation.

Artificial Intelligence (AI) grabs headlines with new tools like ChatGPT and DALL-E 2, but it is already here and having major impacts on our lives. Increasingly we see law enforcement, medical care, schools and workplaces all turning to the black box of AI to make life-altering decisions—a trend we should challenge at every turn. 

The vast and often secretive data sets behind this technology, used to train AI with machine learning, come with baggage. Data collected through surveillance and exploitation will reflect systemic biases and be “learned” in the process. In their worst form, the buzzwords of AI and machine learning are used to "tech wash" this bias, allowing the powerful to buttress oppressive practices behind the supposed objectivity of code.

It's time to break open these black boxes. Embracing collaboratively maintained Open Data sets in the development of AI would not only be a boon to transparency and accountability for these tools, but makes it possible for the would-be subjects to create their own innovative and empowering work and research. We need to reclaim this data and harness the power of a democratic and open science to build better tools and a better world.

Garbage in, Gospel out

Machine Learning is a powerful tool, and there are many impressive use-cases: like searching for signs of life on Mars or building synthetic antibodies. But at their core these algorithms are only as "intelligent" as the data they're fed. You know the saying: "garbage in, garbage out." Machine Learning ultimately relies on training data to learn how to make good guesses—the logic behind which is typically unknown even to the developers. But even the best guesses shouldn’t be taken as gospel. 

Things turn dire when this veiled logic is used to make life-altering decisions. Consider the impact of predictive policing tools, which are built on a foundation of notoriously inaccurate and biased crime data. This AI-enabled search for "future crimes" is a perfect example of how this new tool launders biased police data into biased policing—with algorithms putting an emphasis on already over-policed neighborhoods. This self-fulfilling prophecy even gets rolled out to predict criminality by the shape of your face. Then when determining cash bail, another algorithm can set the price using data riddled with the same racist and classist biases.

Fortunately, transparency laws let researchers identify and bring attention to these issues. Crime data, warts and all, is often made available to the public. This same transparency is not expected from private actors like your employer, your landlord, or your school

The answer isn’t simply to make all this data public. Some AI is trained on legitimately sensitive information, even if publicly available. They are toxic assets sourced by a mix of surveillance and compelled data disclosures. Preparation of this data is itself dubious, often relying on armies of highly exploited workers with no avenues to flag issues with the data or its processing. And despite many "secret sauce" claims, anonymizing these large datasets is very difficult and maybe even impossible, and the impacts of a breach would disproportionately impact the people tracked and exploited to produce it.

Instead, embracing collaboratively maintained open data sets would empower data scientists, who are already experts in transparency and privacy issues pertaining to data, to maintain them more ethically. By pooling resources in this way, consensual and transparent data collection would help address these biases, but unlock the creative potential of open science for the future of AI.

An Open and Empowering Future of AI

As we see elsewhere in Open Access, this removal of barriers and paywalls helps less-resourced people access and build expertise. The result could be an ecosystem where AI doesn’t just serve the haves over the have-nots, but in which everyone can benefit from the development of these tools.

Open Source software has long proven the power of pooling resources and collective experimentation. The same holds true of Open Data—making data openly accessible can identify deficits and let people build on one another's work more democratically. Purposefully biasing data (or "data poisoning") is possible and this unethical behavior already happens in less transparent systems and is harder to catch. While a move towards using Open Data in AI development would help mitigate bias and phony claims, it’s not a panacea; even harmful and secretive tools can be built with good data.

But an open system for AI development, from data, to code, to publication, can bring many humanitarian benefits, like in AI’s use in life-saving medical research. The ability to remix and quickly collaborate on medical research can supercharge the research process and uncover missed discoveries in the data. The result? Tools for lifesaving medical diagnosis and treatments for all peoples, mitigating the racial, gender, and other biases in medical research.

Open Data makes data work for the people.  While the expertise and resources needed for machine learning remain a barrier for many, crowd-sourced projects like Open Oversight already empower communities by making information about law enforcement visibility and transparency. Being able to collect, use, and remix data to make their own tools brings AI research from the ivory towers to the streets and breaks down oppressive power imbalances.

Open Data is not just about making data accessible. It's about embracing the perspectives and creativity of all people to set the groundwork for a more equitable and just society. It's about tearing down exploitative data harvesting and making sure everyone benefits from the future of AI.

Rory Mir

Digital Rights Updates with EFFector 35.1

2 months 1 week ago

It's a new year! There's no better time to keep up with the latest updates on your digital rights. Version 35, issue 1 of our EFFector newsletter is out now. Catch up on the latest EFF news by reading our newsletter or listening to the audio version below. This issue covers a collection of EFF's 2022 Year in Review posts (seriously, there are a lot of them!) as well as some upcoming events EFF will be attending and even new job postings.


EFFECTOR 35.1 - Digital Rights In Review 2022

Make sure you never miss an issue by signing up by email to receive EFFector as soon as it's posted! Since 1990 EFF has published EFFector to help keep readers on the bleeding edge of their digital rights. We know that the intersection of technology, civil liberties, human rights, and the law can be complicated, so EFFector is a great way to stay on top of things. The newsletter is chock full of links to updates, announcements, blog posts, and other stories to help keep readers—and listeners—up to date on the movement to protect online privacy and free expression. 

Thank you to the supporters around the world who make our work possible! If you're not a member yet, join EFF today to help us fight for a brighter digital future.

Christian Romero

US Copyright Term Extensions Have Stopped, But the Public Domain Still Faces Threats

2 months 1 week ago

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, and addressing what's at stake, and what we need to do to make sure that copyright promotes creativity and innovation.

Every January 1st, we celebrate the creative works that become free to use and adapt as their copyright expires. This year, that includes the iconic sci-fi film “Metropolis,” the first Best Picture Oscar winner “Wings,” and the last of the Sherlock Holmes stories by Arthur Conan Doyle. Along with these famous works, many thousands of cultural artifacts from 1927 and earlier can now be used by artists, educators, and businesses without fear of massive copyright liability—if any copies can be found.

For most of the 21st Century, these works have been under legal lock and key. Following the 20-year Sonny Bono Copyright Term Extension Act that Congress passed in 1998, no copyright terms expired in the U.S. until 2019. The cost is staggering - researchers estimate that 75% of the films of the silent era have been lost.

The 1998 extension capped several decades of copyright term expansions that ultimately put U.S. copyrights among the longest in the world. Even though the next 20 years will see many more significant works enter the public domain, including Disney’s famous early films like Snow White, Bambi, and Fantasia, the major media and entertainment companies haven’t called for another term extension—and none seems likely.

Why did U.S. copyright terms stop their relentless growth? Because people from all walks of life stood up and said “no more!” The Internet has made everyone a creator and a user of creative work, whether photos, video, music, or prose. Internet users recognized that ever-longer copyright terms impoverish the public conversation and benefit almost no one. Over the past decade, you’ve made your voices heard and made further term extensions toxic for U.S. lawmakers.

The public domain still faces threats. Canada is poised to enact its own 20-year term extension. We can also expect rightsholders with lots of legal firepower, like Disney, to try and stretch trademark law into what the Supreme Court once called “a species of mutant copyright,” to keep others from building on old characters, books, and films.

Copyright terms remain far too long. It will be nearly two decades before a filmmaker making a documentary about the World War II era can use music recordings from the period without facing what the Recording Industry Association of America and other music industry groups have called a “staggeringly complex” licensing process—or else risking massive and unpredictable statutory damages in a copyright suit.

Rather than preserving culture, long and complicated copyright terms keep us from our history. And that cannot be what copyright was meant to do.

Mitch Stoltz

It’s Copyright Week 2023: Join Us in the Fight for Better Copyright Law and Policy

2 months 1 week ago

We're taking part in Copyright Week, a series of actions and discussions supporting key principles that should guide copyright policy. Every day this week, various groups are taking on different elements of copyright law and policy, and addressing what's at stake, and what we need to do to make sure that copyright promotes creativity and innovation.

One of the interesting side effects of the internet is that more people than ever are aware of copyright. Pretty much everyone online has seen some version of the “this media is no longer available due to a copyright claim” notice on something they wanted to see. Copyright affects everything from what entertainment we see to which of our devices we can repair. This is why we must fight for copyright law and policy that serves everyone.

Eleven years ago, a diverse coalition of Internet users, non-profit groups, and Internet companies defeated the Stop Online Piracy Act (SOPA) and the PROTECT IP Act (PIPA), bills that would have forced Internet companies to blacklist and block websites accused of hosting copyright infringing content. These were bills that would have made censorship very easy and harmed legitimate speech, all in the name of copyright enforcement.

Last year there were a bevy of bad copyright and copyright-related proposals in the U.S. Because thousands of you spoke up, none of them made it into the year-end, must-pass bills in Congress.

But this week isn’t just about stopping bad proposals. It’s about celebrating positive changes for all of us. It’s about right to repair, fair use, and the public domain.

And so, every year, EFF and a number of diverse organizations participate in Copyright Week. Each year, we pick five copyright issues to highlight and advocate a set of principles of copyright law. This year’s issues are:

  • Monday: Public Domain The public domain is a crucial resource for innovation and access to knowledge. Copyright should strive to promote, and not diminish, a robust, accessible public domain.
  • Tuesday: Digital Ownership. As the things we buy increasingly exist either in digital form or as devices with software, we also find ourselves subject to onerous licensing agreements and technological restrictions. If you buy something, you should be able to truly own it – meaning you can learn how it works, repair it, remove unwanted features, or tinker with it to make it work in a new way.
  • Wednesday: Open Access Having an even playing field when accessing the latest information isn’t just good for science, it’s fundamental to human rights worldwide. As we’ve seen in the global response to COVID-19, copyright shouldn’t get in the way of open collaboration and global equity.
  • Thursday: Free Expression and Fair Use Copyright policy should encourage creativity, not hamper it. Fair use makes it possible for us to comment, criticize, and rework our common culture.
  • Friday: Copyright Enforcement Tools as Censorship Freedom of expression is a fundamental human right essential to a functioning democracy. Copyright should encourage more speech, not act as a legal cudgel to silence it.

Every day this week, we’ll be sharing links to blog posts and actions on these topics at https://www.eff.org/copyrightweek and at #CopyrightWeek on Twitter.

As we said last year, and the year before that, if you too stand behind these principles, please join us by supporting them, sharing them, and telling your lawmakers you want to see copyright law reflect them.

Katharine Trendacosta

Kurt Opsahl Moves to EFF Special Counsel

2 months 2 weeks ago

Longtime EFFer and Deputy Executive Director and General Counsel Kurt Opsahl will be moving on from the Electronic Frontier Foundation after nearly 20 years, on February 1. But we aren’t going to let him go too far: Kurt will continue on as a Special Counsel of EFF.  Kurt will be joining the Filecoin Foundation as Associate General Counsel for Cybersecurity and Civil Liberties Policy.

Kurt joined EFF in 2004, and has been a key part of nearly every big fight for digital rights since then. Over the years, he established the reporters’ privilege for online journalists, has been our lead attorney helping security researchers on the Coders' Rights Project, fought against copyright trolls and warrantless mass surveillance, and represented a number of companies who challenged secret National Security Letters. He’s been a big part of bringing EFF’s message to the world, talking to reporters about everything from TikTok to Tornado Cash to the metaverse, in outlets ranging from the New York Times to Popular Science to BBC News to the Washington Post.

Many EFF supporters know Kurt from his years spreading the word about EFF at security conferences like DEF CON, one of the oldest and largest hacker conventions in the world. As Special Counsel, Kurt will continue to be providing pro bono legal advice and counsel to the security community with the Coders Rights Project, so keep an eye out for him at DEF CON and elsewhere. Thank you so much for your work at EFF until now, Kurt, and for your upcoming work on as one of our Special Counsel, and, of course, as Quizmaster for our annual Cyberlaw Trivia Night.

Rebecca Jeschke

Beware the Gifts of Dragons: How D&D’s Open Gaming License May Have Become a Trap for Creators

2 months 2 weeks ago

UPDATE: It's been reported as of January 14, 2023 that Wizards of the Coast has backed off on their plans for now. 

According to leaks reported last week, the company that owns Dungeons and Dragons (D&D) is planning to revoke the open license that has, since the year 2000, applied to a wide range of unofficial, commercial products that build on the mechanics of Dungeons and Dragons. The report indicates that this wouldn’t simply be a change going forward, but would affect existing works that relied on the license. The old license would be revoked for existing uses, and people who have used on it will be forced to adopt new terms or renegotiate with the company, Wizards of the Coast, a subsidiary of game giant Hasbro.

Obviously, this would be a rude and unfair thing to do to people who have accepted the invitation of the open gaming license (OGL) to create new games and stories that build upon Dungeons and Dragons. But would it be legal?

Even more interesting, would revoking the OGL actually give some third parties more freedom to operate, given that the OGL forced them to promise not to do some things that copyright and trademark law otherwise permit?

Let’s find out.

What is an open license?

An open license is an offer to allow people to use your materials in the ways you specify, despite some legal right such as a copyright that would otherwise entitle you to withhold permission. For instance, the Creative Commons Attribution license provides rights to adapt and share a copyrighted work, so long as the user gives you credit, or “attribution.”

If you have a copyrighted work and you want to give people reassurance that they can make use of it, open licenses are a handy way to do that. You might do this because you want your work to be freely shared far and wide or because you want to build a community of creativity.

But an open license only makes sense if the work is actually copyrightable, meaning, you would otherwise have the legal power to stop someone from doing what you want to permit. For instance, if I put together an uncopyrightable phone book composed of bare facts organized alphabetically, then people are already free to use it and there nothing for me to “license.”

What’s copyrightable about a roleplaying game?

A roleplaying game is like a cross between improvisational acting and playing a board game. It’s called a roleplaying game because players take on fictional personas, or roles, and narrate or act out their actions within the shared fictional narrative. Roleplaying games are typically published in books that describe the rules of play and may also include scenarios and fictional settings to supplement the stories that players invent.

Copyright grants an author a limited monopoly over their creative expression. It doesn’t cover bare facts, mere ideas, systems, or methods. But it does cover the creative way that a person expresses facts, ideas, and so forth, provided that the expression has sufficient creativity. A roleplaying game book often includes both a description of a mechanical system and creative, fictional elements.

When describing a noncopyrightable game mechanic, I might do it in a dry, noncopyrightable way, or I might do it in a creative, copyrightable way.

For example, if I want to describe a magic spell that turns someone invisible in a game, a non-copyrightable way to do it might be:

Invisibility spell: You must speak magic words and touch your target. When you do, they become invisible for one hour. You may end this spell whenever you wish. This spell ends automatically if your target makes an attack or casts a spell.

While there are different word choices that could be made in some places, this is a functional description of how the spell works as a game mechanic. You have to speak, so it doesn’t work if you’re gagged. You have to touch the target, so you need to be close to them. And so on. Functional descriptions aren’t copyrightable.

Here’s a different version:

Kit’s Shroud of Concealment spell: Incant “stars’ blight upon all sight” and touch your target. When you do, the spirits of the constellations descend to wrap them in an unearthly mist that makes them invisible for one hour. You may send the spirits home whenever you wish, ending the spell. The spirits depart automatically if your target makes an attack or casts a spell. This spell was developed by the Sorceress-Lawyer Kit when she negotiated the contract between the Tower of Sorcery and the Constellation Spirits in the year of the Fallen Mountain.

If all that additional text is just fluff with no game consequences, this version probably contains some elements that are copyrightable.

However, if other game elements trigger when spirits are present, or if someone says a rhyme, or based on other fictional elements described here, then the uncopyrightable game system might “merge” with the text here so that it wouldn’t be infringement for someone to reproduce this text entirely. Courts are essentially interested in whether the uncopyrightable elements of the work remain available for the public to use, or if the copyright owner is effectively monopolizing them because there simply aren’t many different ways to describe the uncopyrightable ideas or system.

Contrast this with a five-page narrative about the history of the Sorceress-Lawyers, which would likely be covered by copyright just like any other fictional narrative. It’s also possible to have a copyright in the selection and arrangement of uncopyrightable elements: if my game arranges the spells according to which fictional spellcaster invented them, then I might have a copyright in that particular arrangement, and a user should probably put them in alphabetical order or order of increasing power or something else conventional and functional if they don’t want to risk potential infringement.

The exact lines of copyrightability are going to vary from game to game and even page to page. For now, it suffices to say that there is a lot of text in a roleplaying game that can be shared without infringing copyright, and there’s also text that will have copyright attached to it. And, of course, this legal analysis focuses on US law; there will likely be different considerations elsewhere in the world, particularly in jurisdictions without robust protection for the right to make fair uses of copyrighted works.

What did Wizards of the Coast offer under their Open Gaming License 1.0a?

The version of the Open Gaming License (OGL) that has existed since 2000 is very narrow. It permits use of “the game mechanic and includes the methods, procedures, processes and routines to the extent such content does not embody the Product Identity and is an enhancement over the prior art and any additional content clearly identified as Open Game Content by the Contributor.” You’ll notice that these are the elements that are not copyrightable in the first place. So the only benefit that OGL offers, legally, is that you can copy verbatim some descriptions of some elements that otherwise might arguably rise to the level of copyrightability.

But if you accept the terms of the OGL (more on that later), you agree not to use a lot of other things that the license defines as “Product Identity,” including “product and product line names, logos and identifying marks including trade dress; artifacts; creatures characters; stories, storylines, plots, thematic elements, dialogue, incidents, language, artwork, symbols, designs, depictions, likenesses, formats, poses, concepts, themes and graphic, photographic and other visual or audio representations; names and descriptions of characters, spells, enchantments, personalities, teams, personas, likenesses and special abilities; places, locations, environments, creatures, equipment, magical or supernatural abilities or effects, logos, symbols, or graphic designs; and any other trademark or registered trademark clearly identified as Product identity by the owner of the Product Identity, and which specifically excludes the Open Game Content.”

For most users, accepting this license almost certainly means you have fewer rights to use elements of Dungeons and Dragons than you would otherwise. For example, absent this agreement, you have a legal right to create a work using noncopyrightable elements of D&D or making fair use of copyrightable elements and to say that that work is compatible with Dungeons and Dragons. In many contexts you also have the right to use the logo to name the game (something called “nominative fair use” in trademark law). You can certainly use some of the language, concepts, themes, descriptions, and so forth. Accepting this license almost certainly means signing away rights to use these elements. Like Sauron’s rings of power, the gift of the OGL came with strings attached.

The primary benefit is that you know under what terms Wizards of the Coast will choose not to sue you, so you can avoid having to prove your fair use rights or engage in an expensive legal battle over copyrightability in court.

Adoption of the OGL

Despite the stinginess of the OGL, it provided legal certainty that many individuals and small game publishers used to make new games and new material for Dungeons and Dragons. Some of these product lines have been around for decades and developed their own following. One of the neat things about a cultural commons is that people can find something they kind of like and then tweak it to be just right for them, rather than settling for a one-size-fits-all approach. House rules and variants had always been a part of roleplaying game culture, and D&D culture specifically, and now the practice had official permission (even if it never needed permission in the first place as a legal matter).

Revocation of the OGL

If the reported leaks are accurate, and if Wizards of the Coast goes ahead with a plan to revoke the OGL, then people who publish and distribute works relying on the OGL will have to re-evaluate their legal position. If they’re doing something that would be copyright infringement absent a license, they may face legal risk.

As a threshold question, can Wizards of the Coast legally revoke their license? Other open licenses like Creative Commons licenses and the GPL are clear that the rights they grant are irrevocable. At the very least, this means that once you rely on the license to make something, you can keep making it and distributing it no matter what the copyright owner says (as long as you comply with the terms of the license).

UPDATE January 11, 2023: As the community has scrutinized Wizards of the Coast's past statements, it's become very clear that Wizards always thought of this as a contract with obligations for both sides (for instance their 2001 OGL FAQ v 1.0). Unlike a bare license without consideration, an offer to contract like this cannot be revoked unilaterally once it has been accepted, under the law of Washington (where they are located) and other states. Since the contract is accepted when someone “uses” the licensed material, then people who relied on the OGL 1.0a have a good argument under contract law that Wizards of the Coast cannot unilaterally withdraw the value that it offered under the contract. This would apply to people who “accepted” the OGL 1.0a by using the relevant material prior to receiving notice that Wizards is rescinding that offer. In short, games that held up their end of the bargain under the OGL 1.0a are entitled to the benefit Wizards of the Coast promised them under that contract. But Wizards can revoke the offer of the OGL 1.0a as to new potential users who haven't yet accepted its terms.

The OGL 1.0a does specifically address new versions and gives the recipient the right to use “any authorized version” of the license “to copy, modify and distribute any Open Game Content originally distributed under any version.” This means that people who accepted OGL 1.0a have the right to use its terms for anything licensed under a subsequent OGL 1.1, so long as the OGL 1.0a remains an “authorized version.” The leaks suggest that Wizards wishes to construe this term to mean “a version that they have, in their full discretion, decided to keep authorizing on any given day,” but a better reading would be that it's any license they have authorized, as opposed to an OGL that wasn't associated with Wizards. This is particularly true since courts construe ambiguity in unilateral contracts against the party that drafted them.

Read on for the original post language in italics, analyzing the OGL as if it were a bare license and explaining the difference between the terms "perpetual" and "irrevocable" in licensing.

The OGL does not say that it is irrevocable, unfortunately. It’s possible that Wizards of the Coast made other promises or statements that will let the beneficiaries of the license argue that they can’t revoke it, but on its face it seems that they can.

Some have pointed to the word “perpetual” to argue that the license is irrevocable, but these are different concepts in the law of licenses. Perpetual means that the license will not expire due to time passing, that’s all. In RPG terms, consider the invisibility spell. “Perpetual” is like the duration; the spell lasts for one hour. But the caster can dismiss it at any time: that’s like revocation. And if the invisible person makes an attack, the spell ends automatically: that’s like a license terminating because of a condition being met, usually breaching the terms of the license. Just like the magic spell, these are three independent concepts.

What Wizards of the Coast can’t do is revoke the license, yet continue to hold users to the restrictions in the OGL. If they revoke it, then the people who have relied on the license are no longer under an obligation to refrain from using “Product Identity” if they do so in ways that are fair use or otherwise permitted under copyright law. And unless they are using actually copyrighted material in a way that would infringe copyright, there may be little incentive to agree to such restrictions, let alone the new restrictions and potential royalty obligations of any new version of the OGL that comes along.

Can a player or publisher avoid the terms of the OGL, old or new?
The OGL 1.0a includes a strange term claiming that you agree to be bound by this contract by “using” the “Open Game Content,” such as the mechanics. Wizards of the Coast wrote D&D’s license to operate like a cursed helm, where you’re doomed the moment you put it on.

Fortunately, that’s not how contracts work. If it were, then in my book I could write a contract saying that you owe me $10,000 if you write a bad review or author a competing book in the same genre. Contracts require an offer, acceptance, and some kind of value in exchange, called “consideration.” If you sell a game, you are inviting the reader to play it, full stop. Any additional obligations require more than a rote assertion. And for many readers, a bunch of legalese buried in one page of a 200-page book wouldn’t even be effective notice that a supposed contract exists.

However, there are a few ways a person might bind themself to this agreement. If you publish a book and say “published pursuant to OGL 1.0a” or something along those lines you’ve pretty clearly agreed to it. You might also arguably have agreed to it as part of signing up for an online account with Wizards of the Coast. There are arguments against the enforcement of clickwrap agreements, particularly ones that restrict your speech rights, but there certainly are clickwrap agreements that do get enforced.

For someone who wants to make a game that is similar mechanically to Dungeons and Dragons, and even announce that the game is compatible with Dungeons and Dragons, it has always been more advantageous as a matter of law to ignore the OGL. Practicality may dictate a different result when up against the legal team of a large corporation, but if the terms of the OGL are revoked and the new OGL proves even more onerous, that might change the calculus for creators going forward.

Lessons for Other Open Licenses and for Fan Creators

Open licenses can involve a lot of legalese that makes them hard for a layperson to understand, but if you’re going to rely on one, or if you want others to rely on your own open license, it’s important to use one that is robust and meets your needs. Licenses like Creative Commons and the GNU Public License were written to serve the interests of creative communities, rather than a corporation, and it shows. Beware corporate policies about the acceptable use of their copyrighted materials that wind up being restrictions on your fair use rights rather than the grant of meaningful permission.

Kit Walsh

EFF and Partners Call Out Threats to Free Expression in Draft Text as UN Cybersecurity Treaty Negotiations Resume

2 months 2 weeks ago

EFF is attending this week and next a new round of negotiations over the proposed UN Cybercrime Treaty to raise concerns that draft provisions now on the table include a long list of content-related crimes that pose serious threats to free expression, privacy, and the legitimate activities of journalists, whistleblowers, activists, and others.

In talks starting today and running through January 20 in Vienna, we will fight for users to ensure that detailed human rights are embedded in the treaty, and that proposed criminal offenses are narrow in scope, limited to core cybercrimes, and exclude content-based crimes and crimes that are considered “cyber” just because technology was used to commit them. We will also be fighting against the inclusion of overbroad and undefined concepts that could potentially authorize surveillance measures such as government hacking, as well as any provision that could undermine encryption.

EFF and Privacy International articulated these concerns in a joint submission, delivered to Member States last month, that includes detailed observations and recommendations to limit the scope of the proposed treaty, including limiting criminal conduct to only offenses in which information and communications systems are the direct objects and instruments of the crimes. You can read the full submission here.

Moreover, in a letter released today to the UN Ad Hoc Committee facilitating the negotiations, EFF and more than 74 digital and human rights organizations in more than 45 countries and regions, expressed grave concerns that the draft text released by the committee on November 7 calls for Member States to treat various kinds of speech—much of which would be fully protected under international human rights law—as a criminal offense. The text includes a long list of crimes that interfere with protected speech and fail to comply with permissible restrictions on free expression established by international human rights standards.

The letter is part of our year-long push, along with Privacy International, Human Rights Watch, ARTICLE 19, Access Now, Derechos Digitales, and many other allies, to ensure that human rights are baked into the proposed UN treaty, so it’s not used as a tool to censor legal speech, conduct illegal surveillance, violate privacy rights, or stifle free expression.

The proposed UN Cybercrime Treaty has the potential to rewrite criminal laws and procedures  around the world, adding new offenses and creating new police powers for both domestic and international investigations, implicating the rights of billions of people worldwide.

Today’s session, the fourth of seven such meetings that started last February, marks the beginning of a critical phase in the negotiations. In three sessions last year, Member States exchanged views on the convention’s objectives, scope, and structure, and proposed what they believe should be key elements of the treaty. This work culminated in the “consolidated negotiating document (CND),” a summary of Member State’s submissions that will serve as a basis for negotiating, starting today, the treaty’s final text.

The session will focus on criminalization provisions, general provisions, and procedural measures and law enforcement. In three final negotiating sessions scheduled over the course of this year, the parties will seek agreement to finalize and approve the draft and send a resolution on it to the UN General Assembly for consideration and adoption in early 2024.

As it stands, the CND contains many troubling provisions. We are particularly concerned that the draft includes such content crimes as “extremism-related offences” (Article 27),  “denial, approval, justification or rehabilitation of genocide” (Article 28), and “terrorism-related offences” (Article 29). Distributing material online “motivated by political, ideological, social, racial, ethnic or religious hatred”should be struck from the proposed text. Further, “the spreading of strife, sedition, hatred or racism” via information and communications technologies a crime should also be excluded.

These overly broad, undefined, and subjective terms will undoubtedly sweep up legitimate expression, news reporting, protest speech, and more. We are not alone in these concerns. ARTICLE 19 points out in its comments on the CND text that several content-based Articles do not comply with freedom of expression standards, and the CND  conflates “cybercrime” with privacy and data protection concepts, muddying legal frameworks that have been historically and deliberately separated at the national level. ARTICLE 19 urged Member States to seriously reconsider their efforts and make sure the CND does not include provisions that violate human rights standards the CDN text claims to prioritize.

Broad and undefined concepts such as “terrorism” and “extremism” should not be used as a basis to restrict freedom of expression. As we explained in our letter to the committee, there are no uniform definitions of these concepts in international law, and many States take advantage of this ambiguity to justify human rights abuses, such as politically motivated arrests and prosecutions of civil society members, independent media, and opposition parties, among others.

Over the next two weeks, EFF and its allies are committed to working on behalf of users to ensure that any new draft CND to emerge from this session is aligned with the principles and standards that are crucial to protect fundamental rights, including free expression, of those who will be subject to the treaty for decades to come.

Karen Gullo

Last Chance for U.S. Federal Employees to Make a Pledge for EFF!

2 months 3 weeks ago

Calling all U.S. federal employees and retirees: the Combined Federal Campaign (CFC) pledge period is closing on January 14, 2023. Be sure to make a pledge for EFF now to support digital freedoms for every internet user. 

It's easy to donate to EFF through the CFC! Here are the steps:

  1. Scan the QR code below or go to https://GiveCFC.org
  2. Click the DONATE button to give via payroll deduction, credit/debit, or an e-check
  3. Be sure to use our CFC ID #10437

If you are renewing a pledge from last year, you can also easily increase your support by following the steps above! 

In 2021 U.S. federal employees raised over $34,000 for EFF during the CFC pledge period. That support has helped us secure some major victories, including several in the last month:

We’ve got our eyes on even more internet freedom issues in 2023 and we can’t continue without you. Support from federal employees has a tremendous impact on the work that we can do. Support EFF today by using our CFC ID #10437 today! 

Christian Romero

Data Sanctuary for Abortion and Trans Health Care: 2022 in Review

2 months 3 weeks ago

In the wake of this year’s Supreme Court decision in Dobbs overruling Roe v. Wade, sheriffs and bounty hunters in anti-abortion states will try to investigate and punish abortion seekers based on their internet browsing, private messaging, and phone app location data. We can expect similar tactics from officials in states that have prohibited transgender youths from obtaining gender-affirming health care. Indeed, the Texas governor ordered state child welfare officials to investigate such care as child abuse.

Many states are stepping forward to serve as health care sanctuaries for people seeking abortion or gender-affirming care that is not legal at home. These states must also be data sanctuaries. To be the safest refuge, a state that has data about people who sought abortion or gender-affirming health care must lock down that data, and not disclose it to adversaries who would use it to punish them for seeking that health care.

So it is great news that California Gov. Gavin Newsom recently signed three bills that will help meet these data privacy threats: A.B. 1242, authored by Asm. Rebecca Bauer-Kahan; A.B. 2091, authored by Asm. Mia Bonta; and S.B. 107, authored by Sen. Scott Wiener.

EFF supported all three bills. And we encourage other states to pass similar bills. They create new reproductive and trans health data exemptions from old information disclosure mandates. These laws also place new limits on how courts, government agencies, and businesses handle this data. (You can read here a more detailed explanation of these three new California laws; this post is a summary.)

New exemptions from old mandates. Many states require in-state entities to share data with out-of-state entities. States that respect the rights to abortion and gender-affirming health care must create new exemptions from these old laws, for out-of-state investigations of such health care. The new California bills do this to three old California laws that (1) require certain California digital service providers to treat out-of-state warrants like in-state warrants, (2) require California courts to assist in enforcing out-of-state judicial orders, and (3) require California health care providers to disclose certain kinds of medical information to certain kinds of entities.

New limits on judges. Under the new California laws, state judges cannot authorize wiretaps, pen registers, or search warrants, if they are for the purpose of investigating abortions that are legal in California. Also, state judges now cannot compel someone to identify a person who had an abortion, or issue a subpoena, in connection with an out-of-state investigation of an abortion that is legal in California.

New limits on state agencies. California’s state and local government agencies, including but not limited to law enforcement and prisons, are now barred from disclosing information to an individual or out-of-state agency regarding a person’s abortion or gender-affirming health care.

New limits on communication services. There is a new rule for California corporations, and corporations with principal offices in California, that provide electronic communication services. They shall not, in California, provide information or assistance in response to out-of-state court orders concerning abortions that are legal in California. However, such a corporation is not subject to liability unless it knew or should have known that the court order in question related to such an abortion.

Three cheers for California! These new data sanctuary laws are strong protections for people seeking abortion and transgender health care. Other pro-choice and pro-trans states should enact similar laws.

But more work remains.

Anti-abortion and anti-trans sheriffs will continue to seek information located in the Golden State. California lawmakers must enact new laws as needed. For example, they may need to add new exemptions to an old law that authorizes state courts to command residents to travel out-of-state to testify in criminal proceedings. Eternal vigilance is the price of data sanctuary. States should also be data sanctuaries for immigrants.

Also, Congress and the states must enact comprehensive consumer data privacy legislation that limits how businesses collect, retain, use, and share our data. A great way to stop anti-choice and anti-trans sheriffs from seizing data from businesses is to stop these businesses from collecting and retaining this data in the first place. Legislators should start with Rep. Jacobs’ My Body, My Data bill.

Finally, Congress and the states must limit how law enforcement agencies obtain our data from businesses. For example, police across the country are using “reverse search warrants” to identify all people who used particular keywords in their web searches, and all people who were physically present at a particular geolocation. These schemes violate the Fourth Amendment. Legislators must ban them. New York State legislators tried to do so last year. Anti-abortion sheriffs might use them to identify all people who searched the web for “abortion pill,” or who visited an abortion clinic. Likewise, police across the country are buying detailed location data, often without a warrant, from data brokers who got it from our phone apps. This also violates the Fourth Amendment. Legislators should ban it, too.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2022.

Adam Schwartz

A Year in Internet Surveillance and Resilience: 2022 in Review

2 months 3 weeks ago

This year, we have seen an array of different ways governments around the world have tried to alter basic security on the web for users. Much of this was attempted through legislation, direct network interference, or as a request directly from a government to internet governance authorities. On the other hand, we have also seen new anti-censorship mechanisms assist people so that they can regain access to the wider world, providing hope in really dark times.

EU's Digital Identity Framework

While the European Union’s eIDAS (electronic IDentification, Authentication and trust Services) framework and law is not new and has been in effect since 2014, there were several amendments proposed in the European Parliament that have struck new conversations, and concerns. As a top example, there is a proposed amendment to Article 45 that we believe could fundamentally alter the web trust model as we know it. The amendment would require that web browsers trust third parties designated by the government, without necessary security assurances.

EFF went over the implications and concluded that it is a solution in search of a problem. The proposal would enforce expensive Qualified Web Authentication Certificates (QWACs) for websites, instead of cheaper or free certificates as the safest option for communication on the web; and it could potentially make users vulnerable to malicious activity by government-based Certificate Authorities (or Qualified Trust Service Providers/QTSPs) in a worse case scenario.

December 6th 2022, The Council of the European Union adopted the original amendment language despite the proposals from several committees in the European Parliament that would allow browsers to protect users in light of a security threat by a QTSP. The ultimate decision lies with the Industry, Research and Energy committee (ITRE), and we urge the final vote to ensure that browsers can continue to block certificate authorities that don't meet security standards, especially when the EU itself is facing member states’ various issues around democracy.

Wartime Internet

With the Russian invasion of Ukraine came multiple issues around government blocking, censorship and security risk within and outside of Russia. Inside the country, various VPNs and anonymity protocols like Tor have been getting blocked, which we can speculate is most likely to deter dissent and to keep an eye on people’s traffic.

Heavy foreign sanctions were another layer that contributed to the fragmentation of the Russian internet. As businesses cut ties, services like certificate authorities had cut off issuing new certificates to any website with a Russian top-level domain (like .ru). This created space for the Russian government to step in and create its own “Russian Trusted Root CA” to fill the gaps for these websites, paving the way for a lasting “Splinternet” Russia ultimately aspires to. Lastly, requests came from the Ukrainian government to the Internet Corporation for Assigned Names and Numbers (ICANN) to completely cut off Russia’s top-level domains from the rest of the internet. ICANN is the US-based international non-profit that oversees the global system of internet domain names and IP addresses. We explained why this request would not just impact those in the wrong, but negatively affect security on the web for everyone. Thankfully, ICANN declined the request.

Uprising in Iran

On 13th September 2022, Mahsa Amini, a 22-year-old Kurdish woman who visited Tehran with her family was arrested by “morality” police officers, and died in custody three days later. Since then, protests in Iran have been sustained by large swathes of the Iranian people, and in response, the government has blocked many online services within the country. Like in Russia, Iran’s efforts to filter domestic online traffic are not new, and are part of an ongoing effort to deter dissent and lock out important information from the outside world. Back in March, EFF signed a letter to the Iranian government with more than 50 other organizations to urge it to rescind the draconian “Regulatory System for Cyberspace Services Bill”. This bill violates basic rights to privacy and freedom of expression. While it has not been ratified, it has already been suspected that some of its parts have been implemented already. With more recent proven incidents of internet censorship, the government has already crossed that bridge toward a host of human rights violations.

Anti-Censorship Tools Progress

With Iran as an example, we have seen new forms of internet blocking of modern protocols and popular endpoints that support them; such as encrypted DNS and HTTP/3. While we are worried about how governments are evolving to creatively block network traffic, we are also optimistic about developments to help activists get their message out and communicate with others.

One tool that has seen major popularity is Snowflake. This tool helps connect those in countries where Tor is blocked by helping user traffic appear innocuous. You can learn how to “become a Snowflake” and support people under censorship to connect to the open web with our post. Speaking of Tor, the Tor browser has also added a new automatic Connection Assist feature that connects to Tor bridges in case Tor is blocked in your region. This feature ensures that you can now seamlessly connect to Tor Bridges, including with Snowflake.

As reports came in that Signal was being blocked in Iran, the call for Signal Proxies from the president of Signal, Meredith Whittaker, gave a very easy guide on how to create and host a Signal proxy and help people reconnect to the platform securely. While there are reports that these can be blocked if discovered by government censors, there are ways of discretely sharing the address of these proxies, as explained in the guide.

Lastly, this year the Open Observatory of Network Interference (OONI) also rolled out a new online class with the human rights training platform Advocacy Assembly to use OONI’s tools to measure censorship and the real-time data of various frequently blocked websites and services like WhatsApp. This effort could aid in the effort for the open research of more granular cases around the world that could be getting missed.

While internet censorship on a governmental level is tough to combat, we hope to see innovations continue to keep these technologies open and available to the public around the world. Part of that is by keeping internet security strong in places everywhere, not just in those countries traditionally thought of as authoritarian. Promoting and defending end-to-end encryption and ubiquitous encryption on the web even where internet security is strongest in the world will help aid where it is at its weakest.

Alexis Hancock

Global Cybercrime and Government Access to User Data Across Borders: 2022 in Review

2 months 3 weeks ago

Since the new UN cybercrime treaty began to take shape in 2022, EFF has been fighting on behalf of users to make sure content-based crimes are excluded from the Treaty, and robust human rights safeguards and rule of law standards are the basis of any final product.

There’s a lot at stake—the proposed UN cybercrime treaty has the potential to rewrite criminal laws around the world, adding new offenses and creating new police powers for both domestic and international investigations, and implicating the rights of billions of people worldwide.

Our push for human rights safeguards in the UN treaty follows a campaign since 2013 to strengthen human rights protections in government investigative powers. In 2017 that effort led us to advocate for changes (through submissions and testimony) in the now-approved Council of Europe’s Second Additional Protocol to the Budapest Cybercrime Convention. The Protocol is another instrument, approved on May 2022, expanding cross-border access to potential evidence in criminal investigations.

We raised concerns that the Protocol not only fails to require adequate oversight, but even creates government powers that bypass existing accountability mechanisms. Unfortunately, our core concerns about weak privacy standards in the Protocol were not addressed, and it was approved by Member States at the Council of Europe without robust safeguards. Existing signatories of the Budapest Convention have been invited since May 2022 to sign the new Protocol; the United States and 29 other countries have already done so. Next, countries will have to implement its provisions, and many of those countries may require reforms in their domestic criminal law. The treaty will finally enter into force once five countries have ratified it.

But we haven’t retreated. As the battle moves to the implementation phase, we released a comprehensive overview and guide about the new Protocol for countries in Latin America, as well as a handy outline of key issues Latin American civil society organizations can raise in urging the government to carefully consider the implications of acceding to the treaty.

2022-2023: The UN Cybercrime Battle Continues

And now a new debate has begun at the United Nations. While the Council of Europe mostly excluded civil society and even privacy regulators from timely participation in negotiations and drafting of the Protocol, EFF, and other human and digital rights organizations have had a seat at the table as meetings convened by the UN to begin work on its cybercrime treaty. Civil society successfully persuaded the UN Ad-Hoc Committee overseeing the process to approve the participation of EFF and other nongovernmental organizations and has advocated for the process to be broadened even further.

While we don’t think the UN Cybercrime Treaty is necessary, we have nevertheless been closely scrutinizing the process and providing constructive analysis, which will continue in 2023. We’ve made clear that human rights must be baked into the treaty so that it doesn’t become a tool to stifle freedom of expression, infringe on privacy and data protection, or endanger vulnerable  people and communities. Since January 2022, in presentations at four meetings in New York and Vienna, we’ve asked Member States to better protect human rights in the treaty.

Even before UN negotiators held their first meeting in February, EFF and over 134 organizations and academics from around the world urged members of the Ad-Hoc Committee to include human rights considerations at every step of the drafting process. We told the committee

The goal should be to combat the use of information and communications technologies for criminal purposes without endangering the fundamental rights of those it seeks to protect.

Because privacy and human rights standards vary dramatically among the member states, we made a statement in the Ad-Hoc Committee at its March meeting expressing concern that investigative powers adopted in the treaty will seek to accommodate the worst police surveillance practices across participating states. EFF Policy Director for Global Privacy, Katitza Rodriguez told the committee:

“There is a real risk that, in an attempt to entice all States to sign a proposed UN cybercrime convention, bad human rights practices will be accommodated, resulting in a race to the bottom.”

Many countries’ early proposals alarmed us; some of the most concerning suggestions can now be found in the first draft text of the treaty, formally entitled “the consolidated negotiating document” (CND), which the Committee published in November. (The CND hasn’t been the subject of negotiations yet, but those negotiations will begin at the next Ad-Hoc Committee session on January 9th.) It includes a range of scary ideas, both for crimes and criminal procedure (that is, both for new offenses and for new law enforcement powers). We've told Member States that the draft includes:

“a long list of offences that are not core cybercrimes, offences that interfere with protected speech and fail to comply with permissible restrictions under international freedom of expression standards, or offences drafted with vague or overbroad language.”

Some proposals present in the draft would essentially call for states to criminalize “using a computer in a crime,” for actions that are already illegal. We’ve maintained that cybercrimes should be understood as those that specifically target computer systems, and t the treaty should require fraudulent intent on the part of the accused person. EFF’s long experience with computer crime laws in the U.S. has shown, again and again, how dangerous it can be to have a broadly-written law with no malicious/fraudulent intent and harm requirement. Such laws can potentially be used against anyone who did something with a computer that someone else didn’t like, even with no intent to cause any harm, and are often abused to punish security researchers or journalists.

Other proposals call for states to treat various kinds of speech (many of which would be fully protected under international human rights law) as a cybercrime. We’ve supported the Office of the High Commissioner for Human Rights’s key messages recommending that any “future agreement on cybercrime should avoid including offences based on the content of online expression (“content offences”).”

Some police powers proposed in the CND are also concerning. Our most recent letter on the CND says, in short:

  • New investigative powers should only be available for bona fide investigations of crimes covered by the treaty.
  • By default, people should be able to learn if their data was handed over. Authorities should be able to impose gag orders only when disclosure would pose a demonstrable threat to an ongoing investigation.
  • All new powers should come with matching human rights safeguards—with teeth.
  • General provisions authorizing interception and real-time collection of data should be amended to clarify that they do not authorize hacking into networks and end devices. 
  • The text should not authorize any indiscriminate or indefinite retention of metadata.

Unfortunately, the CND  fell short of many of these recommendations. It is overbroad in its scope and not restricted to core cybercrimes. It includes provisions that are not sufficiently clear and precise and would criminalize activity in a manner that is not consistent with international human rights standards and principles.

Meanwhile, the CND’s criminal procedural and law enforcement chapter lacks robust human rights safeguards, while its substantive provisions expand the scope of criminal intent and conduct, threatening to criminalize legitimate activities of journalists, whistleblowers, security researchers, and others. We are particularly concerned about the inclusion of content crimes such as “extremism-related offences” and “terrorism-related offences.”

The CND includes one Article on respect for human rights and the inclusion of gender perspectives. But this does not go far enough to ensure the respect of human rights is included in other provisions of the proposed Convention.

While disappointing, the text will go through more revisions in 2023, and we will continue to push for changes. EFF and Human Rights Watch submitted comments to the Committee in December, voicing strong concerns about the CND’s shortcomings and making recommendations to the Committee to tighten its focus, excluding a number of troubling provisions, and strengthen human rights safeguards.

A fourth ad hoc session—the halfway mark in the negotiating process, which aims to conclude sometime in 2024 with the finalization and approval of a draft text of the convention—is scheduled for mid-January 2023. EFF and its allies will be there to ensure that human rights are at the center of the discussions and the next draft is aligned with the principles and standards that are crucial to protect the fundamental rights of those who will be subject to the treaty for decades to come.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2022.

Katitza Rodriguez

Fighting for the Digital Future of Books: 2022 in Review

2 months 4 weeks ago

Informed citizens need comprehensive libraries that meet people where they are.  Today, that means online spaces that welcome everyone to use their resources, invite them to create new and truthful works, and respect the interests of both authors and readers. 

EFF client Internet Archive has created one of those spaces. Through Controlled Digital Lending (“CDL”), the Internet Archive and other nonprofit libraries make and lend digital scans of print books in their collections, at no cost to their patrons.  CDL allows people to check out digital copies of books for two weeks or less, and only permits patrons to check out as many copies as the Archive and its partner libraries physically own. That means that if the Archive and its partner libraries have only one copy of a book, then only one patron can borrow it at a time, just like any other library. Through CDL, the Internet Archive is helping to foster research and learning by helping its patrons access books and by keeping books in circulation when their publishers have lost interest in them.

CDL is fundamentally the same as traditional library lending; it’s just another way of getting a book to the one patron who borrowed it.  But four of the biggest publishers in the world want to shut it down. In 2020, they sued the Internet Archive for copyright infringement. In 2022, both sides filed briefs asking the court to decide the question as a matter of law. Supported by authors, libraries, and scholars, the Internet Archive’s briefs explained that CDL is a lawful fair use that serves copyright’s ultimate purpose: enriching our common culture. The publishers, for their part, claim that the Internet Archive's CDL program harms their ebook licensing market. But their theory has a fundamental flaw: even with unlimited access to years of sales data, they cannot point to a dime they have lost or are likely to lose because of the Internet Archive’s digital lending.

The outcome of this case is likely to define the future of books in the U.S. CDL makes it easier for patrons who live far from a brick-and-mortar library, or who have print disabilities, to access books.  But that's just the beginning. The Internet Archive’s CDL program also helps fight disinformation by facilitating ongoing easy access to authoritative sources for Wikipedia articles.  It helps fight censorship by giving librarians a way to curate and share books banned by local school districts.  Like all library lending, it helps the public discover new works that they love enough to purchase their own copies. Digital lending also makes it possible for patrons to access books without having their reading habits tracked by commercial entities, like OverDrive and Amazon, that may not share librarians’ traditional commitment to protecting privacy. Perhaps most importantly, it gives librarians the power to curate their own digital collections, just as they curate their physical collections.

If the publishers have their way, however, books, like an increasing amount of other copyrighted works, will only be rented, never owned, available subject to the publishers’ whim, on their terms. This is not a hypothetical problem, as students at Georgetown, George Washington University, and the other members of the Washington Research Library Consortium learned last fall when they discovered that found 1,379 books could no longer be borrowed in electronic form. The books had disappeared from those libraries’ virtual shelves because the publisher had decided to stop licensing them to the academic library market. After a public backlash, the publisher backed down, but an ominous message was sent. As more than a thousand authors said in a recent open letter: “We fear a future where libraries are reduced to a sort of Netflix or Spotify for books, from which publishers demand exorbitant licensing fees in perpetuity[.]”

We should all share that fear. Librarians know what their patrons need, which is why they should be in charge of determining what to include in their collections,  now and in the future. Libraries pay publishers under either approach—but digital lending lets libraries make their own decisions about which books to circulate physically, and which to circulate digitally instead. Librarians can continue to maintain permanent collections of books, to preserve those books in their original form for future generations, and to lend them to patrons one at a time, as they have always done.

EFF is proud to defend the Internet Archive in this case, and we will keep fighting for a robust and vibrant digital world for libraries, authors, and readers alike.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2022.

Related Cases: Hachette v. Internet Archive
Corynne McSherry

Seeing Patent Trolls Clearly: 2022 in Review

2 months 4 weeks ago

The internet can be a powerful tool for communicating, collaborating, and finding community. But lawsuits and threats from patent trolls have been an obstacle to the dream of a free and open internet. That’s why EFF has been fighting back against them for more than 15 years. 

Patent trolls are companies that are focused on suing and threatening over patents, not on offering actual goods or services. Very often, they use software patents to sue over basic business processes, like making picture menus or taking event photographs. It’s all too easy to get patents on things like this because the patent system is just a bad fit for software.

In the first 3 quarters of 2022, 64% of all patent lawsuits were filed by patent trolls—companies whose primary focus is making money from patents, not providing services or goods. In the high-tech space, patent trolls filed 88% of all lawsuits. 

Defending Our Progress 

Since EFF has started working to improve the patent system, we’ve seen a few big steps forward, like the introduction of inter partes review (IPR), and the 2014 Alice v. CLS Bank Supreme Court ruling

The Alice precedent, which is more than 8 years old now, has done a good job of knocking out many software patents that never should have been issued in the first place. 

EFF’s “Saved by Alice” project has highlighted some of the small businesses and individuals who have benefited from the more balanced approach courts have taken since the Alice ruling. 

However, patent trolls and companies that extensively license patents have long been lobbying against Alice. This year, we saw an attack on the law that we knew would come eventually. An extreme bill, the “Patent Eligibility Restoration Act,” would have eviscerated Alice completely and allowed for some of the worst software patents to make a comeback. It would have even legalized the patenting of human genes, a practice that is ruled out under current Supreme Court precedent. 

This bill didn’t advance in the current Congress, in part because many EFF supporters spoke out about it and contacted their Senators. But because patent trolls and other pro-patent extremists have so much to gain by chipping away at Alice, we don’t expect it will be the last attempt to turn back some of the progress we’ve made. 

Two Big Steps Forward for Sunshine in the Patent System 

In September, EFF got a big win in our long-running case seeking to unseal records related to Uniloc’s patent trolling. This case began in 2018 as an effort to understand heavily redacted filings in a patent infringement case between Uniloc, a patent troll, and Apple. 

Thanks to our litigation, the great majority of Uniloc’s previously secret court records are now public. That includes most of a table of licensing agreements that Uniloc used to convince a private equity firm called Fortress to fund its patent-trolling activities. 

In November, EFF got involved in a case where several patent troll companies were under scrutiny by a Delaware federal judge, who was concerned that the true owners of the patent trolls had “perpetrated a fraud on the court.”  The trolls’ lawyers sought to have an appeals court shut down the investigation. EFF filed a brief in that case to explain why it’s critical that judges be allowed to demand more information about the patent troll companies that are utilizing our public courts for their business. The U.S. Court of Appeals for the Federal Circuit accepted EFF’s brief, denied the petition brought by patent troll Nimitz Technologies, and allowed the investigation to proceed. 

Because of cases like the ones EFF got involved in, federal courts are increasingly demanding more disclosure in patent cases, including disclosures about litigation funding. That’s a positive trend. Patent trolls rely on secrecy to perpetuate their business. When the public and elected officials learn more about how they operate, it becomes clear that our patent system needs big changes in order to be a real public benefit. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2022.

Joe Mullin

Reproductive Justice and Digital Rights: 2022 in Review

2 months 4 weeks ago

Reproductive justice and safe access to abortion, like so many other aspects of managing our healthcare, is fundamentally tied to our digital lives. And since it is part of our healthcare, we should have the ability to keep it private and access information about it, even when it’s on a digital device. Actually, especially then: our devices contain a vast amount of highly sensitive personal information. And we all now turn to our phones in order to find information and share our experiences. Ever since it was even rumored that the Supreme Court would be overturning Roe v. Wade, EFF has sprung into action to make sure lawmakers and those seeking abortions know exactly what information resides in the digital world and how it could be shared, or censored, without permission.

That meant that when Dobbs v. Jackson overturned the protections that Roe promised to people seeking abortions and other reproductive healthcare, we were prepared. We were prepared to answer questions about what exactly your phone knows, Google knows, and Facebook knows. And how that information could be obtained. The sudden disappearance of federal protections, combined with a growing number of “bounty laws” targeting support for such care, raises a host of concerns regarding data privacy and online expression. And this expanded threat to digital rights is especially dangerous for BIPOC, lower-income, immigrant, LGBTQ+ people, and other traditionally marginalized communities, and the healthcare providers serving these communities.

The repeal of Roe created a lot of new dangers for people seeking healthcare. This past year, EFF has worked to protect your rights in two main areas: 1) your data privacy and security and 2) your right to free speech.

Data Privacy and Security

With law enforcement looking to punish those who seek abortions, your digital paper trail is now potentially incriminating evidence. Google maps data can tell police if you searched for the address of a clinic. Chat logs can show if you talked about abortion with someone. A digital dragnet can give police names of anyone in the vicinity of a place suspected to offer abortion services. These are just a few examples of things law enforcement already does in other criminal contexts and can now do with regard to reproductive health. The good news is that EFF has a lot of experience in fighting these fights. And so our initial efforts focused on protecting the data privacy and security of people seeking, providing, and facilitating abortion access.

First, we assembled data privacy guides for anyone potentially affected: patients seeking reproductive healthcare, clinics and health professionals, and those involved in the abortion access advocacy movements. We also posted a short video on good security practices for those who might be targeted by anti-abortion laws.

We then provided a principled guide for tech companies to respect user privacy and rights to bodily autonomy. And we called on nonprofit organizations to remove trackers from their websites—highlighting the dangers they could create for people seeking information on abortions.

EFF has also been working with legislators on common-sense privacy legislation to protect not only health-related data but the full range of consumer data that could be weaponized against abortion seekers.

At the state level, we’ve supported data sanctuary legislation to prevent data in pro-choice states from being used for abortion ban enforcement in other states. For example, we worked with California lawmakers to pass A.B. 2091 and A.B. 1242—marking a crucial step forward in making California a digital sanctuary state for people seeking reproductive healthcare.

In Congress, EFF supported Rep. Sara Jacobs’ “My Body, My Data” bill, which would limit how businesses collect, retain, use, and share reproductive health data.

We’ve also pushed for legislation that limits law enforcement access to privately held data, in particular geofence and keyword warrants, and purchasing data from brokers. We explained what policymakers can do to reduce the risk that automated license plate reader data and location data is used to track abortion seekers and providers. And we called on the Administration to prevent federal aid from being used by state and local law enforcement to investigate and prosecute reproductive health choices

First Amendment

The criminalization of abortion threatens not only our right to privacy but also our right to speak freely online.

It also makes things very confusing for regular internet users. Almost every platform has rules against “promoting” “illegal” activities. If abortion is legal in some states but not others, how does that affect what can be said and seen across the country? And many states have gone further than simply banning abortion to also policing what people can say and see online.

To mitigate this threat, EFF has warned against anti-choice states’ efforts to restrict the exchange of abortion-related information online. We explained how Texas’s anti-abortion law violates the First Amendment rights to advocate for reproductive choice, to provide basic educational information, and to counsel those considering reproductive decisions. We joined others in successfully opposing a South Carolina bill that would have made it a crime to discuss abortions online. And we emphasized the importance of pressuring platforms to resist government pressure to remove speech that could be interpreted to aid or abet access to abortion care.

Reproductive health has become an increasingly important attack vector for digital rights, and therefore an increasingly important priority for EFF. There’s a lot more to do, and we know this will be a vital part of EFF’s work for years to come.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2022.

Jennifer Pinsof

Schools and EdTech Need to Study Up On Student Privacy: 2022 in Review

2 months 4 weeks ago

In 2022, student privacy gets a solid “C” grade. The trend of schools engaging in student surveillance did not let up in 2022. There were, however, some small wins indicative of  a growing movement to push back against this encroachment. Unfortunately, more schools than ever are spying on students through EdTech software and other means. 

In an important decision by a federal judge, a remote proctoring “room scan” by a public university - Cleveland State University in Ohio -  was deemed an unreasonable search under the Fourth Amendment. “Room scans,” where students are forced to use their device’s camera to give a 360-degree view of everything around the area in which they’re taking a test, are one of the most invasive aspects of remotely proctored exams. Often, these scans are done in a personal residence, and frequently a private space, like a bedroom. 

The district court recognized that room scans provide the government (public schools are government entities) with a window into our homes—a space that “lies at the core of the Fourth Amendment’s protections” and long-recognized by the Supreme Court as private. There are few exceptions to this requirement, and none of the justifications offered by the university—including its interests in deterring cheating and its assertion the student may have been able to refuse the scan—sufficed to outweigh the student’s privacy interest  in this case. Though this decision isn’t binding on other courts, any student of a state school hoping to push back against room scans in particular could now cite it as persuasive precedent. The school is expected to appeal to the Sixth Circuit. 

EFF began looking more closely at student activity monitoring software, which is basically indistinguishable from spyware, and is used to filter, block, and flag vast amounts of student activity on their school-issued, and sometimes personal, devices. We already know that the machine learning algorithms in this software that filter, flag, and block content routinely misclassify any LGBTQ+ content as “Adult” content. We know that Securly flags “Health” sites (like WebMD) as “needs supervision,” andGoGuardian blocks access to reproductive health materials. It isn’t difficult to see the harms that will occur as more anti-trans laws pass and the legal right to abortion is overturned: students who use their devices to research topics such as trans healthcare or abortion-related material could find those devices weaponized against them, potentially resulting in criminal charges. Moreover, there are already examples of these apps outing LGBTQ+ students.

In 2021, we fought an uphill battle against schools such as Dartmouth, whose administration unjustifiably accused students of cheating based on a misinterpretation of data from Canvas, a “Learning Management Software” (LMS) platform that offers online access to class material. . Though the Dartmouth administrators backed down, we’ve heard from multiple students at other schools with similar issues since then. To continue educating schools on the inaccuracy of LMS activity logs, we called on both Canvas and Blackboard to put clearer disclaimers on their log data and publicly defend any student accused of misusing these platforms based on similar data misinterpretations. We asked schools to remove any marks on any student records that were based on LMS data, and make a clear policy not to use it in the future. We will continue to call for these changes; schools still over-value the reliability of these logs, and though Canvas and Blackboard have confirmed to us or on their sites that they do not recommend using them for disciplinary investigations, they have not made these disclaimers clear. 

Although many students have gone back to in-person learning, remote proctoring is still a concern because we don’t believe it is going away. Along with Privacy Rights Clearinghouse, we sponsored legislation in California, the Student Test Taker Privacy Protection Act (STTPPA), that would have directed proctoring companies to follow reasonable data minimization practices, meaning they cannot collect, use, retain, or disclose test takers’ personal information except as strictly necessary to provide proctoring services. With STTPPA codified into law, if a student’s data was processed beyond what was required to proctor the exam, the student would have the opportunity to take the proctoring company to court. Unfortunately, this bill was weakened before it was signed into law, and it no longer offers individual students the right to bring suit against companies for violations. EFF dropped our support of the bill, because we believe strongly that a private right of action is necessary for consumer privacy laws such as this to have teeth. 

In another remote proctoring fight, EFF client Erik Johnson, a Miami University computer engineering undergraduate, reached a settlement in the lawsuit we brought on his behalf against exam surveillance software maker Proctorio, in a victory for fair use of copyrighted material and people’s right to fight back against bad faith Digital Millennium Copyright Act (DMCA) takedowns used to silence critics. Johnson, who is also a security researcher, sued Proctorio after it misused the copyright takedown provisions of the DMCA to remove his comments about security flaws with the service that he posted on Twitter. Under the settlement, Proctorio dropped its copyright claim and other claims it had filed blaming Johnson’s advocacy for damaging its reputation and interfering with its business. In return, Johnson dropped his claims against Proctorio. Johnson’s tweets, which were restored by Twitter through the DMCA’s counter-notice process, will remain up.

In an expansion of our student privacy advocacy, EFF also honed in on young students—those in daycare. EFF’s investigation found early education and daycare apps have several troubling security risks, and in addition to detailing these flaws, we also urged the Federal Trade Commission to look into the issue. Learn more about our fight to improve daycare app privacy in 2022.

We have big plans for the coming year’s fight to protect student privacy, and look forward to further highlighting the problem of student activity monitoring software, in particular. For now, if you’re interested in learning more about protecting your privacy at school, take a look at our Surveillance Self-Defense guide on privacy for students.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2022.

Jason Kelley

Ending the Scourge of Redlining in Broadband Access: 2022 in Review

2 months 4 weeks ago

EFF’s first ask for the incoming Biden Administration on broadband policy was to ban digital redlining by regulating broadband as a public good instead of a private luxury. EFF has extensively researched the state of fiber broadband infrastructure in the United States for years.  We’ve identified a disturbing trend in low-income access: the systemic underinvestment in their networks. Major broadband providers have been segregating internet users into first-class fiber internet and second-class legacy internet, even in areas where it would be profitable to provide equal access.

This trend did not happen overnight, but rather developed over more than a decade. Fixing it will take years. But the opportunity to address this problem now sits before the Federal Communications Commission (FCC), which was tasked by Congress with a new law to to end discrimination in broadband access.

A New Federal Law Bans Discrimination in Broadband Infrastructure

Enormous government efforts to end the digital divide have begun. These include EFF’s supported infrastructure law in California. Congress also passed the Infrastructure Investment and Jobs Act that includes a  national broadband infrastructure plan. One key provision that made it through Congress, despite last ditch efforts by big Internet Service Providers (ISPs) to remove it, was a ban on discrimination in broadband deployment. This is otherwise known as the digital discrimination rule. If implemented fully, it will transform broadband into a utility akin to water and electricity by prohibiting profiting from discrimination.

At the start of the pandemic, EFF predicted that legacy networks that lack fiber infrastructure would suffer from the increased usage, driven by remote uses such as education. Legacy networks have a physical limit on how much traffic they can handle and are more expensive to maintain. Ask any school district in areas the federal government considers “fully served” (an exceedingly low bar to hit) and they will share countless stories of low-income families with inferior access unable to obtain remote education.

This is because not all broadband access is equal in terms of capacity. When wealthy neighborhoods needed substantially more bandwidth to handle the pandemic, new fiber lines in those neighborhoods handled those increases with almost no stress to the system. Meanwhile, legacy networks were bogged down because they weren’t designed for those increased uses. Our technical analysis of various transmission mediums explains why such a disparity exists between future-proof fiber and legacy options such as copper DSL.

Large ISPs that serve major cities have no excuse for allowing this disparity to continue; it would be profitable for them to fully serve those areas with fiber infrastructure. However, they instead chose to reinvest in wealthier (and, often, whiter) households because of their higher profit potential. ISPs categorized the houses of limited income residents (often people of color) as areas of modest profit potential by comparison. Therefore, they took the revenues from those customers and diverted it to improve the connections of the wealthy. This diversion of profits from low-income households has happened for years. Now it has systemically given many poorer Americans internet that is not only slower, but also more expensive while the wealthy will get faster and cheaper access through fiber.

Put another way, ISPs are leaving low-income people with the “used car” of the internet while buying the “new car” for the wealthy. A proper interpretation of the new federal antidiscrimination law will make this illegal. It will also force potentially billions of dollars in revenues back into the networks of low-income people and pay to convert those networks to to 21st century-ready fiber wires.

 This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2022.

Ernesto Falcon

The Year We Got Serious about Tech Monopolies: 2022 in Review

3 months ago

2022 has been a big year for enforcement of the antitrust laws against tech companies, with the five largest (Apple, Google, Meta/Facebook, Amazon, and Microsoft) all facing lawsuits or investigations in the US. Government scrutiny of tech company mergers is on the rise too: the Federal Trade Commission has challenged Meta’s acquisition of VR fitness company Within Unlimited and Microsoft’s purchase of Activision Blizzard. Congress has debated a series of potential new laws to address the harmful effects of market power in Internet markets, and the European Union has actually done it, passing a sweeping new Digital Markets Act.

As they address tech monopolies, courts and enforcement agencies are beginning to acknowledge the interplay of user privacy and security concerns with antitrust, thanks in part to EFF’s advocacy. In February, we explained to a federal appeals court that Apple’s total control over apps on its mobile devices is not necessary to keep users safe, and in fact makes many users less safe.

Antitrust cases against the tech giants still face serious obstacles from a judiciary that’s become increasingly hostile to claims of monopoly abuse. EFF’s brief was filed in Epic Games’s challenge to Apple’s restrictive App Store policies, which was thrown out by a district court and is now awaiting an appeals court ruling. The FTC’s challenge to Facebook (now Meta) over its history of acquiring potential competitors like Instagram and WhatsApp has faced similar obstacles. We’ve also seen some small but significant wins, including suits against legal research provider Westlaw and computer gaming giant Valve getting past their initial legal hurdles.

It’s important that antitrust enforcers persist in their efforts because we can’t count on internet platforms and services that don’t face real competition to safeguard users’ rights. Sometimes they fail spectacularly. And even when they do a good job of protecting users, their protection is fickle, able to be stripped away with the whims of a mercurial CEO, or when cooperation with government surveillance suits their business interests.

Even while government enforcers and private parties make their cases in court, smart new laws are needed to address the unsolved problems of tech monopolies. The EU is attempting this in earnest with its new Digital Markets Act, a sweeping regulation that designates some of the largest online platforms as “gatekeepers” and places new obligations on those companies to protect business users’ ability to compete. The DMA includes interoperability requirements for these gatekeepers. Interoperability is a vital tool to empower consumers, but the EU’s decision to focus first on messaging apps like WhatsApp and iMessage raises concerns about those apps’ ability to continue providing secure, end-to-end encryption. In 2023, EFF will be working with the EU’s enforcement arm to protect secure messaging as we fight monopolies.

The United Kingdom is also stepping up its efforts with the creation of a new Digital Markets Unit within its competition authority. EFF has written in support of new enforcement powers for that agency, and to support their investigation into the role of mobile web browsers as platforms for app competition.

Meanwhile, in the US, a slew of proposals for new laws has not led to much progress. The American Innovation and Competition Online Act, the Open App Markets Act, and the ACCESS Act all contain important elements of a new pro-competition regime for online platforms, but none has been enacted yet. The Digital Advertising Act would take a different approach, requiring the various roles within online advertising markets to be filled by independently run companies, removing incentives to cheat and drive down web publishers’ income. It too has stalled in Congress.

One thing that this year’s battles over new competition policy for tech have shown us is that creating new antitrust exemptions for favored industries, no matter how important they are, is not the way to fight tech monopolies. The Journalism Competition and Preservation Act, while billed as a way to fund journalism in the internet age, would only give more market power to highly consolidated media conglomerates and their Big Tech allies. That’s why we’re disappointed that Congress has spent so much legislative time on JCPA that could have been used to perfect and pass important fixes like ACCESS.

We can have privacy, security, and competition too. In fact, we must have competition to protect privacy and security for the long haul. This year, we saw an incredible amount of energy, legal maneuvers, and smart new thinking directed toward fixing tech’s monopoly problem. If we persist, we can win lasting change in 2023 and beyond.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2022.

Mitch Stoltz
1 hour 14 minutes ago
EFF's Deeplinks Blog: Noteworthy news from around the internet
Subscribe to EFF update feed