The EU’s Digital Markets Act: There Is A Lot To Like, but Room for Improvement

2 months 3 weeks ago

After lengthy consultations and many rumors and leaks, the European Commission has released its public draft of the Digital Markets Act (DMA), which, along with the Digital Services Act (DSA,) represents the first major overhaul of EU Internet legislation in the 21st Century.  Like the DSA, the DMA takes aim at the monopolisation of the tech sector and proposes sweeping pro-competition regulations with serious penalties for noncompliance.

It is one world. So #DigitalServiceAct & #DigitalMarketsAct will create safe & trustworthy services while protecting freedom of expression. Give new do’s & don’t to gatekeepers of the digital part of our world - to ensure fair use of data, interoperability & no self-preferences.

— Margrethe Vestager (@vestager) December 15, 2020

The DMA addresses itself to “gatekeeper platforms”: very large tech companies that  sit between other businesses and their customers and control “core services”, such as search engines, social networking services, certain messaging services, operating systems, and online intermediation services. Think of how Amazon controls access to customers for merchants that sell on its platform and the manufacturers who make their products, or how the Android and iPhone app stores serve as chokepoints in delivering mobile software. These companies are gatekeepers both because of their business models and their scale: It’s hard to imagine making a successful mobile app without going through the app stores. The DMA identifies core platform services as gatekeepers when they have a significant impact on the EU internal market (e.g. through earnings), have a strong intermediation position (e.g. through their number of users), and which show an entrenched and durable position (number of years in business).

The DMA’s premise is that gatekeepers are international in nature, and EU member states on their own can’t hope to regulate them; it takes an international body like the EU itself to bring them to heel by forcing them to comply with a list of do’s and don’ts. Hence, gatekeepers will need to proactively implement certain practices and refrain from engaging in certain types of unfair behavior. Certain obligations should be complied with by design, while others may be subject to further specification following a dialogue between the Commission and the gatekeepers concerned.

 DMA regulations divide gatekeepers’ businesses into “core services,” for example, selling  goods on Amazon or apps in an app store, and “ancillary services,” which are the other ways gatekeepers make money, like payment processing and  advertising. In general, the DMA hits hardest at ancillary services, for example, by banning platforms from requiring their business customers to use their payment processors.

The Commission’s draft is just a starting point: it will go through many iterations and amendments before it is put to votes at the European Parliament and the Council of the EU (which represents the governments of EU member states). As starting points go, there’s a lot to like in this document, as well as room for improvement. 

Things We Like

Future proofing: The DMA is designed to be updated from time to time, first to address new anticompetitive practices that haven’t been invented yet, and second to create less stringent rules for companies that aren’t big enough to be gatekeepers yet, but are headed in that direction (from the “Explanatory Memorandum”). The rules for initiating investigations are set out in Chapter IV and the enforcement rules are in Chapter V. 

Real penalties and structural remedies: The DMA provides for up to 10 percent of a gatekeeper’s global annual revenue in fines for violating its rules. So, theoretically, a company like Facebook, which had annual revenue of nearly $71 billion (USD) in 2019, could face a $7.1 billion fine. Ongoing infractions can be punished with “periodic penalty payments”  of up to 5% of average global daily revenues. Even more importantly: companies that can’t or won’t stop engaging in monopolistic behavior will face “structural remedies,” like being ordered to sell off whole divisions.

A ban on mixing data: The DMA bans gatekeepers from mixing data from data brokers or their business customers, with the data they collect on their customers (this is a widespread practice today, with companies like Facebook and Google linking commercially available data with the data they extract from their own users). This rule also bans gatekeepers from automatically signing users into additional services: that would mean that,or example, logging into Gmail wouldn’t automatically log you into YouTube (Article 5(a).

Protecting multiple prices and terms: Today, platforms set far-reaching requirements on their business users; for example, a company can’t sell discounted subscriptions to customers who buy them directly rather than through the app store. Under the DMA, gatekeepers would be banned from making rules about selling prices and terms set by their business customers (Article 5(b).

No more forced single sign-on: The DMA bans gatekeepers from requiring their business customers to use their own login or identity system (Article 5(e)).

No cross-tying: Under the DMA, gatekeepers are banned from forcing business customers and end-users to sign up for “ancillary services,” meaning you can use Android without having to get a Gmail account, or sell in Apple’s App Store without using Apple’s high-priced payment processor (Article 5(f)).

No spying on business customers: Today, platforms gather data on their business customers’ activities to figure out how to compete with them, like whether and how to clone their products, for example. Under the DMA, gatekeepers will be banned from using this data to compete with business customers (Article 6(a)).

Let a thousand app stores bloom: The DMA requires gatekeepers to permit third party app stores that compete with their own, but it allows gatekeepers to limit these apps’ ability to interfere with “the integrity of the hardware or operating system” (Article 6(c)).

No lock-in: The DMA bans gatekeepers from “technically restricting” users from switching away from default apps  It also bans gatekeepers from locking users into an ISP (Article 6(e)).

Interoperable add-ons: The DMA requires gatekeepers to allow other “ancillary service providers” (like payment processors, cloud hosts, digital identity providers, and ad-tech sellers) to plug into their core services on the same terms that gatekeepers’ own ancillary services enjoy (Article 6(f)).

Data portability and continuous real-time access: Under the DMA, gatekeepers’ business customers and end-users will have the right both to “data portability” (where the gatekeeper gives you all your data in a giant blob you can take to a rival and upload) and “realtime access” (so you can join a rival’s system that can grab all your news messages and data from the gatekeeper every few minutes) (Article 6(h)).

Businesses can access their own data: The DMA requires gatekeepers to allow their business customers to access the data about their sales, customers, and other commercial activity. The access must be “free of charge,” “high quality, continuous and realtime” (from Article 6(i)).

Fair and nondiscriminatory access to app stores: The DMA requires gatekeepers with app stores to accept business’ apps on a “fair and nondiscriminatory” basis ( Article 6(k)).

Things We’re Worried About

A ban on national regulation: The DMA prohibits EU member states from passing their own laws or regulations on gatekeeper platforms that go beyond the DMA. This rule endangers laws already under debate in EU member states that go farther than the DMA, such as Germany’s excellent proposal for expanded interoperability requirements for gatekeepers ( Article 1(5)).

No interoperable “core services”: While the DMA provides for interoperability in ancillary services (payment processing, ad-serving, etc), there is no mention of interop for core services: This means that, for example, Facebook might have to let a competitor offer its own payment processing for Oculus apps, but not offer a competing social media network that interoperates with Facebook. The technical term for this is “weaksauce” (Article 6(f)).

Real-time, but not independent, data-portability: The DMA’s requirement for “realtime data portability” looks good, but users can’t take advantage of it unless they have an account on the gatekeeper service. So if you left Facebook for Diaspora and wanted to stay in touch with your Facebook friends using “realtime data-portability,” you’d have to keep your Facebook account and connect it to Diaspora, which means you’d still be subject to the sprawling garbage-novella of abusive legalese Facebook laughably calls its “terms of service” (Article 6(h))

Cory Doctorow

European Commission’s Proposed Digital Services Act Got Several Things Right, But Improvements Are Necessary to Put Users in Control

2 months 3 weeks ago

The European Commission is set to release today a draft of the Digital Services Act, the most significant reform of European Internet regulations in two decades. The proposal, which will modernize the backbone of the EU’s Internet legislation—the e-Commerce Directive—sets out new responsibilities and rules for how Facebook, Amazon, and other companies that host content handle and make decisions about billions of users’ posts, comments, messages, photos, and videos.

This is a great opportunity for the EU to reinvigorate principles like transparency, openness, and informational self-determination. Many users feel locked into a few powerful platforms and at the mercy of algorithmic decision systems they don’t understand. It’s time to change this.

We obtained a copy of the 85-page draft and, while we are still reviewing all the sections, we zeroed in on several provisions pertaining to liability for illegal content, content moderation, and interoperability, three of the most important issues that affect users’ fundamental rights to free speech and expression on the Internet.

What we found is a mixed bag with some promising proposals. The Commission got it right setting limits on content removal and allowing users to challenge censorship decisions. We are also glad to see that general monitoring of users is not a policy option and that liability for speech rests with the speaker, and not with platforms that host what users post or share online. But the proposal doesn’t address user control over data or establish requirements that the mega platforms work towards interoperability. Thus, there is space for improvement and we will work with the EU Parliament and the Council, which must agree on a text for it to become law, to make sure that the EU fixes what is broken and puts users back in control.

Content liability and monitoring

The new EU Internet bill preserves the key pillars of the current Internet rules embodied in the EU’s e-Commerce Directive. The Commission followed our recommendation to refrain from forcing platforms to monitor and censor what users say or upload online. It seems to have learned a lesson from recent disastrous Internet bills like Article 17, which makes platforms police users’ speech.

The draft allows intermediaries to continue to benefit from comprehensive liability exemptions so, as a principle, they will not be held liable for user content. Due to a European-style “good samaritan” clause, this includes situations where platforms voluntarily act against illegal content. However, the devil lies in the details and we need to make sure that platforms are not nudged to employ “voluntary” upload filters.

New due-diligence obligations

The DSA sets out new due diligence obligations for flagging illegal content for all providers of intermediary services, and establishes special type and size-oriented obligations for online platforms, including the very large ones.

We said from the start that a one-size fits all approach to Internet regulations for social media networks does not work for an Internet that is monopolized by a few powerful platforms. We can therefore only support new due diligence obligations that are matched to the type and size of the platform. The Commission rightly recognizes that the silencing of speech is a systemic risk on very large platforms and that transparency about content moderation can improve the status quo. However, we will carefully analyze other, potentially problematic provisions, such as requiring platforms to report certain types of illegal content to law enforcement authorities. Rules on supervision, investigation, and enforcement deserve in-depth scrutiny from the European Parliament and the Council.

Takedown notices and complaint handling

Here, the Commission has taken a welcome first step towards more procedural justice. Significantly, the Commission acknowledges that platforms frequently make mistakes when moderating content. Recognizing that users deserve more transparency about platforms’ decisions to remove content or close accounts, the draft regulations call for online platforms to provide a user-friendly complaint handling system and restore content or accounts that were wrongly removed.

However, we have concerns that platforms, rather than courts, are increasingly becoming the arbiters of what speech can or cannot be posted online. A harmonized notification system for all sorts of content will increase the risk that the platform becomes aware about the illegality of content and thus held liable for it.

Interoperability measures are missing

The Commission missed the mark on giving users more freedom and control over their Internet experience, as rules on interoperability are absent from the proposal. That may be addressed in the Digital Markets Act draft proposal. If the EU wants to break the power of platforms that monopolize the Internet, it needs regulations that will enable users to communicate with friends across platform boundaries, or be able to follow their favorite content across different platforms without having to create several accounts.

Court/administrative content takedown orders

The Internet is global and takedown orders of global reach are immensely unjust and impair users’ freedom. The draft rules address the perils of world-wide takedown orders by requiring such orders take into account users’ rights and their territorial scope must be necessary

Sanctions

Under the proposed regulations, the largest platforms can be fined up to six percent of their annual revenue for violating rules about hate speech and the sale of illegal goods. Proper enforcement actions and dissuasive sanctions are important tools to change the current digital space that is monopolized by very large platforms. That being said, high fines are only good if the substance of the regulations is good, which we will study in great detail in the next few weeks.

Non-EU platforms

Non-EU platform providers will face compliance duties if their services have a substantial connection to the EU. The proposed rules take particular aim at companies outside the Union, such as those in the U.S, that offer services to EU users. But the criteria for imposing the duties is not clear, and we’re concerned that if non-EU platforms are obligated to have legal representation in the EU, some will decide against offering services in the EU.

 

Christoph Schmon

Protecting Your Rights to Understand and Innovate on the Tech in Your Life

2 months 3 weeks ago

Every three years, the public has an opportunity to chip away at the harm inflicted by an offshoot of copyright law that doesn’t respect traditional safeguards such as fair use. This law, Section 1201 of the Digital Millennium Copyright Act, impedes speech, innovation, and access to knowledge by threatening huge financial penalties to those who simply access copyrighted works that are encumbered by access restriction technology. To mitigate the obvious harm this law causes, Americans have the right to petition for exemptions to Section 1201, which last for three years before the whole process starts over.

The liability created by Section 1201 can attach even to those who aren’t infringing copyright, because their access is in service of research, education, criticism, remix, or other fair and noninfringing uses. The law allows rightsholders to enforce their business models in ways that have nothing to do with the rights actually granted to copyright holders. A willful and commercial act of circumvention can even result in criminal charges and jail time, and the Department of Justice takes the position that there doesn’t need to be any connection to actual copyright infringement for them to prosecute.

EFF is representing Matthew Green and bunnie Huang in a First Amendment challenge to Section 1201, based on its failure to respect copyright's traditional boundaries, including safeguards like fair use.  At the same time, we're participating in the rulemaking process in hopes of winning some exemptions that will mitigate the law's harms.  In the past, we’ve won exemptions for remix videos, jailbreaking personal computing devices, repairing and modifying car software, security research, and more.

This year, EFF is asking the Librarian of Congress to expand on the 2018 device repair exemption with a broader version that would apply to all software-enabled devices and include non-repair modifications.  In past rulemakings, the government has insisted on drawing arbitrarily narrow classes of devices to exempt; our submissions aim to illustrate the wrongheadedness of this approach. In keeping with that theme, we're also asking the Librarian to clarify that the existing exemption for jailbreaking smart TVs includes video streaming devices like the Roku, Apple TV, and Amazon Fire Stick.

Thank you to everyone who sent in stories about how Section 1201 is interfering in your life! We’re proud to turn your stories into legal arguments that can help improve the state of the law, and we couldn’t do it without you.

Related Cases: 2021 DMCA Rulemaking
Cara Gagliano

Visa and Mastercard are Trying to Dictate What You Can Watch on Pornhub

2 months 3 weeks ago

Pornhub is removing millions of user-uploaded videos. This action comes after a New York Times column accused the website of hosting sexual videos of underage and nonconsenting women. In response to the Times’ article, Visa and Mastercard cut ties with Pornhub, making it impossible for Pornhub to process payments other than through cryptocurrencies. 

This isn’t a debate over whether Pornhub is predatory. This is a question about what level of censorship power we want to give to payment processors. 

Sexual exploitation is a scourge on society that needs resources, education, victim support, and, when necessary, prosecution by responsible authorities to address. Visa and Mastercard are the wrong entities for addressing these problems. Visa and Mastercard do not have the skills, expertise, or position to determine complex issues of digital speech. Nuanced challenges to what content should exist online, and whether moderation policies will inadvertently punish otherwise marginalized voices, are issues that legal experts, human rights experts, lawmakers, and courts in the United States and abroad have been deeply considering for years. The truth is, navigating speech policies in a way that won’t shut down huge swaths of legitimate and worthy speech is hard. And it’s wrong that Visa and Mastercard have the power to—however clumsily—police speech online.

More importantly, as a society, we haven’t given Visa and Mastercard the authority to decide online speech cases. Those companies haven’t been elected or chosen by any electorate in any country. They are here enforcing speech rules that we haven’t adopted in the United States—and, frankly, which would likely violate the U.S. Constitution if they were adopted. And sadly this is not the first time these companies’ decisions have jeopardized speech online.

Visa and Mastercard, acting together, are currently a chokepoint for online payments. This means that every arbitrary policy of these two companies can translate into rules that all websites who want to process payments must follow. Until and unless we create a diverse and robust market of online payment services not reliant on Visa and Mastercard, we have to deal with the fact that these two companies can dictate what you can read online—or, in this case, what porn you’re allowed to watch. 

This isn’t a debate over whether Pornhub is predatory. This is a question about what level of censorship power we want to give to payment processors. Ironically, until now some of the most powerful critics of Pornhub’s policies have been the sex workers who also struggle daily with the credit card companies’ rejection of their sites. These companies’ power may be seen as a way to keep Pornhub to account today: but every other day, they are used to remove the financial freedom of independent sex workers, Pornhub’s competitors and potential alternatives to their near-monopoly power.

It’s a well-worn idea that pornography is one of the first reasons for the adoption of many new technologies. It’s also the domain where censors in the West often choose to flex their powers. Censorship, financial or not, of sex sites gets little push-back, and a great deal of public praise. It justifies the continuing concentration of free speech chokepoints—chokepoints that have always been used against LGBTQ speech, and women’s and minority rights—whenever a moral crusade needs an undemocratic hand. Any website or individual can find itself running afoul of Visa and Mastercard’s moral sensibilities and shut off from receiving online payments. We saw it with WikiLeaks. We saw it with the kink social network Fetlife. We saw it with the independent book publisher Smashwords. And we’ve seen it with countless sex workers.

Those praising Mastercard and Visa’s actions now should recognize that these censorship powers are more often used against those without power. That should scare all of us. We hope those praising the actions against Pornhub will work together to reduce these payment processors’ power over online life, rather than give justifications as to why they should keep it.

Danny O'Brien

Massachusetts Legislators Should Stand With Their Communities and Restore Face Recognition Prohibitions to Police Reform Bill

2 months 3 weeks ago

Before 2020 ends, Massachusetts could become the first state to implement robust state-wide protections from government use of face recognition. As part of a sweeping package of police reform legislation (S. 2963) inspired by protests for police accountability, state legislators in the commonwealth passed a prohibition on government agencies using the technology. Disappointingly, Governor Baker returned the omnibus bill to the legislature with this section entirely stricken.

To protect Massachusetts residents from government use of this dangerous technology, legislators must restore these protections to the bill.

Take Action

Massachusetts: End Government Face Surveillance

Government use of face surveillance threatens privacy, chills free expression, and amplifies historical biases in our criminal justice system. A study conducted for the ACLU of Massachusetts revealed that 79% of Massachusetts voters support a moratorium on government use of face surveillance.

Before being struck from S. 2963 by Governor Baker, Section 26 offered protections to prevent police from using face recognition to track residents as they attend school, visit health providers, and otherwise go about their lives.

EFF supports Section 26 of S. 2963, which would end as we know it government use of face surveillance in the Commonwealth of Massachusetts. Face surveillance is a menace to privacy, free speech, and racial justice. We urge Massachusetts residents to tell your senators and representatives to override Governor Baker's decision to remove Section 26.

Nathan Sheard

IPANDETEC Releases First Report Rating Nicaraguan Telecom Providers’ Privacy Policies

2 months 3 weeks ago

IPANDETEC,  a digital rights organization in Central America, today released its first "Who Defends Your Data" (¿Quién Defiende Tus Datos?) report for Nicaragua, assessing how well the country’s mobile phone and Internet service providers (ISPs) are protecting users' personal data and communications. The report follows the series of assessments IPANDETEC has conducted to evaluate the consumer privacy practices of Internet companies in Panama, joining a larger initiative across Latin America and Spain holding ISPs accountable for their privacy commitments. 

The organization reviewed six companies: Claro Nicaragua, a subsidiary of the Mexican company America Móvil; Tigo Nicaragua, a subsidiary of Millicom International, headquartered in Luxembourg; Cootel Nicaragua, part of the Chinese Xinwei Group; Yota Nicaragua, a subsidiary of Rostejnologuii, a Russian company; IBW, part of IBW Holding S.A, which provides telecom services across Central America, and Ideay, a local Nicaraguan  company.

The ¿Quién Defiende Tus Datos? report looks at whether the companies post data protection policies on their website, disclose how much personal data they collect from users, and whether, and how often, they share it with third parties. Companies are awarded stars for transparency in each of five categories, detailed below. Shining a light on these practices allows consumers to make informed choices about what companies they should entrust their data to.

Main Findings

IPANDETEC’s review shows that, with a few exceptions, Nicaragua’s leading ISPs have a long way to go in providing transparency about how they protect user data. Only three of the six companies surveyed—Claro, Tigo, and Cootel—publish privacy policies on their websites and, with the exception of Tigo, the information provided is limited to policies for collecting data from users visiting their websites. Tigo’s policy provides partial information about data collected beyond the company’s website, earning the company a full star. Cootel comes close—its policy refers to its app Mi Cootel (My Cootel), which allows customers to manage and change services under their contracts with the company. Claro and Cootel received half stars in this category.

Claro and Tigo’s parent companies publish more comprehensive data protection policies, but they are not available on the websites of their Nicaragua subsidiaries and don’t take into account how their practices in Nicaragua comport with the country’s data protection regulations and other laws. Tigo reported it’s working to improve the information available on the local website in its reply to IPANDETEC’s request for additional information.

Tigo received a half star for its partial commitment to a policy of requiring court authorization before providing the content of users’ communications to authorities. Claro received a quarter star in this category. The ISP's local policy for requiring court authorization is not clear enough, although the global policy of its parent company America Movil is explicit on this requirement. Both companies have fallen short in showing a similar explicit commitment when handing users' metadata—such as names, subject line information, and creation date of messages–to authorities.

Tigo earned a quarter star for making public guidelines for how law enforcement can access users’ information, primarily on account of Millicom’s global law enforcement assistance policy. The guidelines establish steps that must be taken locally when the company is responding to law enforcement requests, but they are general and not specific to Nicaragua’s legal framework. Moreover, the policy is available only in English on Millicom’s website. Claro’s subsidiaries in Chile and Peru publish such guidelines; unfortunately the company’s Nicaraguan unit does not.

The IPANDETEC report shows that all the top Internet companies in Nicaragua need to step up their transparency game—none publish transparency reports disclosing how many law enforcement requests for user data they receive. Tigo’s parent company Millicom publishes annual transparency reports, but the most recent one didn’t include information about its operations in Nicaragua. Millicom said it plans to include Nicaragua in future reports.

The companies were evaluated on specific criteria listed below. For more information on each company, you can find the full report on IPANDETEC’s website.

Data Protection Policy: Does the company post a data protection policy on its website? Is the policy written in clear and easily accessible language? Does the policy establish the retention period for user data?

Transparency Report: Does the company publish a transparency report? Is the report easily accessible? Does the report list the number of government requests received, accepted, and rejected?   

User Notification: Does the company publicly commit to notifying users, as soon as the law allows, when their information is requested by law enforcement authorities?

Judicial Authorization: Does the company publicly commit to request judicial authorization before handing users’ communications content and metadata

Law Enforcement Guidelines: Does the company outline public guidelines on how law enforcement can access users’ information?

Conclusion

Digital devices and Internet access allow people to stay connected with family and friends and have access to information and entertainment. But technology users around the world are concerned about privacy. It’s imperative for ISPs in Nicaragua to be transparent about if, and how, they are safekeeping users’ private information. We hope to see big strides from them in future reports. 

Karen Gullo

California Has a New COVID Exposure Notification App

2 months 4 weeks ago

Today California joined dozens of other states and countries in launching its COVID-19 exposure notification app, CA Notify, built on Google and Apple’s Exposure Notification API. Google and Apple’s API is already used in 20 other U.S. states, as well as countries including Germany, the UK, and much of Canada

CA Notify and apps like it meet most, but not all, of our standards for exposure notification apps

These apps use mobile phones’ Bluetooth functionality to determine if a person has come into contact with someone who recently tested positive for the virus. (In iOS, there is no app to download; the “Exposure Notification” feature can be turned on via the settings.) If an app user tests positive for COVID, the app will notify others with the app who have come into contact with them, without giving information about the individual who tested positive. While the Bluetooth technology that powers California’s app and others like it is the most promising approach to COVID exposure notification, there are still important privacy and equity concerns. And, ultimately, COVID tracking apps like these can only be effective if deployed alongside widespread testing and interview-based contact tracing.

Is It Private and Secure?

CA Notify and other apps built on Google and Apple’s API meet several of the key proximity tracking and exposure notification safeguards that EFF has been looking for from the start, including informed, voluntary, opt-in consent and data minimization (both in terms of what data is collected and where it is shared). They also allow users to uninstall the app, turn off the functionality, and opt out at any point. Google and Apple have not yet, however, met all of our standards for information security (including subjecting it to third-party audits and penetration testing), nor are we aware of any individual app developers publishing transparency reports. 

Two important privacy-protective choices are worth additionally highlighting: Google and Apple’s system does not track user’s location, and it uses a “decentralized” approach to keep all the user’s identifiers on their device. 

First, these apps use Bluetooth to track your proximity to other devices, rather than using GPS data or cell tower data to track your location. This is the right approach. Phone location data is insufficiently granular to identify when two people are close enough together to transmit the virus, but it is detailed enough to expose sensitive information about where you’ve been and what you’ve been doing.

Proximity tracking apps might be, at most, a small part of a larger public health response to COVID-19 

Second, the apps are designed to keep your identifiers on your device (and not, for example, in an inaccessible, centralized government or law enforcement database). If and when a user tests positive, they can choose to enter the diagnosis code provided by their testing provider and upload their identifiers to a publicly accessible registry. These identifiers are random and ephemeral, and thus harder to correlate to a specific person. 

We've outlined theoretical ways that an attacker could abuse the app, such as setting up a Bluetooth beacon to map a user's detailed routine. Additionally, police may seek data created by proximity apps, which is stored on users’ phones, and could use that to learn about specific associations or interactions. Whether these dangers are outweighed by the benefit of COVID-19 is user-dependent, and the relative costs and benefits of the proximity apps themselves remain unknown.

Will It Work?

Proximity tracking apps might be, at most, a small part of a larger public health response to COVID-19, for several reasons. 

First, any benefits of this technology will be unevenly distributed. These apps assume that one smartphone equates to one human. But any app-based or smartphone-based solution will miss the groups least likely to have a mobile phone and more at risk of COVID-19 and in need of resources: in the United States, that includes elderly people, people without housing, and those living in rural communities. Even if someone has access to a cell phone, that phone might not be an up-to-date iPhone or Android, and many older phones simply won’t have the technology necessary for Bluetooth proximity tracking. Phones can be turned off, left at home, run out of battery, or be set to airplane mode. So even a proximity tracking system with near-universal adoption is going to miss millions of contacts each day, and disproportionately miss communities at higher risk for COVID.

Second, even with widespread adoption, the app will be far from perfect.  Bluetooth technology was simply not designed for this. A study of early deployments of the technology in Europe found that an app detected about 50% of true exposures, and also incorrectly triggered exposure notifications for about 50% of nearby devices. It also found that simply changing the person holding a particular phone was enough to cause significant variations in how the app measured exposure. Some of the app’s performance will be dictated by parameters set by local health departments, and it’s possible that CA officials can do better than earlier prototypes. And even flawed apps can be useful: pilot studies have suggested that even a relatively small number of people using a relatively inaccurate app can help flatten the curve

Third and finally, however, even a theoretically best-designed, most privacy-protective, universally adopted app cannot fill the as-yet unmet need for traditional public health measures like testing, contact tracing, PPE for healthcare workers, and widespread social distancing and masking. Imagine it: if you received a notification that you had been exposed, but could not access testing, contact tracing, or isolation guidance and support, that notification would not serve you or the larger public health purpose of fighting the spread of COVID-19. This is why governments and institutions must not rely on this technology as a “silver bullet” to rush reopening, and further must be prohibited from discriminating against people who choose not to use it.

CA Notify and apps like it meet most, but not all, of our standards for exposure notification apps. We hope to see Google, Apple, and developers building on their system embrace additional information security and transparency measures. In the meantime, governments, institutions and users must continue to take seriously the tradeoffs and risks at stake when it comes to COVID exposure notification technology.

UPDATED (12/18/2020): An earlier version of this article stated incorrectly that Google and Apple's Exposure Notification code is not open source. Multiple components are available on GitHub and on Apple's developer site.

Gennie Gebhart

Raid on COVID Whistleblower in Florida Shows the Need to Reform Overbroad Computer Crime Laws and the Risks of Over-Reliance on IP Addresses

2 months 4 weeks ago

The armed Florida Department of Law Enforcement raid on Monday on the Tallahassee Florida home of data scientist and COVID whistleblower Rebekah Jones was shocking on many levels. This incident smacks of retaliation against someone claiming to provide the public with truthful information about the most pressing issue facing both Florida and our nation: the spread and impact of COVID-19. It was an act of retaliation that depended on two broken systems that EFF has spent decades trying to fix: first, that our computer crime laws are so poorly written and broadly interpreted that they allow for outrageous misuses of police, prosecutorial and judicial resources; and second, that police continue to overstate the reliability of IP addresses as a means of identifying people or locations.

All too often, misunderstandings about computers and the digital networks lead to gross miscarriages of justice.

On the first point, it seems that the police asked for, the prosecutors sought, (and the Court granted) a warrant for a home raid in response to a text message sent to a group of governmental and nongovernmental people working on tracking COVID, urging members to speak up about government hiding and manipulating information about the COVID outbreak in Florida.

This isn’t just a one-off misuse: in other cases, we’ve seen the criminalization of “unauthorized” access used to threaten security researchers who investigate the tools we all rely on, prosecute a mother for impersonating her daughter on a social network, threaten journalists seeking to scrape Facebook to figure out what it is doing with our data, and prosecute employees who did disloyal things on company computers. “Unauthorized” access was also used to prosecute our friend Aaron Swartz, and threaten him with decades in jail for downloading academic articles from the JSTOR database. Facing such threats, he committed suicide.  How could a text message urging people to do the right thing ever result in an armed police home raid? Sadly, the answer lies in the vagueness and overbreadth of the Florida Computer Crime law, which closely mirrors the language in the federal Computer Fraud and Abuse Act (laws in many states across the country are likewise based on the CFAA). 

Police all too often liken an IP address to a “fingerprint"

The law makes it a crime – a serious felony -  to have “unauthorized access” to a computer. But it doesn’t define what “unauthorized” means.  In cases across the country, and in one currently pending before the U.S. Supreme court called Van Buren, we’ve seen that the lack of a clear definition and boundaries around the word “authorized” causes great harm. Here, based upon the Affidavit in the Rebekah Jones case, the police took the position that sending a single text message to a group that you are not (or are no longer) a part of is “unauthorized” access to a computer and so is a crime that merits an armed home police raid. This, despite the obvious fact that no harm happened as a result of people getting a single message urging them to do the right thing.

In fact, if you’ve ever shared a password with a family member or asked someone else to log into a service on your behalf or even lied about your age on a dating website, you’ve likely engaged in “unauthorized” access under some court interpretations. We urged the Supreme Court in the Van Buren case to rule that violations of terms of use (as opposed to overcoming technical blocks) can never be criminal CFAA violations. This won’t entirely fix the law, but it will take away some of the most egregious misuses. 

This case confirms our serious, ongoing national failure to protect whistleblowers. 

Even with the broader definition of “unauthorized,” though, it’s unclear whether the text message in question was criminal. The Affidavit from the police confirms that the text group shared a single user name and password and some have even said that the credentials were publicly available. Either way, it’s hard to see how the text could have been “unauthorized” if there was no technical or other notice to Ms. Jones that sending a message to the list was not allowed. Yet this wafer-thin reed was accepted by a Court as a basis for a search warrant of Ms. Jones’ family home. 

On the second point, the Affidavit indicates that the police relied heavily on the IP address of the sender of the message to seek a warrant to send armed police to Ms. Jones’ home. The affidavit fails to state how the police were able to connect the IP address with the physical address, simply stating that they used “investigative resources.” Press reports claim that Comcast – the ISP that handled that IP address -  did confirm that Ms. Jones home was the customer associated with the IP address, but that isn't stated in the Affidavit. In other cases, the use of notoriously imprecise public reverse IP lookup tools has resulted in raids of the wrong homes, sometimes multiple times, so it is important that the police explain to the Court what they did to confirm the address and not just hide behind “investigative sources.”   

EFF has long warned that the overreliance on IP addresses as a basis for either the identity or location of a suspect is dangerous. Police all too often liken an IP address to a “fingerprint,” a misleading comparison that suggests that IP-based identifications are much more reliable than they really are, making the metaphor a dangerous one. The metaphor really falls apart when you consider  the reality that a single IP address used by a home network is usually providing Internet connectivity to multiple people with several different digital devices, making it difficult to pinpoint a particular individual. Here, the police did tell the court that that Ms. Jones had recently worked for the Florida Department of Health, so the IP address wasn’t the only fact before the court, but it’s still pretty thin for a home invasion warrant, rather than, say, a simple police request that Ms. Jones come in for questioning. 

Even if it turns out Florida police were correct in this case – and for now Ms. Jones has denied sending the text – the rest of us should be concerned that IP addresses alone, combined with some undisclosed “investigative resources” can be the basis for a judge allowing armed police into your home. And it shows that judges must scrutinize both IP address evidence and law enforcement claims about their reliability, along with other supporting evidence, before authorizing search warrants.

This case confirms our serious, ongoing national failure to protect whistleblowers. And in this case - as with Edward Snowden, Reality Winner, Chelsea Manning and many others - it’s clear that part of protecting whistleblowers means updating our computer crime laws to ensure that they can't be used as  ready tools for prosecutorial overreach and misconduct. We also need to continue to educate judges about the unreliability of IP addresses so they require more information than just vague conclusions from police before granting search warrants.

All too often, misunderstandings about computers and the digital networks lead to gross miscarriages of justice. But computers and the Internet are here to stay. It’s long past time we ensured that our criminal laws and processes stopped relying on outdated and imprecise words like “authorized” and metaphors like “fingerprints,”  and instead apply technical rigor when deliberating about technology. 

Correction (December 16 2020): An earlier version of this article stated the raid was conducted by the Florida State Police. It was actually the Florida Department of Law Enforcement. Many thanks to the reader who alerted us to this error.

Related Cases: Van Buren v. United StatesUnited States v. David Nosal
Cindy Cohn

Federal and State Antitrust Suits Challenging Facebook’s Acquisitions are a Welcome Sight

2 months 4 weeks ago

Antitrust enforcers charged with protecting us from monopolists have awoke from decades-long hibernation to finally address something users have known, and been paying for with their private data, for years: Facebook’s acquisitions of rival platforms have harmed social media users by reducing competition, leaving them with fewer choices and creating a personal data-hoovering behemoth whose profiling capabilities only cement its dominance.

These lawsuits, though they won’t be easy to win, are a welcome sight.


Now the government’s enforcers want Facebook broken up. The company’s acquisitions of Instagram in 2012 and WhatsApp in 2014 are at the center of lawsuits filed yesterday by the Federal Trade Commission (FTC) and forty U.S. states and territories that accuse the giant platform of having and illegally maintaining monopoly power in the “personal social networking” market. Facebook CEO Mark Zuckerberg, the lawsuits allege, strategized that it was better to buy rather than compete. Acquiring Instagram and WhatsApp, the lawsuits allege, deprives social media users of the benefits of competition—more choice, quality, and innovation.

The suits also focus on how Facebook treats companies that want to interoperate with its services. Facebook has long recognized that the ability to interoperate with an incumbent platform is a powerful anti-monopoly weapon. That’s why, say the lawsuits, Facebook attaches conditions when it allows app developers to use its APIs: they can’t provide services that compete with Facebook’s functions, and they can’t connect with or promote other social networks.

Spinning off Facebook’s acquisitions could inject competition into a field where it’s been stifled for many years now.



Like many antitrust suits, a key issue will be whether the court accepts the governments' definition of the relevant market that’s being monopolized. In other words, is “personal social networking services" a unique type of service that Facebook dominates? Or does Facebook compete head-to-head with everything from email to television as one player among many? That issue is sure to be hotly contested as the government and states grapple with Facebook about what other companies are part of the relevant market.  

Facebook will probably also argue that its acquisitions were good for consumers and weren’t illegal from an antitrust standpoint because, even if they gave the company market dominance, they led to innovation that benefited users. Because no one can know for sure what would have happened if Instagram and WhatsApp had remained independent, Facebook will argue, the courts can do nothing now.

Tell that to former Instagram and WhatsApp users who saw the platforms they chose over Facebook be subsumed into Facebook’s ecosystem. Those users thought their preferred network, and their data, could be kept separate from Facebook’s; first because they were actually separate, and then because Facebook told them so, only to go back on its word, siphon off their data, and be opaque about the privacy implications to boot.

Antitrust regulators were mostly asleep at the wheel. Meantime, Instagram users saw the Instagram Direct logo disappear and be replaced with Facebook Messenger logo. Facebook continues to blur the lines between the two apps, we noted last month, as part of a broader plan to consolidate Instagram Direct, Facebook Messenger, and WhatsApp. In a recent messaging “update,” Facebook encouraged Instagram users take advantage of  new “cross-platform messaging” features that in essence give you Facebook Messenger inside Instagram. But hey, you get innovations like colors in chats and new emojis.

Facebook will also have to defend its 2013 acquisition of VPN maker Onavo, which was specifically called out in the states’ lawsuit. Onavo’s data-gathering features were billed as a way for Facebook customers to keep their web browsing safe. But as it turns out, Facebook was using Onavo to gather intelligence about potential rivals by seeing how many messages users were sending through WhatsApp, which is what led it to buy WhatsApp. Facebook shut down the Onavo service after the practice was revealed. Whoops.

The enforcers aren’t asking that Facebook pay damages in the lawsuit. Rather, they want a court to require Facebook to divest Instagram, WhatsApp and possibly other acquisitions, and to limit the companies’ future mergers and acquisitions.

That’s the right approach. Even though company break-ups are hard to achieve—the last significant technology company to be broken up was AT&T in 1982—spinning off Facebook’s acquisitions could inject competition into a field where it’s been stifled for many years now. Even the pursuit of a break-up and restrictions on future mergers can create needed space for competition in the future. That’s why these lawsuits, though they won’t be easy to win, are a welcome sight.

Mitch Stoltz

Unfiltered: How YouTube’s ‘Content ID’ Helps Shape What We See Online

2 months 4 weeks ago
New EFF Whitepaper Shows How Filters Can Undermine Fair Use for Video Creators

San Francisco –YouTube is supposed to be a vibrant space for new creativity, but in practice creators are sharply hampered by the site’s “Content ID” system. A new whitepaper from the Electronic Frontier Foundation (EFF) takes a deep dive into the confusing process of getting a video past YouTube’s copyright filters, and what the system means for free speech and creativity on the Internet.

“YouTube dominates the online video market, and Content ID dominates video makers’ experiences there,” said EFF’s Katharine Trendacosta, associate director of policy and activism. “Instead of making the best video they can, they have to make the best video that will pass through Content ID—a system that does a clumsy job of finding actual copyright infringement, but does a great job of ensnaring videos that don’t infringe at all.”

Unfiltered” describes the byzantine process that professional video creators and others must go through to ensure that their video is posted and recommended by YouTube’s algorithm, including dealing with mistaken matches to copyrighted content, arbitrary judgements on how long content clips can be, loss of revenue, and multiple copyright claims of the same piece of music or video. Creators can appeal a Content ID match, but missteps during that appeal process can lead to a formal legal claim through the Digital Millennium Copyright Act (DMCA). That creates more headaches, including the potential loss of an entire YouTube channel.

In addition to a step-by-step description of how Content ID works, “Unfiltered” also includes case studies from popular YouTubers “hbomberguy,” Todd in the Shadows, and Lindsay Ellis—all who express their frustration with Content ID.

“The whole system is so confusing that when copyright experts at NYU tried to post a video about copyright to YouTube, they got caught in Content ID too. And they weren’t sure what to do in order to save their channel from a ‘strike’ that could put all of their videos in jeopardy,” said Trendacosta. “But these restrictions aren’t just annoying for creators, they harm culture as a whole. Content ID is a prime example of why automated filtering systems hurt free speech and expression.”

For “Unfiltered: How YouTube’s Content ID Discourages Fair Use and Dictates What We See Online”:
https://www.eff.org/wp/unfiltered-how-youtubes-content-id-discourages-fair-use-and-dictates-what-we-see-online

Contact:  KatharineTrendacostaAssociate Director of Policy and Activismkatharine@eff.org
Rebecca Jeschke

Filters Do More Than Just Block Content, They Dictate It

2 months 4 weeks ago

Today, EFF is publishing a new white paper, “How YouTube’s Content ID Discourages Fair Use and Dictates What We See Online.” The paper analyzes the effects of YouTube’s automated copyright filter, Content ID, on the creative economy that has sprung up around the platform. Major platforms like YouTube have used copyright filters that prevent their users from expressing themselves even in ways that are allowed under copyright law. As lobbyists for Big Content—major record labels, big Hollywood studios, and the like—push for the use of broader and stricter filtering technology, it’s important to understand the harms these filters cause and how they lead to unfair shakedowns of online creators.

YouTube is the largest streaming video service and one that hosts the most user-generated content. As a result, Content ID has an outsized effect on the online video creator ecosystem. There is a terrible, circular logic that traps creators on YouTube. They cannot afford to dispute Content ID matches because that could lead to Digital Millennium Copyright Act (DMCA) notices. They cannot afford DMCA notices because those lead to YouTube. They cannot afford copyright strikes because that could lead to a loss of their account. They cannot afford to lose their account because they cannot afford to lose access to YouTube’s giant audience. And they cannot afford to lose access to that audience because they cannot count on making money from YouTube’s ads alone—they need as many eyes as possible to watch them in order to make money from sponsorships and direct fan contributions, partially because Content ID often diverts advertising money to rightsholders when there is Content ID match. Which they cannot afford to dispute. 

Within the paper is a diagram of the full maze of Content ID, capturing just how difficult it can be to navigate. Here it is, reproduced in full:

In addition to a broad overview of the issues of Content ID, this paper also includes case studies of three YouTube-based creators, who explain how they experience Content ID. All conclude that a) they have no choice but to be on YouTube b) they make substantial creative decisions based on Content ID and c) they give up or lose money to the system. We hope these interviews make clear the very real harm of filters to expression. When lawmakers, companies, and others call for these things, they do so at the expense of a whole new generation of creators.

Read the paper

How YouTube’s Content ID Discourages Fair Use and Dictates What We See Online

Katharine Trendacosta

Dark Caracal: You Missed a Spot

2 months 4 weeks ago

Security researchers at EFF have tracked APTs (Advanced Persistent Threats) targeting civil society for many years now. And while in many cases, the “advanced” appellation is debatable, “persistent” is not. Since 2015, EFF has tracked the cyber-mercenaries known as Dark Caracal, a threat actor who has carried out digital surveillance campaigns on behalf of government interests in Kazakhstan and Lebanon.

Recent activity seems to indicate that this actor is active once again. In November of 2019 the group Malware Hunter Team discovered new samples of the Bandook malware which is associated with Dark Caracal. This time with legitimate signing certificates for Windows (issued by the “Certum” certificate authority,) which would allow them to be run without a warning to the user on any Windows computer. Tipped off by the emergence of new variants of the Bandook Trojan, researchers at Checkpoint found three new variants of Bandook: some expanded (120 commands), some slimmed down (11 commands), and all signed with Certum certificates. The Checkpoint researchers also discovered several new command and control domains in use by Dark Caracal.

In previous campaigns, this actor has displayed impressively lax operational security, enabling researchers to download terabytes of data from their command and control servers. The latest campaign exhibits a somewhat higher level of opsec. Checkpoint reports that targets included “Government, financial, energy, food industry, healthcare, education, IT and legal institutions” in the following countries: Singapore, Cyprus, Chile, Italy, USA, Turkey, Switzerland, Indonesia and Germany.

Recommended Mitigations against Dark Caracal

The Dark Caracal threat actors still seem to primarily use phishing and Office-based macros as their primary method of infection. Because of this, the best step one can take to protect against Dark Caracal is to disable Office macros on your personal devices or that of your entire organization. This is additionally a good basic security hygiene practice. Standard methods to avoid phishing attacks are also good practice. Readers may also take some comfort in the fact that Bandook is currently detected by many, if not most, antivirus products.

The Bandook Trojan

One of the primary signatures of the Dark Caracal threat group is their use of the Bandook Trojan, which is described in the Checkpoint report as follows:

The final payload in this infection chain is a variant of an old full-featured RAT named Bandook. Written in both Delphi and C++, Bandook has a long history, starting in 2007 as a commercially available RAT that was developed by a Lebanese individual nicknamed PrinceAli.

Bandook’s execution flow starts with a loader, written in Delphi, that uses the Process Hollowing technique to create a new instance of an Internet Explorer process and inject a malicious payload into it. The payload contacts the C&C server, sends basic information about the infected machine, and waits for additional commands from the server.

These findings are consistent with what EFF previously published in our Dark Caracal and Operation Manul reports.

We were surprised to see the Checkpoint report when it was released on Thanksgiving as we had been tracking Dark Caracal again as well. Building on Checkpoint’s work, we are publishing additional indicators of compromise we have observed that may be of interest to other security professionals and malware researchers.

Additional Dark Caracal Indicators of Compromise

Hashes:
09187675a604ffe69388014f07dde2ee0a58a9f7b060bff064ce00354fedc091
0c5735e066bfbc774906942e97a6ffc95f36f88b9960c4dd6555247b3dd2cdb0
2106e0eabc23d44bd08332cf0c34f69df36b9e84a360722a7fd4d62c325531d1
211f1638041aa08762a90c15b1aff699d47e4da21429c22b56f8a3103d13b496
27306de878f7ab58462b6b9436474e85c3935a5b285afec93f4b59a62c30dd32
2b54f945f5e3d226d3a09cdfcc41e311b039ceadf277310697672c8c706aa912
2f9ba191689e69e9a4f79b96d66c0fee9626fbd0ea11e89c0129e5d13afe6d76
3c8ad8264d7ce9c681c633265b531abb4cf9b64c2e1a3befadc64e66e1b5632e
4175c7f8854e2152462058a3e2f23a9026477f9b8001923e2c59b195949070f5
4f2ebe6f4fc345041643d5d7045886956fe121272fa70af08498a27135a72d97
520ead3a863d4ea139f93bbad4d149a37ca766b38af0567f1f31a9205613b824
614d0bece286af47db5a9f17d24778b16e30fea10ab8d4c7f0739246b83d8366
6e79e2a567013cbeea1d13f3e6c883e56e66ab36de88802eb1313736c25293ec
76f9615ce6ce6d20a9404b29649a4987a315c6b6fc703fa289da0aae37d39bce
a5b1ba27edee6953fa30771090387a5aca3e4d4541973df9b2e2b535444db5c3
b8c1cb11108d62611ac8035701eea8bb90b55faff2d0a28c23e2dcced176a52f
c4cba54bf57b3bc3bc8f1d71a7d78fcf25eae18d1d96ba4a4fa5eb8d6fb05e08
c852ebf981daa4d17216a569425b6128f5f7f56d746d4aa03ffecef53fb2829b
c8cbded2f6a5792c147a3362a4deb01a54a13fe9b5636367f2bb39084ed6e13a
cdf6413b56618cd641f93c2ca7fa000c486f7f2455daa3e25459e0d1e72ecf45
d50696eec5288f29994aa68b8f38c920f388a934f23855fb516fa94223c29ecc
d6de67f2187d2fad2d88ca3561aa9f9bf3cecf2e303916df0fc892ed97d94ff5
f08cbc1c5bca190bc34f7da3ad022d915758f5eb2c0902b13c44d50129763cf1
f3b54507a82a17f4056baaa3cb24972a2dfa439fa9b04493db100edb191a239f
fe18568eb574d8fa6c7d9bd7d8afa60d39aae0e582aff17c5986f9bab326ec8a

Unpacked bandook sample:
fabce973a9edff2c62ccb6fdd5b14c422bc215284f952f6b31cc2c7d98856d57

Bandook user-agent:
User-Agent: Mozilla/4.0 (compatible; ALI)
User-Agent: Uploador

Command and Control URLs:
hxxp://blancomed.com/newnjususus1/post.php
hxxp://blancomed.com/newnjususus2/post.php
hxxp://blancomed.com/newnjususus4/post.php
hxxp://blancomed.com/newnjususus5/post.php
hxxp://blombic.com/OSPSPSPS2222292929nnnxnxnxnxnx/add.php
hxxp://blombic.com/uioncby281229hcbc2728hsha11kjddhwqqqqc/add.php
hxxp://www.opwalls.com/tunnel2015/add.php
hxxps://blancomed.com/newnjususus1/post.php
hxxps://blancomed.com/newnjususus2/post.php
hxxps://blancomed.com/newnjususus3/post.php
hxxps://blancomed.com/newnjususus4/post.php
hxxps://blancomed.com/newnjususus5/post.php
hxxps://pronews.icu/aqecva/
hxxps://pronews.icu/aqecva/add.php
hxxps://pronews.icu/raxpafsd/images
hxxps://pronews.icu/phpmyadmin
hxxps://pronews.icu/cgi-bin

hxxp://wbtogm.com
/hc1/
/hc2/
/hc3/
/hc4/
/hc5/
/hc1/images/
/hc1/temp/
/hc1/vk/
/hc1/vk/19160/
/hc1/index.php
/hc1/search.php
/hc1/view.php
/hc1/tv.php
/hc1/test.php
/hc1/log.php
/hc1/get.php
/hc1/config.php
/hc1/cm.php
/hc1/auth.php
/hc1/chrome.php
/hc1/panel.php
/hc1/validate.php
/hc1/pws.php
/hc1/vk.php
/91SD8391AC/cap.abc
/91SD8391AC/pws.abc
/91SD8391AC/extra.abc
/91SD8391AC/tv.abc
/hc4/get.php?action=check
/91SD8391AC/ammyy.abc
/91SD8391AC/89911111111012

Command and Control Domains:
megadeb[.]com
blancomed[.]com
opwalls[.]com
blombic[.]com
wbtogm[.]com

Cooper Quintin

It’s Not Section 230 President Trump Hates, It’s the First Amendment

2 months 4 weeks ago

President Trump’s recent threat to “unequivocally VETO” the National Defense Authorization Act (NDAA) if it doesn’t include a repeal of Section 230 may represent the final attack on online free speech of his presidency, but it’s certainly not the first. The NDAA is one of the “must-pass” bills that Congress passes every year, and it’s absurd that Trump is using it as his at-the-buzzer shot to try to kill the most important law protecting free speech online. Congress must reject Trump’s march against Section 230 once and for all.

Under Section 230, the only party responsible for unlawful speech online is the person who said it, not the website where they posted it, the app they used to share it, or any other third party. It has some limitations—most notably, it does nothing to shield intermediaries from liability under federal criminal law—but at its core, it’s just common-sense policy: if a new Internet startup needed to be prepared to defend against countless lawsuits on account of its users’ speech, startups would never get the investment necessary to grow and compete with large tech companies. 230 isn't just about Internet companies, either. Any intermediary that hosts user-generated material receives this shield, including nonprofit and educational organizations like Wikipedia and the Internet Archive.

Section 230 is not, as Trump and other politicians have suggested, a handout to today’s dominant Internet companies. It protects all of us. If you’ve ever forwarded an email, Section 230 protected you: if a court found that email defamatory, Section 230 would guarantee that you can’t be held liable for it; only the author can.

If you’ve ever forwarded an email, Section 230 protected you.

Two myths about Section 230 have developed in recent years and clouded today’s debates about the law. One says that Section 230 somehow requires online services to be “neutral public forums”: that if they show “bias” in their decisions about what material to show or hide from users, they lose their liability shield under Section 230 (this myth drives today’s deeply misguided “platform vs. publisher” rhetoric). The other myth is that if Section 230 were repealed, online platforms would suddenly turn into “neutral” forums, doing nothing to remove or promote certain users’ speech. Both myths ignore that Section 230 isn’t what protects platforms’ right to reflect any editorial viewpoint in how it moderates users’ speech—the First Amendment to the Constitution is. The First Amendment protects platforms’ right to moderate and curate users’ speech to reflect their views, and Section 230 additionally protects them from certain types of liability for their users’ speech. It’s not one or the other; it’s both.

We’ve written numerous times about proposals in Congress to force platforms to be “neutral” in their moderation decisions. Besides being unworkable, such proposals are clearly unconstitutional: under the First Amendment, the government cannot force sites to display or promote speech they don’t want to display or remove speech they don’t want to remove.

It’s not hard to ascertain the motivations for Trump’s escalating war on Section 230. Even before he was elected, Trump was deeply focused on using the courts to punish companies for insults directed at him. He infamously promised in early 2016 to “open up our libel laws” to make it easier for him to legally bully journalists.

No matter your opinion of Section 230, we should all be alarmed that Trump considers a goofy nickname a security threat.

Trump’s attacks on Section 230 follow a familiar pattern: they always seem to follow a perceived slight by social media companies. The White House issued an executive order earlier this year that would draft the FCC to write regulations narrowing Section 230’s liability shield, though the FCC has no statutory authority to interpret Section 230. (Today, Congress is set to confirm Trump’s pick for a new FCC commissioner—one of the legal architects of the executive order.) That executive order came when Twitter and Facebook began to add fact checks to his dubious claims about mail-in voting.

But before, Trump never took the step of claiming that “national security” requires him to be able to use the courts to censor critics. That claim came on Thanksgiving, which also happened to be the day that Twitter users starting calling him “#DiaperDon” after he snapped at a reporter. Since then, he has frequently tied Section 230 to national security. The right to criticize people in power is one of the foundational rights on which our country is based. No matter your opinion of Section 230, we should all be alarmed that Trump considers a goofy nickname a security threat. Besides, repealing Section 230 would do nothing about the #DiaperDon tweets or any of the claims of mistreatment of conservatives on social media. Even if platforms have a clear political bias, Congress can't enact a law that overrides those platforms’ right to moderate user speech in accordance with that bias.

What would happen if Section 230 were repealed, as the president claims to want? Online platforms would become more restrictive overnight. Before allowing you to post online, a platform would need to gauge the level of legal risk that you and your speech bring on them—some voices would disappear from the Internet entirely. It’s shocking that politicians pushing for a more exclusionary Internet are doing so under the banner of free speech; it’s even more galling that the president has dubbed it a matter of national security.

Our free speech online is too important to be held as collateral in a routine authorization bill. Congress must reject President Trump’s misguided campaign against Section 230.

Elliot Harmon

Senators Express Privacy Concerns Over Proctoring Apps

2 months 4 weeks ago

Last week, five Senators joined the chorus of privacy advocates, students, and teachers expressing concern over surveillance proctoring apps being used to watch students remotely during exams. “You must be able to demonstrate that you are respecting students’ privacy,” the Senators write–and so far, that just doesn’t seem to be the case. Additionally, the letter notes that proctoring app features “have flagged individuals with disabilities or physical conditions, such as tic disorders or muscle reflexes, as suspicious,” and the apps’ shortcomings “fall heavily on vulnerable communities and perpetuate discriminatory biases.” The full letter [pdf] calls on proctoring companies to respond to these and other concerns, and describe what processes they are using to alleviate them.

The product is surveillance. There is no improving it.

EFF agrees that these apps pose a serious danger to students’ privacy. Surveillance shouldn’t be a prerequisite for an education. Proctoring apps use monitoring techniques to supposedly determine whether a student is cheating–but in the process, they force students to surrender sensitive biometric information and video recordings of their private spaces. These apps invade students’ biometric and data privacy, and exacerbate existing inequities in educational outcomes, especially for Black students.

While safeguarding student data and improving equity in educational tools are laudable goals, there is a far deeper and more sinister issue at play here—there is a growing student surveillance ecosystem, even beyond these proctoring apps. Other tools that are gaining popularity with school administrations include facial recognition software and applications that monitor student social media activity, such as Bark, Social Sentinel, and GoGuardian. Cloud-based educational platforms and school-provided devices often collect far more information on students than is necessary, store this information indefinitely, and sometimes even upload it to the cloud automatically. Taken as a whole, these apps normalize and codify the use of surveillance in schools. And remote proctoring apps aren’t just being used at the college level–some companies offer their services to high schools, too.

Schools should not be using surveillance tools on their students. We already know that people who are being watched change their behavior and self-censor, and that it breaks down the trust among students, teachers, and school administrations. School should be about learning, making mistakes, and growing academically and as people. This is nearly impossible in a school-enforced panopticon, where every facial expression and movement is monitored by algorithms, with the expectation that students are not to be trusted. We are teaching our students to expect that spying is normal, and that the people they trust to guide their academic success will spy on them.

To Senators Blumenthal, Van Hollen, Smith, Warren, and Booker: Thank you, sincerely, for investigating these apps. In your assessment of proctoring companies’ responses to your letter, we urge you to consider the following: The product is surveillance. There is no improving it.

Jason Kelley

EFF at 30: Saving Encryption, with Technologist Bruce Schneier

2 months 4 weeks ago

To commemorate the Electronic Frontier Foundation’s 30th anniversary, we present EFF30 Fireside Chats. This limited series of livestreamed conversations looks back at some of the biggest issues in Internet history and their effects on the modern web.

To celebrate 30 years of defending online freedom, EFF invited author, security technologist, and EFF board member Bruce Schneier to discuss the future of the "Crypto Wars." This epic battle, raging since the 1990s, pits privacy and security advocates against the U.S. government in a fight over encryption. Governments around the world have grown evermore keen to weaken encryption and acquire backdoor access to private devices and Internet communications.

Killing the EARN IT Act and protecting encryption is top of EFF’s agenda.

EFF has adamantly defended encryption and its widespread use from the early days of Bernstein v. US Department of Justice, the case that established that software source code was speech protected by the First Amendment. This technology paved the way for ecommerce, rising social movements around the world, and your ability to have a private conversation in an increasingly online world.

The Crypto Wars have continued right up to the present day with the EARN IT Act, a bill that would give unprecedented powers to law enforcement—including the ability to break into our private messages by creating encryption backdoors. Make no mistake: the fundamental security of our devices and the Internet are at stake.

It’s precisely why EFF invited Schneier to peer into encryption’s future at our first EFF30 Fireside Chat with EFF Executive Director Cindy Cohn, Senior Staff Attorney Andrew Crocker, and Senior Staff Technologist Erica Portnoy.

The chat began with a brief history of “the crypto wars,” which began the 90’s. The war is ongoing, said Schneier, but we have to win: “As long as [a smart phone or laptop] is in the hands of every single lawmaker, and world leader, and judge, and police officer, nuclear power plant operator, CEO, and voting official, it’s really important that we secure these--the communications and the storage. And that has to win.” 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2Fh6fe63428XA%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FHnRqHCxp9kU%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Thirty years later,  this issue isn’t going away. “If you’re law enforcement, you want more and more authority,” said Schneier. The solution lies above law enforcement. “The need for security, for national security, trumps the need for law enforcement. We have to accept that, and implement that...This matters more than ever now.” 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FLI1NJUYuV5E%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

It’s become practically rote for law enforcement to claim that they need lawful access to encrypted devices or messaging to solve crimes. But as Staff Attorney Andrew Crocker pointed out, they’ve had a very hard time coming up with “reliable figures and anything more than anecdotes” about this need. And Schneier added: “We need law enforcement to have good digital forensics...I think law enforcement is just poorly educated. So going to the phone is just what they think. If they had more sophisticated investigative tools, we could have all the security we wanted for our data and our conversations. And they would still be able to have all the crime-solving capability they need. I actually don’t think that there’s a conflict here.” 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2Fa6Up9HQJgK4%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Without encryption, free speech would be in serious danger. “Only if we have secure systems can we ensure that dissidents around the world can speak freely,” said Schneier, before quoting former FBI General Counsel James Baker, who suggested in 2019 that it was time for government authorities, including law enforcement, to embrace encryption, as “one of the few mechanisms the United States and its allies can use to more effectively protect themselves from existential cybersecurity threats.”  

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FBQk-60AVpBU%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

One common claim from law enforcement is that we can compromise by allowing backdoors into encrypted messages or devices. Though law enforcement calls this idea a ‘compromise,’ it’s not a compromise at all, said Crocker.

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FtkmFjuO2zPo%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

For one, Executive Director Cindy Cohn pointed out, any attempts at a ‘compromise’ will be used by government as recognition that it’s possible. But it isn’t. “We have seen efforts by law enforcement to get well-meaning tech to try to solve this problem, but what they really want is just a talking point in front of Congress that it can be solved. We’ve seen some very well-meaning people be very misused in the political debate because they were game enough to try. Don’t be that person. Don’t fall for that. They’re not interested in your technical prowess--they’re interested in a talking point.” 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FD-buDps9ShE%3Fautoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

For the technologists and cryptologists, Staff Technologist Erica Portnoy had a similar message: “Don’t try to do it. Even if at the end you explain all the nuances and problems with it, some government official is going to look at it and say ‘this is perfect, let’s implement this now,’ even though you have eight paragraphs on why this won’t work in practice...don’t help them make their arguments.”  

The conversation finished up with a call to action from our Executive Director: we must stop the EARN IT Act, legislation that’s making it way through Congress during the pandemic, which threatens both encryption AND free speech. “Killing the EARN IT Act and protecting encryption is top of EFF’s agenda.” 

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FmNxDHYdYt5g%3Fstart%3D9%26autoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

Thirty years after EFF’s founding, our mission to protect privacy and free speech online is more important than ever. 

Join us on Thursday, December 10 for the second in our series of EFF30 Fireside Chats when we explore Section 230 and the future of free speech. In this candid livestream, EFF Legal Director Dr. Corynne McSherry and Oregon Senator Ron Wyden, an original framer of Section 230, will discuss why 230 is under fire and what made it the most important law protecting freedom of expression and innovation on the Internet.

RSVP

Aaron Jue

Publisher or Platform? It Doesn't Matter.

2 months 4 weeks ago

“You have to choose: are you a platform or a publisher?”

It’s the question that makes us pull out our hair and roll our eyes. It’s the question that makes us want to shout from the rooftops “IT DOESN’T MATTER. YOU DON’T HAVE TO CHOOSE”

We’ll say it plainly here: there is no legal significance to labeling an online service a “platform” as opposed to a “publisher.” Yes. That’s right. There is no legal significance to labeling an online service a “platform.” Nor does the law treat online services differently based on their ideological “neutrality” or lack thereof.

There is no common law or statutory significance to the word “platform.” It is not found in Section 230 at all.

Some of the “You’re a platform!” mania is likely the fault of the companies themselves. Some have used the word “platform” to distinguish themselves, who primarily published user-generated content, from those who primarily published their own content, and/or actively edit and curate the content of others. They self-identified as “platforms” mostly to justify what was perceived as their hands-off approach to content moderation, particularly with respect to decisions not to remove hateful and harassing speech from their sites.

It’s fair to call out the big social media companies for holding themselves out as purely passive conduits (which is what some seem to mean when they call themselves “platforms”) when they actually moderate a ton of user content every day, and pretty much always have. Our work on the Santa Clara Principles reflects the human rights implications of content moderation, even though we support the First Amendment right of intermediaries to curate their sites.

But as a legal cudgel against perceived political bias, which is how the “admit it you’re a publisher not a platform” screed has most frequently been used, it is a meaningless distinction.

When politicians like Sen. Ted Cruz demand that Twitter identify itself as either a “publisher” or a “platform,” they usually make this false distinction in the entirely erroneous context of 47 U.S.C. § 230, the provision of U.S. law that grants broad immunity from liability to online intermediaries when such liability would be based on the speech of others. Rather than enshrine some significance between online “platforms” and “publishers,” Section 230 intentionally nullifies any distinction that might have existed. Contrary to popular misconception, immunity is not a reward for intermediaries that choose the path of total neutrality (whatever that means); nor did Congress enact Section 230 with an expectation that Internet services were or would become completely neutral. Section 230 explicitly grants immunity to all intermediaries, both the “neutral” and the proudly biased. It treats them exactly the same, and does so on purpose.

That’s a feature of Section 230, not a bug.

So online services did not self-identify as “platforms” to mythically gain Section 230 protection—they had that already.

Unlike “publisher” (more on that below), there is no common law or statutory significance to the word “platform.” It is not found in Section 230 at all. The word “platform” doesn’t even appear in any published Section 230 judicial opinions until 2004, and there and in most subsequent cases, the court simply quoted the descriptive language from the parties’ briefs in which it was used mostly as a synonym for “website.” Starting around 2010, courts did start using the word “platform” to describe internet services through which users interacted, much like courts used the terms “portal” or “website” previously and thereafter.

Moreover, regardless of Section 230, it is completely common to be both a “publisher” and a “platform” at the same time—a publisher of your own content and a platform for the content of others. Newspapers have historically done this and continue to do so—a publisher of the articles they write themselves and a platform for the content they publish but did not write themselves—letters to the editor, wire service articles, advertisements, op-eds, etc. And online publications and websites continue to do so now, mixing their own content with that generated by users.

In fact, it is really difficult to find any online service close to the user end, that is services like social media and email clients with which the user directly and openly interacts, that is solely a conduit for user speech, without any speech of its own. One doesn’t really find pure conduits like this until quite deep in the infrastructure layer of the Internet—like ISPs, domain name services, content delivery networks (CDNs), and email servers. And even that at depth, takedowns are not uncommon.

The specious publisher-platform argument is also historically off-base. Historically, there is some legal distinction between “publishers” and more passive “distributors” of others’ speech, and “distributors” is perhaps what those who yearn for “neutral platforms” are referring to. But “distributors” was just a subcategory of “publishers” and both bore liability.

So, what is the legal difference between “publishers” and “distributors”?

One is always a “publisher” of their own words, the stuff they write and say themselves. That is completely uncontroversial. The controversy and confusion arise around republication liability, the idea that you are legally a “publisher” of all statements of others that you republish even if you accurately quote the original speaker and attribute the statement to them. So, if you accurately and directly quote someone in an article you have written, and the quoted statements defame someone, you can be liable for defamation for republishing those statements. This applies to any content in your publication that you did not write yourself, like letters to the editor, advertisements, outside editorial, wire service stories, etc. Legally, you are responsible for all of these statements as if they were your own creations.

This legal concept of republication liability is an old concept inherited from English common law. But it appears that up until 1824, accurate attribution was a full defense.

A subcategory of these “publishers” are “distributors.” Since at least 1837, republication liability has extended also to mere distributors of speech—the 1837 case Day v. Bream dealt with a courier who had delivered a box of libelous handbills—if it could be proved that they knew or should have known about the illegal or tortious content. This “distributor” liability was widely applied to newsstands, booksellers, and libraries. The American version of this knowledge-based “distributor” liability is commonly associated with the US Supreme Court’s 1959 decision in Smith v. California, which found that a bookseller could not be convicted of peddling obscene material unless it could be proven that the bookseller knew of the obscene contents of the book. Outside of criminal law, US courts imposed liability on distributors who simply should have known that they were distributing actionable content.

So “distributor liability” applied to those like booksellers, newsstands, and couriers who merely served as fairly passive conduits for others’ speech, and “publisher liability” applied to those who engaged with the other person’s speech in some way, whether by editing it, modifying it, affirmatively endorsing it, or including it as part of larger original reporting. For the former, group, the passive distributors, there could be no liability unless they knew, or should have known, of the libelous material. For the latter group, the publishers, they were treated the same as the original speakers they quoted.

Because one was treated a bit better if they were a passive distributor, the law actually disincentivized editing, curation, or reviewing content for any reason.

One of the primary purposes of Section 230 was to remove this disincentive and encourage online intermediaries to actively curate and edit their sites without being so penalized. Former Rep. Chris Cox, one of the co-authors of Section 230, recalls finding it “surpassingly stupid” that before Section 230, courts effectively disincentivized platforms from engaging in any speech moderation. And Congress recognized that even the notice-based liability that attached to distributors created the prospect of the “heckler’s veto,” whereby one who wants the speech censored tells the distributor about it and the distributor removes the speech without devoting any resources to investigating whether the objection had any merit. As we have written (PDF), notice-based liability systems are subject to great abuse and have serious human rights implications.

Congress resolved this problem by getting rid of republication liability altogether. That is the significance of the phrase “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” And the very first federal appellate court to interpret Section 230 made clear that Section 230 got rid of both republisher liability and its subset, distributor liability.

So the very purpose of section 230 was to eliminate any distinction between those who actively select, curate, and edit the speech before distributing it and those who are merely passive conduits for it. For the purpose of Section 230, the only relevant distinction is between an “interactive computer service” and an “information content provider.”

But what about “neutrality”? It’s a bit more confusing, but there has never been any requirement that publishers be politically neutral or otherwise speak, edit, or curate without such decisions reflecting their own beliefs. And “neutrality” certainly doesn’t mean “ideological neutrality”—indeed, the First Amendment protects the right of any speaker to express their ideological and political viewpoints , whether through their own speech, or by choosing to serve as an intermediary for the ideological speech of others—that is by actively curating the user-generated speech on their sites.

With respect to republication liability, a few courts recognize a “neutral reportage” privilege to republish statements made by reliable speakers about a public controversy. In that context “neutral” means the that the statement is reported without any implication that it is true, but simply reporting that the statement was made. But the neutral reportage privilege has mostly been rejected by U.S. courts.

With respect to Section 230, the Ninth Circuit, gave the provision of “neutral tools” to users as one example of engagement with user speech that would preserve Section 230 protection. That is, one did not become the creator of content by providing “neutral tools” to help a speaker speak. The court did not explain what tools are in fact “neutral” and there remains confusion as to exactly what “neutral tools” are. But it seems the court meant that the service did not materially contribute to the illegality of the speech. Merely having a viewpoint or taking sides in a controversy would thus not negate Section 230 protection.

Section 230 aside, both “publishers” and “distributors” are liable only when the speech they disseminate is independently illegal or tortious. The mere act of editing or curating your users’ speech is not actionable.

To the contrary, curation and editing is itself speech protected by the First Amendment, as the Supreme Court has held with respect to platforms ranging from newspapers to St. Patrick’s Day parades. And at the center of this constitutional protection is the right to express one’s political views through the curation of others’ speech. As the Supreme Court stated, “the expression of editorial opinion […] lies at the heart of First Amendment protection.”

David Greene

California Legislation to Make Significant Investments in Public Broadband

2 months 4 weeks ago

The California Legislature finished the 2020 session without doing anything  to address broadband access in response to the pandemic. While the California Senate sent much-needed legislation to the Assembly, it was not allowed to move forward from there. That meant no help for the more than 1 million students lacking sufficient Internet access to engage in remote education, or for the countless other Californians relying on their home access to engage in remote work. This year, Senator Lena Gonzalez, the original author of EFF’s sponsored legislation S.B. 1130 to convert the California Internet infrastructure program, is back with a new bill to take action this year, after close work with many of her colleagues. 

This new legislation, S.B. 4 - Broadband for All, takes a different approach than the original S.B. 1130 by creating a new program that will help local governments build their own broadband options. In fact, it enables local governments to make a massive billion dollar investment in public infrastructure by unlocking the bond market for local communities. This new bond program would enable local governments to secure long-term low-interest financing in the same way electricity was paid for in deep rural markets. Those investments, designed to give long terms—multiple decades—to repay the bonds, will be in fiber optic infrastructure. This makes the most sense, as fiber optic is the only data infrastructure proven to last that long, and remain useful as an asset.

California’s current law (known as the California Advanced Services Fund or CASF) has failed to meet the digital divide challenge. It discriminates against local community bidders to build broadband infrastructure, favors spending state money on slow outdated infrastructure, does not cover all rural and low-income Californians, and has been underfunded. A recent study found that California, despite having CASF already, is the the state with the largest number of students in the United States that lack sufficient access to broadband. This lays the weaknesses of CASF bare, showing it is a grant program investing in obsolete infrastructure, at 100% cost to the state, that collects very little actual money to spend.

S.B. 4 remedies this problem by making improvements to the grant program that are more modest than S.B. 1130’s goals, but updates the way the state collects revenue for the program with an “access line” charge. This revenue amendment is critical, because the state’s original way to collect money has been hindered by federal deregulation of the telecommunications industry, and has not kept pace with the way we use communications infrastructure. Absent this change, it’s likely that funding for this program will continue to decline despite the fact that a growing number of Californians depend (and pay for) communications services. Arguably the most consequential change S.B. 4 makes is eliminating the expiration date for revenue collection, allowing the state to do more to permanently close the digital divide with large scale investments every handful of years until every Californian has access to the Internet service they need.

Other changes the program makes are as follows:

  1. Stabilize and expand California’s Internet infrastructure program (CASF) and allow the state to invest $100s of million on broadband infrastructure every year until the digital divide is closed.
  2. Enable local governments to bond finance more than $1 billion with state support to secure long-term low-interest funding to build local infrastructure.
  3. Build broadband networks that meet the goals of the Governor’s Executive Order with an emphasis on scalability to ensure future proofing in infrastructure financed by the state. This ensures access is built to last for the 21st century.
  4. Direct support towards low-income neighborhoods that lack broadband access.
  5. Expand eligibility for state support to all rural Californian communities

Now it is time to rally Sacramento to get these changes made into law and start eliminating the digital divide.  

If you represent a California business, non-profit, local elected official, or anchor institution, please consider signing on support of SB 4. EFF will continue to collect supporters on a letter to update the legislature of how wide and deep current support is for this legislation. Dozens of organizations and elected officials have endorsed this bill with more to come. Join us!

Join us!
Support Broadband for All in CAlifornia

Ernesto Falcon

Podcast Episode: You Bought It, But Do You Own It?

3 months ago
Episode 006 of EFF’s How to Fix the Internet

Chris Lewis joins EFF hosts Cindy Cohn and Danny O’Brien as they discuss how our access to knowledge is increasingly governed by click-wrap agreements that prevent users from ever owning things like books and music, and how this undermines the legal doctrine of “first sale” – which states that once you buy a copyrighted work, it’s yours to resell or give it away as you choose. They talk through the ramifications of this shift on society, and also start to paint a brighter future for how the digital world would thrive if we safeguard digital first sale.

In this episode you’ll learn about:

  • The legal doctrine of first sale—in which owners of a copyrighted work can resell it or give it away as they choose—and why copyright maximalists have fought it for so long;
  • The Redigi case, in which a federal court held that the Redigi music service, which allows music fans to store and resell music they buy from iTunes, violated copyright law—and why that set us down the wrong path;
  • The need for a movement that can help champion digital first sale and access to knowledge more generally;
  • How digital first sale connects to issues of access to knowledge, and how this provides a nexus to issues of societal equity;
  • Why the shift to using terms of service to govern access to content such as music and books has meant that our access to knowledge is intermediated by contract law, which is often impenetrable to average users;
  • How not having a strong right of digital first sale undermines libraries, which have long benefited from bequests and donations;
  • How getting first sale right in the digital world will help to promote equitable access to knowledge and create a more accessible digital world.

Christopher Lewis is President and CEO at Public Knowledge. Prior to being elevated to President and CEO, Chris served for as PK's Vice President from 2012 to 2019 where he led the organization's day-to-day advocacy and political strategy on Capitol Hill and at government agencies. During that time he also served as a local elected official, serving two terms on the Alexandria City Public School Board. Chris serves on the Board of Directors for the Institute for Local Self Reliance and represents Public Knowledge on the Board of the Broadband Internet Technical Advisory Group (BITAG).

Before joining Public Knowledge, Chris worked in the Federal Communications Commission Office of Legislative Affairs, including as its Deputy Director. He is a former U.S. Senate staffer for the late Sen. Edward M. Kennedy and has over 18 years of political organizing and advocacy experience, including serving as Virginia State Director at GenerationEngage, and working as the North Carolina Field Director for Barack Obama's 2008 Presidential Campaign and other roles throughout the campaign. Chris graduated from Harvard University with a Bachelors degree in Government and lives in Alexandria, VA where he continues to volunteer and advocate on local civic issues. You can find Chris on Twitter at @ChrisJ_Lewis

Please subscribe to How to Fix the Internet via RSSStitcherTuneInApple PodcastsGoogle PodcastsSpotify or your podcast player of choice. You can also find the Mp3 of this episode on the Internet Archive, and embedded below.

%3Ciframe%20src%3D%22https%3A%2F%2Farchive.org%2Fembed%2Feff-podcast-episode-6-digital-1st-sale%22%20width%3D%22500%22%20height%3D%22140%22%20frameborder%3D%220%22%20webkitallowfullscreen%3D%22true%22%20mozallowfullscreen%3D%22true%22%20allowfullscreen%3D%22%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from archive.org

 

If you have any feedback on this episode, please email podcast@eff.org.

Below, you’ll find legal resources – including links to important cases, books, and briefs discussed in the podcast – as well a full transcript of the audio.

Resources

Legal Cases

Digital First Sale

Abuses and Failures of Digital Rights Management (DRM) and End User Licensing Agreements (EULAs)

Other Resources

Transcript of Episode 006: You Bought It, But Do You Own It?

Danny O'Brien:

Welcome to How to Fix the Internet with the Electronic Frontier Foundation, the podcast that explores some of the biggest problems we face online right now, problems whose source and solution is often buried in the obscure twists of technological development, societal change, and the subtle details of internet law.

Cindy Cohn:

Hi, everyone. I'm Cindy Cohn, and I am the executive director of the Electronic Frontier Foundation. For these purposes, I'm a lawyer and now I'm apparently also a podcast host.

Danny O'Brien:

I feel podcast hosts should really be higher up on any of those lists. Hi, I'm Danny O'Brien, and I also work at EFF. Welcome to How To Fix the Internet, where in a world full of problems, we take on a few of the digital kind and point the way forward.

Cindy Cohn:

Though this week's episode is really about how the online world and the offline world cross. And Danny, if it's okay for me to rant a little here?

Danny O'Brien:

I can think of no better place.

Cindy Cohn:

A lot of what's wrong with debating how to preserve civil liberties in the online world comes from approaching it as if it's somewhere magically apart from the offline world, but we live in both of those worlds, and they both overlay our lives. So, it doesn't make so much sense to say that we have rights in one part of our life, but not in the other when we're essentially doing the same thing.

Cindy Cohn:

As things move, it's even difficult to figure out where one ends and the other begins, yet too often, powerful forces try to use this shift to get things in the online world that they would never even ask for, much less get in the offline world.

Danny O'Brien:

And I think the topic of this podcast, first sale is a great example of that. First sale is the American legal doctrine, which is I'm sure we'll hear from our guest, has its roots firmly in the analog world. It was created to protect the intuitive idea that if you buy a book or another creative work, it's yours. You can lend or resell it. You can read it to your friends.

Danny O'Brien:

Without first sale, rights holders or patent owners could claim that they should control some of the use of those works, including banning you from giving them away, selling them at a garage sale, or even just showing them to someone else.

Cindy Cohn:

So, like the best legal principles, the first-sale doctrine is an embodiment of common sense, when you buy something, you own it. But in the digital space, physical sales have been replaced secretly with these click-through licenses and other tiny little words at the bottom of a website that no one reads. But what those words are doing is they're changing what's happening from a sale to a license that's limited in many ways. And what that means is that you're limited in what you can do with the software or other media that you've bought.

Danny O'Brien:

Worst, corporations have even tried to twist th king sure that we don't lose a right that we've always had before the net came along.

Cindy Cohn:

Exactly. And to help us understand that we've asked Chris Lewis to join us. Chris is the president and CEO of Public Knowledge, the D.C. based public interest group that works at the intersection of copyright, telecommunications, and internet law. We love working with PK at EFF. We really consider you some of our chief partners. Back to Chris though, before that Chris was deputy director of the FCC's Office of Legislative Affairs, and he's a graduate of Harvard University. Welcome, Chris.

Chris Lewis:

Thanks Cindy. It's great to be with you guys.

Cindy Cohn:

Chris, I think of you as someone who's able to look deeply at complex problems, nerdy stuff like copyright online, net neutrality and our topic today, digital first sale. And what I love about working with you is that you get the nuance of the issues, but also the political practicalities and the storytelling about how we fix it. And also, it's just really always fun to work together. I'm also delighted that you joined us on the EFF advisory board.

Chris Lewis:

It's an honor to be invited to be on the advisory board, though. We love working with you guys as well, and it's a great bi-coastal partnership for the public interest.

Cindy Cohn:

So, how did you get interested in digital first sale? I think folks like us are such a rare breed, people passionate about the side of intellectual property that actually stands with the users and not about making sure rock stars can buy their second island. How did you get passionate about this?

Chris Lewis:

Great question, and it alludes to what you were talking about earlier about common sense. I can recall the questions I had in the era that I grew up in. I was a teenager in the '90s. I was in college in the late '90s and early 2000s. And during that time, my favorite uncle, my uncle Joey, who's a talented musician has this awesome record collection, music collection. And he could have given that to me, but I'd probably have to wait for him to pass away in order for him to relinquish it.

Chris Lewis:

Instead, he digitized his own library of music and chose to share it with me. And that was one of the first examples of where I was uncertain if the digital rights laws had been updated for what is a normal handing off or passing off or downstream transfer of goods. When I got to Public Knowledge, as you said, I came from the world of telecom policy. I was at the Federal Communications Commission. And unlike you, Cindy, I'm not a lawyer. I'm a organizer. I'm a organizer and an advocate.

Chris Lewis:

And for me, it's stories like that that are important for people to think about in their own lives when they think about how the law needs to be updated for the digital age. I have to admit, when I got to Public Knowledge, I was introduced to copywrite law by some expert lawyers. It's why I love working there. It's why we love working with EFF. We have that in common, lawyers who get into those details of how the law can be adjusted for what are just common sense expectations from average consumers that you should be able to bequeath your fantastic record collection to your niece or your nephew.

Chris Lewis:

That you should be able to sell a book that you got, secondhand. And yes, there are challenges that the technology poses, but it doesn't mean they can't be overcome. And how we overcome that I think is an exciting topic.

Danny O'Brien:

Let me just pick apart exactly what the theory is around what rights holders want to be able to do in these situations. Because I think for anything that didn't have one foot in the intellectual property space, it would seem crazy that if somebody sold you an artifact, that they will be able to require you to not resell it on or limit what you do with it. So what's the legal theory against having something like first sale?

Chris Lewis:

I think the legal theory is that ... It's hard to sometimes for me to think about it from the opposite side, but there is a valid ... I think to be fair, there's a valid argument for the importance of protecting the ability for creative artists to be compensated and to recoup some value for the work that they create, and that's important. But what we often miss or that folks who don't like the idea of a digital first sale miss, what they often miss is that there's a balance to copyright law. That it has dual purposes, not just the protection of those artists and their ability to make a living, to have value in their creativity. But also to continue to promote the useful arts and sciences. That's probably a direct quote.

Chris Lewis:

So when first sale was created in the early 1900s, it came out of court cases that recognized that when you buy something, there's a common sense expectation that you can do certain things with it that are common sense. And that balance was struck back then. The first court case that I'm aware of was the Bobbs-Merrill case against Macy's. Macy's had a book that Bobbs-Merrill published. And Macy's wanted to sell it for 11 cents less than the Bobbs-Merrill people wanted.

Chris Lewis:

And they actually printed on the cover of the book that because of the rights owner's wishes, they did not want that book sold for less than a dollar. Macy's was sued for trying to sell it for 89 cents. And the court ruled that once Macy's bought that product, they had the right to resell it. They had the right beyond that first sale to give it away if they want to. And that is common sense, and that does not infringe on the ability for the creator to make a profit off of that first sale. And so that's an important balance.

Danny O'Brien:

So in a way what this is, is it's a way of preventing riders on artifacts. So the idea is that if you have physically bought something, you can't put ifs and buts and controls on that. And I guess the reason why it gets caught up with intellectual property is a lot of intellectual property law, like patents and copyright, is about controlling various kinds of use, even of copies of your material.

Chris Lewis:

Yes, that's right. There are protections for the original creator of the work as well, but this right, the right of first sale, in non-lawyer speak, I just like to say if you, if you bought it, you own it. This right is critically important for basic capitalism of goods and services moving.

Cindy Cohn:

Let me see if I can say, the thing about the digital space ... Let me try Chris, and correct me if I'm not reading your mind correctly. But in the digital space, one of the things we've been freed from is the artificial constraints of the physical. So this is the Jeffersonian idea. This is an old idea in copyright law that when I give you something, I still have it in the digital world. And Jefferson likened it to a candle. Like when I take a light off of your candle, I still have my candle lit. And he said that that was one of the great things about ethereal works, copyrighted works.

Cindy Cohn:

What happens now, the things that used to be ethereal works like music or software, they used to be locked in a physical thing. And so we would treat them like a physical thing where once you had one CD, that was the one CD you had. But digital works, you can copy multiple times, and we don't have the constraints of the physical world on that. And that should free us up. If you're passionate about something, you want to share it with the world, whether that's the next band or the cool new tool you found, or those other things.

Cindy Cohn:

And the first-sale doctrine protects us in the real world so that we can share the things we love, like your uncle sharing his love of music with you. We just want that same thing to be able to happen in the digital world.

Chris Lewis:

Certainly there's a difference between that physical good and a digital good and how easy it is to copy it. To me, that comes with the power of technology. Technology is more powerful. Digital technology is more powerful in certain ways. Copying is one. But what we also miss when we move from the physical to the digital world is that we're also dealing with information, and we're dealing with knowledge, for lack of a better word. And that good is no longer just a hard good. It is access to information. It is access to knowledge.

Chris Lewis:

And so if we take the values of digital first sale and transfer them into the digital age, you have to account for the power of the technology, you have to account for the knowledge that it transmits. And the power to transmit so much knowledge combined I think increases the importance of protecting the principles around digital first sale, but it also makes it more challenging.

Cindy Cohn:

I think that's right. I think that what's going on here is that the folks in the content industry who didn't like first sale in the first place are trying to use this shift to get rid of it. And what we're trying to say is no, those values are really important, whether something's physical or digital. And if the digital world presents some more challenges, then we should just address those challenges and not just throw away the values of your uncle sharing his music with you, or someone being able to find some software at a garage sale and use it.

Cindy Cohn:

So those are the kinds of things that they're just as important in the digital world as they were in the analog world. And it's one of those areas where we shouldn't be drawing a distinction between the two, at least in terms of the values we're trying to protect. We may protect them in different ways, but the values are still strong.

Danny O'Brien:

The limits of first sale seem pretty clear and intuitive in the physical world. And really, they reflect what we might imagine we could do with an object that we owned. But I think isn't it true that in the digital environment, those limits have faded away. And actually if I get a copy of something from somebody, not only can I do a lot of things with it, including not just digitizing and sharing it with my favorite nephew, but with hundreds of thousands of people. It makes it a little bit harder to understand what the limits and what we can do with a product that we own in the digital space.

Chris Lewis:

Yeah, that's right. And it also reminds us of the responsibilities that come with the power of new technology. So when the photocopier was created, yes, a photocopier or a Xerox machine could be used to pirate copyrighted works. But we didn't outlaw photocopiers. We didn't outlaw Xerox machines just because there is power in the technology. What we did was we made sure that we had quality enforcement and also expectations in society of the responsibility that comes with the power of that technology.

Chris Lewis:

And I think that's critically important as we move into a digital space. There are multiple examples of new technologies before and after our move into the world of the internet where technologies were challenged. And the understanding was is that there is a balance of responsibility of use when you have a tool that allows you to do things that shouldn't be legal, but also could be used for things that could be infringing. And I think that's important for us to highlight.

Danny O'Brien:

So what are some examples of attempts by rights holders to place controls on digital artifacts that violate that idea of digital first sale, and are a step too far in the power of those rights holders?

Chris Lewis:

Well, I think probably the best known recent case from 2013 was the ReDigi court case. ReDigi was a file sharing service, was sharing music. And that case, I think, got this balance of power wrong where it said that if you're sharing a file and using it to do a transaction. Maybe you're selling a piece of music, a recording, that as long as you ensure that you sell it to one person and you eliminate ownership on your end, just as you would do with a physical good, that that should be legal. And that was not upheld in the ReDigi case. ReDigi lost. And a service that could have provided for sales and purchases of used digital goods, if that's a term we could use, was basically outlawed.

Chris Lewis:

This is dangerous. It's dangerous for the ability for a society to share ideas. It's not just an academic exercise that we're talking about here. More and more of the world's goods are being moved into a digital-only space. And it's really popular right now in the middle of this pandemic to talk about the digital divide with broadband. The low-income communities or rural communities may be left out of having access to broadband. But once they get access to broadband, once we all get access to broadband, that's not the end of the digital divide.

Chris Lewis:

Folks could still be excluded from knowledge. That's another digital divide. If it is cost prohibited because of the new structures around licensing, or if they are limited from sharing information with each other in commonly accepted ways like selling a good or bequeathing a good. So I think that ReDigi case really set us off on the wrong path and highlights some of the harms that we can have if that idea is duplicated with other technologies.

Danny O'Brien:

To put a point on this, I grew up reading very worn secondhand paper books, mostly science fiction that I bought at secondhand bookstores. And right now as a grownup, I have one of those little libraries outside, and there's a huge traffic in my neighborhood of people leaving books and taking books. And I think what you're describing here is that both of those acts become, if not actively illegal, very difficult to do because there's no way that I can give someone a digital artifact and demonstrate that I destroyed the other copy that I might theoretically have when I gave it to the other person. Is that right?

Chris Lewis:

Exactly. I know Cindy knows. And Danny, you may not know. I spent six years on my local school board here in Alexandria, Virginia, and lived two blocks from the public library. And at our public library in our school division that had a high rates of poverty, that was a place for students to go, not only to get access to the internet, if they did not have it, but also to get access to books and get access to knowledge. And all of those books, whether you're talking about secondhand stores or the public library, which really provides equity and access to knowledge, if those books are not available on a digital space where our students in Alexandria or around the country can get access to them, we're really going to have an increasing digital divide.

Cindy Cohn:

I think that that is to me the most dangerous piece about this. Of course, I care about people being able to get their brother's record collections or buy a piece of software at a garage sale and still be able to use it. But really this is about knowledge. And it really brings one of the big fights at EFF. And both Public Knowledge and EFF are working on today is the attack on the Internet Archive which is trying to make access to all the world's knowledge available to all the world's people.

Cindy Cohn:

The implications of that battle, there are specific things about that battle that are unique to the Internet Archive, but what's really going on in a broader scope is an attempt to really limit digital lending in a way that will make it very difficult for people without access to power and complicated systems to be able to have knowledge. And we're already seeing this with school kids who can't afford the books or the license on the books. And I grew up in a pretty remote rural area, and people shared textbooks and other books just because it was just hard to get them from the publishers.

Cindy Cohn:

And I think we're headed back to that time when the internet should have freed us from that time. It's exactly backwards.

Chris Lewis:

That's right. Who's going to get to use the power of the technology? Is it going to be limited to a few stakeholders who are going to control that knowledge and information through far more expensive licensing schemes and setups than we had in the old book and public library markets, or are we going to use that technology to benefit everyone?

Cindy Cohn:

And the practical impact of turning everything into a service is that everybody's got to have access to credit cards or licensing schemes, or ongoing things that are pretty hard for people who are already at the margins. They already have to go to the library to get access or hang out outside of McDonald's to get access. If you need a service and an account, and probably a credit card to be able to continue to have access to stuff that you already bought, but it goes away if you don't keep paying monthly.

Cindy Cohn:

That's a huge implication of this shift. And again, the first-sale doctrine in the offline world means that you get your hands on that book and you bought it, whether you bought it first sale, you bought it from the first person who bought it, or the third person who bought it, or you checked it out of the library. That knowledge stays where you have it. Whereas with these services, things can go away. And as we've learned, a lot of the services that are offering these kinds of things are not operations that exists forever. So the business goes out of business, and the next thing you know, you've lost access to all the information that you have been paying monthly to get access to.

Cindy Cohn:

I think that leaves people really at the mercy of hoping that the entity that they signed up for to get access to knowledge survives bankruptcy. So there's a lot of implications in a bunch of different directions about trying to remove the first sale rights in the digital world.

Chris Lewis:

Right. I think we've seen a retrenching of how people understand ownership, whether you've bought something or whether you're renting it or leasing it. And I think unfortunately in a lot of digital platforms we're seeing that twisted where someone tells you that you're buying a book or you're buying access to a TV show. And you're really not buying that good. You're leasing, or you're renting access is what you're doing.

Danny O'Brien:

I think we've talked a bit about intuitions here. And it's definitely the point of which people realize that they've lost something is when their intuition about ownership hits the hard wall of what the business thinks or claims that they've done. Just a couple of examples from both the far ancient past of the digital world and more recently. I remember people's shock when due to a licensing disagreement, Amazon started removing copies of George Orwell books that people had purchased, and they believed that they bought a legitimate copy.

Danny O'Brien:

Amazon decided it wasn't a legitimate copy, so they just deleted it from their library collection. It's really odd to have 1984 disappear in that way. And then more recently when people were playing, a little more trivially, 'Fortnite' and suddenly to find that because there was an argument there between Apple and Fortnite's publishers, they were suddenly stopped from being able to play that going forward. We've talked a little bit about how these ideas of ownership have evaporated, but not too much about the click-through contracts and the licenses that are used to enforce this thing.

Danny O'Brien:

It is crazy that we clicked through these multi-page documents that nobody ever reads, but how did that get embedded in the law? How can it be that I can sign something that I didn't read and it still be enforced against me?

Chris Lewis:

Right. Contract law is a powerful thing, right? And these pop-up terms of services or EULAs, the end user licensing agreements, they are long. And in a world where everything is digital, you're having to see one or read one and understand the legal implications of one every day and multiple times a day oftentimes. So I understand the importance of making sure that there are notices and best practices on online services, but at the same time, we need to find ways to be very clear with the consumer on what they're actually purchasing.

Chris Lewis:

And even more so, we need to try to find and promote ways to rebalance or to find that balance of the power of the consumer to actually have control over the things that they've bought. And if we don't, it's going to have a negative impact on access of knowledge. It's going to have a negative impact on the marketplace for consumers as well. We saw this recently in the digital marketplace around tickets and ticketing. Have you guys been following the BOSS Act that was introduced in Congress?

Cindy Cohn:

No. I suspect my colleagues have been, but I have not.

Chris Lewis:

The quick breakdown is it is called BOSS, yes, after someone's love of Bruce Springsteen. A congressman from New Jersey, Bill Pascrell and Senator Blumenthal introduced this because of the dominance of consumers by Ticketmaster in the ticketing marketplace. They not only control the venues and therefore control the services that are offered, but they control the goods, the ticketing itself. And there are secondhand ticketing sites, but they're often squeezed out because of these intellectual property and other agreements around what you can do with a ticket once you purchase it.

Chris Lewis:

And so that's creeping into the digital marketplace for tickets, and that reduces competition for customers which lowers their prices. And also it also eliminates the ability to have that secondhand marketplace, just like we talked about books where you could sell a ticket if you all of a sudden can't go to a show or a sporting event, or to hand it off to someone else. If you can't go, sometimes you give it to your kids, right? So it comes up again and again as everything gets digitized. That these agreements, these terms and conditions need to be clear, but they also need to be fair. And if we need to enact some reforms and laws to achieve that, I think that's a good thing.

Cindy Cohn:

It happened first with some airline tickets. I'm old enough to remember when you could sell or give your airline ticket to someone else. And those days are long gone, and it loses consumers millions and millions of dollars every year, because they end up having to pay these fees if they want to change it or just let the ticket lie. So this love of contract as a way to organize society and the recognition that that really empowers the powerful and disempowers the less powerful. And in this scenario, the less powerful is you, dear consumer. And the powerful are the companies that are selling you things.

Cindy Cohn:

We had a period in time when it was contracts, and we moved away from that. And I think it's time to think about that as well, is putting a baseline in of consumer protection that prevents contracts that really are not fair. And recognizing the power imbalance between you as ... These are called contracts of adhesion. A contract where you don't really get to sit down and negotiate it. You just get handed it and you either do the thing or you don't do the thing, but you don't really have any bargaining power.

Cindy Cohn:

They're different. They're called contracts of adhesion. Then the kinds of contract where you sit down with somebody and you negotiate, "I'll give you this if you give me that." That business to business might happen, or even person to person might happen. And recognizing those differences and empowering people to really have a baseline of fairness and be able to attack fairness is something that is just desperately needed. We're getting a little away from ... What the first-sale doctrine was in the analog world was a way to not let contract law always be the only thing that mattered in the sale of an analog good.

Cindy Cohn:

That's why when we think of about digital first sale, we need something as well that protects us against the only thing that matters is the so-called contract, which isn't really a contract in the way that I think of it, like two people have a meeting of the minds about what's going to happen between them. That we need to have some baseline. And there is a couple ways to do that in the law. One is California has a bunch of baseline consumer protection laws that you can't go below on certain things.

Cindy Cohn:

The other way to do it is to create these independent doctrines, like the first-sale doctrine that just says you just can't contract away this ability of somebody to own something when they bought it. And the third way is to really empower consumers to be able to say that something is unfair at a level that they can't right now. That really those claims don't really succeed. So those are three things. I'm the lawyer who litigates, so I think of, well what could you do in court?

Chris Lewis:

In the same way, you're thinking from a Lawyers perspective, I think from an advocate's perspective. How do we help the public demand that the power remains in their hands? And I think we do that through stories. That's why I tell the story of my uncle and his music. That's why we tell the story of what consumers are facing with tickets and whether they bought them and then have the right to do what they want with them. These stories are important because that's when it makes sense to an average consumer.

Chris Lewis:

And we have to continue to share these. And even if we get some piecemeal fixes, Public Knowledge is advocating for -You talked about the internet archive situation in public libraries - we're advocating for a very simple law in Congress to just say that it is not illegal for a library to do what libraries have always done. That they bought a book, they own it, they can scan it, and they can lend it out to one customer at a time. Everyone knows their public library. Everyone understands it as a concept. And we need to take these easy to understand examples to promote how a digital first sale can work across the entire marketplace.

Danny O'Brien:

Yes. Listening to all of these examples, I do feel that we're slipping into losing these rates, and on the hour by hour, minute by minute basis. I can see how we get there. For instance, when you're talking about ticket sales, there's a sense in which the biggest concern people have on a day-to-day basis when they buy a ticket is that bots come and buy these tickets first and then and scalp people essentially. So you can understand why the companies might make those tickets tied to a particular individual, but I've never thought about the fact that actually that prevents me from being able to give it to a friend. And it gives the company itself a huge monopoly power over that.

Danny O'Brien:

And similarly, even in this conversation, I've been sitting and thinking, well, these days the books I buy, I often buy as eBooks, but I want to be able to leave my library, my collection to either my family when I die, or how the great libraries were always constructed, which is by people leaving their private library to the public system. And we have to have that as a possibility in the future. And with the current situation with eBooks, what happens at the end of it? Do they just evaporate? Do they have to be buried with me in my grave like an Egyptian pharaoh? It does feel like we have to create some rules about this before we find ourselves in a much more greatly impoverished situation.

Chris Lewis:

And I think there are three ways to attack this that we should keep in mind, because they can work together. Cindy, you noted legal changes, changes in the law, changes in the understanding of how the law is applied. That's one very important way just to set clear expectations and clear protections. And I'm sure there's other specific ones that we can come up with. I mentioned the one about public libraries. But the public library law that we're suggesting wouldn't even be possible if we also weren't attacking it from the technological perspective. The good old let's nerd harder.

Chris Lewis:

So if you law harder and you nerd harder, we can come up with a lot of great solutions here. Controlled digital lending, the ability to lend out a book and then retrieve it, and make sure that the person you lend it to does not keep it. That's an innovation in technology. And it allows for the power of technology to work for everyone in the way public libraries have always worked. So if we law harder and we nerd harder, and then we set better norms around our society, I think all of those have to work together as a solution.

Chris Lewis:

People are already doing the norm stuff and they don't even know it when instead of buying access to go to the movies, people are buying access to Netflix, or they're buying access to some other streaming service. When they do that and then they share their password with their family members, that's a change in norms. And a change in norms that, by the way, many of the streaming companies are okay with. They have not actively stopped it. They haven't tried to stop it through licensing. They probably could, the way that the law is interpreted right now, but they realized that they're adding viewers.

Chris Lewis:

And I think that's a good thing. It shifts the norm from buying to one-time viewing or buying a DVD to buying access to a library of information, a library of digital goods.

Cindy Cohn:

No, I think you're right that a lot of the services are quietly cool with password sharing. One of the things that we have long pointed out is that we're sliding into other EFF territory. The Computer Fraud and Abuse Act and some of the other state anti-hacking laws actually do make that illegal, and not just illegal, but criminal. And copyright law itself of course, has criminal penalties. So I do think that this is a long time issue in copyright law, which is that there's lots of things that violate copyright law, or at least the content owner's view of copyright law and the law that are actually good for them.

Cindy Cohn:

The EFF was founded by John Perry Barlow who wrote songs for the Grateful Dead. And the big insight that the Grateful Dead had is that if you let people record your live concerts, which is not legal, they'll share those tapes around and you end up with more people coming to your concerts. Similarly, if you let people share their passwords, more people get turned on to your service and it'll end up growing your pie. So I think that I would like to see all the things that people do that are technically illegal, but really useful to society become legal so they can come out of the shadows.

Danny O'Brien:

I often find that one of the challenges here is that because the responsibility for how these systems are built is so diffused, there are a thousand lawyers, present company excepted, and thousands of vested interests trying to decide how to do this, that no one is able to stand up, either for the user, or just for common sense. So just to give an example, we had a huge fight with Netflix when they were trying to force DRM into web standards. But in the conversations with the people who were advocating for this at venues like the W3C, the main web standards organization, these were technologists.

Danny O'Brien:

And we were sitting down going, "You do know that first of all, this doesn't work to stop mass piracy. And also it stops your customers from doing perfectly reasonable things with the products that they've purchased from you like pulling out a little clip to show their friends or sharing it on a big screen for their friends and family." And the off the record conversations I had is they would go, "We would love to do this without DRM. We think it would be far more successful, far more popular. But we're tied to these contracts with the original rights holders, and we can't get out of those, so we have to enforce the impossible."

Danny O'Brien:

I guess this is my question to you, Chris as an activist. When you have this debate over the limits of copyright and the limits of ownership, a lot of the time this ends up being a debate between big tech and the big rights holders, Hollywood. How do we get the voice of the user and the public interest into that debate?

Chris Lewis:

Right. This is something that we just have to continue to grow. I don't know how many times a month I say to folks in Washington, "Please stop saying this is a fight between big tech and big content. It is not. There is a third and far more important perspective and it's public interest perspective, what is important for our society?" Again, it's that balance of values that has to be included in the policymaking that I think when balanced properly takes care of the needs or can take care of the needs of big tech and big content in the long run. And especially if we combine it with smart technological innovations.

Chris Lewis:

So we have a movement to build. The raw politics of it righty now is that I don't think we have enough allies and champions in Washington who understand this balance. And we need to continue to organize and bring these stories to them. And the crazy thing is when they hear the common sense stories, people are easily converted to Washington. Cindy, remember the small, but fierce fight we had a few years ago on unlocking cell phones. And the reason that worked is because everyone owns a cell phone, every member of Congress, the president.

Chris Lewis:

And so as soon as the public started to say, "It's insane that I can't unlock this digital lock on something that I bought and I own in order to switch to another service or to get it fixed properly, or to sell it to someone else," that was an easy story to tell because everyone understood it. So we just got to continue to do that in order to build more champions and more understanding among policymakers.

Cindy Cohn:

I'm just sitting here nodding my head, because we really do need a movement. The public interest side of this debate is just so tiny compared to the size of the problems that we're trying to address and the size of the people on the other side. And I think that you're completely right. PK, you guys do heroic work making voices heard in Congress and we do our best to help you, but we're so tiny compared to the size of these fights and the size that we need to be. That we really do need to turn this into a movement, and a movement that really doesn't go away in between the crises. That's there all the time, because that's what happens on the other side.

Danny O'Brien:

To end on a somewhat positive note, I do remember the amazing shift that happened when members of Congress first started using iPods. We had all of these debates about the terrors of piracy online. And then when folks got iPods, I just remember congresspeople standing up and saying, "You mean it's criminal. These people want to stop me from transferring my CD collection to my iPod? That's insane." And I think that we have a good ... I think time is on our side in many ways, because as a new generation of politicians come up who recognize these intuitions that I think technology users have, I think there's a better chance of encoding those intuitions in law.

Cindy Cohn:

Well, that's a great segue I think, Danny. So this podcast is, 'how are we fixing the internet?', Chris, so what does the world look like if we get it right?

Chris Lewis:

Oh, wow. The possibilities are fantastic if we get it right. A world where average users of technology are empowered to have information at their fingertips no matter where they're born as long as they have access to the internet. That's a powerful world. Gosh, I remember coming back from school. I'm not a lawyer. I just went and got an undergraduate degree at Harvard University and I came back home to Virginia. And I remember meeting with other African-American high school students, kids who look like me, who grew up in the same county I grew up in. And they literally said things to me like, "I didn't know that black people went to Harvard."

Chris Lewis:

I'm not joking. And I said, kid, "You haven't even been across the county, let alone to a place like Boston." The power of the internet to open up the worldview of those kids or other kids, if they have access to it is limitless. I can't emphasize that enough.

Cindy Cohn:

Getting digital first sales right to me is exactly that. It's the ability for us to really let people who don't come from privileged backgrounds or privileged places have access to the world's knowledge. And then this to me goes back to why copyright is in the constitution. It's in the constitution to promote the progress of science and the useful arts. If we get this right, we're going to unleash the power of so many people around the world who haven't been able to contribute to promoting the progress of science and the useful arts, because they haven't had access to the information they need to get to the place where they can help us.

Cindy Cohn:

So I feel like we're just unleashing the potential of so many people, especially kids who really haven't been able to participate in growing the sciences and the useful arts.

Chris Lewis:

Right. And the things that they're going to produce are going to be amazing. Being creative requires some spark or inspiration. It's why if you look at the history of American music, the blues built off of Negro spirituals and rock and roll built off of the blues and country. Creativity builds on itself. And if young people don't have access to the world's knowledge, the world's information, they're not going to have that foundation to build off of and to improve and innovate.

Danny O'Brien:

I think that we always think of knowledge as a gift, but it's not going to be much of a gift if we can't give it on, if we can't like pass it to either the next generation or to people that don't currently have it. So I think at the heart of first sale and digital first sale is really about that ability to share what you have and pass it on, and it just not be a dead end.

Cindy Cohn:

Thank you so much, Chris.

Danny O'Brien:

Cindy, that was a great discussion. Really, the thing that stuck with me, I think is once again this idea that debates over copyright aren't really industry deals between big tech and big content. They're actually about what a future society is going to be like. And we have to build them with the public interest and long-term goals in mind. And particularly, the thing I hadn't really got about digital first sale is a lot of it's about posterity and legacy. A lot of it's about being able to have the right to give somebody what you've received.

Cindy Cohn:

I think that that's tremendously important. And of course, the place where that has traditionally happened so much in our society is in libraries. So they're central to the conversation.

Danny O'Brien:

I hadn't really wrapped my head around, if you don't have first sale or if you don't have this idea of, "I bought it, so I own it", then you don't get the opportunity to create a system around lending, and lending libraries. But how do we change something like that that's about a norm that's developing and we want to shift it around? Chris talked a bit about legal reform. The lawmaking reform. And one of the ways I know that you can achieve this thing is by having very specific, targeted copyright reform rather than these big omnibus copyright bills that you can just nudge things along. And I liked that Chris talked about very small bills that you can drop in if we can get support from politicians.

Cindy Cohn:

I think that it may be that we have to do this in tiny little bites, like the BOSS Act he talked about and the very narrow protection for libraries. I find that a little frustrating. The other thing in talking about the court cases, and especially the ReDigi case that Chris mentioned, the courts have gotten some of this wrong. And so that's an opportunity for ... There's an opportunity there for them to just get it right. Copyright is in the constitution. It's a constitutional analysis about whether we're promoting science and the useful arts. There's some undoing that we're going to need to do for digital first sale, but we certainly could get there.

Danny O'Brien:

And this is largely a co-created doctrine. Again, I'm not the lawyer, but if you have disagreements between the circuits in the United States, at least, or you can bring this to the Supreme Court. You can actually, I guess, create a more systemic view of the whole problem, the whole solution. But again, that needs people to advocate for it, or at least to point out the problems, which comes back to this idea about building a movement. And I know that in the '90s, in the 2000s, we did have that free culture movement that recognized that copyright and issues around copyright were important.

Danny O'Brien:

And I now feel like we're facing the hard end of that fight. That were in a situation where it's not just theoretical anymore. It's really about how people are living their lives. And that requires people to stand up and actually consult with other people to work out what a good strategy might be going forward.

Cindy Cohn:

This is one of the real geniuses of Chris is that he is an activist, and he thinks about this from the context of building a movement and collecting the stories and setting the groundwork to get the lawmakers to act, or the courts to recognize the issues at stake beyond big tech and big content as you started this out with. And that's the place where we're in complete agreement. This is a movement. And the good thing about everybody relying on digital first sale is that we have a lot of people who are impacted by it. And so there's an opportunity there. I think the trick for us in building this movement is to make sure people see what they're losing so that we don't just slide into a world in which it seems normal to just license knowledge, as opposed to actually owning it and being able to hand it on to people.

Cindy Cohn:

I also really appreciated Chris's recognition that this movement is one that will be grounded in part in standing up for marginalized voices. The people who are really going to lose out if they don't have the money to continue to pay for access to knowledge. And I appreciate that he really brings that lens to the conversation, because I think it's exactly right.

Danny O'Brien:

Well, Cindy, I don't know whether you've realized this, but actually this was our last episode in this mini-series, which I think technically means we fixed the internet. Although in practice, I think this is only a smattering of the many challenges you and I and the rest of our colleagues have in putting the world to rights.

Cindy Cohn:

I can't believe we reached the end of this. This has been really great fun. And I really want to give a special shout-out to our colleague, Rainey Reitman who carried this along and really made it happen. I appreciate your view that we fixed the internet, but really this podcast is how to fix the internet. And hopefully we've pointed the way to how to fix it, but I think that what's come up over and over again in all of these conversations is how we need you.

Cindy Cohn:

We need the engagement of a broader community, whether we call it a movement or we call it the users, or we call it the bigger community. So there's lots of ways that you can get involved to help us get to that place where we don't need this podcast anymore because the internet's fixed. Of course, I'm the head of the EFF, so we work for tips. And if you think that what we're doing to try to move things forward is good, please donate and join us at EFF, join our part of the movement. We've had lots of friends on this podcast who also are other pieces of the movement. And if any of those called to you, please support them as well.

Cindy Cohn:

Again, the way a movement works is that you don't have to just pick one horse and go for it. You can really support a range of things that interest you. And I certainly hope that you will.

Danny O'Brien:

And if you have any feedback on this particular branch of the digital revolution, let us know at the eff.org/podcast. And we hope to see you in the very near future.

Danny O'Brien:

Thanks again for joining us. If you'd like to support the Electronic Frontier Foundation, here are three things you can do today. One, you can hit subscribe in your podcast player of choice. And if you have time, please leave a review. It helps more people find us. Two, please share on social media and with your friends and family. Three, please visit eff.org/podcasts where you will find more episodes, learn about these issues. You can donate to become a member and lots more.

Danny O'Brien:

Members are the only reason we can do this work, plus you can get cool stuff like an EFF hat or an EFF hoodie, or even a camera cover for your laptop. Thanks once again for joining us. And if you have any feedback on this episode, please email podcast@eff.org. We do read every email. This podcast was produced by the Electronic Frontier Foundation with the help from Stuga Studios. Music by Nat Keefe of BeatMower.

 

rainey Reitman

Missed the RJI-EFF Event on Deciphering Data Privacy? We’ve Got You Covered

3 months ago

EFF recently partnered with the University of Missouri's Donald W. Reynolds Journalism Institute on a virtual event, "Deciphering Data Privacy.” This event aimed to foster a conversation with journalists, technologists, and privacy experts on the ways that we can make the conversation about privacy—how companies view it, how it affects you every day, and how you can stand up for your privacy rights—more understandable to everyone.

The event featured an opening keynote from EFF Project Manager and student privacy activist Lindsay Oliver on the landscape of data privacy, from current threats to pending legislative efforts with a particular focus on the privacy invasions facing students at all grade levels.

%3Ciframe%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2F46FVtRG_FSM%3Ft%3D3562%26autoplay%3D1%26mute%3D1%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%22%20allowfullscreen%3D%22%22%20width%3D%22560%22%20height%3D%22315%22%20frameborder%3D%220%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

You can see the slides from her talk here. (PDF)

Other events included a fireside chat with Melanie Ensign, CEO/Founder of Discernible, and a panel moderated by Kashmir Hill of the New York Times in conversation with Maddy Varner of The Markup, Dhruv Mehdotra of Gizmodo Media, and Aaron Krolik of the New York Times to discuss some of their favorite privacy pieces.

Journalists can help the public better understand what's happening and what's at stake when it comes to data privacy. We’re proud to work with the University of Missouri to showcase how the critical role journalists can play in explaining the incentive structures driving the technology industry, highlight data privacy harms, and inform the public.

Hayley Tsukayama

Today: Tell Congress Not To Bankrupt Internet Users

3 months ago

We are at a critical juncture in the world of copyright claims. The “Copyright Alternative in Small-Claims Enforcement Act”—the CASE Act—is apparently being considered for inclusion in next week’s spending bill. That is “must pass” legislation—in other words, legislation that is vital to the function of the government and so anything attached to it, related to spending or not, has a good chance of becoming law. The CASE Act could mean Internet users facing $30,000 penalties for sharing a meme or making a video. It has no place in must pass legislation.

PREVENT COPYRIGHT TROLLING

TELL CONGRESS NOT TO TREAT FREE EXPRESSION LIKE A TRAFFIC TICKET

The CASE Act purports to be a simple fix to the complicated problem of online copyright infringement. In reality, it creates an obscure, labyrinthine system that will be easy for the big players to find the way out of. The new “Copyright Claims Board” in the Copyright Office would be empowered to levy large penalties against anyone accused of copyright infringement. The only way out would be to respond to the Copyright Office—in a very specific manner, within a limited time period. Regular Internet users, those who can’t afford the $30,000 this “small claims” board can force you to pay, will be the ones most likely to get lost in the shuffle.

The CASE Act doesn’t create a small-claims court, which might a least have some hard-fought for protections for free expression built in. Instead, claims under the CASE Act would be heard by neither judges or juries, just “claims officers.” And CASE limits appeals, so you may be stuck with whatever penalty the “claims board” decides you owe.

Previous versions of the CASE Act all failed. This version is not an improvement, and Congress has not heard enough from those of us who would be most affected by CASE: regular, everyday Internet users who could end up owing thousands of dollars. Large, well-resourced players will not be affected, as they will have the resources to track notices and simply opt out.

How do we know that the effect of this bill on people who do not have those resources has not been understood? For one thing, Representative Doug Collins of Georgia said in an open hearing any claim with a $30,000 cap on damages was “truly small.” Of course, for many many people – often the same people who don’t have lawyers to help them opt-out in time – paying those damages would be ruinous.

That’s why we’re asking you to take some time today to call and tell your representatives how dangerous this bill really is and that has no place being snuck through via a completely unrelated “must pass” spending bill.

Tell Congress that a bill like this has no place being added to any spending bill. That it must rise or fall on its own merits and that there are people who will be harmed and who are speaking out against this bill.

PREVENT COPYRIGHT TROLLING

TELL CONGRESS NOT TO TREAT FREE EXPRESSION LIKE A TRAFFIC TICKET

Katharine Trendacosta
Checked
30 minutes 36 seconds ago
EFF's Deeplinks Blog: Noteworthy news from around the internet
Subscribe to EFF update feed