👎 California's Terrible, No Good, Very Bad Social Media Ban | EFFector 38.9

4 hours ago

We'd all like the internet to be a better place—for kids and adults alike. But in the name of online safety, governments around the world are racing to impose a dangerous new system of control. Are age gates the silver bullet to the internet's problems they're being promoted as? Or are we being sold a bill of goods? We're answering this question and more in our latest EFFector newsletter.

JOIN OUR NEWSLETTER

For over 35 years, EFFector has been your guide to understanding the intersection of technology, civil liberties, and the law. This latest issue covers an attack on VPNs in Utah, a livestream on how to disenshittify the internet, and California's proposed social media ban that could set a dangerous new precedent for online censorship.

Prefer to listen in? EFFector is now available on all major podcast platforms. This time, we're having a conversation with EFF Legislative Analyst Molly Buckley on why social media bans can't sidestep the U.S. constitution. You can find the episode and subscribe on your podcast platform of choice:

%3Ciframe%20height%3D%22200px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F07b61711-d8ff-4483-aee3-21daa5a3ea22%3Fdark%3Dfalse%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

   

Want to help push back on these misguided regulations? Sign up for EFF's EFFector newsletter for updates, ways to take action, and new merch drops. You can also fuel the fight for privacy and free speech online when you support EFF today!

Christian Romero

【国体擁護法】「国旗等損壊罪」反対集会開く 愛国心刑罰で強要=古川英一

4 hours 12 minutes ago
  高市政権の元で進められている「国旗等損壊罪」は自民党のプロジェクトチームでの検討が3月末から始まり、今の国会で提出される可能性が高まっている。こうした状況に危機感を持つ市民団体や日の丸・君が代の強制に反対している教職員などが4月11日の夜、東京・文京区で「国旗等損壊罪」に反対する緊急集会を開いた=写真=。 集会には90人あまりが参加し「君が代裁判」の弁護にあたる澤藤統一郎弁護士がこの法律の意味や狙いについて講演した。    この中で澤藤さんは「国旗等損壊罪」とは、国旗が象..
JCJ

The SECURE Data Act is Not a Serious Piece of Privacy Legislation

5 hours 34 minutes ago

The federal SECURE Data Act is not a serious consumer privacy bill, and its provisions—if enacted—would be a retreat from already insufficient state protections.

Republicans on the House Energy and Commerce Committee released a draft of the bill late last month without bipartisan support. The bill is weaker than congressional proposals in prior years, as well as most of the 21 state consumer privacy laws already on the books.

The bill could wipe out hundreds of  state privacy protections.

Most troubling for EFF: the bill would preempt dozens, if not hundreds, of state laws that regulate related topics, and it would not allow consumers to sue to protect their own rights (commonly called a private right of action). And it comes nowhere close to banning online behavioral advertising—a practice that fuels technology companies’ always increasing hunt for personal data.

The bill also suffers from many other flaws including weak opt-out defaults, inadequate data minimization requirements, and large definitional loopholes for companies.

Key Provisions

The bill would give consumers some rights to take action to control their personal data— like access, correction, deletion, and limited portability. These rights have become standard in all data privacy proposals in recent years.

The bill would also require companies to obtain your consent before processing your sensitive data, or using any of your personal data for a previously undisclosed purpose. Absent your consent, a company couldn’t do these things.

Further, the bill would allow you to opt out of (1) targeted third-party advertising, (2) the sale of your personal data, and (3) profiling of you that has a legal, healthcare, housing, or employment effect. Unfortunately, a company could keep doing these invasive things to you, unless you opted out.

The bill would also require data brokers that make at least 50 percent of their profits from the sale of personal data to register in a public database maintained by the Federal Trade Commission (FTC).

Preemption of Too Many State Laws

Federal privacy laws should allow states to build ever stronger rights on top of the federal floor. Many federal privacy laws allow this, including the Health Insurance Portability and Accountability Act, the Video Privacy Protection Act, and the Electronic Communications Privacy Act.

The SECURE Data Act would not do that. Instead, it would wipe out dozens, if not hundreds, of existing state privacy protections. Section 15 of the bill would preempt any “law, rule, regulation, requirement, standard, or other provision [that] relates to the provisions of this Act.” This would kill the 21 state consumer privacy laws passed in the past few years. These state bills aren’t strong enough, but they are still better than this federal proposal. For example, California maintains a data broker deletion tool and requires companies to comply with automatic opt-out signals—including one that is built into EFF’s Privacy Badger.

Because the SECURE Data Act has provisions that relate to data privacy and security, it could preempt all 50 state data breach laws and many others. It could also preempt state laws related to specific pieces of sensitive data, like bans on the sale of biometric or location information. Some states like California have constitutional provisions that protect an individual’s right to privacy, which can be enforced against companies. That constitutional provision, as well as state privacy torts, could also be in danger if this bill passed.

No Private Enforcement, A New Cure Period, and Vague Security Powers

Strong consumer privacy laws should allow consumers to take companies to court to defend their own rights. This is essential because regulators do not have the resources to catch every violation, and federal consumer enforcement agencies have been gutted during the current administration.

The SECURE Data Act does not have a private right of action. The FTC, along with state attorneys general, have primary enforcement authority. The law also gives companies 45 days to “cure” any violation with no penalty after they are caught.

Moreover, Section 8 of the bill creates a vaguely defined self-regulatory scheme in which companies can apply to be audited by an “independent organization” that will apply a “code of conduct.” Following this code of conduct would give companies a presumption that they are complying with the law. This provision is an implicit acknowledgement that the bill does not provide regulators with any new resources to enforce new protections.

Section 9 of the bill would give the Secretary of Commerce broad power to “take any action necessary and appropriate to support the international flow of personal data,” including assessing “security interests of the United States.” The scope of this amorphous provision is unclear, but it likely does not belong in a consumer protection bill.

Weak Privacy Defaults

Your online privacy should not depend on whether you have the time, patience, and knowledge to navigate a website and turn off invasive tracking. Good privacy laws build in data minimization requirements—meaning there should be a default standard that prevents companies from processing your data for purposes that are not needed to provide you with the service you asked for.

The SECURE Data Act puts the burden on you to opt out of invasive company practices, like targeted third-party advertising, the sale of your personal data, and profiling. The bill at least requires companies to obtain your consent before processing your sensitive data (like selling your precise location). These consent requirements, however, are often an invitation for companies to trick you into clicking a button to give away your rights in hard-to-read policies. Indeed, few people would knowingly agree to let a company sell their personal data to a broker who turns around and sells it to the government.

Section 3 of the bill uses the term “data minimization,” but it is done in name only. The provision does not limit a company’s processing of data to only what is necessary to provide the customer with the good or service they asked for. Instead, the provision limits processing of data to only what a company “disclosed to the customer”—meaning if it is in the confusing privacy policy that nobody reads, it is okay.

And the bill would not even allow you to restrict certain uses of your data. As companies seek more data for AI systems, many internet users do not want their private personal data to be used to train those models. However, the bill makes clear that “nothing in this Act may be construed to restrict” a company from collecting, using, or retaining your data to “develop” or “improve” a new technology.

Other Flawed Definitions and Loopholes

The bill has numerous loopholes that technology companies would exploit if the bill were to become law. Below is just a sampling:

  • Government contractors: Under Section 13(b)(2), government contractors are exempt from the bill, which could be wrongly interpreted to exempt certain data brokers from sale restrictions when those sales are made to the government. This type of exemption could benefit surveillance companies like Clearview AI, which previously argued it was exempt from Illinois’ strict biometric law using a similar contractor exception. This is likely not the authors’ intention, since the definition of sale includes those made “to a government entity.”
    Sale definition: The definition in Section 16(28) is defined too narrowly. A sale should mean any exchange for monetary “or other valuable” consideration, as in some other privacy laws.
  • Biometric information definition: The definition in Section 16(4) excludes data generated from a photo or video, and the definition excludes face scans not meant to “identify a specific individual.” This could be wrongly interpreted to allow biometric identification from security camera footage, or biometric use for sentiment or demographic analysis.
  • Personal data definition: The definition in Section 16(21) exempts “de-identified data” from the definition of personal data, which could allow companies to do anything with de-identified data because that data is not protected by the law. The problem with de-identified data is that many times it is not.
  • Deletion requests: With regard to data that a company obtained from a third-party, Section 2(d)(5) would treat a consumer’s deletion request merely as an opt-out request. And even if a customer requested deletion, a company might be able to retain the data for research purposes under section 11(a)(9)(A).
  • Profiling definition: Under the definition in Section 16(25), companies could profile so long as the profiling is not “solely automated.” The flimsiest human review would exempt highly automated profiling.

Congress is long overdue to enact a strong comprehensive consumer data privacy law, and we have sketched what it should look like. But the SECURE Data Act is woefully inadequate. In fact, it would cause even more corporate surveillance of our personal information, by wiping out state laws that are more protective than this federal bill. Even worse, this bill would block state legislatures from protecting their residents from the privacy threats of tomorrow that are unforeseeable today. 

Mario Trujillo

[B] 官製ヘイトと体感治安

1 day ago
今この国では外国から働きに来ている人たちが急速に働くこと、住み続けることが窮屈になっている。高市政権誕生を期に制度上の締め付けが強くなっているからだ。言い換えれば政治と行政による排外主義と外国人ヘイトが強まっているのだ。筆者が住む埼玉県では地方行政と地方議会に排外主義が蔓延するというありえない事態が出現している。そのキーワードになっているのが「体感治安」という聞きなれない概念だ。(大野和興)
日刊ベリタ

【映画の鏡】もう1つのなでしこジャパン『アイ・コンタクト』東京デフリンピックで新作も=鈴木 賀津彦

1 day 4 hours ago
               アイ・コンタクト製作委員会 昨年11月、日本で初めて「東京2025デフリンピック」が開催された。これまで「デフリンピックとは何か」を知らない人も多かったが、東京開催を機に「きこえない・きこえにくい人のための国際スポーツ大会」として広く認知されるようになった。国際ろう者スポーツ委員会(ICSD)が4年ごとに開催する、デフアスリートのオリンピックである。 障害者サッカーを含め、サッカー界のドキュメンタリーを撮り続けている中村和彦監督は、今大会に密着し..
JCJ

EFF and 18 Organizations Urge UK Policymakers to Prioritize Addressing the Roots of Online Harm

1 day 9 hours ago

EFF joins 18 organizations in writing a letter to UK policymakers urging them to address the root causes of online harm—rather than undermining the open web through blunt restrictions.

The coalition, which includes Mozilla, Tor Project, and Open Rights Group, warns that proposed measures following the passage of the Children’s Wellbeing and Schools Bill risk fundamentally reshaping the internet in harmful ways. Chief among these proposals are sweeping age-gating requirements and access restrictions that would apply not only to young people, but effectively to all users.

While framed as efforts to protect children online, these policies rely heavily on age assurance technologies that are either inaccurate, privacy-invasive, or both. As the letter notes, mandating such systems across a wide range of services—from social media and video games to VPNs and even basic websites—would force users to verify their identity simply to access the web. This creates serious risks, including expanded surveillance, data breaches, and the erosion of anonymity.

Beyond privacy concerns, the signatories argue that these measures threaten the core architecture of the open internet. Age-gating at scale could fragment the web into a patchwork of restricted jurisdictions, limit access to information, and entrench the dominance of powerful gatekeepers like app stores and platform ecosystems. In doing so, policymakers risk weakening the very qualities—interoperability, accessibility, and openness—that have made the internet a global public resource.

The letter also emphasizes what’s missing from the current policy approach: meaningful efforts to address the underlying drivers of online harm. Many digital platforms are designed to maximize engagement and profit through pervasive data collection and targeted advertising, often at the expense of user safety and autonomy. Rather than imposing access bans, the coalition calls on UK policymakers to hold companies accountable for these systemic practices and to prioritize user rights by design.

Importantly, the signatories highlight that the internet remains a vital space for young people: offering access to information, support networks, and opportunities for expression that may not exist offline. Policies that restrict access risk cutting off these lifelines without meaningfully reducing harm.

The message is clear: protecting users online requires more than heavy-handed restrictions. It demands thoughtful, rights-respecting policies that tackle the business models and design choices driving harm, while preserving the open, global nature of the web.

Jillian C. York

Shut Down Turnkey Totalitarianism

1 day 13 hours ago

William Binney, the NSA surveillance architect-turned-whistleblower, called it the "turnkey totalitarian state." Whoever sits in power gains access to a boundless surveillance empire that scorns privacy and crushes dissent. Politicians will come and go, but you can help us claw the tools of oppression out of government hands.

JOIN EFF

Become a Monthly Sustaining Donor

We must stand strong to uphold your privacy and free expression as democratic principles. With members around the world, EFF is empowered to use its trusted voice and formidable advocacy to protect your rights online. Whether giving monthly or one-time donations, members have helped EFF:

  • Sue to stop warrantless searches of Automated License Plate Reader (ALPR) records, which reveal millions of drivers’ private habits, movements, and associations.

  • Launch Rayhunter, an open source tool that empowers you to help search out cell-site simulators capable of tracking the movements of protestors, journalists, and more.

  • Help journalists see through the spin of "copaganda" by breaking down how policing technology companies often market their tools with misleading claims with our Selling Safety report.

Right now, U.S. Congress is on the edge of renewing the international mass spying program known as Section 702, affecting millions. EFF is rallying to cut through the politics and give ordinary people a chance to stop this oppressive surveillance. It’s only possible with help from supporters like you, so join EFF today.

The New EFF Member Gear

Get this year’s new member t-shirt when you join EFF. Aptly titled "Claw Back," the design features an orange boy swatting at the street-level surveillance equipment multiplying in our communities. You might empathize with him, but there’s a better way. Let’s end the law enforcement contracts, harmful practices, and twisted logic that enable mass spying in the first place.

You can also get brand new set of eleven soft and supple polyglot puffy stickers as a token of thanks. Whether you're a kid or a kid at heart, these nostalgic stickers are perfect for digital devices, lunchboxes, and notebooks alike. Our little Ghostie protects privacy in six languages: Arabic, English, Japanese, Persian, Russian, and Spanish.

And for a limited time, get a Privacy Badger Crewneck sweater to help you browse the web with confidence. The embroidered Privacy Badger mascot appears above characters that say "privacy” because human rights are universal. Millions of people around the world use Privacy Badger, EFF's free tool that devours devious scripts and cookies that twist your web browsing into a commodity for Big Tech, advertisers, and scammers.

Privacy is a human right because it gives you a fundamental measure of security and freedom. We owe it to ourselves to fight the mass surveillance used to control and intimidate people. Let’s do this. Join EFF today with a monthly donation or one-time donation and help claw back your privacy.

____________________

EFF is a member-supported U.S. 501(c)(3) organization. We've received top ratings from the nonprofit watchdog Charity Navigator since 2013! Your donation is tax-deductible as allowed by law.

Aaron Jue

EFF Submission to UK Consultation on Digital ID

2 days 1 hour ago

Last September, the United Kingdom’s Prime Minister Keir Starmer announced plans to introduce a new digital ID scheme in the country. The scheme aims to make it easier for people to prove their identities by creating a virtual ID on personal devices with information like names, date of birth, nationality or residency status, and a photo to verify their right to live and work in the country. 

Since then, EFF has joined UK-based civil society organizations in urging the government to reconsider this proposal. In one joint letter from December, ahead of Parliament’s debate around a petition signed by 2.9 million people calling for an end to the government’s plans to roll out a national digital ID, EFF and 12 other civil society organizations wrote to politicians in the country urging MPs to reject the Labour government’s proposal.

Nevertheless, politicians have continued to explore ways to build out a digital ID system in the country, often fluctuating between different ideas and conceptualisations for such a scheme. In their search for clarity, the government launched a consultation, ‘Making public services work for you with your digital identity,’ seeking views on a proposed national digital ID system in the UK. 

EFF submitted comments to this consultation, focusing on six interconnected issues:

  1. Mission creep
  2. Infringements on privacy rights 
  3. Serious security risks
  4. Reliance on inaccurate and unproven technologies
  5. Discrimination and exclusion
  6. The deepening of entrenched power imbalances between the state and the public.

Even the strongest recommended safeguards cannot resolve these issues, and the fundamental core problem that a mandatory digital ID scheme that shifts power dramatically away from individuals and toward the state. They are pursued as a technological solution to offline problems but instead allow the state to determine what you can access, not just verify who you are, by functioning as a key to opening—or closing—doors to essential services and experiences. 

No one should be coerced—technically or socially—into a digital system in order to participate fully in public life. It is essential that the UK government listen to people in the country and say no to digital ID. 

Read our submission in full here.

Paige Collings

Congress Has Until June to Take Action on 702. Tell Them Not to Drop The Ball

2 days 3 hours ago

There are no excuses for any Member of Congress to support a clean reauthorization of Section 702. Anyone who votes to do so does not take your privacy seriously. Full stop.

Section 702 of the Foreign Intelligence Surveillance Act (FISA) is among the United States’ most infamous mass surveillance programs. Sold to the public as a foreign surveillance tool, it has become a backdoor for law enforcement to search through Americans’ private communications without ever obtaining a warrant. We need to act now to prevent Congress from reauthorizing 702 in a way that ignores the truth: This authority needs to change.

House Speaker Mike Johnson has attempted several times to push re-authorization bills that give us now substantive reforms. We will not fall for fig leafs or shifts in rhetoric. Our demands are common sense: no renewal without real reforms. A simple extension is a betrayal of every US resident who expects their government to respect their rights and the Constitution.

Your representative needs to hear from you right now, before the April 30 deadline. Contact them today.

Tell them: No vote on any bills that would reauthorize Section 702 without meaningful reform.

Electronic Frontier Foundation

【5月出版界の動き】リアル書店減と新規開店への模索

2 days 4 hours ago
◆雑誌・書籍販売金額1118億円(前年同月比7.4%減) 書籍は743億5600万円(同8.4%減)、雑誌375億1200万円(同5.4%減)。雑誌の内訳は月刊誌が同5.5%減、週刊誌が同4.8%減。返品率は書籍が同0.4ポイント増の25.6%、雑誌は同1.0ポイント減の40.6%。 書店店頭での売れ行きは、書籍が約2%減、文芸約2%増、文庫本約2%増、学参ほぼ前年なみ、ビジネス書約3%減、児童書約2%減、新書本約4%増、書籍扱いコミックス約6%減。雑誌は定期誌が約2%減、雑..
JCJ

Getting Digital Fairness Right: EFF's Recommendations for the EU's Digital Fairness Act

2 days 4 hours ago
Digital Fairness in the EU

The next few years will be decisive for EU digital policymaking. With major laws like the Digital Services Act, the Digital Markets Act, and the AI Act now in place, the EU is entering an enforcement era that will show whether these rules are rights-respecting or drift toward overreach and corporate control. With the proposed EU’s Digital Fairness Act (DFA), the Commission is now turning to increasingly visible risks for users, such as dark patterns and exploitative personalization. Its “Digital Fairness Fitness Check” makes clear that existing consumer rules need updating to reflect how digital markets operate today.

But not all proposed solutions point in the right direction. Regulators are already flirting with measures that rely on expanded surveillance, such as age verification mandates—surface-level fixes that risk undermining fundamental rights while offering little more than a false sense of protection.

For EFF, digital fairness means addressing the root causes of harm, not requiring platforms to exert more control over their users. It means safeguarding privacy, freedom of expression, and the rights of users and developers.

If the DFA is to make a real difference, it must tackle structural imbalances. Lawmakers should focus on two interlocking principles. First, prioritize privacy. Reforms should address harms driven by surveillance-based business models, alongside deceptive design practices that impair informed choices. Second, strengthen user sovereignty, which is also a necessary precondition for European digital sovereignty more broadly. Strengthening user sovereignty means taking measures that address user lock-in, coercive contract terms, and manipulative defaults that limit users’ ability to freely choose how they use digital products and services.

Together, these principles would support the EU’s objectives of consistent consumer protection, fair markets, and a more coherent legal framework. If implemented properly, the EU could address power imbalances and build trust in Europe’s digital economy.

Ban Dark Patterns

Dark patterns are practices that impair users’ ability to make informed and autonomous decisions. Many companies deploy these tactics through interface design to steer choices and influence behavior. Their impact goes beyond poor consumer decisions. Dark patterns push users to share personal data they would not otherwise disclose and undermine autonomy by making alternatives harder to access.

The DFA should address this by clearly prohibiting misleading interfaces that distort user choice in commercial contexts. While the Digital Services Act introduced a definition, it only partially bans such practices and leaves gaps across existing consumer law rules. The DFA should close these gaps by, at the very least, introducing explicit prohibitions and clearer enforcement rules, without resorting to design mandates.

Tackle Commercial Surveillance

At the core of digital unfairness lies the pervasive collection and use of personal data. Surveillance and profiling drive many of the harms regulators are trying to address, from dark patterns to exploitative personalization. The DFA should tackle these incentives directly by reducing reliance on surveillance-based business models. These practices are fundamentally incompatible with privacy and fairness, and they distort digital markets by rewarding data exploitation rather than quality of service. At a minimum, the DFA should address unfair profiling and surveillance advertising by strengthening privacy rights and banning pay-for-privacy schemes. Users should not have to trade their data or pay extra to avoid being tracked. Accordingly, the DFA should support the recognition of automated privacy signals by web browsers and mobile operating systems, which give users a better way to reject tracking and exercise their rights. Practices that override such signals through banners or interface design should be considered unfair.

Addressing surveillance and profiling also protects children, since many online harms are tied to the collection and exploitation of their data. Systems that serve ads or curate content often rely on intrusive profiling practices, raising concerns about privacy and fairness, particularly when applied to minors. Rather than turning to invasive age verification, the focus should be on limiting data use by default.

Strengthen User Sovereignty

There is a major gap in how EU law addresses user autonomy in digital markets: many digital products and services still restrict what people can do with what they pay for through opaque or one-sided licensing terms, technical protection measures, and remote controls. These mechanisms increasingly limit lawful use, modification, or access after purchase, allowing providers to revoke access, disable functionalities, or degrade performance over time. In practice, this turns ownership into a conditional rental.

Consumers must be able to use and resell digital goods without hidden limitations and with clear licensing terms. Too often, technical and contractual lock-ins, including remote lockouts and unilateral restrictions on functionality, erode that control. Recent legal reforms show that progress is possible. Rules such as those under the Digital Markets Act have begun to curb technical and contractual barriers and promote user choice. However, many restrictions persist.

The DFA must address these practices by targeting unfair post-sale restrictions and strengthening users’ ability to control and switch services. This means setting clear limits on unfair terms and misleading practices, alongside robust transparency on how digital services function over time. It should also strengthen interoperability and support user control, allowing people to access third-party applications and to let trusted applications act on their behalf, reducing lock-in and expanding meaningful choice in how users interact with digital services.

Christoph Schmon