Tech Nonprofits to Feds: Don’t Weaponize Procurement to Undermine AI Trust and Safety

2 hours 34 minutes ago

While the very public fight continues between the Department of Defense and Anthropic over whether the government can punish a company for refusing to allow its technology to be used for mass surveillance, another branch of the U.S. government is quietly working to ensure that this dispute will never happen again. How? By rewriting government procurement rules.

Using procurement -- meaning, the processes by which governments acquire goods and services-- to accomplish policy goals is a time-honored and often appropriate strategy. The government literally expresses its politics and priorities by deciding where and how it spends its money. To that end, governments can and should give our tax dollars to companies and projects that serve the public interest, such as open-source software development, interoperability, or right to repair. And they should withhold those dollars from those that don’t, like shady contractors with inadequate security systems .

New proposed rules from the principal agency in charge of acquiring goods, property and services for the federal government, the General Services Administration, are supposed to be primarily an effort to implement one policy priority: promoting steering government funds toward “ideologically neutral” American AI innovation But the new guidelines do far more than that.

As explained in comments filed today with our partners at the Center for Democracy and Technology, the Protect Democracy Project, and the Electronic Privacy Information Center, the GSA’s guidelines include broad provisions that would make AI tools less safe and less useful. If finally adopted, these provisions would become standard components of every federal contract. You can read the full comments here.

The most egregious example is a requirement that contractors and government service providers must license their AI systems to the government for “all lawful purposes.” Given the government’s loose interpretations of the law, ability to find loopholes to surveil you, and willingness to do illegal spying, we need serious and proactive legal restrictions to prevent it from gobbling up all the personally data it can acquire and using even routine bureaucratic data for punitive ends.

Relatedly, the draft rules require that “AI System(s) must not refuse to produce data outputs or conduct analyses based on the Contractor’s or Service Provider’s discretionary policies.” In other words, if a company’s safety guardrails might prevent responding to a government request, the company must disable those guardrails. Given widespread public concerns about AI safety, it seems misguided, at best, to limit safeguards a company deems necessary.

There are myriad other problems with the draft rules, such as technologically incoherent “anti-Woke” requirements. But the overarching problem is clear: much of this proposal would not serve the overall public interest in using American tax dollars to promote privacy, safety, and responsible technological innovation. The GSA should start over.

Note they are also about implementing "anti-woke" tech which is even more stupid. I rewrote to allude to it but really that's a whole other blog post

Corynne McSherry

 【4月出版界の動き】出版業界に押し寄せる第3の大波

3 hours 11 minutes ago
◆書協が生成AI使用の指針を 出版社の業界団体、日本書籍出版協会は出版社が生成AI(人工知能)を使う際の指針作成に乗り出す。出版社が行う翻訳・校正など編集業務にAIを導入する際のガイドラインを作成する。また本の著者が執筆時にAIを使う場合を想定し、契約書に規定を盛り込むといった対応を進める。 生成AIへの対応を議論する検討会を設置して、381の加盟企業のうち約40社が検討会に参加。専門家からのヒアリングを交えて話し合う。秋にも出版社向けのガイドラインをまとめることを目指す。◆..
JCJ

Double Shot of Privacy's Defender in D.C.

4 hours 13 minutes ago

You’re invited on a journey inside the privacy battles that shaped the internet. EFF’s Executive Director Cindy Cohn has tangled with the feds, fought for your data security, and argued before judges to protect our access to science and knowledge on the internet.

Join Cindy at two events in Washingtion, D.C. on April 13 and 14 discussing her new book: Privacy's Defender: My Thirty-Year Fight Against Digital Surveillance, on sale now. All proceeds from the book benefit EFF. Find the full event details below, and RSVP to let us know if you can make it.

April 13 - With Gigi Sohn at Busboys & Poets

Join American Association of Public Broadband (AAPB) Executive Director Gigi Sohn, in conversation with EFF Executive Director Cindy Cohn for a discussion about Cindy's work, her new book, and what we're all wondering: Can have private conversations if we live our lives online?

Privacy's Defender at Busboys & Poets
Busboys & Poets - 14th & V
2021 14th St NW, Washington, DC 20009
Monday, April 13, 2026
6:30 pm to 8:30 pm

Register Now

April 14 - With Women in Security and Privacy (WISP)

Join Women in Security and Privacy (WISP) and EFF for a conversation featuring American University Senior Professorial Lecturer Chelsea Horne and EFF Executive Director Cindy Cohn as they dive into data security, Federal access to data, and your digital rights. 

Privacy's Defender with WISP
True Reformer Building - Lankford Auditorium
1200 U St NW, Washington, DC 20009
Tuesday, April 14, 2026
6:00 pm to 8:30 pm

REGISTER NOW

"Privacy’s Defender is a compelling account of a life well lived and an inspiring call to action for the next generation of civil liberties champions."

~Edward Snowden, whistleblower; author of Permanent Record

Can't make it? Look for Cindy at a city (or web connection) near you! Find the latest tour dates on the Privacy’s Defender hub or follow EFF for more.

Part memoir and part legal history for the general reader, Privacy’s Defender is a compelling testament to just how much privacy and free expression matter in our efforts to combat authoritarianism, grow democracy, and strengthen human rights. Thank you for being a part of that fight.

Want to support the cause and get a copy of the new book? New or renewing EFF members can preorder one as their annual gift!

Aaron Jue

Weakening Speech Protections Will Punish All of Us—Not Just Meta

21 hours 28 minutes ago

Recently, a California Superior Court jury found that Meta and YouTube harmed a user through some of the features they offered. And a New Mexico jury concluded that Meta deceived young users into thinking its platforms were safe from predation. 

It’s clear that many people are frustrated by big tech companies and perhaps Meta in particular. We too have been highly critical of them and have pushed for years to end their harmful corporate surveillance. So it’s not surprising that a jury felt like Mark Zuckerberg and his company, along with YouTube, needed to be held accountable. 

While it would be easy to claim that these cases set a legal precedent that should make social media companies fearful, that’s not exactly true. And that’s actually a good thing for the internet and its users. 

These jury trials were just an early step in a long road through the court system. These cases will now go up on appeal, where the courts’ rulings about the First Amendment and immunity under Section 230 will likely get reconsidered. 

As we have argued many times before, the First Amendment protects both user speech and the choices platforms make on how to deliver that speech (in the same way it protects newspapers' right to curate their editorial pages as they see fit). Features on social media sites that are designed to connect users cannot be separated from the users’ speech, which is why courts have repeatedly held that these features are indeed protected. 

So while it may be tempting to celebrate these juries’ decisions as a "win" against big tech, in fact the ramifications of lowering First Amendment and immunity standards on other speakers—ones that members of the public actually like, and do not want to punish—are bad. We can’t create less protective speech rules for Meta and Google alone just because we want them held accountable for something else.

As we have often said, much of the anger against these companies arises from people rightfully feeling that these companies harvest and exploit their data, and monetize their lives for crass economic reasons. We therefore continue to urge Congress to pass a comprehensive national privacy law with a private right of action to address these core concerns.

David Greene

A Baseless Copyright Claim Against a Web Host—and Why It Failed

22 hours 36 minutes ago

Copyright law is supposed to encourage creativity. Too often, it’s used to extract payouts from others.

Higbee & Associates, a law firm known for sending copyright demand letters to website owners, targeted May First Movement Technology, accusing it of infringing a photograph owned by Agence France-Presse (AFP). The claim was baseless. May First didn’t post the photo. It didn’t even own the website where the photo appeared.

May First is a nonprofit membership organization that provides web hosting and technical infrastructure to social justice groups around the world. The allegedly infringing image was posted years ago by one of May First’s members, a human rights group based in Mexico. When May First learned about the copyright complaint, it ensured that the group removed the image.

That should have been the end of it. Instead, the firm demanded payment.

So EFF stepped in as May First’s counsel and explained why AFP and Higbee had no valid claim. After receiving our response, Higbee backed down.

This outcome is a reminder that targets of copyright demands often have strong defenses—especially when someone else posted the material.

Hosting Content Isn’t the Same as Publishing It

Copyright law treats those who create or control content differently from those who simply provide the tools or infrastructure for others to communicate.

In this case, May First provided hosting services but didn’t post the photo. Courts have long recognized that service providers aren’t direct infringers when they merely store material at the direction of users. In those cases, service providers lack “volitional conduct”—the intentional act of copying or distributing the work.

Copyright law also recognizes that intermediaries can’t realistically police everything users upload. That’s why legal protections like the Digital Millennium Copyright Act safe harbors exist. Even outside those safe harbors, courts still shield service providers from liability when they promptly respond to notices.

May First did exactly what the law expects: it notified its member, and the image came down.

A Claim That Should Have Been Withdrawn Much Sooner

The troubling part of this story isn’t just that a demand was sent. It’s that Higbee and AFP continued to demand money and threaten litigation after May First explained that it was merely a hosting provider and had the image removed.

In other words, the claim was built on shaky legal ground from the start. Once May First explained its role, Higbee should have withdrawn its demand. Individuals and small nonprofits shouldn’t need lawyers just to stop aggressive copyright shakedowns.

Statutory Damages Fuel Copyright Abuse

This isn’t an isolated case—it’s a predictable result of copyright law’s statutory damages regime.

Statutory damages can reach $150,000 per work, regardless of actual harm. That enormous leverage incentivizes firms like Higbee to send mass demand letters seeking quick settlements. Even meritless claims can generate revenue when recipients are too afraid, confused, or resource-constrained to fight back.

This hits community organizations, independent publishers, and small service providers that don’t have in-house legal teams especially hard. Faced with the threat of ruinous statutory damages, many just pay what is demanded.

That’s not how copyright law should work.

Know Your Rights

If you receive a copyright demand based on material someone else posted, don’t assume you’re liable.

You may have defenses based on:

  • Your role as a hosting or service provider
  • Lack of volitional conduct
  • Prompt removal of the material after notice
  • The statute of limitations
  • The copyright owner’s failure to timely register the work
  • The absence of actual damages

Every situation is different, but the key point is this: a demand letter is not the same as a valid legal claim.

Standing Up to Copyright Trolls

May First stood its ground, and Higbee abandoned its demand after we explained the law.

But the bigger problem remains. Copyright’s statutory damages framework enables aggressive enforcement tactics that targets the wrong parties, and chills lawful online activity.

Until lawmakers fix these structural incentives, organizations and individuals will keep facing pressure to pay up—even when they’ve done nothing wrong.

If you get one of these demand letters, remember: you may have more rights than it suggests.

Betty Gedlu