EFF Sues DHS and ICE For Records on Subpoenas Seeking to Unmask Online Critics
SAN FRANCISCO – The Electronic Frontier Foundation (EFF) sued the Department of Homeland Security (DHS) and Immigration and Customs Enforcement (ICE) today demanding public records about their use of administrative subpoenas to try to identify their online critics.
Court records and news reports show that in the past year, DHS has used administrative subpoenas to unmask or locate people who have documented ICE's activities in their community, criticized the government, or attended protests. The subpoenas are sent to technology companies to demand information about internet users who are often engaged in protected First Amendment activity.
These subpoenas are dangerous because they don’t require judges’ approval. But they are also unlawful, and the government knows it. When a few users challenged them in court with the help of American Civil Liberties Union affiliates in Northern California and Pennsylvania, DHS withdrew them rather than waiting for a decision.
DHS and ICE have ignored EFF’s public-records requests for documents about the processes behind these subpoenas, so EFF sued Wednesday in the U.S. District Court for the District of Columbia.
“DHS and ICE should not be able to first claim that they have the legal authority to unmask critics and then run from court when users challenge these administrative subpoenas,” said EFF Deputy Legal Director Aaron Mackey. “The public deserves to know what laws the agencies believe give them the power to issue these speech-chilling subpoenas.”
An administrative subpoena cannot be used to obtain the content of communications, but they have been used to try and obtain some basic subscriber information like name, address, IP address, length of service, and session times. If a technology company refuses to comply, an agency’s only recourse is to drop it or go to court and try to convince a judge that the request is lawful.
EFF and the ACLU of Northern California in February wrote to Amazon, Apple, Discord, Google, Meta, Microsoft, Reddit, SNAP, TikTok, and X to ask that they insist on court intervention and an order before complying with a DHS subpoena; give users as much notice as possible when they are the target of a subpoena, so the users can seek help; and resist gag orders that would prevent the companies from notifying users who are targets of subpoenas.
And EFF last week asked California’s and New York’s attorneys general to investigate Google for deceptive trade practices for breaking its promise to notify users before handing their data to law enforcement, citing the case of a doctoral student who was targeted with an ICE subpoena after briefly attending a pro-Palestine protest.
EFF in early March filed public-records requests with DHS and ICE for their policies, procedures, guidelines, directives, memos, and legal analyses supporting such use of administrative subpoenas. EFF also requested all Inspector General or oversight records, all approval and issuance procedures for the subpoenas, all records reflecting how many such subpoenas have been issued, all communications with technology companies concerning these demands, all communications regarding specific named targets or programs, and all communications with the Department of Justice regarding such subpoenas.
DHS and ICE have not responded, even though EFF requested expedited processing of its requests, which requires agencies to get back to requesters within 10 days.
“The policies, directives, and authorization records governing the program have not been disclosed,” the complaint notes. “The legal basis asserted by DHS and ICE for using a customs statute to compel disclosure of information about persons engaged in constitutionally protected speech and association has not been made public.”
For the complaint: https://www.eff.org/document/eff-v-dhs-ice-administrative-subpoenas-complaint
For EFF’s letter urging tech companies to protect users: https://www.eff.org/deeplinks/2026/02/open-letter-tech-companies-protect-your-users-lawless-dhs-subpoenas
For EFF’s letter urging state probes of Google: https://www.eff.org/press/releases/eff-state-ags-investigate-googles-broken-promise-users-targeted-government
Between the algorithmisation of territories and the monoculture of data: Are there paths towards AI that respect rights and life?
JVN: Ziostation2におけるパストラバーサルの脆弱性
JVN: LiveOn MeetのWindows PC用クライアントインストーラおよびプラグインインストーラにおける任意のDLL読み込みの脆弱性
JVN: DeepL Chrome拡張機能におけるクロスサイトスクリプティングの脆弱性
For AI to work for us, it will have to stop pretending to be us
JVN: CISA ICS Advisory / ICS Medical Advisory(2026年04月21日)
JVN: サイレックス・テクノロジー製SD-330ACおよびAMC Managerにおける複数の脆弱性
Weekly Report: Adobe AcrobatおよびReaderの脆弱性(APSB26-43)に関する注意喚起
「インターネット上の偽・誤情報等への対策技術の開発・実証事業」に係る実証団体の公募
令和7年度補正予算「地域社会DX推進パッケージ事業」 ―『実証事業(先進的通信システム活用タイプ)』一次公募の選定結果―
盛土等による災害の防止に関する調査 <結果に基づく通知>
「放送コンテンツ製作取引・法律相談ホットライン」専用サイトの運営開始
情報通信審議会 総会(第56回)の開催について
無線局(基幹放送局を除く。)の開設の根本的基準等の一部を改正する省令案等に関する意見募集の結果及び電波監理審議会からの答申
デジタル空間における情報流通の諸課題への対処に関する検討会 青少年保護ワーキンググループ(第4回)配布資料
「サービス産業動態統計調査」2026年(令和8年)2月分(速報)
一般職技術系(情報通信行政)の情報を更新しました
Copyright and DMCA Best Practices for Fediverse Operators
People building the future of the social web — interoperable and decentralized — need to protect themselves against copyright liability. Like anyone who creates and operates platforms for user-uploaded content, the hosts of the decentralized social web can take preventive measures to reduce their legal exposure when a user posts material that violates someone’s copyright.
This post gives an overview of the steps to take. It’s meant for operators of Mastodon and other ActivityPub servers, Bluesky hosts, RSS mirrors, and other decentralized social media protocols, and developers of apps for those protocols — but it will apply to other hosts as well. This isn’t legal advice, and can’t substitute for a consultation with a lawyer about your specific circumstances. It focuses on U.S. law — the law may impose different requirements elsewhere. Still, we hope it helps you get started with confidence.
Why should I care? Copyright’s Sword of DamoclesIn some circumstances, the operator of a platform that handles user content can be legally responsible for content that infringes copyright. That can happen when the platform operator is directly involved in copying or distributing the copyrighted material, when they promote or knowingly assist the infringement, or when they benefit financially from infringement while being in a position to supervise it. But these judge-made rules are often difficult and uncertain to apply in practice — and the penalties for being found on the wrong side of the law can be severe. Copyright’s “statutory damages” regime allows for massive, unpredictable financial liability. That’s why it’s important to limit your risk.
For Server Operators: Limiting Risk with the DMCA Safe HarborsIf you run a social network server, the safe harbor provisions of the Digital Millennium Copyright Act (DMCA) are an important way to limit your liability risk. The DMCA shields server operators from nearly all forms of copyright liability that can result from “storage at the direction of a user” — in other words, hosting user-uploaded content. But to qualify for this protection, there are steps a server operator has to take.
1. Designate A Contact To Receive Copyright Infringement NoticesFirst, you’ll need to provide contact information for someone who can receive infringement notices (a “designated agent”). That information needs to be posted in at least two places: on your server in a place visible to users (such as a “DMCA” page or post, or as part of your Terms of Service), and in the U.S. Copyright Office’s “Designated Agent Directory.” To post that information to the directory, you have to create an account at https://www.copyright.gov/dmca-directory/ and pay a small fee. The directory listings expire after three years, and once expired, your safe harbor protection goes away, so it’s important to keep that listing current.
2. Respond Promptly to Notices and Counter-noticesWhen you receive infringement notices, it’s important to respond to them promptly. Notices are supposed to identify the copyright holder, the copyrighted work they claim was infringed, and the post they claim is infringing. By deleting or disabling access to the posted material, you protect yourself from liability with respect to that material.
The theory behind Section 512 is that hosts don’t have to be in a position of deciding whether a post infringes someone’s copyright — it’s up to the poster, the rights holder, and potentially a court to decide that. A host who takes down posts whenever they receive an infringement notice is well-protected. But it’s equally important to recognize that hosts aren’t required to take down content in response to every notice. Infringement notices are frequently wrong, misguided, or abusive, or simply incomplete. Hosts who want to stand up for their users’ speech can choose to disregard infringement notices that seem suspect. While this risks losing the automatic protection of the safe harbor in each instance, it can still be done safely with careful preparation, ideally using a plan crafted with help from a lawyer. Bear in mind that people sending false notices, including by failing to consider whether a post is a fair use before asking a host to take it down, can be liable for damages under the DMCA.
The DMCA also allows the person who posted the material to send a “counter-notification” asserting that they really did have the right to post and that there’s no copyright infringement. Responding to counter-notifications is a good way for a host to demonstrate that they look out for their users. When a host receives a counter-notification, they should forward it on to the person who sent the original takedown notice and let them know that the post will be restored in 10 business days. Then, after that waiting period has elapsed, the host can restore the posted material. Just like with infringement notices, a host isn’t required to honor a counter-notification that appears to be fraudulent, but there’s no penalty for honoring it anyway.
3. Have A Repeat Infringer PolicyThe next requirement is to have a policy of terminating the accounts of “subscribers and account holders” who are “repeat infringers” in “appropriate circumstances,” and to carry out that policy. Yes, that’s a vague requirement. It doesn’t require a “three strikes” policy or any other sports analogy. It just needs to be reasonable. Be sure your policy is spelled out in your website terms or “DMCA” page.
4. Don’t Ignore Known InfringementHosts need to take down user posts whenever the host actually knows that the post is infringing. In other words, a host isn’t protected if they ignore takedown notices based on technicalities in the notices, or if they learn about the infringement some other way. But hosts don’t need to actively look for infringement on their servers — only to act when someone notifies them.
5. Don’t Encourage InfringementFinally, make sure that nothing you post or advertise actively encourages copyright infringement. For example, don’t post examples of users uploading copyrighted music or video without permission, or insinuate that your server is a good place for infringing content.
There are some other technicalities in the DMCA that can affect the safe harbor, which is why it’s always a good idea to consult with a lawyer. But following these steps will help protect you when you run a social media server — or any other kind of user-uploaded content platform.