日中韓自由貿易協定(FTA)交渉の第10 回交渉会合(局長/局次長会合)が開催されます
「活力あふれる『ビンテージ・ソサエティ』の実現に向けて」(研究会報告書)をとりまとめました
自動走行との連携が期待される、地図情報に関する国際規格が発行されました
東京電力株式会社の会社分割について、電気事業法に基づき認可しました
【NHKと放送のこれから公開講座】兵庫県知事選のデマから 「報道特集」大キャンペーン TBSの曺琴袖氏講演=河野 慎二
【時事マンガ】着々と準備「対中国」戦争=画・八方美人
[B] 「国連ジェノサイド総会」【西サハラ最新情報】 平田伊都子
【出版界の動き】25年8~9月=出版部会
Companies Must Provide Accurate and Transparent Information to Users When Posts are Removed
This is the third installment in a blog series documenting EFF's findings from the Stop Censoring Abortion campaign. You can read additional posts here.
Imagine sharing information about reproductive health care on social media and receiving a message that your content has been removed for violating a policy intended to curb online extremism. That’s exactly what happened to one person using Instagram who shared her story with our Stop Censoring Abortion project.
Meta’s rules for “Dangerous Organizations and Individuals” (DOI) were supposed to be narrow: a way to prevent the platform from being used by terrorist groups, organized crime, and those engaged in violent or criminal activity. But over the years, we’ve seen these rules applied in far broader—and more troubling—ways, with little transparency and significant impact on marginalized voices.
EFF has long warned that the DOI policy is opaque, inconsistently enforced, and prone to overreach. The policy has been critiqued by others for its opacity and propensity to disproportionately censor marginalized groups.
Samantha Shoemaker's post about Plan C was flagged under Meta's policy on dangerous organizations and individuals
Meta has since added examples and clarifications in its Transparency Center to this and other policies, but their implementation still leaves users in the dark about what’s allowed and what isn’t.
The case we received illustrates just how harmful this lack of clarity can be. Samantha Shoemaker, an individual sharing information about abortion care, shared straightforward, facts about accessing abortion pills. Her posts included:
- A video linking to Plan C’s website, which lists organizations that provide abortion pills in different states.
- A reshared image from Plan C’s own Instagram account encouraging people to learn about advance provision of abortion pills.
- A short clip of women talking about their experiences taking abortion pills.
Instead of allowing her to facilitate informed discussion, Instagram flagged some of her posts under its “Prescription Drugs” policy, while others were removed under the DOI policy—the same set of rules meant to stop violent extremism from being shared.
We recognize that moderation systems—both human and automated—will make mistakes. But when Meta equates medically accurate, harm-reducing information about abortion with “dangerous organizations,” it underscores a deeper problem: the blunt tools of content moderation disproportionately silence speech that is lawful, important, and often life-saving.
At a time when access to abortion information is already under political attack in the United States and around the world, platforms must be especially careful not to compound the harm. This incident shows how overly broad rules and opaque enforcement can erase valuable speech and disempower users who most need access to knowledge.
And when content does violate the rules, it’s important that users are provided with accurate information as to why. An individual sharing information about health care will undoubtedly be confused or upset by being told that they have violated a policy meant to curb violent extremism. Moderating content responsibly means offering the greatest transparency and clarity to users as possible. As outlined in the Santa Clara Principles on Transparency and Accountability in Content Moderation, users should be able to readily understand:
- What types of content are prohibited by the company and will be removed, with detailed guidance and examples of permissible and impermissible content;
- What types of content the company will take action against other than removal, such as algorithmic downranking, with detailed guidance and examples on each type of content and action; and
- The circumstances under which the company will suspend a user’s account, whether permanently or temporarily.
If you find your content removed under Meta’s policies, you do have options:
- Appeal the decision: Every takedown notice should give you the option to appeal within the app. Appeals are sometimes reviewed by a human moderator rather than an automated system.
- Request Oversight Board review: In certain cases, you can escalate to Meta’s independent Oversight Board, which has the power to overturn takedowns and set policy precedents.
- Document your case: Save screenshots of takedown notices, appeals, and your original post. This documentation is essential if you want to report the issue to advocacy groups or in future proceedings.
- Share your story: Projects like Stop Censoring Abortion collect cases of unjust takedowns to build pressure for change. Speaking out, whether to EFF and other advocacy groups or to the media, helps illustrate how policies harm real people.
Abortion is health care. Sharing information about it is not dangerous—it’s necessary. Meta should allow users to share vital information about reproductive care. The company must also ensure that users are provided with clear information about how their policies are being applied and how to appeal seemingly wrongful decisions.
This is the third post in our blog series documenting the findings from our Stop Censoring Abortion campaign. Read more in the series: https://www.eff.org/pages/stop-censoring-abortion
Vacancy: Executive Director
We’re recruiting a new Director to lead us into our 35th year and beyond.
The successful candidate will build on the organisation’s rich history and legacy of exposing and opposing state secrecy, surveillance, repression and violence; and supporting and resourcing struggles for rights, liberties, transparency, and democracy.
They will be strategic, cooperative and adaptable, and have strong organisational, coordination and communication skills.
On 19 September we hosted an information session for interested applicants via Zoom. If you would like access to the recording, please emails comms [at] statewatch.org.
Apply for the role via the CharityJob website.
Deadline for applications: 11:00 BST, 6 October 2025
JVN: Westermo製WeOS 5における複数の脆弱性
JVN: 複数のSchneider Electric製品における複数のOSコマンドインジェクションの脆弱性
JVN: 複数のHitachi Energy製品における複数の脆弱性
JVN: 複数のCognex製品における複数の脆弱性
JVN: Dover Fueling Solutions製ProGauge MagLink LXにおける複数の脆弱性
お知らせ:JPCERT/CC Eyes「解説:脆弱性関連情報取扱制度の運用と今後の課題について(後編)~脆弱性悪用情報のハンドリングと今後の課題~」
Shining a Spotlight on Digital Rights Heroes: EFF Awards 2025
It's been a year full of challenges, but also important victories for digital freedoms. From EFF’s new lawsuit against OPM and DOGE, to launching Rayhunter (our new tool to detect cellular spying), to exposing the censorship of abortion-related content on social media, we’ve been busy! But we’re not the only ones leading the charge.
On September 10 in San Francisco, we presented the annual EFF Awards to three courageous honorees who are pushing back against unlawful surveillance, championing data privacy, and advancing civil liberties online. This year’s awards went to Just Futures Law, Erie Meyer, and the Software Freedom Law Center, India.
If you missed the celebration in person, you can still watch it live! The full event is posted on YouTube and the Internet Archive, and a transcript of the live captions is also available.
SEE THE EFF AWARDS CEREMONY ON YOUTUBE
Looking Back, Looking AheadEFF Executive Director Cindy Cohn opened the evening by reflecting on our victories this past year and reiterated how vital EFF’s mission to protect privacy and free speech is today. She also announced her upcoming departure as Executive Director after a decade in the role (and over 25 years of involvement with EFF!). No need to be too sentimental—Cindy isn’t going far. As we like to say: you can check out at any time, but you never really leave the fight.
Cindy then welcomed one of EFF’s founders, Mitch Kapor, who joked that he had been “brought out of cold storage” for the occasion. Mitch recalled EFF’s early days, when no one knew exactly how constitutional rights would interact with emerging technologies—but everyone understood the stakes. “We understood that the matter of digital rights were very important,” he reflected. And history has proven them right.
Honoring Defenders of Digital FreedomThe first award of the night, the EFF Award for Defending Digital Freedoms, went to the Software Freedom Law Center, India (SFLC.IN). Presenting the award, EFF Civil Liberties Director David Greene emphasized the importance of international partners like SFLC.IN, whose local perspectives enrich and strengthen EFF’s own work.
SFLC.IN is at the forefront of digital rights in India—challenging internet shutdowns, tracking violations of free expression with their Free Speech Tracker, and training lawyers across the country. Accepting the award, SFLC.IN founder Mishi Choudhary reminded us: “These freedoms are not abstract. They are fought for every day by people, by organizations, and by movements.”
SFLC.IN founder Mishi Choudhary accepts the EFF Award for Defending Digital Freedoms
Next, EFF Staff Attorney Mario Trujillo introduced the winner of the EFF Award for Protecting Americans’ Data, Erie Meyer. Erie has served as CTO of the Federal Trade Commission and Consumer Financial Protection Bureau, and was a founding member of the U.S. Digital Service. Today, she continues to fight for better government technology and safeguards for sensitive data.
In her remarks, Erie underscored the urgency of protecting personal data at scale: “We need to protect people’s data the same way we protect this country from national security risks. What’s happening right now is like all the data breaches in history rolled into one. ‘Trust me, bro’ is not a way to handle 550 million Americans’ data.”
Erie Meyer accepts the EFF Award for Protecting Americans’ Data
Finally, EFF General Counsel Jennifer Lynch introduced the EFF Award for Leading Immigration and Surveillance Litigation, presented to Just Futures Law. Co-founder and Executive Director Paromita Shah accepted on behalf of the organization, which works to challenge the ways surveillance disproportionately harms people of color in the U.S.
“For years, corporations and law enforcement—including ICE—have been testing the legal limits of their tools on communities of color,” Paromita said in her speech. Just Futures Law has fought back, suing the Department of Homeland Security to reveal its use of AI, and defending activists against surveillance technologies like Clearview AI.
Just Futures Law Executive Director Paromita Shah accepted the EFF Award for Leading Immigration and Surveillance Litigation
Carrying the Work ForwardWe’re honored to shine a spotlight on these award winners, who are doing truly fearless and essential work to protect online privacy and free expression. Their courage reminds us that the fight for civil liberties will be won when we work together—across borders, communities, and movements.
Join the fight and donate today
A heartfelt thank you to all of the EFF members worldwide who make this work possible. Public support is what allows us to push for a better internet. If you’d like to join the fight, consider becoming an EFF member—you’ll receive special gear as our thanks, and you’ll help power the digital freedom movement.
And finally, special thanks to the sponsor of this year’s EFF Awards: Electric Capital.
Catch Up From the EventReminder that if you missed the event, you can watch the live recording on our YouTube and the Internet Archive. Plus, a special thank you to our photographers, Alex Schoenfeldt and Carolina Kroon. You can see some of our favorite group photos that were taken during the event, and photos of the awardees with their trophies.