AIセキュリティ分科会(第2回)開催案内
令和7年度市区町村長、都道府県議会議長及び市区町村議会議長総務大臣表彰式
情報通信審議会 情報通信技術分科会 陸上無線通信委員会(第95回)
自動運転時代の“次世代のITS通信”研究会(第3期第2回)配布資料
情報通信審議会 情報通信技術分科会 衛星通信システム委員会(第49回)
❌ How Meta Is Censoring Abortion | EFFector 37.13
It's spooky season—but while jump scares may get your heart racing, catching up on digital rights news shouldn't! Our EFFector newsletter has got you covered with easy, bite-sized updates to keep you up-to-date.
In this issue, we spotlight new ALPR-enhanced police drones and how local communities can push back; unpack the ongoing TikTok “ban,” which we’ve consistently said violates the First Amendment; and celebrate a privacy win—abandoning a phone doesn't mean you've also abandoned your privacy rights.
Prefer to listen in? Check out our audio companion, where we interview EFF Staff Attorney Lisa Femia who explains the findings from our investigation into abortion censorship on social media. Catch the conversation on YouTube or the Internet Archive.
EFFECTOR 37.13 - ❌ HOW META IS CENSORING ABORTION
Since 1990 EFF has published EFFector to help keep readers on the bleeding edge of their digital rights. We know that the intersection of technology, civil liberties, human rights, and the law can be complicated, so EFFector is a great way to stay on top of things. The newsletter is chock full of links to updates, announcements, blog posts, and other stories to help keep readers—and listeners—up to date on the movement to protect online privacy and free expression.
Thank you to the supporters around the world who make our work possible! If you're not a member yet, join EFF today to help us fight for a brighter digital future.
EFF Is Standing Up for Federal Employees—Here’s How You Can Stand With Us
Federal employees play a key role in safeguarding the civil liberties of millions of Americans. Our rights to privacy and free expression can only survive when we stand together to push back against overreach and ensure that technology serves all people—not just the powerful.
That’s why EFF jumped to action earlier this year, when the U.S. Office of Personnel Management (OPM) handed over sensitive employee data—Social Security numbers, benefits data, work histories, and more—to Elon Musk’s Department of Government Efficiency (DOGE). This was a blatant violation of the Privacy Act of 1974, and it put federal workers directly at risk.
We didn’t let it stand. Alongside federal employee unions, EFF sued OPM and DOGE in February. In June, we secured a victory when a judge ruled we were entitled to a preliminary injunction and ordered OPM to provide accounting of DOGE access to employee records. Your support makes this possible.
Now the fight continues—and your support matters more than ever. The Office of Personnel Management is planting the seeds to undermine and potentially remove the Combined Federal Campaign (CFC), the main program federal employees and retirees have long used to support charities—including EFF. For now, you can still give to EFF through the CFC this year (use our ID: 10437) and we’d appreciate your support! But with the program’s uncertain future, direct support is the best way to keep our work going strong for years to come.
SUPPORT EFF'S WORK DIRECTLY, BECOME A MEMBER!
When you donate directly, you join a movement of lawyers, activists, and technologists who defend privacy, call out censorship, and push back against abuses of power—everywhere from the courts to Congress and to the streets. As a member, you’ll also receive insider updates, invitations to exclusive events, and receive conversation-starting EFF gear.
Plus, you can sustain our mission long-term with a monthly or annual donation!
Stand with EFF. Protect privacy. Defend free expression. Support our work today.
Related Cases: American Federation of Government Employees v. U.S. Office of Personnel ManagementPlatforms Have Failed Us on Abortion Content. Here's How They Can Fix It.
This is the eighth installment in a blog series documenting EFF's findings from the Stop Censoring Abortion campaign. You can read additional posts here.
In our Stop Censoring Abortion series, we’ve documented the many ways that reproductive rights advocates have faced arbitrary censorship on Meta platforms. Since social media is the primary—and sometimes the only—way that providers, advocates, and communities can safely and effectively share timely and accurate information about abortion, it’s vitally important that platforms take steps to proactively protect this speech.
Yet, even though Meta says its moderation policies allow abortion-related speech, its enforcement of those policies tells a different story. Posts are being wrongfully flagged, accounts are disappearing without warning, and important information is being removed without clear justification.
So what explains the gap between Meta’s public commitments and its actions? And how can we push platforms to be better—to, dare we say, #StopCensoringAbortion?
After reviewing nearly one-hundred submissions and speaking with Meta to clarify their moderation practices, here’s what we’ve learned.
Platforms’ Editorial Freedom to Moderate User ContentFirst, given the current landscape—with some states trying to criminalize speech about abortion—you may be wondering how much leeway platforms like Facebook and Instagram have to choose their own content moderation policies. In other words, can social media companies proactively commit to stop censoring abortion?
The answer is yes. Social media companies, including Meta, TikTok, and X, have the constitutionally protected First Amendment right to moderate user content however they see fit. They can take down posts, suspend accounts, or suppress content for virtually any reason.
The Supreme Court explicitly affirmed this right in 2023 in Moody v. Netchoice, holding that social media platforms, like newspapers, bookstores, and art galleries before them, have the First Amendment right to edit the user speech that they host and deliver to other users on their platforms. The Court also established that the government has a very limited role in dictating what social media platforms must (or must not) publish. This editorial discretion, whether granted to individuals, traditional press, or online platforms, is meant to protect these institutions from government interference and to safeguard the diversity of the public sphere—so that important conversations and movements like this one have the space to flourish.
Meta’s Broken PromisesUnfortunately, Meta is failing to meet even these basic standards. Again and again, its policies say one thing while its actual enforcement says another.
Meta has stated its intent to allow conversations about abortion to take place on its platforms. In fact, as we’ve written previously in this series, Meta has publicly insisted that posts with educational content about abortion access should not be censored, even admitting in several public statements to moderation mistakes and over-enforcement. One spokesperson told the New York Times: “We want our platforms to be a place where people can access reliable information about health services, advertisers can promote health services and everyone can discuss and debate public policies in this space. . . . That’s why we allow posts and ads about, discussing and debating abortion.”
Meta’s platform policies largely reflect this intent. But as our campaign reveals, Meta’s enforcement of those policies is wildly inconsistent. Time and again, users—including advocacy organizations, healthcare providers, and individuals sharing personal stories—have had their content taken down even though it did not actually violate any of Meta’s stated guidelines. Worse, they are often left in the dark about what happened and how to fix it.
Arbitrary enforcement like this harms abortion activists and providers by cutting them off from their audiences, wasting the effort they spend creating resources and building community on these platforms, and silencing their vital reproductive rights advocacy. And it goes without saying that it hurts users, who need access to timely, accurate, and sometimes life-saving information. At a time when abortion rights are under attack, platforms with enormous resources—like Meta—have no excuse for silencing this important speech.
Our Call to PlatformsOur case studies have highlighted that when users can’t rely on platforms to apply their own rules fairly, the result is a widespread chilling effect on online speech. That’s why we are calling on Meta to adopt the following urgent changes.
1. Publish clear and understandable policies.Too often, platforms’ vague rules force users to guess what content might be flagged in order to avoid shadowbanning or worse, leading to needless self-censorship. To prevent this chilling effect, platforms should strive to offer users the greatest possible transparency and clarity on their policies. The policies should be clear enough that users know exactly what is allowed and what isn’t so that, for example, no one is left wondering how exactly a clip of women sharing their abortion experiences could be mislabeled as violent extremism.
2. Enforce rules consistently and fairly.If content doesn’t violate a platform’s stated policies, it should not be removed. And, per Meta’s own policies, an account should not be suspended for abortion-related content violations if it has not received any prior warnings or “strikes.” Yet as we’ve seen throughout this campaign, abortion advocates repeatedly face takedowns or even account suspensions of posts that fall entirely within Meta’s Community Standards. On such a massive scale, this selective enforcement erodes trust and chills entire communities from participating in critical conversations.
3. Provide meaningful transparency in enforcement actions.When content is removed, Meta tends to give vague, boilerplate explanations—or none at all. Instead, users facing takedowns or suspensions deserve detailed and accurate explanations that state the policy violated, reflect the reasoning behind the actual enforcement decision, and ways to appeal the decision. Clear explanations are key to preventing wrongful censorship and ensuring that platforms remain accountable to their commitments and to their users.
4. Guarantee functional appeals.Every user deserves a real chance to challenge improper enforcement decisions and have them reversed. But based on our survey responses, it seems Meta’s appeals process is broken. Many users reported that they do not receive responses to appeals, even when the content did not violate Meta’s policies, and thus have no meaningful way to challenge takedowns. Alarmingly, we found that a user’s best (and sometimes only) chance at success is to rely on a personal connection at Meta to right wrongs and restore content. This is unacceptable. Users should have a reliable and efficient appeal process that does not depend on insider access.
5. Expand human review.Finally, automated systems cannot always handle the nuance of sensitive issues like reproductive health and advocacy. They misinterpret words, miss important cultural or political context, and wrongly flag legitimate advocacy as “dangerous.” Therefore, we call upon platforms to expand the role that human moderators play in reviewing auto-flagged content violations—especially when posts involve sensitive healthcare information or political expression.
Users Deserve BetterMeta has already made the choice to allow speech about abortion on its platforms, and it has not hesitated to highlight that commitment whenever it has faced scrutiny. Now it’s time for Meta to put its money where its mouth is.
Users deserve better than a system where rules are applied at random, appeals go nowhere, and vital reproductive health information is needlessly (or negligently) silenced. If Meta truly values free speech, it must commit to moderating with fairness, transparency, and accountability.
This is the eighth post in our blog series documenting the findings from our Stop Censoring Abortion campaign. Read more at https://www.eff.org/pages/stop-censoring-abortion
Affected by unjust censorship? Share your story using the hashtag #StopCensoringAbortion. Amplify censored posts and accounts, share screenshots of removals and platform messages—together, we can demonstrate how these policies harm real people.
【おすすめ本】山田 健太『転がる石のように 揺れるジャーナリズムと軋む表現の自由』―表面の時流に流されず 現場から説く鋭い定点時評=藤森 研(JCJ代表委員)
【おすすめ本】 小林美穂子・小松田健一『桐生市事件 生活保護が歪められた街で 』―「命の砦」を守る闘いの記録=白井康彦(フリージャーナリスト)
JVN: OpenSSLにおける複数の脆弱性(OpenSSL Security Advisory [30th September 2025])
JVN: Keysight製Ixia Vision Product Familyにおける複数の脆弱性
JVN: Megasys Enterprises製Telenium Online Web ApplicationにおけるOSコマンドインジェクションの脆弱性
JVN: 複数のFesto製品における複数の脆弱性
JVN: OpenPLC_V3における未定義、未指定、または実装定義の動作への依存の脆弱性
[B] 中学校始まる!~ チャオ!イタリア通信
注意喚起: Cisco ASAおよびFTDにおける複数の脆弱性(CVE-2025-20333、CVE-2025-20362)に関する注意喚起 (更新)
Gate Crashing: An Interview Series
There is a lot of bad on the internet and it seems to only be getting worse. But one of the things the internet did well, and is worth preserving, is nontraditional paths for creativity, journalism, and criticism. As governments and major corporations throw up more barriers to expression—and more and more gatekeepers try to control the internet—it’s important to learn how to crash through those gates.
In EFF's interview series, Gate Crashing, we talk to people who have used the internet to take nontraditional paths to the very traditional worlds of journalism, creativity, and criticism. We hope it's both inspiring to see these people and enlightening for anyone trying to find voices they like online.
Our mini-series will be dropping an episode each month closing out 2025 in style.
- Episode 1: Fanfiction Becomes Mainstream – Launching October 1*
- Episode 2: From DIY to Publishing – Launching November 1
- Episode 3: A New Path for Journalism – Launching December 1
Be sure to mark your calendar or check our socials on drop dates. If you have a friend or colleague that might be interested in watching our series, please forward this link: eff.org/gatecrashing
For over 35 years, EFF members have empowered attorneys, activists, and technologists to defend civil liberties and human rights online for everyone.
Tech should be a tool for the people, and we need you in this fight.
* This interview was originally published in December 2024. No changes have been made