【NHKと放送のこれから公開講座】兵庫県知事選のデマから 「報道特集」大キャンペーン TBSの曺琴袖氏講演=河野 慎二

2 days 21 hours ago
  JCJも協賛する「NHKと放送のこれから」の連続公開講座(第3回)「TBS『報道特集』で何が起きているのか」が7月20日立教大学で開かれ、番組の曺琴袖(チョウ・クンス)前編集長が講演した。 曺氏は講座の冒頭「今、SNSというネットメディアが政治をどう変えようとしているか。もっと言えば、民主主義をどう破壊しようとしているのかを知って帰ってほしい」と参加者によびかけた。 「報道特集」は昨年11月の兵庫県知事選について「公益通報制度」との関わりで取材を進め「当選した斎藤知事側と..
JCJ

[B] 「国連ジェノサイド総会」【西サハラ最新情報】  平田伊都子

4 days 20 hours ago
2025年9月16日、国連の独立国際調査委員会はジュネーブの国連で、ガザのパレスチナ人への攻撃は国際法で定義されるジェノサイド、大量虐殺にあたると記者会見しました。 「いまさら何を言ってるのさ!」という思いです。 イスラエル国連大使は、「ハマスの虚偽情報に完全に依拠した捏造」と、反発しました。 ネタニヤフ・イスラエル首相は、「ガザに飢餓状況はない。我々はハマス戦闘員を虐殺しているんだ。民間人は殺ってない」と、今日もガザを爆撃しジェノサイドを続行中です。
日刊ベリタ

【出版界の動き】25年8~9月=出版部会

4 days 20 hours ago
◆上半期7737億円(2.1%減) 出版科学研究所は、25年上半期(1~6月期)の紙と電子を合わせた推定販売金額が7737億円(前年同期比2.1%減)になると発表。内訳は、紙の出版物が4926億円(同5.4%減)だが、出版科学研究所による紙の推定販売金額には、近年増加している出版社と書店の直接取引や出版社による直接販売は含まれていない。 紙の出版物の内訳は、書籍が3132億円(同1.5%減)、雑誌が1795億円(同11.4%減)。雑誌は月刊誌(週刊誌以外のすべて、ムック・コミ..
JCJ

Companies Must Provide Accurate and Transparent Information to Users When Posts are Removed

4 days 21 hours ago

This is the third installment in a blog series documenting EFF's findings from the Stop Censoring Abortion campaign. You can read additional posts here. 

Imagine sharing information about reproductive health care on social media and receiving a message that your content has been removed for violating a policy intended to curb online extremism. That’s exactly what happened to one person using Instagram who shared her story with our Stop Censoring Abortion project.

Meta’s rules for “Dangerous Organizations and Individuals” (DOI) were supposed to be narrow: a way to prevent the platform from being used by terrorist groups, organized crime, and those engaged in violent or criminal activity. But over the years, we’ve seen these rules applied in far broader—and more troubling—ways, with little transparency and significant impact on marginalized voices.

EFF has long warned that the DOI policy is opaque, inconsistently enforced, and prone to overreach. The policy has been critiqued by others for its opacity and propensity to disproportionately censor marginalized groups.

Samantha Shoemaker's post about Plan C was flagged under Meta's policy on dangerous organizations and individuals

Meta has since added examples and clarifications in its Transparency Center to this and other policies, but their implementation still leaves users in the dark about what’s allowed and what isn’t.

The case we received illustrates just how harmful this lack of clarity can be. Samantha Shoemaker, an individual sharing information about abortion care, shared straightforward, facts about accessing abortion pills. Her posts included:

  • A video linking to Plan C’s website, which lists organizations that provide abortion pills in different states.

  • A reshared image from Plan C’s own Instagram account encouraging people to learn about advance provision of abortion pills.

  • A short clip of women talking about their experiences taking abortion pills.
Information Provided to Users Must Be Accurate

Instead of allowing her to facilitate informed discussion, Instagram flagged some of her posts under its “Prescription Drugs” policy, while others were removed under the DOI policy—the same set of rules meant to stop violent extremism from being shared.

We recognize that moderation systems—both human and automated—will make mistakes. But when Meta equates medically accurate, harm-reducing information about abortion with “dangerous organizations,” it underscores a deeper problem: the blunt tools of content moderation disproportionately silence speech that is lawful, important, and often life-saving.

At a time when access to abortion information is already under political attack in the United States and around the world, platforms must be especially careful not to compound the harm. This incident shows how overly broad rules and opaque enforcement can erase valuable speech and disempower users who most need access to knowledge.

And when content does violate the rules, it’s important that users are provided with accurate information as to why. An individual sharing information about health care will undoubtedly be confused or upset by being told that they have violated a policy meant to curb violent extremism. Moderating content responsibly means offering the greatest transparency and clarity to users as possible. As outlined in the Santa Clara Principles on Transparency and Accountability in Content Moderation, users should be able to readily understand:

  • What types of content are prohibited by the company and will be removed, with detailed guidance and examples of permissible and impermissible content;
  • What types of content the company will take action against other than removal, such as algorithmic downranking, with detailed guidance and examples on each type of content and action; and
  • The circumstances under which the company will suspend a user’s account, whether permanently or temporarily.
What You Can Do if Your Content is Removed

If you find your content removed under Meta’s policies, you do have options:

  • Appeal the decision: Every takedown notice should give you the option to appeal within the app. Appeals are sometimes reviewed by a human moderator rather than an automated system.
  • Request Oversight Board review: In certain cases, you can escalate to Meta’s independent Oversight Board, which has the power to overturn takedowns and set policy precedents.
  • Document your case: Save screenshots of takedown notices, appeals, and your original post. This documentation is essential if you want to report the issue to advocacy groups or in future proceedings.
  • Share your story: Projects like Stop Censoring Abortion collect cases of unjust takedowns to build pressure for change. Speaking out, whether to EFF and other advocacy groups or to the media, helps illustrate how policies harm real people.

Abortion is health care. Sharing information about it is not dangerous—it’s necessary. Meta should allow users to share vital information about reproductive care. The company must also ensure that users are provided with clear information about how their policies are being applied and how to appeal seemingly wrongful decisions.

This is the third post in our blog series documenting the findings from our Stop Censoring Abortion campaign. Read more in the series: https://www.eff.org/pages/stop-censoring-abortion   

Jillian C. York

Vacancy: Executive Director

5 days 2 hours ago

We’re recruiting a new Director to lead us into our 35th year and beyond.

The successful candidate will build on the organisation’s rich history and legacy of exposing and opposing state secrecy, surveillance, repression and violence; and supporting and resourcing struggles for rights, liberties, transparency, and democracy.

They will be strategic, cooperative and adaptable, and have strong organisational, coordination and communication skills.

On 19 September we hosted an information session for interested applicants via Zoom. If you would like access to the recording, please emails comms [at] statewatch.org.

Apply for the role via the CharityJob website.

Deadline for applications: 11:00 BST, 6 October 2025

Statewatch