[B] 基地建設反対「不当な『代執行』による大浦湾の埋め立てを許さない」学習会

1 month 1 week ago
アメリカ軍普天間飛行場を沖縄県宜野湾市から名護市辺野古に移設する工事をめぐり、国が、沖縄県に代わって設計変更の承認を行う「代執行」に向けて提訴した裁判で、2月29日、最高裁は同変更を不承認としていた沖縄県の上告を退ける決定をした。(小栗俊也)
日刊ベリタ

EFF’s Submission to Ofcom’s Consultation on Illegal Harms

1 month 1 week ago

More than four years after it was first introduced, the Online Safety Act (OSA) was passed by the U.K. Parliament in September 2023. The Act seeks to make the U.K. “the safest place” in the world to be online and provides Ofcom, the country’s communications regulator, with the power to enforce this.

EFF has opposed the Online Safety Act since it was first introduced. It will lead to a more censored, locked-down internet for British users. The Act empowers the U.K. government to undermine not just the privacy and security of U.K. residents, but internet users worldwide. We joined civil society organizations, security experts, and tech companies to unequivocally ask for the removal of clauses that require online platforms to use government-approved software to scan for illegal content. 

Under the Online Safety Act, websites, and apps that host content deemed “harmful” minors will face heavy penalties; the problem, of course, is views vary on what type of content is “harmful,” in the U.K. as with all other societies. Soon, U.K. government censors will make that decision. 

The Act also requires mandatory age verification, which undermines the free expression of both adults and minors. 

Ofcom recently published the first of four major consultations seeking information on how internet and search services should approach their new duties on illegal content. While we continue to oppose the concept of the Act, we are continuing to engage with Ofcom to limit the damage to our most fundamental rights online. 

EFF recently submitted information to the consultation, reaffirming our call on policymakers in the U.K. to protect speech and privacy online. 

Encryption 

For years, we opposed a clause contained in the then Online Safety Bill allowing Ofcom to serve a notice requiring tech companies to scan their users–all of them–for child abuse content. We are pleased to see that Ofcom’s recent statements note that the Online Safety Act will not apply to end-to-end encrypted messages. Encryption backdoors of any kind are incompatible with privacy and human rights. 

However, there are places in Ofcom’s documentation where this commitment can and should be clearer. In our submission, we affirmed the importance of ensuring that people’s rights to use and benefit from encryption—regardless of the size and type of the online service. The commitment to not scan encrypted data must be firm, regardless of the size of the service, or what encrypted services it provides. For instance, Ofcom has suggested that “file-storage and file-sharing” may be subject to a different risk profile for mandating scanning. But encrypted “communications” are not significantly different from encrypted “file-storage and file-sharing.”

In this context, Ofcom should also take note of new milestone judgment in PODCHASOV v. RUSSIA (Application no. 33696/19) where the European Court of Human Rights (ECtHR) ruled that weakening encryption can lead to general and indiscriminate surveillance of communications for all users, and violates the human right to privacy. 

Content Moderation

An earlier version of the Online Safety Bill enabled the U.K. government to directly silence user speech and imprison those who publish messages that it doesn’t like. It also empowered Ofcom to levy heavy fines or even block access to sites that offend people. We were happy to see this clause removed from the bill in 2022. But a lot of problems with the OSA remain. Our submission on illegal harms affirmed the importance of ensuring that users have: greater control over what content they see and interact with, are equipped with knowledge about how various controls operate and how they can use them to their advantage, and have the right to anonymity and pseudonymity online.

Moderation mechanisms must not interfere with users’ freedom of expression rights, and moderators should receive ample training and materials to ensure cultural and linguistic competence in content moderation. In cases where time-related pressure is placed on moderators to make determinations, companies often remove more than necessary to avoid potential liability, and are incentivized towards using automated technologies for content removal and upload filters. These are notoriously inaccurate and prone to overblocking legitimate material. Moreover, the moderation of terrorism-related content is prone to error and any new mechanism like hash matching or URL detection must be provided with expert oversight. 

Next Steps

Throughout this consultation period, EFF will continue contributing to and monitoring Ofcom’s drafting of the regulation. And we will continue to hold the U.K. government accountable to the international and European human rights protections to which they are signatories.

Read EFF's full submission to Ofcom

Paige Collings

Eswatini strives for digital sovereignty amid technological advancements

1 month 1 week ago

In Eswatini, two key legislative measures impact internet governance: the Computer Crime and Cyber Crime Act of 2022 and the Data Protection Act of 2022. Concerns loom over potential implications of these statutes, particularly in terms of press freedom and freedom of expression on social media.

Language English
lori

【出版トピックス】海外へ進出する書店・本屋さん振興策の行方=出版部会

1 month 1 week ago
◆漫画『ドラゴンボール』の作者・鳥山明さんが、急性硬膜下血腫により3月1日逝去(享年68)。心からお悔やみ申し上げます。鳥山さんは1978年に『ワンダーアイランド』でデビュー。その後、『Dr.スランプ』、『ドラゴンボール』などの人気作品を世に送り出し、アニメ化され世界的な人気漫画家となり、今もなお愛され続けている。◆24年1月の出版物販売金額731億円(前年比5.8%減)、書籍457億円(同3.5%減)、雑誌273億円(同9.5%減)。月刊誌219億円(同10.0%減)、週刊..
JCJ