北海道ニセコ町「宿泊税」の新設
青森県「核燃料物質等取扱税」の更新
茨城県「核燃料等取扱税」の更新
福岡県太宰府市「歴史と文化の環境税」の更新
第36回政策評価審議会(第38回政策評価制度部会と合同)(令和6年3月15日)の開催について
松本総務大臣閣議後記者会見の概要
令和6年能登半島地震に係る被害状況等について(第87報)
活力ある地域社会の実現に向けた情報通信基盤と利活用の在り方に関する懇談会(第3回)
情報通信審議会 情報通信技術分科会(第178回)配付資料・議事概要・議事録
第19回統計基準部会
情報通信審議会 情報通信技術分科会 ITU部会電気通信システム委員会(第19回) 開催案内
経済安全保障ワーキンググループ(第2回)配布資料・議事概要
マイナンバーカードを活用した救急業務の迅速化・円滑化に関する令和6年度 実証事業の実施消防本部の決定
「輪島市大規模火災を踏まえた消防防災対策のあり方に関する検討会」の開催
公害等調整委員会事務局総務課 非常勤職員採用情報
[B] 基地建設反対「不当な『代執行』による大浦湾の埋め立てを許さない」学習会
EFF’s Submission to Ofcom’s Consultation on Illegal Harms
More than four years after it was first introduced, the Online Safety Act (OSA) was passed by the U.K. Parliament in September 2023. The Act seeks to make the U.K. “the safest place” in the world to be online and provides Ofcom, the country’s communications regulator, with the power to enforce this.
EFF has opposed the Online Safety Act since it was first introduced. It will lead to a more censored, locked-down internet for British users. The Act empowers the U.K. government to undermine not just the privacy and security of U.K. residents, but internet users worldwide. We joined civil society organizations, security experts, and tech companies to unequivocally ask for the removal of clauses that require online platforms to use government-approved software to scan for illegal content.
Under the Online Safety Act, websites, and apps that host content deemed “harmful” minors will face heavy penalties; the problem, of course, is views vary on what type of content is “harmful,” in the U.K. as with all other societies. Soon, U.K. government censors will make that decision.
The Act also requires mandatory age verification, which undermines the free expression of both adults and minors.
Ofcom recently published the first of four major consultations seeking information on how internet and search services should approach their new duties on illegal content. While we continue to oppose the concept of the Act, we are continuing to engage with Ofcom to limit the damage to our most fundamental rights online.
EFF recently submitted information to the consultation, reaffirming our call on policymakers in the U.K. to protect speech and privacy online.
For years, we opposed a clause contained in the then Online Safety Bill allowing Ofcom to serve a notice requiring tech companies to scan their users–all of them–for child abuse content. We are pleased to see that Ofcom’s recent statements note that the Online Safety Act will not apply to end-to-end encrypted messages. Encryption backdoors of any kind are incompatible with privacy and human rights.
However, there are places in Ofcom’s documentation where this commitment can and should be clearer. In our submission, we affirmed the importance of ensuring that people’s rights to use and benefit from encryption—regardless of the size and type of the online service. The commitment to not scan encrypted data must be firm, regardless of the size of the service, or what encrypted services it provides. For instance, Ofcom has suggested that “file-storage and file-sharing” may be subject to a different risk profile for mandating scanning. But encrypted “communications” are not significantly different from encrypted “file-storage and file-sharing.”
In this context, Ofcom should also take note of new milestone judgment in PODCHASOV v. RUSSIA (Application no. 33696/19) where the European Court of Human Rights (ECtHR) ruled that weakening encryption can lead to general and indiscriminate surveillance of communications for all users, and violates the human right to privacy.
An earlier version of the Online Safety Bill enabled the U.K. government to directly silence user speech and imprison those who publish messages that it doesn’t like. It also empowered Ofcom to levy heavy fines or even block access to sites that offend people. We were happy to see this clause removed from the bill in 2022. But a lot of problems with the OSA remain. Our submission on illegal harms affirmed the importance of ensuring that users have: greater control over what content they see and interact with, are equipped with knowledge about how various controls operate and how they can use them to their advantage, and have the right to anonymity and pseudonymity online.
Moderation mechanisms must not interfere with users’ freedom of expression rights, and moderators should receive ample training and materials to ensure cultural and linguistic competence in content moderation. In cases where time-related pressure is placed on moderators to make determinations, companies often remove more than necessary to avoid potential liability, and are incentivized towards using automated technologies for content removal and upload filters. These are notoriously inaccurate and prone to overblocking legitimate material. Moreover, the moderation of terrorism-related content is prone to error and any new mechanism like hash matching or URL detection must be provided with expert oversight.
Throughout this consultation period, EFF will continue contributing to and monitoring Ofcom’s drafting of the regulation. And we will continue to hold the U.K. government accountable to the international and European human rights protections to which they are signatories.
Read EFF's full submission to Ofcom.
Eswatini strives for digital sovereignty amid technological advancements
In Eswatini, two key legislative measures impact internet governance: the Computer Crime and Cyber Crime Act of 2022 and the Data Protection Act of 2022. Concerns loom over potential implications of these statutes, particularly in terms of press freedom and freedom of expression on social media.
Language English