【お知らせ】大軍拡の宮古島・石垣島 最前線の取材報告 沖縄ジャンプナイト、HPにアップ=吉原功

3 months 2 weeks ago
 沖縄ジャンプナイト(OJN)は、日本ジャーナリスト会議(JCJ)のなかで、沖縄の状況にとりわけ強い関心を持つ会員のグループです。南西諸島の基地強化、軍備増強の必要性が喧伝され政策化されるなかで、OJNは現地の状況を知らねばと宮古島・石垣島への現地取材を実行しました。その概要は本紙22年12月、23年2~5月号に報告しています。 OJNではその後、何回もネット会合を持ち、取材した内容を議論し検討しました。これらの会合には現地取材に行けなかったメンバーも参加して、多くの新たな..
JCJ

States Attack Young People’s Constitutional Right to Use Social Media: 2023 Year in Review

3 months 2 weeks ago

Legislatures in more than half of the country targeted young people’s use of social media this year, with many of the proposals blocking adults’ ability to access the same sites. State representatives introduced dozens of bills that would limit young people’s use of some of the most popular sites and apps, either by requiring the companies to introduce or amend their features or data usage for young users, or by forcing those users to get permission from parents, and in some cases, share their passwords, before they can log on. Courts blocked several of these laws for violating the First Amendment—though some may go into effect later this year. 

Fourteen months after California passed the AADC, it feels like a dam has broken.

How did we get to a point where state lawmakers are willing to censor large parts of the internet? In many ways, California’s Age Appropriate Design Code Act (AADC), passed in September of 2022, set the stage for this year’s battle. EFF asked Governor Newsom to veto that bill before it was signed into law, despite its good intentions in seeking to protect the privacy and well-being of children. Like many of the bills that followed it this year, it runs the risk of imposing surveillance requirements and content restrictions on a broader audience than intended. A federal court blocked the AADC earlier this year, and California has appealed that decision.

Fourteen months after California passed the AADC, it feels like a dam has broken: we’ve seen dangerous social media regulations for young people introduced across the country, and passed in several states, including Utah, Arkansas, and Texas. The severity and individual components of these regulations vary. Like California’s, many of these bills would introduce age verification requirements, forcing sites to identify all of their users, harming both minors’ and adults’ ability to access information online. We oppose age verification requirements, which are the wrong approach to protecting young people online. No one should have to hand over their driver’s license, or, worse, provide biometric information, just to access lawful speech on websites.

A Closer Look at State Social Media Laws Passed in 2023

Utah enacted the first child social media regulation this year, S.B. 152, in March. The law prohibits social media companies from providing accounts to a Utah minor, unless they have the express consent of a parent or guardian. We requested that Utah’s governor veto the bill.

We identified at least four reasons to oppose the law, many of which apply to other states’ social media regulations. First, young people have a First Amendment right to information that the law infringes upon. With S.B. 152 in effect, the majority of young Utahns will find themselves effectively locked out of much of the web absent their parents permission. Second, the law  dangerously requires parental surveillance of young peoples’ accounts, harming their privacy and free speech. Third, the law endangers the privacy of all Utah users, as it requires many sites to collect and analyze private information, like government issued identification, for every user, to verify ages. And fourth, the law interferes with the broader public’s First Amendment right to receive information by requiring that all users in Utah tie their accounts to their age, and ultimately, their identity, and will lead to fewer people expressing themselves, or seeking information online. 

Federal courts have blocked the laws in Arkansas and California.

The law passed despite these problems, as did Utah’s H.B. 311, which creates liability for social media companies should they, in the view of Utah lawmakers, create services that are addictive to minors. H.B. 311 is unconstitutional because it imposes a vague and unscientific standard for what might constitute social media addiction, potentially creating liability for core features of a service, such as letting you know that someone responded to your post. Both S.B. 152 and H.B. 311 are scheduled to take effect in March 2024.

Arkansas passed a similar law to Utah's S.B. 152 in April, which requires users of social media to prove their age or obtain parental permission to create social media accounts. A federal court blocked the Arkansas law in September, ruling that the age-verification provisions violated the First Amendment because they burdened everyone's ability to access lawful speech online. EFF joined the ACLU in a friend-of-the-court brief arguing that the statute was unconstitutional.

Texas, in June, passed a regulation similar to the Arkansas law, which would ban anyone under 18 from having a social media account unless they receive consent from parents or guardians. The law is scheduled to take effect in September 2024.

Given the strong constitutional protections for people, including children, to access information without having to identify themselves, federal courts have blocked the laws in Arkansas and California. The Utah and Texas laws are likely to suffer the same fate. EFF has warned that such laws were bad policy and would not withstand court challenges, in large part because applying online regulations specifically to young people often forces sites to use age verification, which comes with a host of problems, legal and otherwise. 

To that end, we spent much of this year explaining to legislators that comprehensive data privacy legislation is the best way to hold tech companies accountable in our surveillance age, including for harms they do to children. For an even more detailed account of our suggestions, see Privacy First: A Better Way to Address Online Harms. In short, comprehensive data privacy legislation would address the massive collection and processing of personal data that is the root cause of many problems online, and it is far easier to write data privacy laws that are constitutional. Laws that lock online content behind age gates can almost never withstand First Amendment scrutiny because they frustrate all internet users’ rights to access information and often impinge on people’s right to anonymity.

Of course, states were not alone in their attempt to regulate social media for young people. Our Year in Review post on similar federal legislation that was introduced this year covers that fight, which was successful. Our post on the UK’s Online Safety Act describes the battle across the pond. 2024 is shaping up to be a year of court battles that may determine the future of young people’s access to speak out and obtain information online. We’ll be there, continuing to fight against misguided laws that do little to protect kids while doing much to invade everyone’s privacy and speech rights.

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Jason Kelley

Fighting European Threats to Encryption: 2023 Year in Review 

3 months 2 weeks ago

Private communication is a fundamental human right. In the online world, the best tool we have to defend this right is end-to-end encryption. Yet throughout 2023, politicians across Europe attempted to undermine encryption, seeking to access and scan our private messages and pictures. 

But we pushed back in the EU, and so far, we’ve succeeded. EFF spent this year fighting hard against an EU proposal (text) that, if it became law, would have been a disaster for online privacy in the EU and throughout the world. In the name of fighting online child abuse, the European Commission, the EU’s executive body, put forward a draft bill that would allow EU authorities to compel online services to scan user data and check it against law enforcement databases. The proposal would have pressured online services to abandon end-to-end encryption. The Commission even suggested using AI to rifle through peoples’ text messages, leading some opponents to call the proposal “chat control.”

EFF has been opposed to this proposal since it was unveiled last year. We joined together with EU allies and urged people to sign the “Don’t Scan Me” petition. We lobbied EU lawmakers and urged them to protect their constituents’ human right to have a private conversation—backed up by strong encryption. 

Our message broke through. In November, a key EU committee adopted a position that bars mass scanning of messages and protects end-to-end encryption. It also bars mandatory age verification, which would have amounted to a mandate to show ID before you get online; age verification can erode a free and anonymous internet for both kids and adults. 

We’ll continue to monitor the EU proposal as attention shifts to the Council of the EU, the second decision-making body of the EU. Despite several Member States still supporting widespread surveillance of citizens, there are promising signs that such a measure won’t get majority support in the Council. 

Make no mistake—the hard-fought compromise in the European Parliament is a big victory for EFF and our supporters. The governments of the world should understand clearly: mass scanning of peoples’ messages is wrong, and at odds with human rights. 

A Wrong Turn in the U.K.

EFF also opposed the U.K.’s Online Safety Bill (OSB), which passed and became the Online Safety Act (OSA) this October, after more than four years on the British legislative agenda. The stated goal of the OSB was to make the U.K. the world’s “safest place” to use the internet, but the bill’s more than 260 pages actually outline a variety of ways to undermine our privacy and speech. 

The OSA requires platforms to take action to prevent individuals from encountering certain illegal content, which will likely mandate the use of intrusive scanning systems. Even worse, it empowers the British government, in certain situations, to demand that online platforms use government-approved software to scan for illegal content. The U.K. government said that content will only be scanned to check for specific categories of content. In one of the final OSB debates, a representative of the government noted that orders to scan user files “can be issued only where technically feasible,” as determined by the U.K. communications regulator, Ofcom. 

But as we’ve said many times, there is no middle ground to content scanning and no “safe backdoor” if the internet is to remain free and private. Either all content is scanned and all actors—including authoritarian governments and rogue criminals—have access, or no one does. 

Despite our opposition, working closely with civil society groups in the UK, the bill passed in September, with anti-encryption measures intact. But the story doesn't end here. The OSA remains vague about what exactly it requires of platforms and users alike. Ofcom must now take the OSA and, over the coming year, draft regulations to operationalize the legislation. 

The public understands better than ever that government efforts to “scan it all” will always undermine encryption, and prevent us from having a safe and secure internet. EFF will monitor Ofcom’s drafting of the regulation, and we will continue to hold the UK government accountable to the international and European human rights protections that they are signatories to. 

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Christoph Schmon

AIへの画像生成プロンプトの書き方を教えてくれるAIツール「Say What You See」

3 months 2 weeks ago
公開されたのは 11 月だが、Google Arts & Culture の「Say What You See」を 9to5Google が紹介している。 Say What You See は AI の生成した画像を見て、それに近い画像を生成可能なプロンプトの作成を目指すゲーム形式の学習ツールだ。120 文字以内のプロンプトを入力して画像を生成すると一致度がパーセンテージで表示され、一致度を高めるためのヒントが表示される。 プロンプトは現在のところ英語のみ対応しており、何が描かれているのか解釈が難しい画像が表示されることもある。レベル 1 では 50% の一致度で合格だが、レベルが進むとより高い一致度が必要になる。スラドの皆さんのご感想はいかがだろうか。

すべて読む | ITセクション | Google | グラフィック | 人工知能 |

関連ストーリー:
Microsoft、Android版Copilotアプリを静かにリリース 2023年12月29日
ChatGPTのプロンプト書き方のコツをOpenAIが公開 2023年12月19日
ChatGPT、単語を永遠に繰り返すよう頼むと規約違反の警告を表示するようになる 2023年12月06日
ブラジルの市議会、AIが生成した条例案を知らずに可決していた 2023年12月05日
ユニバーサルミュージックなどの音楽出版社、原曲とほぼ同じ歌詞を出力するAIモデルの開発元を訴える 2023年10月25日
Google、1本弦のチェロのような楽器を奏でる音楽アプリ「Viola the Bird」を公開 2023年07月16日
Google Arts & Culture、AIで俳句を視聴覚化した「Haiku Imagined」 2023年05月10日
プロンプトエンジニアはプログラマーを駆逐するか? 2023年05月09日
Google Arts & Curture、クラシック音楽に関するアプリ 2 本を公開 2022年12月16日
Google、漫画のマンガの歴史や作品について学べる「Manga Out of the Box」 2022年03月29日
Google Arts & Culture アプリの新機能「Pet Portraits」 2021年11月12日

headless

First, Let’s Talk About Consumer Privacy: 2023 Year in Review

3 months 2 weeks ago

Whatever online harms you want to alleviate on the internet today, you can do it better—with a broader impact—if you enact strong consumer data privacy legislation first. That is a grounding principle that has informed much of EFF’s consumer protection work in 2023.

While consumer privacy will not solve every problem, it is superior to many other proposals that attempt to address issues like child mental health or foreign government surveillance. That is true for two reasons: well written consumer privacy laws address the root source of corporate surveillance, and they can withstand constitutional scrutiny.

EFF’s work on this issue includes: (1) advocating for strong comprehensive consumer data privacy laws; (2) fighting bad laws; (3) protecting existing sectoral privacy laws.

Advocating for Strong Comprehensive Consumer Data Privacy


This year, EFF released a report titled “Privacy First: A Better Way to Address Online Harms.” The report listed the key pillars of a strong privacy law (like no online behavioral ads and minimization) and how these principles can help address current issues (like protecting children’s mental health or reproductive health privacy).

We highlighted why data privacy legislation is a form of civil rights legislation and why adtech surveillance often feeds government surveillance.

And we made the case why well-written privacy laws can be constitutional when they regulate the commercial processing of personal data; that personal data is private and not a matter of public concern; and the law is tailored to address the government’s interest in privacy, free expression, security, and guarding against discrimination.

Fighting Bad Laws Based in Censorship of Internet Users


We filed amicus briefs in lawsuits challenging laws in Arkansas and Texas that required internet users to submit to age verification before accessing certain online content. These challenges continue to make their way through the courts, but they have so far been successful. We plan to do the same in a case challenging California’s Age Appropriate Design Code, while cautioning the court not to cast doubt on important privacy principles.

We filed a similar amicus brief in a lawsuit challenging Montana’s TikTok ban, where a federal court recently ruled that the law violated users’ First Amendment rights to speak and to access information online, and the company’s First Amendment rights to select and curate users’ content.

Protecting Existing Sectoral Laws


EFF is also gearing up to file an amicus brief supporting the constitutionality of the federal law called the Video Privacy Protection Act, which limits how video providers can sell or share their users’ private viewing data with third-party companies or the government. While we think a comprehensive privacy law is best, we support strong existing sectoral laws that protect data like video watch history, biometrics, and broadband use records.

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Mario Trujillo

Fighting For Your Digital Rights Across the Country: Year in Review 2023

3 months 2 weeks ago

EFF works every year to improve policy in ways that protect your digital rights in states across the country. Thanks to the messages of hundreds of EFF members across the country, we've spoken up for digital rights this year from Sacramento to Augusta.

Much of EFF's state legislative work has, historically, been in our home state of California—also often the most active state on digital civil liberties issues. This year, the Golden State passed several laws that strengthen consumer digital rights.

Two major laws we supported stand out in 2023. The first is S.B. 244, authored by California Sen. Susan Eggman, which makes it easier for individuals and independent repair shops to access materials and parts needed for maintenance on electronics and appliances. That means that Californians with a broken phone screen or a busted washing machine will have many more options for getting them fixed. Even though some electronics are not included, such as video game consoles, it still raises the bar for other right-to-repair bills.

S.B. 244 is one of the strongest right-to-repair laws in the country, doggedly championed by a group of advocates led by the California Public Interest Research Group, and we were proud to support it.

Another significant win comes with the signing of S.B. 362, also known as the CA Delete Act, authored by California Sen. Josh Becker. Privacy Rights Clearinghouse and Californians for Consumer Privacy led the fight on this bill, which builds on the state's landmark data privacy law and makes it easier for Californians to control their data through the state's data broker registry.

In addition to these wins, several other California bills we supported are now law. These include a measure that will broaden protections for immigration status data and one to facilitate better broadband access.

Health Privacy Is Data Privacy

States across the country continue to legislate at the intersection of digital privacy and reproductive rights. Both in California and beyond, EFF has worked with reproductive justice activists, medical practitioners, and other digital rights advocates to ensure that data from apps, electronic health records, law enforcement databases, and social media posts are not weaponized to prosecute those seeking or aiding those who seek reproductive or gender-affirming care. 

While some states are directly targeting those who seek this type of health care, other states are taking different approaches to strengthen protections. In California, EFF supported a bill that passed into law—A.B. 352, authored by CA Assemblymember Rebecca Bauer-Kahan—which extended the protections of California's health care data privacy law to apps such as period trackers. Washington, meanwhile, passed the "My Health, My Data Act"—H.B. 1155, authored by WA Rep. Vandana Slatter—that, among other protections, prohibits the collection of health data without consent. While EFF did not take a position on H.B. 1155, we do applaud the law's opt-in consent provisions and encourage other states to consider similar bills.

Consumer Privacy Bills Could Be Stronger

Since California passed the California Consumer Privacy Act in 2018, several states have passed their own versions of consumer privacy legislation. Unfortunately, many of these laws have been more consumer-hostile and business-friendly than EFF would like to see. In 2023, eight states—Delaware, Florida, Indiana, Iowa, Montana, Oregon, Tennessee and Texas— passed their own versions of broad consumer privacy bills.

EFF did not support any of these laws, many of which can trace their lineage to a weak Virginia law we opposed in 2021. Yet not all of them are equally bad.

For example, while EFF could not support the Oregon bill after a legislative deal stripped it of its private right of action, the law is a strong starting point for privacy legislation moving forward. While it has its flaws, unique among all other state privacy laws, it requires businesses to share the names of actual third parties, rather than simply the categories of companies that have your information. So, instead of knowing a "data broker" has your information and hitting a dead end in following your own data trail, you can know exactly where to file your next request. EFF participated in a years-long process to bring that bill together, and we thank the Oregon Attorney General's office for their work to keep it as strong as it is.

EFF also wants to give plaudits to Montana for another bill—a strong genetic privacy bill passed this year. The bill is a good starting point for other states, and shows Montana is thinking critically about how to protect people from overbroad data collection and surveillance.

Of course, one post can't capture all the work we did in states this year. In particular, the curious should read our Year in Review post specifically focused on children’s privacy, speech, and censorship bills introduced in states this year. But EFF was able to move the ball forward on several issues this year—and will continue to fight for your digital rights in statehouses from coast to coast.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Hayley Tsukayama

In the Trenches of Broadband Policy: 2023 Year In Review

3 months 2 weeks ago

EFF has long advocated for affordable, accessible, future-proof internet access for all. Nearly 80% of Americans already consider internet access to be as essential as water and electricity, so as our work, health services, education, entertainment, social lives, etc. increasingly have an online component, we cannot accept a future where the quality of your internet access—and so the quality of your connection to these crucial facets of your life—is determined by geographic, socioeconomic, or otherwise divided lines. 

Lawmakers recognized this during the pandemic and set in motion once-in-a-generation opportunities to build the future-proof fiber infrastructure needed to close the digital divide once and for all.

As we exit the pandemic however, that dedication is wavering. Monopolistic internet service providers (ISPs), with business models that created the digital divide in the first place, are doing everything they can to maintain control over the broadband market—including stopping the construction of any infrastructure they do not control. Further, while some government agencies are continuing to make rules to advance equitable and competitive access to broadband, others have not. Regardless, EFF will continue to fight for the vision we’ve long advocated.

New York City Abandons Revolutionary Fiber Plan 

This year, New York City Mayor Eric Adams turned his back on the future of broadband accessibility for New Yorkers.

In 2020, then Mayor Bill de Blasio unveiled New York City’s Internet Master Plan to deliver broadband to low-income New Yorkers by investing in public fiber infrastructure. Public fiber infrastructure would have been an investment in New York City’s future, a long-term solution to permanently bridge the digital divide and bring affordable, accessible future-proof service to New Yorkers for generations to come. This kind of public infrastructure, especially if provisioned on an open and affordable basis dramatically lowers barriers to entry, which in turn creates competition, lower prices, and better customer service in the market as a whole.

Mayor Eric Adams not only abandoned this plan, but subsequently introduced a three-year $90 million dollar subsidy plan called Big Apple Connect. Instead of building physical infrastructure to bridge the digital divide for decades to come, New York City will now subsidize NYC’s oligopolist ISPs, Charter Spectrum and Altice, to continue doing business as usual. This does nothing to address the needs of underinvested communities whose legacy networks physically cannot handle a fast connection. All it does is put taxpayer dollars into corporate pockets instead of into infrastructure that actually serves the people.

The Adams administration even asked a cooperatively-run community based ISP that had been a part of the Internet Master Plan and had already installed fiber infrastructure to dismantle their network so the city can further contract with the big ISPs.

California Wavers On Its Commitments

New York City is not the only place public commitment to bridging the digital divide has wavered. 

In 2021, California invested nearly $7 billion to bring affordable fiber infrastructure to all Californians. As part of this process California’s Department of Technology was meant to build 10,000 miles of middle-mile fiber infrastructure, the physical foundation through which community-level last mile connections would be built to serve underserved communities for decades to come.

Unfortunately, in August the Department of Technology not only reduced the number of miles to be built but also cut off entire communities that had traditionally been underserved. Despite fierce community pushback, the Department of Technology stuck to their revised plans and awarded contracts accordingly.

Governor Newsom has promised to restore the lost miles in 2024, which EFF and California community groups intend to hold him to, but the fact remains that the reduction of miles should not have been done the way they were.

FCC Rules on Digital Discrimination and Rulemaking on Net Neutrality

On the federal level, the Federal Communications Commission finally received its fifth commissioner in Anna Gomez September of this year, allowing them to begin their rulemaking on net neutrality and promulgate rules on digital discrimination. We submitted comments on the net neutrality proceeding, advocating for a return to light-touch, targeted, and enforceable net neutrality protections for the whole country.

On digital discrimination, EFF applauds the Commission for adopting a disparate treatment as well as disparate impact standard. Companies can now be found liable for digital discrimination not only when they intentionally treat communities differently, but when the impact of their decisions—regardless of intent—affect a community differently.  Further, for the first time the Commission recognized the link between historic redlining in housing and digital discrimination, making the connection between the historic underinvestment of lower income communities of color and the continued underinvestment by the monopolistic ISPs.

Next year will bring more fights around broadband implementation. The questions will be who gets funding, whether and where infrastructure gets built, and whether long-neglected communities will finally be heard and brought into the 21st-century or left behind by public neglect or private greed. The path to affordable, accessible, future-proof internet for all will require the political will to invest in physical infrastructure and hold incumbents to nondiscrimination rules that preserve speech and competition online.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Chao Liu

[B] 「神は瓦礫の中」【西サハラ最新情報】  平田伊都子

3 months 2 weeks ago
「神様はどこ?」黒いアバーヤ(民族衣装)を灰塵まみれにされたガザの女性たちは、カメラに向かって叫びました。「神なんてどこにいるんだ?」と、ガザの老人も訴えました。 人々の背後ではイスラエル軍の爆弾がさく裂し、イスラエル狙撃兵の銃弾が頭をかすめていきます。 ガザの人々は命がけでカメラに訴えます。 世界の人々に訴えます。 神に訴えます。 ザでは一日約200人をイスラエル軍に虐殺されています。 230万人ガザ住民の9割以上が、家を追い出され、避難民になりました。
日刊ベリタ

【JCJ沖縄声明】辺野古新基地設計変更承認の国交大臣の代執行に抗議する

3 months 2 weeks ago
沖縄県名護市辺野古の新基地建設を巡り、斉藤鉄夫国土交通相は設計変更承認の代執行を断行した。日本国憲法の下でかつてない強権の発動であり、沖縄の自治を侵害する行為だ。断じて認められない。 公有水面の埋め立て承認は県の権限だ。地方自治法は国と自治体を「対等・協力」の関係と定めている。国が県の権限を取り上げる代執行は、本来、法の本旨から外れた行為であり、極めて限定的に行われるべき「最終かつ例外的手段」である。 沖縄県民は、新基地建設の是非を直接に問う県民投票や、新基地建設を争..
JCJ

Pact for the Future: Joint civil society input

3 months 2 weeks ago

APC and others joined together to provide civil society input to the zero draft for the Pact for the Future, an action-oriented outcome document that will be negotiated and endorsed by UN member countries in the lead-up to and during the Summit of the Future in September 2024.

flavia

Microsoft、Android版Copilotアプリを静かにリリース

3 months 2 weeks ago
Microsoft が Android 版の Copilot アプリ「Microsoft Copilot」を Google Play で静かにリリースしていたようだ (Neowin の記事、 The Verge の記事、 Ghacks の記事)。 Microsoft は既に Copilot の機能を各種 Android アプリに組み込んでおり、「Bing: AI&GPT-4とチャット」というアプリも Google Play でリリースしているが、Android 版 Microsoft Copilot は GPT-4 と DALL-E 3 を用い、「高速でで複雑で正確な回答を提供」するという。「主なな機能」は「生産性を向上させるための多多様なAIアシスタント」とデザイン効率を向上させる Image Creator で、メールの下書きやテキストの要約、「物語やや脚本の作成」「多言語コンテンツのの翻訳、校正、最適化」のほか、テキストプロンプトによる画像生成など、「あなたの創造性もも魅力的な新しいレベルまで引き上げ」るとのこと。スラドの皆さんも「AIとの未来体体験に挑戦」してみてはいかがだろうか。

すべて読む | ITセクション | モバイル | マイクロソフト | ソフトウェア | 人工知能 | Android |

関連ストーリー:
Visual Studio 2022のプレビュー版にAIが識別子の名前を考えてくれる機能 2023年12月26日
選挙関連の質問に対するMicrosoft Copilotの回答、正解は3分の1以下という調査結果 2023年12月17日
Microsoft Edgeの開発者ツール、Copilotが利用可能に 2023年12月16日
Microsoft、Windows 10 にもCopilotのプレビュー版を提供開始 2023年12月03日
Windows 12のCopilot推奨HW環境、既存ノートPCでの利用は困難か 2023年11月07日
BingチャットにCAPTCHAを解読させる方法 2023年10月07日
Windows 11 バージョン22H2の新機能、Release Previewチャネルで提供開始 2023年09月24日
Microsoft、賠償金支払いを含めてCopilot利用による商用顧客の法的問題の責任を持つと誓約 2023年09月09日
ワードやエクセルと「GPT-4」が合体。「Microsoft 365 Copilot」発表 2023年03月20日
Microsoft、AI を利用する Bing と Edge の新機能を発表 2023年02月09日
「GitHub Copilotはオープンソースライセンスを侵害」OSS開発者が集団訴訟を開始 2022年11月07日
Microsoft、iOS版のCopilotアプリもリリース 2024年01月05日
NYT、AIによるコンテンツ無断使用でMicrosoftとOpenAIを提訴 2023年12月30日
AIへの画像生成プロンプトの書き方を教えてくれるAIツール「Say What You See」 2023年12月30日

headless

[B] このままではWHOの人道支援も崩壊 テドロス事務局長がガザの事態で警告

3 months 2 weeks ago
中東のメディアMiddleEastMonitorは12月28日、国連機関である世界保健機関(WHO)のテドロス・アダノム・ゲブレイエスス事務局長が国際社会に対し、ガザの人々が直面している深刻な脅威に対処するため、早急に行動を起こすよう促した、と報じた。テドロス事務局長の声明は、この危機は、ガザの人々の生命をへの脅威ばかりでなく、その脅威を取り除こうと支援活動を継続している活動家自身を脅かし、支援活動が危機的な状況にあることを指摘している。(大野和興)
日刊ベリタ

【焦点】晴海選手村マンション建設で中央区に入るべき開発協力金51億円のうち37億減額 不可解な行政の優遇ぶり<br />問題化=橋詰雅博

3 months 2 weeks ago
 東京都が9割値引きという超格安で開発業者に売却したのは許せないと都民が提訴している問題の用地、晴海五輪選手村跡地に建設中だったマンション群(HARUMI FLAG)が完成した。分譲・賃貸含め5632戸の住宅が生まれるこの新タウンには約1万2000人が住む。大きな小中学校や特別出張所、認定こども園・図書館・保健センター・お年寄りセンターが入る大型複合施設などが建設される。 来年1月から住民の入居が始まるが、中央区による開発業者への優遇ぶりが改めて問題化している。 中央区..
JCJ

Protecting Students from Faulty Software and Legislation: 2023 Year in Review

3 months 2 weeks ago

Lawmakers, schools districts, educational technology companies and others keep rolling out legislation and software that threatens students’ privacy, free speech, and access to social media, in the name of “protecting” children. At EFF, we fought back against this overreach and demand accountability and transparency.

Bad bills and invasive monitoring systems, though sometimes well-meaning, hurt students rather than protect them from the perceived dangers of the internet and social media. We saw many efforts to bar young people, and students, from digital spaces, censor what they are allowed to see and share online, and monitor and control when and how they can do it. This makes it increasingly difficult for them to access information about everything from gun violence and drug abuse to politics and LGBTQ+ topics, all because some software or elected official considers these topics “harmful.”

In response, we doubled down on exposing faulty surveillance software, long a problem in many schools across the country. We launched a new project called the Red Flag Machine, an interactive quiz and report demonstrating the absurd inefficiency—and potential dangers—of student surveillance software that schools across the country use and that routinely invades the privacy of millions of children.

We’ll continue to fight student surveillance and censorship, and we are heartened to see students fighting back

The project grew out of our investigation of GoGuardian, computer monitoring software used in about 11,500 schools to surveil about 27 million students—mostly in middle and high school—according to the company. The software allows school officials and teachers to monitor student’s computers and devices, talk to them via chat or webcam, block sites considered “offensive,” and get alerts when students access content that the software, or the school, deems harmful or explicit.

Our investigation showed that the software inaccurately flags massive amounts of useful material. The software flagged sites about black authors and artists, the Holocaust, and the LGBTQ+ rights movement. The software flagged the official Marine Corps’ fitness guide and the bios of the cast of Shark Tank. Bible.com was flagged because the text of Genesis 3 contained the word “naked.” We found thousands more examples of mis-flagged sites.

EFF built the Red Flag Machine to expose the ludicrous results of GoGuardian’s flagging algorithm. In addition to reading our research about the software, you can take a quiz that presents websites flagged by the software, and guess which of five possible words triggered the flag. The results would be funny if they were not so potentially harmful.

Congress Takes Aim At Students and Young People

Meanwhile, Congress this year resurrected the Kids Online Safety Act (KOSA), a bill that would increase surveillance and restrict access to information in the name of protecting children online—including students. KOSA would give power to state attorneys general to decide what content on many popular online platforms is dangerous for young people, and would enable censorship and surveillance. Sites would likely be required to block important educational content, often made by young people themselves, about how to deal with anxiety, depression, eating disorders, substance use disorders, physical violence, online bullying and harassment, sexual exploitation and abuse, and suicidal thoughts. We urged Congress to reject this bill and encouraged people to tell their senators and representative that KOSA will censor the internet but not help kids. 

We also called out the brazen Eyes on the Board Act, which aims to end social media use entirely in schools. This heavy-handed bill would cut some federal funding to any school that doesn’t block all social media platforms. We can understand the desire to ensure students are focusing on schoolwork when in class, but this bill tells teachers and school officials how to do their jobs, and enforces unnecessary censorship.

Many schools already don’t allow device use in the classroom and block social media sites and other content on school issued devices. Too much social media is not a problem that teachers and administrators need the government to correct—they already have the tools and know-how to do it.

Unfortunately, we’ve seen a slew of state bills that also seek to control what students and young people can access online. There are bills in Texas, Utah, Arkansas, Florida, Montana, to name just a few, and keeping up with all this bad legislation is like a game of whack a mole.

Finally, teachers and school administrators are grappling with whether generative AI use should be allowed, and if they should deploy detection tools to find students who have used it. We think the answer to both is no. AI detection tools are very inaccurate and carry significant risks of falsely flagging students for plagiarism. And AI use is growing exponentially and will likely have significant impact on students’ lives and futures. They should be learning about and exploring generative AI now to understand some of the benefits and flaws. Demonizing it only deprives students from gaining knowledge about a technology that may change the world around us.

We’ll continue to fight student surveillance and censorship, and we are heartened to see students fighting back against efforts to supposedly protect children that actually give government control over who gets to see what content. It has never been more important for young people to defend our democracy and we’re excited to be joining with them. 

If you’re interested in learning more about protecting your privacy at school, take a look at our Surveillance Self-Defense guide on privacy for students.

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.

Karen Gullo