[B] 見えない迫害、日本で生きるクルド人たち

1 day 4 hours ago
6月20日の「世界難民の日」に合わせて、クルド人支援団体「クルド人難民Mさんを支援する会」などは、6月14日に東京・神保町で講演会「トルコにおけるクルド人の迫害」を開催。その後、16日から22日にかけて、神保町の「日本教育会館一ツ橋画廊」で、クルド難民に焦点を当てた写真展「日本とトルコ クルド難民の今」が開かれた。(藤ヶ谷魁)
日刊ベリタ

【おすすめ本】藍原寛子『フクシマ、能登、そしてこれから 震災後を生きる 13人の物語 』―被災地の人々から学ぶ 痛みを伴う貴重な教訓=坂本充孝 (ジャーナリスト)

1 day 6 hours ago
 天災は故郷を壊わし、生活をなぎ倒し、心を打ちのめす。そんなとき人は、どうやって起き上がり、歩き始めるのか。東 日本大震災と能登半島地震。二つの災害現場を舞台に復興に力を尽くす13人の物語を、福島在住の著者が丹念に追跡する。 福島第一原発に近い浪江町出身の歌人である三原由起子さんの次の歌が語る。 原発の話題に触れればその人のほんとうを知ることはたやすい 放射能汚染への恐怖は人の心を引き裂いた。疑心暗鬼と分断、やがて沈黙へ。だが故郷を取り戻そうという三原さんらの働きかけに空気は..
JCJ

【オピニオン】戦後80年 改憲派に焦り 国民世論との矛盾 自衛隊明記・緊急事態条項 石破首相主張 維国も=編集部

2 days 5 hours ago
 戦後80年、日本国憲法施行78年の憲法記念日を迎えた5月3日、東京・有明防災公園に3万8000人を集めた「憲法大集会」に対抗して開かれた改憲派の集会「公開憲法フォーラム」には、石破首相がビデオメッセージを寄せた。厳しさを増す日本の安全保障下「緊急事態対応、自衛隊明記を最優先に取り組んでまいりたい」「衆参の憲法審査会での議論がさらに進み、国会による発議が早期に実現するよう、党として尽力する」とし、今年の自民党の運動方針に条文案の起草を盛り込んだことなどをアピール。自公に加え維..
JCJ

[B] 気候変動問題について(再論) 落合栄一郎

2 days 18 hours ago
気候変動=人為CO2説に関して、何度か、ここに投稿しましたが、最近特に、気候変動危機説の報道が激しくなってきたようです。現在、日本は、35度前後の日が継続し、温暖化だ、脱炭素運動をさらに加速しなければと。そうした現状を見てみます。
日刊ベリタ

Flock Safety’s Feature Updates Cannot Make Automated License Plate Readers Safe

2 days 21 hours ago

Two recent statements from the surveillance company—one addressing Illinois privacy violations and another defending the company's national surveillance network—reveal a troubling pattern: when confronted by evidence of widespread abuse, Flock Safety has blamed users, downplayed harms, and doubled down on the very systems that enabled the violations in the first place.

Flock's aggressive public relations campaign to salvage its reputation comes as no surprise. Last month, we described how investigative reporting from 404 Media revealed that a sheriff's office in Texas searched data from more than 83,000 automated license plate reader (ALPR) cameras to track down a woman suspected of self-managing an abortion. (A scenario that may have been avoided, it's worth noting, had Flock taken action when they were first warned about this threat three years ago).

Flock calls the reporting on the Texas sheriff's office "purposefully misleading," claiming the woman was searched for as a missing person at her family's request rather than for her abortion. But that ignores the core issue: this officer used a nationwide surveillance dragnet (again: over 83,000 cameras) to track someone down, and used her suspected healthcare decisions as a reason to do so. Framing this as concern for her safety plays directly into anti-abortion narratives that depict abortion as dangerous and traumatic in order to justify increased policing, criminalization, control—and, ultimately, surveillance.

Flock Safety has blamed users, downplayed harms, and doubled down on the very systems that enabled the violations in the first place.

As if that weren't enough, the company has also come under fire for how its ALPR network data is being actively used to assist in mass deportation. Despite U.S. Immigration and Customs Enforcement (ICE) having no formal agreement with Flock Safety, public records revealed "more than 4,000 nation and statewide lookups by local and state police done either at the behest of the federal government or as an 'informal' favor to federal law enforcement, or with a potential immigration focus." The network audit data analyzed by 404 exposed an informal data-sharing environment that creates an end-run around oversight and accountability measures: federal agencies can access the surveillance network through local partnerships without the transparency and legal constraints that would apply to direct federal contracts.

Flock Safety is adamant this is "not Flock's decision," and by implication, not their fault. Instead, the responsibility lies with each individual local law enforcement agency. In the same breath, they insist that data sharing is essential, loudly claiming credit when the technology is involved in cross-jurisdictional investigations—but failing to show the same attitude when that data-sharing ecosystem is used to terrorize abortion seekers or immigrants. 

Flock Safety: The Surveillance Social Network

In growing from a 2017 startup to a $7.5 billion company "serving over 5,000 communities," Flock allowed individual agencies wide berth to set and regulate their own policies. In effect, this approach offered cheap surveillance technology with minimal restrictions, leaving major decisions and actions in the hands of law enforcement while the company scaled rapidly.

And they have no intention of slowing down. Just this week, Flock launched its Business Network, facilitating unregulated data sharing amongst its private sector security clients. "For years, our law enforcement customers have used the power of a shared network to identify threats, connect cases, and reduce crime. Now, we're extending that same network effect to the private sector," Flock Safety's CEO announced

Flock Safety wooing law enforcement officers at the 2023 International Chiefs of Police Conference.

The company is building out a new mass surveillance network using the exact template that ended with the company having to retrain thousands of officers in Illinois on how not to break state law—the same template that made it easy for officers to do so in the first place. Flock's continued integration of disparate surveillance networks across the public and private spheres—despite the harms that have already occurred—is owed in part to the one thing that it's gotten really good at over the past couple of years: facilitating a surveillance social network. 

Employing marketing phrases like "collaboration" and "force multiplier," Flock encourages as much sharing as possible, going as far as to claim that network effects can significantly improve case closure rates. They cultivate a sense of shared community and purpose among users so they opt into good faith sharing relationships with other law enforcement agencies across the country. But it's precisely that social layer that creates uncontrollable risk.

The possibility of human workarounds at every level undermines any technical safeguards Flock may claim. Search term blocking relies on officers accurately labeling search intent—a system easily defeated by entering vague reasons like "investigation" or incorrect justifications, made either intentionally or not. And, of course, words like "investigation" or "missing person" can mean virtually anything, offering no value to meaningful oversight of how and for what the system is being used. Moving forward, sheriff's offices looking to avoid negative press can surveil abortion seekers or immigrants with ease, so long as they use vague and unsuspecting reasons. 

The same can be said for case number requirements, which depend on manual entry. This can easily be circumvented by reusing legitimate case numbers for unauthorized searches. Audit logs only track inputs, not contextual legitimacy. Flock's proposed AI-driven audit alerts, something that may be able to flag suspicious activity after searches (and harm) have already occurred, relies on local agencies to self-monitor misuse—despite their demonstrated inability to do so.

Flock operates as a single point of failure that can compromise—and has compromised—the privacy of millions of Americans simultaneously.

And, of course, even the most restrictive department policy may not be enough. Austin, Texas, had implemented one of the most restrictive ALPR programs in the country, and the program still failed: the city's own audit revealed systematic compliance failures that rendered its guardrails meaningless. The company's continued appeal to "local policies" means nothing when Flock's data-sharing network does not account for how law enforcement policies, regulations, and accountability vary by jurisdiction. You may have a good relationship with your local police, who solicit your input on what their policy looks like; you don't have that same relationship with hundreds or thousands of other agencies with whom they share their data. So if an officer on the other side of the country violates your privacy, it’d be difficult to hold them accountable. 

ALPR surveillance systems are inherently vulnerable to both technical exploitation and human manipulation. These vulnerabilities are not theoretical—they represent real pathways for bad actors to access vast databases containing millions of Americans' location data. When surveillance databases are breached, the consequences extend far beyond typical data theft—this information can be used to harass, stalk, or even extort. The intimate details of people's daily routines, their associations, and their political activities may become available to anyone with malicious intent. Flock operates as a single point of failure that can compromise—and has compromised—the privacy of millions of Americans simultaneously.

Don't Stop de-Flocking

Rather than addressing legitimate concerns about privacy, security, and constitutional rights, Flock has only promised updates that fall short of meaningful reforms. These software tweaks and feature rollouts cannot assuage the fear engendered by the massive surveillance system it has built and continues to expand.

A typical specimen of Flock Safety's automated license plate readers.

Flock's insistence that what's happening with abortion criminalization and immigration enforcement has nothing to do with them—that these are just red-state problems or the fault of rogue officers—is concerning. Flock designed the network that is being used, and the public should hold them accountable for failing to build in protections from abuse that cannot be easily circumvented.

Thankfully, that's exactly what's happening: cities like Austin, San MarcosDenver, Norfolk, and San Diego are pushing back. And it's not nearly as hard a choice as Flock would have you believe: Austinites are weighing the benefits of a surveillance system that generates a hit less than 0.02% of the time against the possibility that scanning 75 million license plates will result in an abortion seeker being tracked down by police, or an immigrant being flagged by ICE in a so-called "sanctuary city." These are not hypothetical risks. It is already happening.

Given how pervasive, sprawling, and ungovernable ALPR sharing networks have become, the only feature update we can truly rely on to protect people's rights and safety is no network at all. And we applaud the communities taking decisive action to dismantle its surveillance infrastructure.

Follow their lead: don't stop de-flocking.

Sarah Hamid

Today's Supreme Court Decision on Age Verification Tramples Free Speech and Undermines Privacy

3 days 1 hour ago

Today’s decision in Free Speech Coalition v. Paxton is a direct blow to the free speech rights of adults. The Court ruled that “no person—adult or child—has a First Amendment right to access speech that is obscene to minors without first submitting proof of age.” This ruling allows states to enact onerous age-verification rules that will block adults from accessing lawful speech, curtail their ability to be anonymous, and jeopardize their data security and privacy. These are real and immense burdens on adults, and the Court was wrong to ignore them in upholding Texas’ law.  

Importantly, the Court's reasoning applies only to age-verification rules for certain sexual material, and not to age limits in general. We will continue to fight against age restrictions on online access more broadly, such as on social media and specific online features.  

Still, the decision has immense consequences for internet users in Texas and in other states that have enacted similar laws. The Texas law forces adults to submit personal information over the internet to access entire websites that hold some amount of sexual material, not just pages or portions of sites that contain specific sexual materials. Many sites that cannot reasonably implement age verification measures for reasons such as cost or technical requirements will likely block users living in Texas and other states with similar laws wholesale.  

Importantly, the Court's reasoning applies only to age-verification rules for certain sexual material, and not to age limits in general. 

Many users will not be comfortable sharing private information to access sites that do implement age verification, for reasons of privacy or concern for data breaches. Many others do not have a driver’s license or photo ID to complete the age verification process. This decision will, ultimately, deter adult users from speaking and accessing lawful content, and will endanger the privacy of those who choose to go forward with verification. 

What the Court Said Today 

In the 6-3 decision, the Court ruled that Texas’ HB 1181 is constitutional. This law requires websites that Texas decides are composed of “one-third” or more of “sexual material harmful to minors” to confirm the age of users by collecting age-verifying personal information from all visitors—even to access the other two-thirds of material that is not adult content.   

In 1997, the Supreme Court struck down a federal online age-verification law in Reno v. American Civil Liberties Union. In that case the court ruled that many elements of the Communications Decency Act violated the First Amendment, including part of the law making it a crime for anyone to engage in online speech that is "indecent" or "patently offensive" if the speech could be viewed by a minor. Like HB 1181, that law would have resulted in many users being unable to view constitutionally protected speech, as many websites would have had to implement age verification, while others would have been forced to shut down.  

In Reno and in subsequent cases, the Supreme Court ruled that laws that burden adults’ access to lawful speech are subjected to the highest level of review under the First Amendment, known as strict scrutiny. This level of scrutiny requires a law to be very narrowly tailored or the least speech-restrictive means available to the government.  

That all changed with the Supreme Court’s decision today.  

The Court now says that laws that burden adults’ access to sexual materials that are obscene to minors are subject to less-searching First Amendment review, known as intermediate scrutiny. And under that lower standard, the Texas law does not violate the First Amendment. The Court did not have to respond to arguments that there are less speech-restrictive ways of reaching the same goal—for example, encouraging parents to install content-filtering software on their children’s devices.

The court reached this decision by incorrectly assuming that online age verification is functionally equivalent to flashing an ID at a brick-and-mortar store. As we explained in our amicus brief, this ignores the many ways in which verifying age online is significantly more burdensome and invasive than doing so in person. As we and many others have previously explained, unlike with in-person age-checks, the only viable way for a website to comply with an age verification requirement is to require all users to upload and submit—not just momentarily display—a data-rich government-issued ID or other document with personal identifying information.  

This leads to a host of serious anonymity, privacy, and security concerns—all of which the majority failed to address. A person who submits identifying information online can never be sure if websites will keep that information or how that information might be used or disclosed. This leaves users highly vulnerable to data breaches and other security harms. Age verification also undermines anonymous internet browsing, even though courts have consistently ruled that anonymity is an aspect of the freedom of speech protected by the First Amendment.    

This Supreme Court broke a fundamental agreement between internet users and the state that has existed since its inception

The Court sidestepped its previous online age verification decisions by claiming the internet has changed too much to follow the precedent from Reno that requires these laws to survive strict scrutiny. Writing for the minority, Justice Kagan disagreed with the premise that the internet has changed: “the majority’s claim—again mistaken—that the internet has changed too much to follow our precedents’ lead.”   

But the majority argues that past precedent does not account for the dramatic expansion of the internet since the 1990s, which has led to easier and greater internet access and larger amounts of content available to teens online. The majority’s opinion entirely fails to address the obvious corollary: the internet’s expansion also has benefited adults. Age verification requirements now affect exponentially more adults than they did in the 1990s and burden vastly more constitutionally protected online speech. The majority's argument actually demonstrates that the burdens on adult speech have grown dramatically larger because of technological changes, yet the Court bizarrely interprets this expansion as justification for weaker constitutional protection. 

What It Means Going Forward 

This Supreme Court broke a fundamental agreement between internet users and the state that has existed since its inception: the government will not stand in the way of people accessing First Amendment-protected material. There is no question that multiple states will now introduce similar laws to Texas. Two dozen already have, though they are not all in effect. At least three of those states have no limit on the percentage of material required before the law applies—a sweeping restriction on every site that contains any material that the state believes the law includes. These laws will force U.S.-based adult websites to implement age-verification or block users in those states, as many have in the past when similar laws were in effect.  

Rather than submit to verification, research has found that people will choose a variety of other paths: using VPNs to indicate that they are outside of the state, accessing similar sites that don’t comply with the law, often because the site is operating in a different country. While many users will simply not access the content as a result, others may accept the risk, at their peril.   

We expect some states to push the envelope in terms of what content they consider “harmful to minors,” and to expand the type of websites that are covered by these laws, either through updated language or threats of litigation. Even if these attacks are struck down, operators of sites that involve sexual content of any type may be under threat, especially if that information is politically divisive. We worry that the point of some of these laws will be to deter queer folks and others from accessing lawful speech and finding community online by requiring them to identify themselves. We will continue to fight to protect against the disclosure of this critical information and for people to maintain their anonymity. 

EFF Will Continue to Fight for All Users’ Free Expression and Privacy 

That said, the ruling does not give states or Congress the green light to impose age-verification regulations on the broader internet. The majority’s decision rests on the fact that minors do not have a First Amendment right to access sexual material that would be obscene. In short, adults have a First Amendment right to access those sexual materials, while minors do not. Although it was wrong, the majority’s opinion ruled that because Texas is blocking minors from speech they have no constitutional right to access, the age-verification requirement only incidentally burdens adult’s First Amendment rights.  

But the same rationale does not apply to general-audience sites and services, including social media. Minors and adults have coextensive rights to both speak and access the speech of other users on these sites because the vast majority of the speech is not sexual materials that would be obscene to minors. Lawmakers should be careful not to interpret this ruling to mean that broader restrictions on minors’ First Amendment rights, like those included in the Kids Online Safety Act, would be deemed constitutional.  

Free Speech Coalition v. Paxton will have an effect on nearly every U.S. adult internet user for the foreseeable future. It marks a worrying shift in the ways that governments can restrict access to speech online. But that only means we must work harder than ever to protect privacy, security, and free speech as central tenets of the internet.  

Aaron Mackey

[B] 「戦争と平和」【西サハラ最新情報】  平田伊都子

3 days 4 hours ago
「NATO加盟国はアメリカと武器開発共同事業をやろうヨ」と、トランプ米大統領はNATO首脳会談で戦争商売の営業をやりました。 イラン核施設攻撃でアメリカが使ったステルス爆撃機も今や最新兵器とは言えず、在庫整理にお忙しいようです。 その一方で、<平和の大統領>というトランプ自作のあだ名を、世界が口にしてくれることを期待し、自らも連呼し、<ノーベル平和賞>を狙っています。 戦争と平和を両天秤にかけて金儲けをするのはやめてください。
日刊ベリタ

【シンポ開催】メディア業界のハラスメントと市民の「知る権利」7月6日(日)13時から16時30分 全水道会館=日本新聞労働組合連合 中央執行委員長 西村 誠

3 days 6 hours ago
 新聞労連と東京地連は、7 月 6 日(日)に「メディア業界のハラスメ ントと市民の『知る権利』」と題したシンポジウムを開催します。 新聞業界は深刻な産業不況に悩まされており、離職の増加や採用難が常態化しています。その理由は、部数減少など構造的に産業の将来が見通せないということもありますが、職場内でハラスメント やジェンダー不平等がいまだに横行していることも大きいと言えます。ハラスメントやジェンダー差別は、職場の閉塞感、風通しの悪さをもたらし、それが自由で多様な発想を阻害し取..
JCJ

Georgia Court Rules for Transparency over Private Police Foundation

3 days 6 hours ago

A Georgia court has decided that private non-profit Atlanta Police Foundation (APF) must comply with public records requests under the Georgia Open Records Act for some of its functions on behalf of the Atlanta Police Department. This is a major win for transparency in the state. 

 The lawsuit was brought last year by the Atlanta Community Press Collective (ACPC) and Electronic Frontier Alliance member Lucy Parsons Labs (LPL). It concerns the APF’s refusal to disclose records about its role as the leaser and manager of the site of so-called Cop City, the Atlanta Public Safety Training Center at the heart of a years-long battle that pitted local social and environmental movements against the APF. We’ve previously written about how APF and similar groups fund police surveillance technology, and how the Atlanta Police Department spied on the social media of activists opposed to Cop City.  

This is a big win for transparency and for local communities who want to maintain their right to know what public agencies are doing. 

Police Foundations often provide resources to police departments that help them avoid public oversight, and the Atlanta Police Foundation leads the way with its maintenance of the Loudermilk Video Intergration Center and its role in Cop City, which will be used by public agencies including the Atlanta and other police departments. 

ACPC and LPL were represented by attorneys Joy Ramsingh, Luke Andrews, and Samantha Hamilton who had won the release of some materials this past December. The plaintiffs had earlier been represented by the University of Georgia School of Law First Amendment Clinic.  

The win comes at just the right time. Last Summer, the Georgia Supreme Court ruled that private contractors working for public entities are subject to open records laws. The Georgia state legislature then passed a bill to make it harder to file public records requests against private entities. With this month’s ruling, there is still time for the Atlanta Police Foundation to appeal the decision, but failing that, they will have to begin to comply with public records requests by the beginning of July.  

We hope that this will help ensure transparency and accountability when government agencies farm out public functions to private entities, so that local activists and journalists will be able to uncover materials that should be available to the general public. 

José Martinez