Responding to ShotSpotter, Police Shoot at Child Lighting Fireworks

1 month 2 weeks ago

This post was written by Rachel Hochhauser, an EFF legal intern

We’ve written multiple times about the inaccurate and dangerous “gunshot detection” tool, Shotspotter. A recent near-tragedy in Chicago adds to the growing pile of evidence that cities should drop the product.

On January 25, while responding to a ShotSpotter alert, a Chicago police officer opened fire on an unarmed “maybe 14 or 15” year old child in his backyard. Three officers approached the boy’s house, with one asking “What you doing bro, you good?” They heard a loud bang, later determined to be fireworks, and shot at the child. Fortunately, no physical injuries were recorded. In initial reports, police falsely claimed that they fired at a “man” who had fired on officers.

In a subsequent assessment of the event, the Chicago Civilian Office of Police Accountability (“COPA”) concluded that “a firearm was not used against the officers.” Chicago Police Superintendent Larry Snelling placed all attending officers on administrative duty for 30 days and is investigating whether the officers violated department policies.

ShotSpotter is the largest company which produces and distributes audio gunshot detection for U.S. cities and police departments. Currently, it is used by 100 law enforcement agencies. The system relies on sensors positioned on buildings and lamp posts, which purportedly detect the acoustic signature of a gunshot. The information is then forwarded to humans who purportedly have the expertise to verify whether the sound was gunfire (and not, for example, a car backfiring), and whether to deploy officers to the scene.

ShotSpotter claims that its technology is “97% accurate,” a figure produced by the marketing department and not engineers. The recent Chicago shooting shows this is not accurate. Indeed, a 2021 study in Chicago found that, in a period of 21 months, ShotSpotter resulted in police acting on dead-end reports over 40,000 times. Likewise, the Cook County State’s Attorney’s office concluded that ShotSpotter had “minimal return on investment” and only resulted in arrest for 1% of proven shootings, according to a recent CBS report. The technology is predominantly used in Black and Latinx neighborhoods, contributing to the over-policing of these areas. Police responding to ShotSpotter arrive at the scenes expecting gunfire and are on edge and therefore more likely to draw their firearms.

Finally, these sensors invade the right to privacy. Even in public places, people often have a reasonable expectation of privacy and therefore a legal right not to have their voices recorded. But these sound sensors risk the capture and leaking of private conversation. In People v. Johnson in California, a court held such recordings from ShotSpotter to be admissible evidence.

In February, Chicago’s Mayor announced that the city would not be renewing its contract with Shotspotter. Many other cities have cancelled or are considering cancelling use of the tool.

This technology endangers lives, disparately impacts communities of color, and encroaches on the privacy rights of individuals. It has a history of false positives and poses clear dangers to pedestrians and residents. It is urgent that these inaccurate and harmful systems be removed from our streets.

Adam Schwartz

【焦点】覇権を争う米中 外交の武器は何か=橋詰雅博

1 month 2 weeks ago
 米国と中国は自陣営に引っ張りこうと各国に接近を図っている。ただし覇権をめざす米中のやり方は同じではない。3月2日のJCJと日本AALA(アジア・アフリカ・ラテンアメリカ)連帯委員会が共催したオンライン講演で慶応大・京都大学名誉教授の大西広氏は、両国の外交の違いを解説した。 「米国にはグローバルサウス(インド、ブラジル、タイ、南アフリカのような南半球に位置するアジア、アフリカ、中南米地域の新興国・途上国の総称)に手を突っ込みコントロールする力はもはやありません」と前置きしうえ..
JCJ

Cops Running DNA-Manufactured Faces Through Face Recognition Is a Tornado of Bad Ideas

1 month 2 weeks ago

In keeping with law enforcement’s grand tradition of taking antiquated, invasive, and oppressive technologies, making them digital, and then calling it innovation, police in the U.S. recently combined two existing dystopian technologies in a brand new way to violate civil liberties. A police force in California recently employed the new practice of taking a DNA sample from a crime scene, running this through a service provided by US company Parabon NanoLabs that guesses what the perpetrators face looked like, and plugging this rendered image into face recognition software to build a suspect list.

Parts of this process aren't entirely new. On more than one occasion, police forces have been found to have fed images of celebrities into face recognition software to generate suspect lists. In one case from 2017, the New York Police Department decided its suspect looked like Woody Harrelson and ran the actor’s image through the software to generate hits. Further, software provided by US company Vigilant Solutions enables law enforcement to create “a proxy image from a sketch artist or artist rendering” to enhance images of potential suspects so that face recognition software can match these more accurately.

Since 2014, law enforcement have also sought the assistance of Parabon NanoLabs—a company that alleges it can create an image of the suspect’s face from their DNA. Parabon NanoLabs claim to have built this system by training machine learning models on the DNA data of thousands of volunteers with 3D scans of their faces. It is currently the only company offering phenotyping and only in concert with a forensic genetic genealogy investigation. The process is yet to be independently audited, and scientists have affirmed that predicting face shapes—particularly from DNA samples—is not possible. But this has not stopped law enforcement officers from seeking to use it, or from running these fabricated images through face recognition software.

Simply put: police are using DNA to create a hypothetical and not at all accurate face, then using that face as a clue on which to base investigations into crimes. Not only is this full dice-roll policing, it also threatens the rights, freedom, or even the life of whoever is unlucky enough to look a little bit like that artificial face.

But it gets worse.

In 2020, a detective from the East Bay Regional Park District Police Department in California asked to have a rendered image from Parabon NanoLabs run through face recognition software. This 3D rendering, called a Snapshot Phenotype Report, predicted that—among other attributes—the suspect was male, had brown eyes, and fair skin. Found in police records published by Distributed Denial of Secrets, this appears to be the first reporting of a detective running an algorithmically-generated rendering based on crime-scene DNA through face recognition software. This puts a second layer of speculation between the actual face of the suspect and the product the police are using to guide investigations and make arrests. Not only is the artificial face a guess, now face recognition (a technology known to misidentify people)  will create a “most likely match” for that face.

These technologies, and their reckless use by police forces, are an inherent threat to our individual privacy, free expression, information security, and social justice. Face recognition tech alone has an egregious history of misidentifying people of color, especially Black women, as well as failing to correctly identify trans and nonbinary people. The algorithms are not always reliable, and even if the technology somehow had 100% accuracy, it would still be an unacceptable tool of invasive surveillance capable of identifying and tracking people on a massive scale. Combining this with fabricated 3D renderings from crime-scene DNA exponentially increases the likelihood of false arrests, and exacerbates existing harms on communities that are already disproportionately over-surveilled by face recognition technology and discriminatory policing. 

There are no federal rules that prohibit police forces from undertaking these actions. And despite the detective’s request violating Parabon NanoLabs’ terms of service, there is seemingly no way to ensure compliance. Pulling together criteria like skin tone, hair color, and gender does not give an accurate face of a suspect, and deploying these untested algorithms without any oversight places people at risk of being a suspect for a crime they didn’t commit. In one case from Canada, Edmonton Police Service issued an apology over its failure to balance the harms to the Black community with the potential investigative value after using Parabon’s DNA phenotyping services to identify a suspect.

EFF continues to call for a complete ban on government use of face recognition—because otherwise these are the results. How much more evidence do law markers need that police cannot be trusted with this dangerous technology? How many more people need to be falsely arrested and how many more reckless schemes like this one need to be perpetrated before legislators realize this is not a sustainable method of law enforcement? Cities across the United States have already taken the step to ban government use of this technology, and Montana has specifically recognized a privacy interest in phenotype data. Other cities and states need to catch up or Congress needs to act before more people are hurt and our rights are trampled. 

Paige Collings

[B] 「I can’t die ! 死んでも死にきれない」【西サハラ最新情報】

1 month 2 weeks ago
米NBC・TVのセス・マイヤーズ・トーク番組に出演した81才のバイデン米大統領は、77才の大統領選対抗馬トランプに向けて、「彼はだいたい私と同じくらいの年齢だが、自分の妻の名前も覚えていない」と、馬鹿にしました。 トランプは2月23日の「保守政治活動会議(CPAC)」での演説で妻のメラニアを「メルセデス」と呼んでいたそうです。 トランプは以前、バイデンを「無能」で「認知障害」のある人物と言ってましたが、バイデンのそうした要素は単に年齢から来るものだけではないそうです。 そんな選挙戦に、アメリカの庶民もウンザリしてきたようです。
日刊ベリタ

EFF and 34 Civil Society Organizations Call on Ghana’s President to Reject the Anti-LGBTQ+ Bill 

1 month 2 weeks ago

MPs in Ghana’s Parliament voted to pass the country’s draconian ‘Promotion of Proper Human Sexual Rights and Ghanaian Family Values Bill’ on February 28th. The bill now heads to Ghana’s President Nana Akufo-Addo to be signed into law. 

EFF has joined 34 civil society organizations to demand that President Akufo-Addo vetoes the Family Values Bill.

The legislation criminalizes being LGBTQ+ or an ally of LGBTQ+ people, and also imposes custodial sentences for users and social media companies in punishment for vague, ill-defined offenses like promoting “change in public opinion of prohibited acts” on social media. This would effectively ban all speech and activity online and offline that even remotely supports LGBTQ+ rights.

The letter concludes:

“We also call on you to reaffirm Ghana’s obligation to prevent acts that violate and undermine LGBTQ+ people’s fundamental human rights, including the rights to life, to information, to free association, and to freedom of expression.”

Read the full letter here.

Paige Collings

「目に余る米国のダブルスタンダード」宇都宮健児

1 month 2 weeks ago
 昨年10月7日に始まったパレスチナ自治区ガザにおけるイスラエル軍とイスラム組織ハマスの戦闘は、3月7日で5カ月となる。ガザ保健当局は、ガザでの死者は3万人を超え、負傷者は7万人以上と発表している。死者の7割が女性と子ど […]
admin

[B] 野添憲治の《秋田県朝鮮人強制連行の記録1》 野添憲治さんの仕事をひもとく 

1 month 2 weeks ago
今年1月末、群馬県高崎市の「群馬の森」にある朝鮮人追悼碑が県の行政代執行で撤去された。日本の植民地支配を背景に強制連行され、挙句死亡した朝鮮人を追悼しようと市民団体が建立した碑である。公権力による撤去という行動は、歴史を人びとの目から奪い行為に他ならない。日本という国家全体がいま、歴史改ざんを邁進しているように見える。そんなとき、秋田に腰を据えて中国人・朝鮮人の強制連行の歴史を掘り起こし、明らかにしてきた野添憲治さんの仕事の一端を本紙を通して知ってもらうことも意味があるのではないかと考えた。おそらく野添さんの最後の仕事だったのではないかと思う『秋田県の朝鮮人強制連行―52カ所の現場・写真・地図―』を連載の形で掲載する。わずか64ページのブックレットで編著者は野添憲治。発行は野添さんが事務局長を務めた秋田県朝鮮人強制連行真相調査団。発行は2015年7月。定価600円。地元の印刷屋さんで冊子にした。(大野和興)
日刊ベリタ