米国の反DEI政策、どう抗う? 「女性を守る」の表現でトランス女性迫害
東電株主代表訴訟 上告受理申立理由書を提出
高浜・美浜原発延長認可等取消訴訟控訴審 被害と科学の不定性に向き合った司法判断を
函館市の大間原発建設差止等請求訴訟 原告、水蒸気爆発の対策がまったくないと批判
〈隣人〉田中優子
長生炭鉱跡での遺骨発見後、市民団体が初の対政府交渉 早期のDNA鑑定を
社会主義者や労組活動家が虐殺された「亀戸事件」、江東区で追悼会 今の世相に警鐘
朝鮮人大虐殺の犠牲者を横浜の寺院が一世紀経て初供養
関東大震災で虐殺された朝鮮人ら追悼に過去最多700人
あるくラジオ報告 : 「労働者文化運動のこれまでとこれから」高橋織丸さん
企画等専門調査会(第46回)の開催について【11月12日開催】
Voices from Africa: Building hope and connection across the region
JVN: 複数のRoboticsware製品における引用符で囲まれていないファイルパスの脆弱性
JVN: Optical Disc Archive Software (Windows版)における引用符で囲まれていないファイルパスの脆弱性
JVN: Progress Flowmonにおける認証後に実行されるOSコマンドインジェクションの脆弱性
渡部通信(11/4):高市政権の外交政策と日中関係
令和7年秋の叙勲等の受章者数及び受章者名簿(令和7年11月3日付け発令)
改憲に踏み込む高市政権に対峙する! ~ 憲法集会に2300人
The Legal Case Against Ring’s Face Recognition Feature
Amazon Ring’s upcoming face recognition tool has the potential to violate the privacy rights of millions of people and could result in Amazon breaking state biometric privacy laws.
Ring plans to introduce a feature to its home surveillance cameras called “Familiar Faces,” to identify specific people who come into view of the camera. When turned on, the feature will scan the faces of all people who approach the camera to try and find a match with a list of pre-saved faces. This will include many people who have not consented to a face scan, including friends and family, political canvassers, postal workers, delivery drivers, children selling cookies, or maybe even some people passing on the sidewalk.
When turned on, the feature will scan the faces of all people who approach the camera.
Many biometric privacy laws across the country are clear: Companies need your affirmative consent before running face recognition on you. In at least one state, ordinary people with the help of attorneys can challenge Amazon’s data collection. Where not possible, state privacy regulators should step in.
Sen. Ed Markey (D-Mass.) has already called on Amazon to abandon its plans and sent the company a list of questions. Ring spokesperson Emma Daniels answered written questions posed by EFF, which can be viewed here.
What is Ring’s “Familiar Faces”?Amazon describes “Familiar Faces” as a tool that “intelligently recognizes familiar people.” It says this tool will provide camera owners with “personalized context of who is detected, eliminating guesswork and making it effortless to find and review important moments involving specific familiar people.” Amazon plans to release the feature in December.
The feature will allow camera owners to tag particular people so Ring cameras can automatically recognize them in the future. In order for Amazon to recognize particular people, it will need to perform face recognition on every person that steps in front of the camera. Even if a camera owner does not tag a particular face, Amazon says it may retain that biometric information for up to six months. Amazon said it does not currently use the biometric data for “model training or algorithmic purposes.”
In order to biometrically identify you, a company typically will take your image and extract a faceprint by taking tiny measurements of your face and converting that into a series of numbers that is saved for later. When you step in front of a camera again, the company takes a new faceprint and compares it to a list of previous prints to find a match. Other forms of biometric tracking can be done with a scan of your fingertip, eyeball, or even your particular gait.
Amazon has told reporters that the feature will be off by default and that it would be unavailable in certain jurisdictions with the most active biometric privacy enforcement—including the states of Illinois and Texas, and the city of Portland, Oregon. The company would not promise that this feature will remain off by default in the future.
Why is This a Privacy Problem?Your biometric data, such as your faceprint, are some of the most sensitive pieces of data that a company can collect. Associated risks include mass surveillance, data breach, and discrimination.
Today’s feature to recognize your friend at your front door can easily be repurposed tomorrow for mass surveillance. Ring’s close partnership with police amplifies that threat. For example, in a city dense with face recognition cameras, the entirety of a person’s movements could be tracked with the click of a button, or all people could be identified at a particular location. A recent and unrelated private-public partnership in New Orleans unfortunately shows that mass surveillance through face recognition is not some far flung concern.
Amazon has already announced a related tool called “search party” that can identify and track lost dogs using neighbors’ cameras. A tool like this could be repurposed for law enforcement to track people. At least for now, Amazon says it does not have the technical capability to comply with law enforcement demanding a list of all cameras in which a person has been identified. Though, it complies with other law enforcement demands.
In addition, data breaches are a perpetual concern with any data collection. Biometrics magnify that risk because your face cannot be reset, unlike a password or credit card number. Amazon says it processes and stores biometrics collected by Ring cameras on its own servers, and that it uses comprehensive security measure to protect the data.
Face recognition has also been shown to have higher error rates with certain groups—most prominently with dark-skinned women. Similar technology has also been used to make questionable guesses about a person’s emotions, age, and gender.
Will Ring’s “Familiar Faces” Violate State Biometric Laws?Any Ring collection of biometric information in states that require opt-in consent poses huge legal risk for the company. Amazon already told reporters that the feature will not be available in Illinois and Texas—strongly suggesting its feature could not survive legal scrutiny there. The company said it is also avoiding Portland, Oregon, which has a biometric privacy law that similar companies have avoided.
Its “familiar faces” feature will necessarily require its cameras to collect a faceprint from of every person who comes into view of an enabled camera, to try and find a match. It is impossible for Amazon to obtain consent from everyone—especially people who do not own Ring cameras. It appears that Amazon will try to unload some consent requirements onto individual camera owners themselves. Amazon says it will provide in-app messages to customers, reminding them to comply with applicable laws. But Amazon—as a company itself collecting, processing, and storing this biometric data—could have its own consent obligations under numerous laws.
Lawsuits against similar features highlight Amazon’s legal risks. In Texas, Google paid $1.375 billion to settle a lawsuit that alleged, among other things, that Google’s Nest cameras "indiscriminately capture the face geometry of any Texan who happens to come into view, including non-users." In Illinois, Facebook paid $650 million and shut down its face recognition tools that automatically scanned Facebook photos—even the faces of non-Facebook users—in order to identify people to recommend tagging. Later, Meta paid another $1.4 billion to settle a similar suit in Texas.
Many states aside from Illinois and Texas now protect biometric data. While the state has never enforced its law, Washington in 2017 passed a biometric privacy law. In 2023, the state passed an ever stronger law that protects biometric privacy, which allows individuals to sue on their own behalf. And at least 16 states have recently passed comprehensive privacy laws that often require companies to obtain opt-in consent for the collection of sensitive data, which typically includes biometric data. For example, in Colorado, a company that jointly with others determines the purpose and means of processing biometric data must obtain consent. Maryland goes farther, and such companies are essentially prohibited from collecting or processing biometric data from bystanders.
Many of these comprehensive laws have numerous loopholes and can only be enforced by state regulators—a glaring weakness facilitated in part by Amazon lobbyists.
Nonetheless, Ring’s new feature provides regulators a clear opportunity to step up to investigate, protect people’s privacy, and test the strength of their laws.