[B] ミャンマーで民主派勢力と少数民族組織による新たな連合体「SCEF」が誕生

16 hours 34 minutes ago
2021年にミャンマーで軍事クーデターを強行したミンアウンフライン前国軍総司令官は、今月、大統領に就任した。これに対し、同国内で活動する民主派組織などは、新たに「ミャンマー連邦民主主義国家樹立推進評議会」(SCEF)を結成した。(藤ヶ谷魁)
日刊ベリタ

Stop New York's Attack on 3D Printing

16 hours 46 minutes ago

New York's proposed 2026-2027 budget currently includes provisions that will require all 3D printers sold in the state to run print-blocking censorware—software that surveils every print for forbidden designs. This policy would also create felony charges for possessing or sharing certain design files. The vote on the state budget could happen as early as next week, so New Yorkers need to act fast and demand that their Assemblymembers and Senators strip this provision from the budget.

Take action

Tell Your Representative to Stand with Creators

State legislators across the US are rushing to regulate 3D-printed firearms under the syllogism “something must be done; there, I've done something.” The most reckless of these proposals is a mandate for manufacturers to implement print blocking on all 3D printers. We, and other experts, have already pointed out that this algorithmic print blocking is simply unfeasible and will only serve to stifle competition, free expression, and privacy. While most detrimental to the creative communities lawfully using these printers, every New Yorker will be impacted by this blow to innovation.

This policy is unfortunately buried in Part C of the New York State’s proposed budget for the 2026-2027 fiscal year (S.9005 / A.10005), which is urgently moving toward a vote after facing extensive delays. It’s also bundled with a policy that would allow felony charges to be brought against researchers and journalists for sharing design files restricted by the state.  The worst of these impacts won’t be known until after it is negotiated behind closed doors, with no safeguards for creative expression or privacy.

Researchers and Journalists Could Face Felony Charges

Part C Subpart A of the budget includes two particularly concerning provisions: §2.10 and 2.11. These threaten Class E felony charges for distributing or possessing 3D-printer files that would produce firearm parts with a 3D printer or CNC machine. 

Under these provisions merely sharing a print file with any of them could result in criminal charges

The first provision, 2.10, makes it a felony to sell or distribute files that can produce major firearm components to someone who is not a federally and NY-licensed gunsmith. Under 2.11, it’s also a felony to possess these files if you intend to illegally print a firearm or share them with someone you believe is not permitted to own or smith a firearm.

A journalist reporting on 3D-printed guns. A researcher studying printable firearms. An artist incorporating parts into a new work commenting on gun culture. Under these provisions merely sharing a print file with any of them could result in criminal charges, even if no one involved intends to assemble a firearm.

Criminalizing information doesn’t work. Someone intent on illegally printing a firearm is already subject to charges for that act. Adding felony liability for simply possessing a file or design piles on additional charges while doing nothing to stop printing. New charges for someone distributing these files won’t make them inaccessible to lawbreakers, but they will have a chilling effect on legitimate and entirely legal work. 

Unsurprisingly, a similar law was proposed and subsequently scrapped in Colorado due to First Amendment concerns. We recommend New York do the same.

Mandated Surveillance, Less Access

Part C Subpart B would require every 3D printer and CNC machine sold in New York to include algorithms that scan your design files and block prints the system identifies as producing firearm components. Furthermore, all sales and deliveries of these machines must be made face-to-face. 

Unlike other bills we have seen, there are no exceptions to this mandate. These restrictions apply to sales to researchers, commercial manufacturers, and—oddly enough—federally and state-licensed gunsmiths.

Applying these restrictions to CNC machine sellers is particularly absurd. These cousins of 3D printers, which make 3D objects by removing materials, are often tens of thousands of dollars and used by commercial manufacturers. Automotive, aerospace, medical manufacturers, and many others industries will be subject to the in-person sales, surveillance risk, and all the other problems with these print-blocking algorithms introduce.

Industries will be subject to the in-person sales, surveillance risk, and all the other problems

Even limiting the focus to individual buyers—hobbyists and artists who use these machines at home—this restriction to face-to-face sales comes with its own issues. Beyond unnecessarily complicating the use of printers in the state, this barrier to access will hit rural New Yorkers the hardest. People in rural or remote locations can stand to benefit from the saved time and costs of printing useful parts at home. With this restriction, they will need to drive to one of the few retailers who actually sell this equipment and settle for the models they stock. 

That is, if sellers continue to stock these printers despite the risk. Subpart B §§ 2.3 and 2.5 open sellers up to liability, including anyone on the second-hand market, for selling out-of-date printers. Meanwhile, buyers hoping to illegally print firearms can simply build their own printer with widely available equipment.

The Law Won’t Work as Advertised 

Here’s what makes Subpart B of the New York budget particularly reckless: the technology it mandates is not capable of doing what it is supposed to. 

There is very little detail provided about requirements for the mandated algorithms. What the bill does outline boils down to this: the algorithms must evaluate print files to determine whether they would produce a firearm or illegal firearm parts, and if so, block the print. In an attempt to enable this, New York state would also create and maintain a library of forbidden files with tightly restricted access. 

We’ve already gone over why this idea simply won’t work. Design files are trivially easy to modify, split into segments, or otherwise alter to evade pattern detection. Even if printers fully rendered and analyzed the print with cloud-based AI, any number of design or post-print tricks can be used to dodge detection. Meanwhile, such fuzzy AI interpretation will rapidly increase the percentage of lawful prints censored. 

Firearms aren’t a highly specific design like paper currency; these proposed algorithms are futilely attempting to block an infinite number of designs capable of—or that can be made capable of—the few simple mechanical functions that make up a firearm. 

This group has no peer review requirements, so it could easily be loaded with profiteers or incumbent manufacturers

As we’ve said before: the internet always routes around censorship. Anyone determined to print a prohibited object has straightforward workarounds. The people who get surveilled and blocked are the people trying to follow the law.

The bill aims to enforce this impossible mandate by creating a working group to define the actual technical requirements of enforcement—but only after the law passes. This group has no peer review requirements, so it could easily be loaded with profiteers or incumbent manufacturers who are already lining up to participate. These incumbents stand to profit from shutting out new competitors and locking in users to their devices, and sellers into their platform, subjecting both to the type of enshittification seen with Digital Rights Management (DRM) software. There are also no safeguards in the law to prevent the most surveillance-heavy approaches to print scanning, or to stop this censorship infrastructure from being further weaponized against lawful speech.

On the other hand, unbiased experts in open-source manufacturing in the working group can at best pause the clock by showing such algorithms are unfeasible. That is, until a new snake oil company comes along to restart it. 

New York Won't Be the Last Stop 

New York is one of the largest consumer markets in the country. When it mandates a feature in hardware, manufacturers hardly ever build a New York-only version. They build the New York version and sell it globally. A print-blocking mandate adopted in New York will become the national standard in practice.

New Yorkers deserve more than this rush job buried in a budget bill. This is an unfeasible tech solution, built without the consumer protections that would be required of any serious policy proposal, and creates new costs and inconveniences amidst a protracted annual budget process. It also threatens First Amendment protections. This policy will take shape without consumer guardrails, behind closed doors, and risks the worst outcomes for grassroots innovation and creativity enabled by these machines. Worse still, these practices can become the norm across other states and among 3D-printer manufacturers worldwide. 

Your representatives could vote on this ill-conceived measure in the next week.  If you're a New Yorker, email your legislators now, and tell them to strip this measure from the budget today. 

Take action

Tell Your Representative to Stand with Creators

Rory Mir

Stop New York’s Surveillance and Censorship Mandate Before It’s Too Late

17 hours 3 minutes ago

New York’s state budget could pass within days. Buried deep in the text is a provision that has nothing to do with balancing the books. Part C of the budget bill would require every 3D printer sold in New York to run surveillance software that scans every design file you create, and blocks anything an algorithm flags as a potential firearm component. A separate provision would expose researchers, journalists, and educators to felony charges simply for possessing or sharing certain design files.

This isn’t a niche issue for hobbyists. This is the state mandating censorship infrastructure on your devices, and handing printer companies a way to lock in and surveil users. It’s a direct attack on grassroots innovation and user control.

New York is one of the largest consumer markets in the country. When it mandates technology in hardware, manufacturers don’t build a New York-only version. They build one version and sell it everywhere. What passes in Albany won’t stay in New York.

The vote could happen within days. If you’re a New Yorker, contact your legislators now.

Electronic Frontier Foundation

Tell Congress: Don't Let Anyone Own The Law

18 hours 40 minutes ago

Court after court has recognized that no one can own the text of the law. But the Pro Codes Act is a deceptive power grab that will help giant industry associations ration access to huge swaths of U.S. laws. Tell Congress not to fall for it.

Electronic Frontier Foundation

How Push Notifications Can Betray Your Privacy (and What to Do About It)

20 hours 36 minutes ago

A phone’s push notifications can contain a significant amount of information about you, your communications, and what you do throughout the day. They’re important enough to government investigations that Apple and Google now both require a judge’s order to hand details about push notifications over to law enforcement, and even with that requirement Apple shares data on hundreds of users. More recently, we also learned from a 404 Media report that law enforcement forensic extraction tools can unearth the text from deleted notifications, including those from secure messaging tools, like Signal. The good news is that you can mitigate some of this risk. 

There are two points where notifications may betray your privacy: when they’re transmitted over cloud servers and once they land on the device. Let’s start with the cloud. It might seem like push notifications come directly from an app, but they are typically routed through either Apple or Google’s servers first (depending on if you use iOS or Android). According to a letter sent to the Department of Justice by Senator Wyden, the content of those notifications may be visible to Apple and Google, and at the very least the companies collect some metadata about what apps send a notification and when. App providers have to make the decision to hide the content from Apple and Google and implement that functionality; Signal is one app that does this. 

Then, once the notifications land on your phone, depending on your settings, the notification content may be visible on your lock screen without needing to unlock the device. This can be dangerous if you lose your device, someone steals it, or it’s confiscated by law enforcement. 

You may clear notifications after looking at them. But it turns out the content notifications get recorded in your device’s internal storage, which then makes them susceptible to recovery with certain types of forensic tools. Notification content may even persist after the app is deleted, if the OS doesn’t fully purge the app’s notification data. 

We still have a lot of unanswered questions about how the notification databases work on devices. We do not know how long notifications are stored, or whether they’re backed up to the cloud, in which case the cloud provider could get backdoor access to the content of messages if the backups are enabled and not end-to-end encrypted. This may also make backups vulnerable to law enforcement demands for data. 

Which is all to say that there are myriad ways that law enforcement can access the content or metadata of push notifications. Let’s fix that.

Consider the Strongest Notification Protections for Your Secure Messaging Apps

Secure chat tools are designed to keep the content of the messages safe inside the app. So, for secure chat apps like WhatsApp and Signal, that means the company that makes those apps cannot see the content of your messages, and they’re only accessible on your and your recipients’ devices. Once messages land on a device, it’s still important to consider some privacy precautions, particularly with notifications. 

Signal
Signal offers three levels of information to include in notifications, all which are pretty self explanatory:

  • Name, Content, and Actions (Name and message on Android) shows the entirety of a message as well as who sent it (on iPhone you can also slide to reply, mark as read, or call back). 
  • Name only only shows the name of the sender. 
  • No Name or Content (No name or message on Android) will only show that you have a message from Signal, not who sent it or what it’s about. 

To change your settings:

  • On iPhone: Tap your profile picture, then Settings > Notifications > Show.
  • On Android: Tap your profile picture, then Notifications > Show

WhatsApp
WhatsApp only has one option for this, and it’s currently limited to iPhone, but you can at least tell the app not to include the content of a message in the notification:

  • Open WhatsApp for iPhone, tap the “You” bar, then Notifications, and disable the Show preview option.

Check your other apps to see if they offer similar settings.

Limit Your Notifications Device-Wide

Since Apple and Google manage push notifications for their respective devices, they also have some visibility into certain data. Push notification data can include certain types of metadata, like which app sent a notification and when, as well as the account ID associated with the phone. In some cases, Apple and Google may have access to unencrypted content, including the content of the text in a notification or other information from the app itself. 

For most app notifications, there’s no simple way to easily figure out what metadata might be gleaned from a notification, or if the notification is unencrypted or not. But some app developers have described details along these lines. For example, Signal president Meredith Whittaker explained on social media how the Signal app handles notifications entirely on-device. Searching online for an app name along with “notification privacy,” “notification encryption” or “notification metadata” may help answer your questions, or you may need to dig around in support forums for the app.

It’s also good to reconsider whether any app should be sending you notifications to begin with. Aside from a potential decrease in the number of distractions you endure throughout the day, or the level of chaos on display on your lockscreen, limiting the apps that can send notifications and what content is visible in them can improve your privacy with respect to the sorts of metadata that may be gathered by the companies, as well as any content that may be viewable if someone has physically accessed your device.

To check and change your settings on iPhone

  • Open Settings > Notifications.
  • On the Show Previews option, you can choose whether to show the content of notifications on the lock screen, “Always,” which doesn’t require unlocking the device, “When Unlocked,” which does, and “Never,” which means notifications won’t have any details, just that you have a notification in an app. 
  • Alternatively, you can scroll down and change these settings per app. Just tap the app name, then the Show Previews menu, and choose how you’d like them to appear. Or, if you’ve decided you don’t want notifications from that app at all, uncheck the Allow Notifications option.

To check and change your settings on Android
The core version of Android relies on app developers to develop specific settings more than controlling them on a platform-wide level.

  • Open Settings > Notifications > App notifications to disable notifications from any app completely. Some apps may also offer internal notification options for specific types of notices, like new messages, that you can control in the app itself. Tap an app name, then tap the Addition settings in the app option to potentially customize it more.
  • You can also experiment with the sensitive content setting. This is up to the developer to set properly, but when done so, most notifications will require at least unlocking the device to see them. Open Settings > Notifications > Notifications on lock screen and disable “Show sensitive content.”
Control What Notifications AI Tools Can Access

In an attempt to make notifications easier to skim, both Android and iOS offer optional ways to get notification summaries using their AI tools that summarize the content of notifications. On an individual app level, WhatsApp offers this as well. Some of these summarization tools, like Apple’s, run on the device, while others, like WhatsApp’s, do not. This can all be a lot to keep track of, and sending data off device may create some level of risk for some messages.

Since this is a bit more complicated, we have another blog post that walks through the steps to take to protect messaging from accidentally ending up in AI tools built into Apple and Google's devices. For WhatsApp specifically, we have a blog detailing when you might want to turn on the app’s “Advanced Chat Privacy” feature, which can disable summaries for both yourself and others in the chat.

Balancing security, privacy, and usability with something like push notifications is a complicated task. At the very least, Apple and Google should better ensure that the content of these notifications isn’t transmitted over their servers in plain text. The companies need to also make sure that device operating systems don’t back up the notification database to the cloud, and when an app is deleted, that all notification data is purged.

We appreciate that apps like Signal allow you to control what’s visible with notifications on a per-app basis, and we’d like to see this level of granularity of choices in other secure messaging tools, like WhatsApp. Likewise, more apps should handle push notifications similarly to the way Signal does, where a ping is sent to wake up the app to check for messages, and the content of that message is never sent across servers.

Thorin Klosowski

【Bookガイド】4月の“推し本”紹介=萩山 拓(ライター)

21 hours 18 minutes ago
 ノンフィクション・ジャンルからチョイスした本の紹介です(刊行順・販価は税別)◆『石牟礼道子━水俣病公式確認70年のために』KAWADE夢ムック増補新版 4/6刊 1600円2026年は、水俣病が公式に確認されてから70年の節目。作家の石牟礼道子(1927~2018)は、患者たちの運動を支援し、また『苦海浄土』などの作品を通じて水俣病や現代文明の抱える諸問題を問い続けた。石牟礼さん逝去を受け、18年に刊行した文藝別冊「追悼 石牟礼道子 さよなら、不知火海の言霊」を、多数の増補..
JCJ