Programming

How Rust Finally Got a Specification - Thanks to a Consultancy's Open-Source Donation (rustfoundation.org) 16

As Rust approaches its 10th anniversary, "there is an important piece of documentation missing that many other languages provide," notes the Rust Foundation.

While there's documentation and tutorials — there's no official language specification: In December 2022, an RFC was submitted to encourage the Rust Project to begin working on a specification. After much discussion, the RFC was approved in July 2023, and work began.

Initially, the Rust Project specification team (t-spec) were interested in creating the document from scratch using the Rust Reference as a guiding marker. However, the team knew there was already an external Rust specification that was being used successfully for compiler qualification purposes — the FLS.

Thank Berlin-based Ferrous Systems, a Rust-based consultancy who assembled that description "some years ago," according to a post on the Rust blog: They've since been faithfully maintaining and updating this document for new versions of Rust, and they've successfully used it to qualify toolchains based on Rust for use in safety-critical industries. [The Rust Foundation notes it part of the consultancy's "Ferrocene" Rust compiler/toolchain.] Seeing this success, others have also begun to rely on the FLS for their own qualification efforts when building with Rust.
The Rust Foundation explains: The FLS provides a structured and detailed reference for Rust's syntax, semantics, and behavior, serving as a foundation for verification, compliance, and standardization efforts. Since Rust did not have an official language specification back then, nor a plan to write one, the FLS represented a major step toward describing Rust in a way that aligns with industry requirements, particularly in high-assurance domains.
And the Rust Project is "passionate about shipping high quality tools that enable people to build reliable software at scale," adds the Rust blog. So... It's in that light that we're pleased to announce that we'll be adopting the FLS into the Rust Project as part of our ongoing specification efforts. This adoption is being made possible by the gracious donation of the FLS by Ferrous Systems. We're grateful to them for the work they've done in assembling the FLS, in making it fit for qualification purposes, in promoting its use and the use of Rust generally in safety-critical industries, and now, for working with us to take the next step and to bring the FLS into the Project.

With this adoption, we look forward to better integrating the FLS with the processes of the Project and to providing ongoing and increased assurances to all those who use Rust in safety-critical industries and, in particular, to those who use the FLS as part of their qualification efforts.

More from the Rust Foundation: The t-spec team wanted to avoid potential confusion from having two highly visible Rust specifications in the industry and so decided it would be worthwhile to try to integrate the FLS with the Rust Reference to create the official Rust Project specification. They approached Ferrous Systems, which agreed to contribute its FLS to the Rust Project and allow the Rust Project to take over its development and management... This generous donation will provide a clearer path to delivering an official Rust specification. It will also empower the Rust Project to oversee its ongoing evolution, providing confidence to companies and individuals already relying on the FLS, and marking a major milestone for the Rust ecosystem.

"I really appreciate Ferrous taking this step to provide their specification to the Rust Project," said Joel Marcey, Director of Technology at the Rust Foundation and member of the t-spec team. "They have already done a massive amount of legwork...." This effort will provide others who require a Rust specification with an official, authoritative reference for their work with the Rust programming language... This is an exciting outcome. A heartfelt thank you to the Ferrous Systems team for their invaluable contribution!

Marcey said the move allows the team "to supercharge our progress in the delivery of an official Rust specification."

And the co-founder of Ferrous Systems, Felix Gilcher, also sounded excited. "We originally created the Ferrocene Language Specification to provide a structured and reliable description of Rust for the certification of the Ferrocene compiler. As an open source-first company, contributing the FLS to the Rust Project is a logical step toward fostering the development of a unified, community-driven specification that benefits all Rust users."
AI

Has the Decline of Knowledge Worker Jobs Begun? (boston.com) 101

The New York Times notes that white-collar workers have faced higher unemployment than other groups in the U.S. over the past few years — along with slower wager growth.

Some economists wonder if this trend might be irreversible... and partly attributable to AI: After sitting below 4% for more than two years, the overall unemployment rate has topped that threshold since May... "We're seeing a meaningful transition in the way work is done in the white-collar world," said Carl Tannenbaum, the chief economist of Northern Trust. "I tell people a wave is coming...." Thousands of video game workers lost jobs last year and the year before... Unemployment in finance and related industries, while still low, increased by about a quarter from 2022 to 2024, as rising interest rates slowed demand for mortgages and companies sought to become leaner....

Overall, the latest data from the Federal Reserve Bank of New York show that the unemployment rate for college grads has risen 30% since bottoming out in September 2022 (to 2.6% from 2%), versus about 18% for all workers (to 4% from 3.4%). An analysis by Julia Pollak, chief economist of ZipRecruiter, shows that unemployment has been most elevated among those with bachelor's degrees or some college but no degree, while unemployment has been steady or falling at the very top and bottom of the education ladder — for those with advanced degrees or without a high school diploma. Hiring rates have slowed more for jobs requiring a college degree than for other jobs, according to ADP Research, which studies the labor market....

And artificial intelligence could reduce that need further by increasing the automation of white-collar jobs. A recent academic paper found that software developers who used an AI coding assistant improved a key measure of productivity by more than 25% and that the productivity gains appeared to be largest among the least experienced developers. The result suggested that adopting AI could reduce the wage premium enjoyed by more experienced coders, since it would erode their productivity advantages over novices... [A]t least in the near term, many tech executives and their investors appear to see AI as a way to trim their staffing. A software engineer at a large tech company who declined to be named for fear of harming his job prospects said that his team was about half the size it was last year and that he and his co-workers were expected to do roughly the same amount of work by relying on an AI assistant. Overall, the unemployment rate in tech and related industries jumped by more than half from 2022 to 2024, to 4.4% from 2.9%.

"Some economists say these trends may be short term in nature and little cause for concern on their own," the article points out (with one economist noting the unemployment rate is still low compared to historical averages).

Harvard labor economist Lawrence Katz even suggested the slower wage growth could reflect the discount that these workers accepted in return for being able to work from home.

Thanks to Slashdot reader databasecowgirl for sharing the article.
Cloud

Microsoft Announces 'Hyperlight Wasm': Speedy VM-Based Security at Scale with a WebAssembly Runtime (microsoft.com) 18

Cloud providers like the security of running things in virtual machines "at scale" — even though VMs "are not known for having fast cold starts or a small footprint..." noted Microsoft's Open Source blog last November. So Microsoft's Azure Core Upstream team built an open source Rust library called Hyperlight "to execute functions as fast as possible while isolating those functions within a VM."

But that was just the beginning... Then, we showed how to run Rust functions really, really fast, followed by using C to [securely] run Javascript. In February 2025, the Cloud Native Computing Foundation (CNCF) voted to onboard Hyperlight into their Sandbox program [for early-stage projects].

[This week] we're announcing the release of Hyperlight Wasm: a Hyperlight virtual machine "micro-guest" that can run wasm component workloads written in many programming languages...

Traditional virtual machines do a lot of work to be able to run programs. Not only do they have to load an entire operating system, they also boot up the virtual devices that the operating system depends on. Hyperlight is fast because it doesn't do that work; all it exposes to its VM guests is a linear slice of memory and a CPU. No virtual devices. No operating system. But this speed comes at the cost of compatibility. Chances are that your current production application expects a Linux operating system running on the x86-64 architecture (hardware), not a bare linear slice of memory...

[B]uilding Hyperlight with a WebAssembly runtime — wasmtime — enables any programming language to execute in a protected Hyperlight micro-VM without any prior knowledge of Hyperlight at all. As far as program authors are concerned, they're just compiling for the wasm32-wasip2 target... Executing workloads in the Hyperlight Wasm guest isn't just possible for compiled languages like C, Go, and Rust, but also for interpreted languages like Python, JavaScript, and C#. The trick here, much like with containers, is to also include a language runtime as part of the image... Programming languages, runtimes, application platforms, and cloud providers are all starting to offer rich experiences for WebAssembly out of the box. If we do things right, you will never need to think about whether your application is running inside of a Hyperlight Micro-VM in Azure. You may never know your workload is executing in a Hyperlight Micro VM. And that's a good thing.

While a traditional virtual-device-based VM takes about 125 milliseconds to load, "When the Hyperlight VMM creates a new VM, all it needs do to is create a new slice of memory and load the VM guest, which in turn loads the wasm workload. This takes about 1-2 milliseconds today, and work is happening to bring that number to be less than 1 millisecond in the future."

And there's also double security due to Wasmtime's software-defined runtime sandbox within Hyperlight's larger VM...
AI

First Trial of Generative AI Therapy Shows It Might Help With Depression 42

An anonymous reader quotes a report from MIT Technology Review: The first clinical trial of a therapy bot that uses generative AI suggests it was as effective as human therapy for participants with depression, anxiety, or risk for developing eating disorders. Even so, it doesn't give a go-ahead to the dozens of companies hyping such technologies while operating in a regulatory gray area. A team led by psychiatric researchers and psychologists at the Geisel School of Medicine at Dartmouth College built the tool, called Therabot, and the results were published on March 27 in the New England Journal of Medicine. Many tech companies are building AI therapy bots to address the mental health care gap, offering more frequent and affordable access than traditional therapy. However, challenges persist: poorly worded bot responses can cause harm, and forming meaningful therapeutic relationships is hard to replicate in software. While many bots rely on general internet data, researchers at Dartmouth developed "Therabot" using custom, evidence-based datasets. Here's what they found: To test the bot, the researchers ran an eight-week clinical trial with 210 participants who had symptoms of depression or generalized anxiety disorder or were at high risk for eating disorders. About half had access to Therabot, and a control group did not. Participants responded to prompts from the AI and initiated conversations, averaging about 10 messages per day. Participants with depression experienced a 51% reduction in symptoms, the best result in the study. Those with anxiety experienced a 31% reduction, and those at risk for eating disorders saw a 19% reduction in concerns about body image and weight. These measurements are based on self-reporting through surveys, a method that's not perfect but remains one of the best tools researchers have.

These results ... are about what one finds in randomized control trials of psychotherapy with 16 hours of human-provided treatment, but the Therabot trial accomplished it in about half the time. "I've been working in digital therapeutics for a long time, and I've never seen levels of engagement that are prolonged and sustained at this level," says [Michael Heinz, a research psychiatrist at Dartmouth College and Dartmouth Health and first author of the study].
Science

A New Image File Format Efficiently Stores Invisible Light Data (arstechnica.com) 11

An anonymous reader quotes a report from Ars Technica: Imagine working with special cameras that capture light your eyes can't even see -- ultraviolet rays that cause sunburn, infrared heat signatures that reveal hidden writing, or specific wavelengths that plants use for photosynthesis. Or perhaps using a special camera designed to distinguish the subtle visible differences that make paint colors appear just right under specific lighting. Scientists and engineers do this every day, and they're drowning in the resulting data. A new compression format called Spectral JPEG XL might finally solve this growing problem in scientific visualization and computer graphics. Researchers Alban Fichet and Christoph Peters of Intel Corporation detailed the format in a recent paper published in the Journal of Computer Graphics Techniques (JCGT). It tackles a serious bottleneck for industries working with these specialized images. These spectral files can contain 30, 100, or more data points per pixel, causing file sizes to balloon into multi-gigabyte territory -- making them unwieldy to store and analyze.

[...] The current standard format for storing this kind of data, OpenEXR, wasn't designed with these massive spectral requirements in mind. Even with built-in lossless compression methods like ZIP, the files remain unwieldy for practical work as these methods struggle with the large number of spectral channels. Spectral JPEG XL utilizes a technique used with human-visible images, a math trick called a discrete cosine transform (DCT), to make these massive files smaller. Instead of storing the exact light intensity at every single wavelength (which creates huge files), it transforms this information into a different form. [...]

According to the researchers, the massive file sizes of spectral images have reportedly been a real barrier to adoption in industries that would benefit from their accuracy. Smaller files mean faster transfer times, reduced storage costs, and the ability to work with these images more interactively without specialized hardware. The results reported by the researchers seem impressive -- with their technique, spectral image files shrink by 10 to 60 times compared to standard OpenEXR lossless compression, bringing them down to sizes comparable to regular high-quality photos. They also preserve key OpenEXR features like metadata and high dynamic range support.
The report notes that broader adoption "hinges on the continued development and refinement of the software tools that handle JPEG XL encoding and decoding."

Some scientific applications may also see JPEG XL's lossy approach as a drawback. "Some researchers working with spectral data might readily accept the trade-off for the practical benefits of smaller files and faster processing," reports Ars. "Others handling particularly sensitive measurements might need to seek alternative methods of storage."
Oracle

Oracle Health Breach Compromises Patient Data At US Hospitals 5

A breach of legacy Cerner servers at Oracle Health exposed patient data from multiple U.S. hospitals and healthcare organizations, with threat actors using compromised customer credentials to steal the data before it had been migrated to Oracle Cloud. Despite confirming the breach privately, Oracle Health has yet to publicly acknowledge the incident. BleepingComputer reports: Oracle Health, formerly known as Cerner, is a healthcare software-as-a-service (SaaS) company offering Electronic Health Records (EHR) and business operations systems to hospitals and healthcare organizations. After being acquired by Oracle in 2022, Cerner was merged into Oracle Health, with its systems migrated to Oracle Cloud. In a notice sent to impacted customers and seen by BleepingComputer, Oracle Health said it became aware of a breach of legacy Cerner data migration servers on February 20, 2025.

"We are writing to inform you that, on or around February 20, 2025, we became aware of a cybersecurity event involving unauthorized access to some amount of your Cerner data that was on an old legacy server not yet migrated to the Oracle Cloud," reads a notification sent to impacted Oracle Health customers. Oracle says that the threat actor used compromised customer credentials to breach the servers sometime after January 22, 2025, and copied data to a remote server. This stolen data "may" have included patient information from electronic health records. However, multiple sources told BleepingComputer that it was confirmed that patient data was stolen during the attack.

Oracle Health is also telling hospitals that they will not notify patients directly and that it is their responsibility to determine if the stolen data violates HIPAA laws and whether they are required to send notifications. However, the company says they will help identify impacted individuals and provide templates to help with notifications.
Software

'Apple Needs a Snow Sequoia' (ofb.biz) 85

uninet writes: The same year Apple launched the iPhone, it unveiled a massive upgrade to Mac OS X known as Leopard, sporting "300 New Features." Two years later, it did something almost unheard of: it released Snow Leopard, an upgrade all about how little it added and how much it took away. Apple needs to make it snow again. Current releases of MacOS Sequoia and iOS/iPadOS 18 are riddled with easily reproducible bugs in high-traffic areas, the author argues, suggesting Apple's engineers aren't using their own software. Messages can't reliably copy text, email connections randomly fail, and Safari frequently jams up. Even worse are the baffling design decisions, like burying display arrangement settings and redesigning Photos with needless margins and inconsistent navigation.

Apple's focus on the Vision Pro while AI advances raced ahead has left them scrambling to catch up, the author argues, with Apple Intelligence features now indefinitely delayed. The author insists that Apple's products still remain better than Windows or Android alternatives -- but "least bad" isn't the premium experience Apple loyalists expect. With its enormous resources, Apple could easily have teams focus on cleaning up existing software while simultaneously developing AI features.

Further reading: 'Something Is Rotten in the State of Cupertino' .
The Courts

Qualcomm Launches Global Antitrust Campaign Against Arm (tomshardware.com) 29

An anonymous reader quotes a report from Tom's Hardware: Qualcomm has reportedly filed secret complaints against Arm with the European Commission, the U.S. Federal Trade Commission (FTC), and the Korea Fair Trade Commission. Qualcomm argues that Arm's open licensing approach helped build a robust hardware and software ecosystem. However, this ecosystem is under threat now as Arm moves to restrict that access to benefit its chip design business, namely compute subsystems (CSS) reference designs for client and datacenter processors and custom silicon based on CSS for large-scale clients.

Qualcomm has presented its case to the EC, U.S. FTC, and Korea FTC behind closed doors and through formal filings, so it does not comment on the matter now. Arm rejected the accusations, stating that it is committed to innovation, competition, and upholding contract terms. The company called Qualcomm's move an attempt to shift attention from a wider commercial dispute between the two companies and use regulatory pressure for its benefit.

Indeed, the antitrust complaints align with Qualcomm's arguments in a recent legal clash with Arm in Delaware. Qualcomm won that trial, as the court ruled that the company did not break the terms of its architecture license agreement (ALA) and technology license agreement (TLA) by acquiring Nuvia and using its IP in its Snapdragon X processors for client PCs. Arm said it would seek a retrial. However, Qualcomm seems to want to ensure that it will have access to Arm's instruction set architecture and technologies by filing complaints with antitrust regulators.

Businesses

VMware Sues Siemens For Allegedly Using Unlicensed Software (theregister.com) 25

VMware has sued industrial giant AG Siemens's US operations for alleged use of unlicensed software and accused it of changing its story negotiations. From a report: The case was filed last Friday in the US District Court for the District Delaware. VMware's complaint [PDF] alleges that Siemens AG's US operations used more VMware software that it had licensed. Siemens's use of VMware became contentious when it tried to arrange extended support for some products.

On September 9, 2024, Siemens apparently produced a list of the VMware software it used and "demanded that VMware accept a purchase order to provide maintenance and support services for the listed products." The complaint states that list mentioned VMware deployments that "far exceeded the number of licenses it [Siemens] had actually purchased."

Android

Google Will Develop the Android OS Fully In Private 20

An anonymous reader quotes a report from Android Authority: No matter the manufacturer, every Android phone has one thing in common: its software base. Manufacturers can heavily customize the look and feel of the Android OS they ship on their Android devices, but under the hood, the core system functionality is derived from the same open-source foundation: the Android Open Source Project. After over 16 years, Google is making big changes to how it develops the open source version of Android in an effort to streamline its development. [...] Beginning next week, all Android development will occur within Google's internal branches, and the source code for changes will only be released when Google publishes a new branch containing those changes. As this is already the practice for most Android component changes, Google is simply consolidating its development efforts into a single branch.

This change will have minimal impact on regular users. While it streamlines Android OS development for Google, potentially affecting the speed of new version development and bug reduction, the overall effect will likely be imperceptible. Therefore, don't expect this change to accelerate OS updates for your phone. This change will also have minimal impact on most developers. App developers are unaffected, as it pertains only to platform development. Platform developers, including those who build custom ROMs, will largely also see little change, since they typically base their work on specific tags or release branches, not the main AOSP branch. Similarly, companies that release forked AOSP products rarely use the main AOSP branch due to its inherent instability.

External developers who enjoy reading or contributing to AOSP will likely be dismayed by this news, as it reduces their insight into Google's development efforts. Without a GMS license, contributing to Android OS development becomes more challenging, as the available code will consistently lag behind by weeks or months. This news will also make it more challenging for some developers to keep up with new Android platform changes, as they'll no longer be able to track changes in AOSP. For reporters, this change means less access to potentially revealing information, as AOSP patches often provide insights into Google's development plans. [...] Google will share more details about this change when it announces it later this week. If you're interested in learning more, be sure to keep an eye out for the announcement and new documentation on source.android.com.
Android Authority's Mishaal Rahman says Google is "committed to publishing Android's source code, so this change doesn't mean that Android is becoming closed-source."

"What will change is the frequency of public source code releases for specific Android components," says Rahman. "Some components like the build system, update engine, Bluetooth stack, Virtualization framework, and SELinux configuration are currently AOSP-first, meaning they're developed fully in public. Most Android components like the core OS framework are primarily developed internally, although some features, such as the unlocked-only storage area API, are still developed within AOSP."
The Internet

Open Source Devs Say AI Crawlers Dominate Traffic, Forcing Blocks On Entire Countries (arstechnica.com) 64

An anonymous reader quotes a report from Ars Technica: Software developer Xe Iaso reached a breaking point earlier this year when aggressive AI crawler traffic from Amazon overwhelmed their Git repository service, repeatedly causing instability and downtime. Despite configuring standard defensive measures -- adjusting robots.txt, blocking known crawler user-agents, and filtering suspicious traffic -- Iaso found that AI crawlers continued evading all attempts to stop them, spoofing user-agents and cycling through residential IP addresses as proxies. Desperate for a solution, Iaso eventually resorted to moving their server behind a VPN and creating "Anubis," a custom-built proof-of-work challenge system that forces web browsers to solve computational puzzles before accessing the site. "It's futile to block AI crawler bots because they lie, change their user agent, use residential IP addresses as proxies, and more," Iaso wrote in a blog post titled "a desperate cry for help." "I don't want to have to close off my Gitea server to the public, but I will if I have to."

Iaso's story highlights a broader crisis rapidly spreading across the open source community, as what appear to be aggressive AI crawlers increasingly overload community-maintained infrastructure, causing what amounts to persistent distributed denial-of-service (DDoS) attacks on vital public resources. According to a comprehensive recent report from LibreNews, some open source projects now see as much as 97 percent of their traffic originating from AI companies' bots, dramatically increasing bandwidth costs, service instability, and burdening already stretched-thin maintainers.

Kevin Fenzi, a member of the Fedora Pagure project's sysadmin team, reported on his blog that the project had to block all traffic from Brazil after repeated attempts to mitigate bot traffic failed. GNOME GitLab implemented Iaso's "Anubis" system, requiring browsers to solve computational puzzles before accessing content. GNOME sysadmin Bart Piotrowski shared on Mastodon that only about 3.2 percent of requests (2,690 out of 84,056) passed their challenge system, suggesting the vast majority of traffic was automated. KDE's GitLab infrastructure was temporarily knocked offline by crawler traffic originating from Alibaba IP ranges, according to LibreNews, citing a KDE Development chat. While Anubis has proven effective at filtering out bot traffic, it comes with drawbacks for legitimate users. When many people access the same link simultaneously -- such as when a GitLab link is shared in a chat room -- site visitors can face significant delays. Some mobile users have reported waiting up to two minutes for the proof-of-work challenge to complete, according to the news outlet.

Microsoft

Microsoft's Many Outlooks Are Confusing Users 64

The Register's Richard Speed reports: Baffled by the plethora of Outlook options out there? You aren't alone. Microsoft veteran Scott Hanselman posted a list of some more variants that could be used to do the same thing. It's a problem common to several Microsoft products. A file needs to be opened, but which app should be used? Should it be Outlook New, or Outlook (New)? With tongue firmly in cheek, Hanselman listed some more options: Outlook (Zero Sugar), Outlook (Caffeine Free), and so on. Hanselman, Developer Community veep at Microsoft, also included Outlook '95, although to our mind the peak came with the version of Outlook in Office 97. A happier, more trusting time when security was less important.

While users can create multiple Outlook profiles to store email account details and data locations, Hanselman's post on Bluesky highlights an issue facing many users of Microsoft's software: which incarnation of the application to use. Teams users often find themselves presented with a variety of applications -- Microsoft Teams and Microsoft Teams (Personal), for example, can often appear side by side in the system tray. [...]

There is a cautionary tale about what happened when a soft drinks company tried to replace a well-liked product with a "new" version and renamed the previous preferred version as "classic." The list posted by Hanselman -- who is also notable for tips on managing Microsoft's personal information manager -- is amusing, but also highlights the perils of having multiple, similarly functioning options to do the same thing, and the potential for confusing users.
Businesses

'I Won't Connect My Dishwasher To Your Stupid Cloud' (jeffgeerling.com) 272

A software engineer discovered that his newly purchased Bosch 500 series dishwasher locks basic functionality behind cloud connectivity, reigniting concerns about internet-dependent home appliances. Jeff Geerling found that features like rinse cycle, delayed start and eco mode on his $1,000 dishwasher require connecting to WiFi and creating an account with "Home Connect," Bosch's cloud service.

Geerling criticized the approach as potentially part of planned obsolescence, noting that without a current subscription fee, the company will likely either shutter the service or introduce payments for previously standard features.
AI

OpenAI CEO Altman Says AI Will Lead To Fewer Software Engineers (stratechery.com) 163

OpenAI CEO Sam Altman believes companies will eventually need fewer software engineers as AI continues to transform programming. "Each software engineer will just do much, much more for a while. And then at some point, yeah, maybe we do need less software engineers," Altman told Stratechery.

AI now handles over 50% of code authorship in many companies, Altman estimated, a significant shift that's happened rapidly as large language models have improved. The real paradigm shift is still coming, he said. "The big thing I think will come with agentic coding, which no one's doing for real yet," Altman said, suggesting that the next breakthrough will be AI systems that can independently tackle larger programming tasks with minimal human guidance.

While OpenAI continues hiring engineers for now, Altman recommended that high school graduates entering the workforce "get really good at using AI tools," calling it the modern equivalent of learning to code. "When I was graduating as a senior from high school, the obvious tactical thing was get really good at coding. And this is the new version of that," he said.
AI

AlexNet, the AI Model That Started It All, Released In Source Code Form (zdnet.com) 8

An anonymous reader quotes a report from ZDNet: There are many stories of how artificial intelligence came to take over the world, but one of the most important developments is the emergence in 2012 of AlexNet, a neural network that, for the first time, demonstrated a huge jump in a computer's ability to recognize images. Thursday, the Computer History Museum (CHM), in collaboration with Google, released for the first time the AlexNet source code written by University of Toronto graduate student Alex Krizhevsky, placing it on GitHub for all to peruse and download.

"CHM is proud to present the source code to the 2012 version of Alex Krizhevsky, Ilya Sutskever, and Geoffery Hinton's AlexNet, which transformed the field of artificial intelligence," write the Museum organizers in the readme file on GitHub. Krizhevsky's creation would lead to a flood of innovation in the ensuing years, and tons of capital, based on proof that with sufficient data and computing, neural networks could achieve breakthroughs previously viewed as mainly theoretical.
The Computer History Museum's software historian, Hansen Hsu, published an essay describing how he spent five years negotiating with Google to release the code.
Businesses

Software Maker SAP Becomes Europe's Largest Company (msn.com) 34

An anonymous reader quotes a report from Reuters: German software company SAP overtook Danish healthcare company Novo Nordisk as Europe's largest company by market capitalization on Monday. At 0900 GMT, SAP had a market cap of $340 billion, slightly more than Novo Nordisk, according to Reuters calculations using LSEG Workspace data. SAP is Europe's largest software maker, providing business application software used by companies for finance, sales, supply chain and other functions.

Its shares have surged in recent years, in part due to optimism that its cloud business will be a major beneficiary of recent investment in generative artificial intelligence. While SAP shares are up 7% so far in 2025, underperforming the broader European STOXX 600 index, which is up 8.3% year-to-date, they have clocked a total return of 160% since the end of 2022, far outperforming the STOXX 600's 28%. In contrast, Novo Nordisk shares have underperformed the market in recent months after data from trials of its experimental next-generation obesity drug Cagrisema disappointed investors.

Portables (Apple)

Software Engineer Runs Generative AI On 20-Year-Old PowerBook G4 (macrumors.com) 55

A software engineer successfully ran Meta's Llama 2 generative AI model on a 20-year-old PowerBook G4, demonstrating how well-optimized code can push the limits of legacy hardware. MacRumors' Joe Rossignol reports: While hardware requirements for large language models (LLMs) are typically high, this particular PowerBook G4 model from 2005 is equipped with a mere 1.5GHz PowerPC G4 processor and 1GB of RAM. Despite this 20-year-old hardware, my brother was able to achieve inference with Meta's LLM model Llama 2 on the laptop. The experiment involved porting the open-source llama2.c project, and then accelerating performance with a PowerPC vector extension called AltiVec. His full blog post offers more technical details about the project.
Open Source

FaunaDB Shuts Down But Hints At Open Source Future (theregister.com) 13

FaunaDB, a serverless database combining relational and document features, will shut down by the end of May due to unsustainable capital demands. The company plans to open source its core technology, including its FQL query language, in hopes of continuing its legacy within the developer community. The Register reports: The startup pocketed $27 million in VC funding in 2020 and boasted that 25,000 developers worldwide were using its serverless database. However, last week, FaunaDB announced that it would sunset its database services. FaunaDB said it plans to release an open-source version of its core database technology. The system stores data in JSON documents but retains relational features like consistency, support for joins and foreign keys, and full schema enforcement. Fauna's query language, FQL, will also be made available to the open-source community. "Driving broad based adoption of a new operational database that runs as a service globally is very capital intensive. In the current market environment, our board and investors have determined that it is not possible to raise the capital needed to achieve that goal independently," the leadership team said.

"While we will no longer be accepting new customers, existing Fauna customers will experience no immediate change. We will gradually transition customers off Fauna and are committed to ensuring a smooth process over the next several months," it added.
EU

Is WhatsApp Being Ditched for Signal in Dutch Higher Education? (dub.uu.nl) 42

For weeks Signal has been one of the three most-downloaded apps in the Netherlands, according to a local news site. And now "Higher education institutions in the Netherlands have been looking for an alternative," according to DUB (an independent news site for the Utrecht University community): Employees of the Utrecht University of Applied Sciences (HU) were recently advised to switch to Signal. Avans University of Applied Sciences has also been discussing a switch...The National Student Union is concerned about privacy. The subject was raised at last week's general meeting, as reported by chair Abdelkader Karbache, who said: "Our local unions want to switch to Signal or other open-source software."
Besides being open source, Signal is a non-commercial nonprofit, the article points out — though its proponents suggest there's another big difference. "HU argues that Signal keeps users' data private, unlike WhatsApp." Cybernews.com explains the concern: In an interview with the Dutch newspaper De Telegraaf, Meredith Whittaker [president of the Signal Foundation] discussed the pitfalls of WhatsApp. "WhatsApp collects metadata: who you send messages to, when, and how often. That's incredibly sensitive information," she says.... The only information [Signal] collects is the date an account was registered, the time when an account was last active, and hashed phone numbers... Information like profile name and the people a user communicates with is all encrypted... Metadata might sound harmless, but it couldn't be further from the truth. According to Whittaker, metadata is deadly. "As a former CIA director once said: 'We kill people based on metadata'."
WhatsApp's metadata also includes IP addresses, TechRadar noted last May: Other identifiable data such as your network details, the browser you use, ISP, and other identifiers linked to other Meta products (like Instagram and Facebook) associated with the same device or account are also collected... [Y]our IP can be used to track down your location. As the company explained, even if you keep the location-related features off, IP addresses and other collected information like phone number area codes can be used to estimate your "general location."

WhatsApp is required by law to share this information with authorities during an investigation...

[U]nder scrutiny is how Meta itself uses these precious details for commercial purposes. Again, this is clearly stated in WhatsApp's privacy policy and terms of use. "We may use the information we receive from [other Meta companies], and they may use the information we share with them, to help operate, provide, improve, understand, customize, support, and market our Services and their offerings," reads the policy. This means that yes, your messages are always private, but WhatsApp is actively collecting your metadata to build your digital persona across other Meta platforms...

The article suggests using a VPN with WhatsApp and turning on its "advanced privacy feature" (which hides your IP address during calls) and managing the app's permissions for data collection. "While these steps can help reduce the amount of metadata collected, it's crucial to bear in mind that it's impossible to completely avoid metadata collection on the Meta-owned app... For extra privacy and security, I suggest switching to the more secure messaging app Signal."

The article also includes a cautionary anecdote. "It was exactly a piece of metadata — a Proton Mail recovery email — that led to the arrest of a Catalan activist."

Thanks to long-time Slashdot reader united_notions for sharing the article.
GNU is Not Unix

FSF Holds Live Auction of 'Historically Important' Free Software Memorabilia 6

In 30 minutes the Free Software Foundation holds a live auction of memorabilia to celebrate their upcoming 40th anniversary. "By moving out of the FSF office, we got to sort through all the fun and historically important memorabilia and selected the best ones," they announced earlier — and 25 items will up for bids. (To participate in the live auction, you must register in advance.)

"This is your chance to get your very own personal souvenir of the FSF," explains an 11-page auction booklet, "from original GNU art to a famous katana and the Internet Hall of Fame medal of the FSF's founder." That's right... a katana. Once upon a time, this 41-inch blade turned heads at the FSF's tech team office. Donated by FSF friends and fans of the XKCD webcomic #225, it became a lighthearted "weapon" in the war for user freedom. As RMS himself is anti-violence, he made a silly joke by examining the katana closely instead of brandishing it, symbolizing that software freedom can be defended with wit. In a legendary photo, this was perceived as if he sniffed the blade. Between the etched dragon on the scabbard and the wavy hamon on the blade, it's as flashy as it is symbolic — especially if you like taking on proprietary software with style (and a dash of humor).
The auction is intended "to entrust some of the historically important free software memorabilia that were in the FSF's office and archive to the free software community instead of locking them away in a storage unit where no one can enjoy them.

"Hopefully, this way some of these unique items will be displayed in galleries or on the walls of free software enthusiasts. All auction proceeds will go towards the FSF's mission to promote computer user freedom."

And speaking of user freedom, here's how they described the Internet Hall of Fame medal: When Richard M. Stallman, the founder of the FSF, was inducted into the Internet Hall of Fame, it was the ultimate nod to free software's immense impact on the Internet... The medal is shiny, and the frame is fancy, but the real radiance is the recognition that the Internet might look much more locked down and dull without those original free software seeds. Hang it on your wall, and you'll be reminded that hacking for user freedom can change the world.

Slashdot Top Deals