×
Space

Conflicting Values For Hubble Constant Not Due To Measurement Error, Study Finds (arstechnica.com) 64

Jennifer Ouellette reports via Ars Technica: Astronomers have made new measurements of the Hubble Constant, a measure of how quickly the Universe is expanding, by combining data from the Hubble Space Telescope and the James Webb Space Telescope. Their results confirmed the accuracy of Hubble's earlier measurement of the constant's value, according to their recent paper published in The Astrophysical Journal Letters, with implications for a long-standing discrepancy in values obtained by different observational methods known as the "Hubble tension."

There was a time when scientists believed the Universe was static, but that changed with Albert Einstein's general theory of relativity. Alexander Friedmann published a set of equations showing that the Universe might actually be expanding in 1922, with Georges Lemaitre later making an independent derivation to arrive at that same conclusion. Edwin Hubble confirmed this expansion with observational data in 1929. Prior to this, Einstein had been trying to modify general relativity by adding a cosmological constant in order to get a static universe from his theory; after Hubble's discovery, legend has it, he referred to that effort as his biggest blunder.
The article notes how scientists have employed different methods to calculate the Hubble Constant, including observing nearby celestial objects, analyzing gravitational waves from cosmic events, and examining the Cosmic Microwave Background (CMB). However, these approaches yield differing values, highlighting the challenge in pinning down the constant precisely. A recent effort involved making additional observations of Cepheid variable stars, correlating them with the Hubble data. The results further confirmed the accuracy of the Hubble data.

"We've now spanned the whole range of what Hubble observed, and we can rule out a measurement error as the cause of the Hubble Tension with very high confidence," said co-author and team leader Adam Riess, a physicist at Johns Hopkins University. "Combining Webb and Hubble gives us the best of both worlds. We find that the Hubble measurements remain reliable as we climb farther along the cosmic distance ladder. With measurement errors negated, what remains is the real and exciting possibility that we have misunderstood the Universe."
Space

US Intelligence Officer Explains Roswell, UFO Sightings (cnn.com) 43

CNN's national security analyst interviewed a U.S. intelligence officer who worked on the newly-released Defense report debunking UFO sightings — physicist Sean Kirkpatrick. He tells CNN "about two to five percent" of UFO reports are "truly anomalous."

But CNN adds that "he thinks explanations for that small percentage will most likely be found right here on Earth..." This is how Kirkpatrick and his team explain the Roswell incident, which plays a prominent role in UFO lore. That's because, in 1947, a U.S. military news release stated that a flying saucer had crashed near Roswell Army Air Field in New Mexico. A day later, the Army retracted the story and said the crashed object was a weather balloon. Newspapers ran the initial saucer headline, followed up with the official debunking, and interest in the case largely died down. Until 1980, that is, when a pair of UFO researchers published a book alleging that alien bodies had been recovered from the Roswell wreckage and that the U.S. government had covered up the evidence.

Kirkpatrick says his office dug deep into the Roswell incident and found that in the late 1940s and early 1950s, there were a lot of things happening near the Roswell Airfield. There was a spy program called Project Mogul, which launched long strings of oddly shaped metallic balloons. They were designed to monitor Soviet nuclear tests and were highly secret. At the same time, the U.S. military was conducting tests with other high-altitude balloons that carried human test dummies rigged with sensors and zipped into body-sized bags for protection against the elements. And there was at least one military plane crash nearby with 11 fatalities.

Echoing earlier government investigations, Kirkpatrick and his team concluded that the crashed Mogul balloons, the recovery operations to retrieve downed test dummies and glimpses of the charred aftermath of that real plane crash likely combined into a single false narrative about a crashed alien spacecraft...

Since 2020, the Pentagon has standardized, de-stigmatized and increased the volume of reporting on UFOs by the U.S. military. Kirkpatrick says that's the reason the closely covered and widely-mocked Chinese spy balloon was spotted in the first place last year. The incident shows that the U.S. government's policy of taking UFOs seriously is actually working.

The pattern keeps repeating. "Kirkpatrick says, his investigation found that most UFO sightings are of advanced technology that the U.S. government needs to keep secret, of aircraft that rival nations are using to spy on the U.S. or of benign civilian drones and balloons." ("What's more likely?" asked Kirkpatrick. "The fact that there is a state-of-the-art technology that's being commercialized down in Florida that you didn't know about, or we have extraterrestrials?")

But the greatest irony may be that "stories about these secret programs spread inside the Pentagon, got embellished and received the occasional boost from service members who'd heard rumors about or caught glimpses of seemingly sci-fi technology or aircraft. And Kirkpatrick says his investigators ultimately traced this game of top-secret telephone back to fewer than a dozen people... [F]or decades, UFO true believers have been telling us there's a U.S. government conspiracy to hide evidence of aliens. But — if you believe Kirkpatrick — the more mundane truth is that these stories are being pumped up by a group of UFO true believers in and around government."
Open Source

OpenTTD (Unofficial Remake of 'Transport Tycoon Deluxe' Game) Turns 20 (openttd.org) 17

In 1995 Scottish video game designer Chris Sawyer created the business simulator game Transport Tycoon Deluxe — and within four years, Wikipedia notes, work began on the first version of an open source version that's still being actively developed. "According to a study of the 61,154 open-source projects on SourceForge in the period between 1999 and 2005, OpenTTD ranked as the 8th most active open-source project to receive patches and contributions. In 2004, development moved to their own server."

Long-time Slashdot reader orudge says he's been involved for almost 25 years. "Exactly 21 years ago, I received an ICQ message (look it up, kids) out of the blue from a guy named Ludvig Strigeus (nicknamed Ludde)." "Hello, you probably don't know me, but I've been working on a project to clone Transport Tycoon Deluxe for a while," he said, more or less... Ludde made more progress with the project [written in C] over the coming year, and it looks like we even attempted some multiplayer games (not too reliable, especially over my dial-up connection at the time). Eventually, when he was happy with what he had created, he agreed to allow me to release the game as open source. Coincidentally, this happened exactly a year after I'd first spoken to him, on the 6th March 2004...

Things really got going after this, and a community started to form with enthusiastic developers fixing bugs, adding in new features, and smoothing off the rough edges. Ludde was, I think, a bit taken aback by how popular it proved, and even rejoined the development effort for a while. A read through the old changelogs reveals just how many features were added over a very short period of time. Quick wins like higher vehicle limits came in very quickly, and support for TTDPatch's NewGRF format started to be functional just four months later. Large maps, improved multiplayer, better pathfinders, improved TTDPatch compatibility, and of course, ports to a great many different operating systems, such as Mac OS X, BeOS, MorphOS and OS/2. It was a very exciting time to be a TTD fan!

Within six years, ambitious projects to create free replacements for the original TTD graphics, sounds and music sets were complete, and OpenTTD finally had its 1.0 release. And while we may not have the same frantic addition of new features we had in 2004, there have still been massive improvements to the code, with plenty of exciting new features over the years, with major releases every year since 2008. he move to GitHub in 2018 and the release of OpenTTD on Steam in 2021 have also re-energised development efforts, with thousands of people now enjoying playing the game regularly. And development shows no signs of slowing down, with the upcoming OpenTTD 14.0 release including over 40 new features!

"Personally, I would like to say thank you to everyone who has supported OpenTTD development over the past two decades..." they write, adding "Finally, of course, I'd like to thank you, the players! None of us would be here if people weren't still playing the game.

"Seeing how the first twenty years have gone, I can't wait to see what the next twenty years have in store. :)"
Security

Linux Variants of Bifrost Trojan Evade Detection via Typosquatting (darkreading.com) 19

"A 20-year-old Trojan resurfaced recently," reports Dark Reading, "with new variants that target Linux and impersonate a trusted hosted domain to evade detection." Researchers from Palo Alto Networks spotted a new Linux variant of the Bifrost (aka Bifrose) malware that uses a deceptive practice known as typosquatting to mimic a legitimate VMware domain, which allows the malware to fly under the radar. Bifrost is a remote access Trojan (RAT) that's been active since 2004 and gathers sensitive information, such as hostname and IP address, from a compromised system.

There has been a worrying spike in Bifrost Linux variants during the past few months: Palo Alto Networks has detected more than 100 instances of Bifrost samples, which "raises concerns among security experts and organizations," researchers Anmol Murya and Siddharth Sharma wrote in the company's newly published findings.

Moreover, there is evidence that cyberattackers aim to expand Bifrost's attack surface even further, using a malicious IP address associated with a Linux variant hosting an ARM version of Bifrost as well, they said... "As ARM-based devices become more common, cybercriminals will likely change their tactics to include ARM-based malware, making their attacks stronger and able to reach more targets."

Anime

Akira Toriyama, Creator of Dragon Ball Manga Series, Dies Aged 68 (theguardian.com) 40

Longtime Slashdot reader AmiMoJo shares a report from The Guardian: Akira Toriyama, the influential Japanese manga artist who created the Dragon Ball series, has died at the age of 68. He died on March 1 from an acute subdural haematoma. The news was confirmed by Bird Studio, the manga company that Toriyama founded in 1983.

"It's our deep regret that he still had several works in the middle of creation with great enthusiasm," the studio wrote in a statement. "Also, he would have many more things to achieve." The studio remembered his "unique world of creation". "He has left many manga titles and works of art to this world," the statement read. "Thanks to the support of so many people around the world, he has been able to continue his creative activities for over 45 years." [...]

Based on an earlier work titled Dragon Boy, Dragon Ball was serialized in 519 chapters in Weekly Shonen Jump from 1984 to 1995 and birthed a blockbuster franchise including an English-language comic book series, five distinct television adaptation -- with Dragon Ball Z the most familiar to western audiences -- and spin-offs, over 20 different films and a vast array of video games. The series -- a kung fu take on the shonen (or young adult) manga genre -- drew from Chinese and Hong Kong action films as well as Japanese folklore. It introduced audiences to the now-instantly familiar Son Goku -- a young martial arts trainee searching for seven magical orbs that will summon a mystical dragon -- as well as his ragtag gang of allies and enemies.
You can learn more about Toriyama via the Dragon Ball Wiki.

The Associated Press, Washington Post, and New York Times, among others, have all reported on his passing.
Social Networks

Threads' API Is Coming in June (techcrunch.com) 17

In 2005 Gabe Rivera was a compiler software engineer at Intel — before starting the tech-news aggregator Techmeme. And last year his Threads profile added the words "This is a little self-serving, but I want all social networks to be as open as possible."

Friday Threads engineer Jesse Chen posted that it was Rivera's post when Threads launched asking for an API that "convinced us to go for it." And Techmeme just made its first post using the API, according to Chen. The Verge reports : Threads plans to release its API by the end of June after testing it with a limited set of partners, including Hootsuite, Sprinklr, Sprout Social, Social News Desk, and Techmeme. The API will let developers build third-party apps for Threads and allow sites to publish directly to the platform.
More from TechCrunch: Engineer Jesse Chen posted that the company has been building the API for the past few months. The API currently allows users to authenticate, publish threads and fetch the content they post through these tools. "Over the past few months, we've been building the Threads API to enable creators, developers, and brands to manage their Threads presence at scale and easily share fresh, new ideas with their communities from their favorite third-party applications," he said...

The engineer added that Threads is looking to add more capabilities to APIs for moderation and insights gathering.

Space

The Desert Planet In 'Dune' Is Plausible, According To Science (sciencenews.org) 51

The desert planet Arrakis in Frank Herbert's science fiction novel Dune is plausible, says Alexander Farnsworth, a climate modeler at the University of Bristol in England. According to Science News, the world would be a harsh place for humans to live, and they probably wouldn't have to worry about getting eaten by extraterrestrial helminths. From the report: For their Arrakis climate simulation, which you can explore at the website Climate Archive, Farnsworth and colleagues started with the well-known physics that drive weather and climate on Earth. Using our planet as a starting point makes sense, Farnsworth says, partly because Herbert drew inspiration for Arrakis from "some sort of semi-science of looking at dune systems on the Earth itself." The team then added nuggets of information about the planet from details in Herbert's novels and in the Dune Encyclopedia. According to that intel, the fictional planet's atmosphere is similar to Earth's with a couple of notable differences. Arrakis has less carbon dioxide in the atmosphere than Earth -- about 350 parts per million on the desert planet compared with 417 parts per million on Earth. But Dune has far more ozone in its lower atmosphere: 0.5 percent of the gases in the atmosphere compared to Earth's 0.000001 percent.

All that extra ozone is crucial for understanding the planet. Ozone is a powerful greenhouse gas, about 65 times as potent at warming the atmosphere as carbon dioxide is, when measured over a 20-year period. "Arrakis would certainly have a much warmer atmosphere, even though it has less CO2 than Earth today," Farnsworth says. In addition to warming the planet, so much ozone in the lower atmosphere could be bad news. "For humans, that would be incredibly toxic, I think, almost fatal if you were to live under such conditions," Farnsworth says. People on Arrakis would probably have to rely on technology to scrub ozone from the air. Of course, ozone in the upper atmosphere could help shield Arrakis from harmful radiation from its star, Canopus. (Canopus is a real star also known as Alpha Carinae. It's visible in the Southern Hemisphere and is the second brightest star in the sky. Unfortunately for Dune fans, it isn't known to have planets.) If Arrakis were real, it would be located about as far from Canopus as Pluto is from the sun, Farnsworth says. But Canopus is a large white star calculated to be about 7,200 degrees Celsius. "That's significantly hotter than the sun," which runs about 2,000 degrees cooler, Farnsworth says. But "there's a lot of supposition and assumptions they made in here, and whether those are accurate numbers or not, I can't say."

The climate simulation revealed that Arrakis probably wouldn't be exactly as Herbert described it. For instance, in one throwaway line, the author described polar ice caps receding in the summer heat. But Farnsworth and colleagues say it would be far too hot at the poles, about 70Â C during the summer, for ice caps to exist at all. Plus, there would be too little precipitation to replenish the ice in the winter. High clouds and other processes would warm the atmosphere at the poles and keep it warmer than lower latitudes, especially in the summertime. Although Herbert's novels have people living in the midlatitudes and close to the poles, the extreme summer heat and bone-chilling -40C to -75C temperatures in the winters would make those regions nearly unlivable without technology, Farnsworth says. Temperatures in Arrakis' tropical latitudes would be relatively more pleasant at 45C in the warmest months and about 15C in colder months. On Earth, high humidity in the tropics makes it far warmer than at the poles. But on Arrakis, "most of the atmospheric moisture was essentially removed from the tropics," making even the scorching summers more tolerable. The poles are where clouds and the paltry amount of moisture gather and heat the atmosphere. But the tropics on Arrakis pose their own challenges. Hurricane force winds would regularly sandblast inhabitants and build dunes up to 250 meters tall, the researchers calculate. It doesn't mean people couldn't live on Arrakis, just that they'd need technology and lots of off-world support to bring in food and water, Farnsworth says. "I'd say it's a very livable world, just a very inhospitable world."

Wikipedia

Rogue Editors Started a Competing Wikipedia That's Only About Roads (gizmodo.com) 57

An anonymous reader quotes a report from Gizmodo: For 20 years, a loosely organized group of Wikipedia editors toiled away curating a collection of 15,000 articles on a single subject: the roads and highways of the United States. Despite minor disagreements, the US Roads Project mostly worked in harmony, but recently, a long-simmering debate over the website's rules drove this community to the brink. Efforts at compromise fell apart. There was a schism, and in the fall of 2023, the editors packed up their articles and moved over to a website dedicated to roads and roads alone. It's called AARoads, a promised land where the editors hope, at last, that they can find peace. "Roads are a background piece. People drive on them every day, but they don't give them much attention," said editor Michael Gronseth, who goes by Imzadi1979 on Wikipedia, where he dedicated his work to Michigan highways, specifically. But a road has so much to offer if you look beyond the asphalt. It's the nexus of history, geography, travel, and government, a seemingly perfect subject for the hyper-fixations of Wikipedia. "But there was a shift about a year ago," Gronseth said. "More editors started telling us that what we're doing isn't important enough, and we should go work on more significant topics." [...]

The Roads Project had a number of adversaries, but the chief rival is a group known as the New Page Patrol, or the NPP for short. The NPP has a singular mission. When a new page goes up on Wikipedia, it gets reviewed by the NPP. The Patrol has special editing privileges and if a new article doesn't meet the website's standards, the NPP takes it down. "There's a faction of people who feel that basically anything is valid to be published on Wikipedia. They say, 'Hey, just throw it out there! Anything goes.' That's not where I come down." said Bil Zeleny, a former member of the NPP who goes by onel5969 on Wikipedia, a reference to the unusual spelling of his first name. At his peak, Zeleny said he was reviewing upwards of 100,000 articles a year, and he rejected a lot of articles about roads during his time. After years of frustration, Zeleny felt he was seeing too many new road articles that weren't following the rules -- entire articles that cited nothing other than Google Maps, he said. Enough was enough. Zeleny decided it was time to bring the subject to the council.

Zeleny brought up the problem on the NPP discussion forum, sparking months of heated debate. Eventually, the issue became so serious that some editors proposed an official policy change on the use of maps as a source. Rule changes require a process called "Request for Comment," where everyone is invited to share their thoughts on the issue. Over the course of a month, Wikipedia users had written more than 56,000 words on the subject. For reference, that's about twice as long as Ernest Hemingway's novel The Old Man and the Sea. In the end, the roads project was successful. The vote was decisive, and Wikipedia updated its "No Original Research" policy to clarify that it's ok to cite maps and other visual sources. But this, ultimately, was a victory with no winners. "Some of us felt attacked," Gronseth said. On the US Roads Project's Discord channel, a different debate was brewing. The website didn't feel safe anymore. What would happen at the next request for comment? The community decided it was time to fork. "We don't want our articles deleted. It didn't feel like we had a choice," he said.

The Wikipedia platform is designed for interoperability. If you want to start your own Wiki, you can split off and take your Wikipedia work with you, a process known as "forking." [...] Over the course of several months, the US Roads Project did the same. Leaving Wikipedia was painful, but the fight that drove the roads editors away was just as difficult for people on the other side. Some editors embroiled in the roads fights deleted their accounts, though none of these ex-Wikipedian's responded to Gizmodo's requests for comment. Bil Zeleny was among the casualties. After almost six years of hard work on the New Post Patrol, he reached the breaking point. The controversy had pushed him too far, and Zeleny resigned from the NPP. [...] AARoads actually predates Wikipedia, tracing its origins all the way back to the prehistoric internet days of the year 2000, complete with articles, maps, forums, and a collection of over 10,000 photos of highway signs and markers. When the US Roads Project needed a new home, AARoads was happy to oblige. It's a beautiful resource. It even has backlinks to relevant non-roads articles on the regular Wikipedia. But for some, it isn't home.
"There are members who disagree with me, but my ultimate goal is to fork back," said Gronseth. "We made our articles license-compatible, so they can be exported back to Wikipedia someday if that becomes an option. I don't want to stay separate. I want to be part of the Wikipedia community. But we don't know where things will land, and for now, we've struck out on our own."
AI

AI-Generated Articles Prompt Wikipedia To Downgrade CNET's Reliability Rating (arstechnica.com) 54

Wikipedia has downgraded tech website CNET's reliability rating following extensive discussions among its editors regarding the impact of AI-generated content on the site's trustworthiness. "The decision reflects concerns over the reliability of articles found on the tech news outlet after it began publishing AI-generated stories in 2022," adds Ars Technica. Futurism first reported the news. From the report: Wikipedia maintains a page called "Reliable sources/Perennial sources" that includes a chart featuring news publications and their reliability ratings as viewed from Wikipedia's perspective. Shortly after the CNET news broke in January 2023, Wikipedia editors began a discussion thread on the Reliable Sources project page about the publication. "CNET, usually regarded as an ordinary tech RS [reliable source], has started experimentally running AI-generated articles, which are riddled with errors," wrote a Wikipedia editor named David Gerard. "So far the experiment is not going down well, as it shouldn't. I haven't found any yet, but any of these articles that make it into a Wikipedia article need to be removed." After other editors agreed in the discussion, they began the process of downgrading CNET's reliability rating.

As of this writing, Wikipedia's Perennial Sources list currently features three entries for CNET broken into three time periods: (1) before October 2020, when Wikipedia considered CNET a "generally reliable" source; (2) between October 2020 and present, when Wikipedia notes that the site was acquired by Red Ventures in October 2020, "leading to a deterioration in editorial standards" and saying there is no consensus about reliability; and (3) between November 2022 and January 2023, when Wikipedia considers CNET "generally unreliable" because the site began using an AI tool "to rapidly generate articles riddled with factual inaccuracies and affiliate links."

Futurism reports that the issue with CNET's AI-generated content also sparked a broader debate within the Wikipedia community about the reliability of sources owned by Red Ventures, such as Bankrate and CreditCards.com. Those sites published AI-generated content around the same period of time as CNET. The editors also criticized Red Ventures for not being forthcoming about where and how AI was being implemented, further eroding trust in the company's publications. This lack of transparency was a key factor in the decision to downgrade CNET's reliability rating.
A CNET spokesperson said in a statement: "CNET is the world's largest provider of unbiased tech-focused news and advice. We have been trusted for nearly 30 years because of our rigorous editorial and product review standards. It is important to clarify that CNET is not actively using AI to create new content. While we have no specific plans to restart, any future initiatives would follow our public AI policy."
Open Source

'Paying People To Work on Open Source is Good Actually' 40

Jacob Kaplan-Moss, one of the lead developers of Django, writes in a long post that he says has come from a place of frustration: [...] Instead, every time a maintainer finds a way to get paid, people show up to criticize and complain. Non-OSI licenses "don"t count" as open source. Someone employed by Microsoft is "beholden to corporate interests" and not to be trusted. Patreon is "asking for handouts." Raising money through GitHub sponsors is "supporting Microsoft's rent-seeking." VC funding means we're being set up for a "rug pull" or "enshitification." Open Core is "bait and switch."

None of this is hypothetical; each of these examples are actual things I've seen said about maintainers who take money for their work. One maintainer even told me he got criticized for selling t-shirts! Look. There are absolutely problems with every tactic we have to support maintainers. It's true that VC investment comes with strings attached that often lead to problems down the line. It sucks that Patreon or GitHub (and Stripe) take a cut of sponsor money. The additional restrictions imposed by PolyForm or the BSL really do go against the Freedom 0 ideal. I myself am often frustrated by discovering that some key feature I want out of an open core tool is only available to paid licensees.

But you can criticize these systems while still supporting and celebrating the maintainers! Yell at A16Z all you like, I don't care. (Neither do they.) But yelling at a maintainer because they took money from a VC is directing that anger in the wrong direction. The structural and societal problems that make all these different funding models problematic aren't the fault of the people trying to make a living doing open source. It's like yelling at someone for shopping at Dollar General when it's the only store they have access to. Dollar General's predatory business model absolutely sucks, as do the governmental policies that lead to food deserts, but none of that is on the shoulders of the person who needs milk and doesn't have alternatives.
United States

Wildfires Threaten Nuclear Weapons Plant In Texas (independent.co.uk) 68

An anonymous reader quotes a report from The Independent: Wildfires sweeping across Texas briefly forced the evacuation of America's main nuclear weapons facility as strong winds, dry grass and unseasonably warm temperatures fed the blaze. Pantex Plant, the main facility that assembles and disassembles America's nuclear arsenal, shut down its operations on Tuesday night as the Windy Deuce fire roared towards the Potter County location. Pantex re-opened and resumed operations as normal on Wednesday morning. Pantex is about 17 miles (27.36 kilometers) northeast of Amarillo and some 320 miles (515 kilometers) northwest of Dallas. Since 1975 it has been the US's main assembly and disassembly site for its atomic bombs. It assembled the last new bomb in 1991. "We have evacuated our personnel, non-essential personnel from the site, just in an abundance of caution," said Laef Pendergraft, a spokesperson for National Nuclear Security Administration's Production Office at Pantex. "But we do have a well-equipped fire department that has trained for these scenarios, that is on-site and watching and ready should any kind of real emergency arise on the plant site."
Open Source

Cloudflare Makes Pingora Rust Framework Open-Source (phoronix.com) 5

Michael Larabel reports via Phoronix: Back in 2022 Cloudflare announced they were ditching Nginx for an in-house, Rust-written software called Pingora. Today Cloudflare is open-sourcing the Pingora framework. Cloudflare announced today that they have open-sourced Pingora under an Apache 2.0 license. Pingora is a Rust async multi-threaded framework for building programmable network services. Pingora has long been used internally within Cloudflare and is capable of sustaining a lot of traffic while now Pingora is being open-sourced for helping to build infrastructure outside of Cloudflare. The Pingora Rust code is available on GitHub.
Power

Are Corporate Interests Holding Back US Electrical Grid Expansion? (ieee.org) 133

Long-time Slashdot reader BishopBerkeley writes: Though it does not come as much of a surprise, a new study highlighted in IEEE Spectrum delves into how corporate profit motives are preventing the upgrading and the expansion of the U.S. electrical grid. The full report can be downloaded here from the source [the nonprofit economic research group NBER].

Besides opening up the market to competition, utilities don't want to lose control over regional infrastructure, writes IEEE Spectrum. "[I]nterregional lines threaten utility companies' dominance over the nation's power supply. In the power industry, asset ownership provides control over rules that govern energy markets and transmission service and expansion. When upstart entities build power plants and transmission lines, they may be able to dilute utility companies' control over power-industry rules and prevent utilities from dictating decisions about transmission expansion."

The article begins by noting that "The United States is not building enough transmission lines to connect regional power networks. The deficit is driving up electricity prices, reducing grid reliability, and hobbling renewable-energy deployment. " Utilities can stall transmission expansion because out-of-date laws sanction these companies' sweeping control over transmission development... One of the main values of connecting regional networks is that it enablesâ"and is in fact critical forâ"incorporating renewable energy... Plus, adding interregional transmission for renewables can significantly reduce costs for consumers. Such connections allow excess wind and solar power to flow to neighboring regions when weather conditions are favorable and allow the import of energy from elsewhere when renewables are less productive.

Even without renewables, better integrated networks generally lower costs for consumers because they reduce the amount of generation capacity needed overall and decrease energy market prices. Interregional transmission also enhances reliability,particularly during extreme weather...

Addressing the transmission shortage is on the agenda in Washington, but utility companies are lobbying against reforms.

The article points out that now investors and entrepreneurs "are developing long-distance direct-current lines, which are more efficient at moving large amounts of energy over long distances, compared with AC," and also "sidestep the utility-dominated transmission-expansion planning processes."

They're already in use in China, and are also becoming Europe's preferred choice...
Moon

Moon Landing's Payloads Include Archive of Human Knowledge, Lunar Data Center Test, NFTs (medium.com) 75

In 2019 a SpaceX Falcon 9 rocket launched an Israeli spacecraft carrying a 30-million page archive of human civilization to the moon. Unfortunately, that spacecraft crashed. But thanks to this week's moon landing by the Odysseus, there's now a 30-million page "Lunar Library" on the moon — according to a Medium post by the Arch Mission Foundation.

"This historic moment secures humanity's cultural heritage and knowledge in an indestructible archive built to last for up to billions of years." Etched onto thin sheets of nickel, called NanoFiche, the Lunar Library is practically indestructible and can withstand the harsh conditions of space... Some of the notable content includes:


The Wikipedia. The entire English Wikipedia containing over 6 million articles on every branch of knowledge.
Project Gutenberg. Portions of Project Gutenberg's library of over 70,000 free eBooks containing some of our most treasured literature.
The Long Now Foundation's Rosetta Project archive of over 7,000 human languages and The Panlex datasets.
Selections from the Internet Archive's collections of books and important documents and data sets.
The SETI Institute's Earthling Project, featuring a musical compilation of 10,000 vocal submissions representing humanity united
The Arch Lunar Art Archive containing a collection of works from global contemporary and digital artists in 2022, recorded as NFTs.
David Copperfield's Magic Secrets — the secrets to all his greatest illusions — including how he will make the Moon disappear in the near future.
The Arch Mission Primer — which teaches a million concepts with images and words in 5 languages.
The Arch Mission Private Library — containing millions of pages as well as books, documents and articles on every subject, including a broad range of fiction and non-fiction, textbooks, periodicals, audio recordings, videos, historical documents, software sourcecode, data sets, and more.
The Arch Mission Vaults — private collections, including collections from our advisors and partners, and a collection of important texts and images from all the world's religions including the great religions and indigenous religions from around the world, collections of books, photos, and a collection of music by leading recording artists, and much more content that may be revealed in the future...


We also want to recognize our esteemed advisors, and our many content partners and collections including the Wikimedia Foundation, the Long Now Foundation, The SETI Institute Earthling Project, the Arch Lunar Art Archive project, Project Gutenberg, the Internet Archive, and the many donors who helped make the Lunar Library possible through their generous contributions. This accomplishment would not have happened without the collaborative support of so many...

We will continue to send backups of our important knowledge and cultural heritage — placing them on the surface of the Earth, in caves and deep underground bunkers and mines, and around the solar system as well. This is a mission that continues as long as humanity endures, and perhaps even long after we are gone, as a gift for whoever comes next.

Space.com has a nice rundown of the other new payloads that just landed on the moon. Some highlights:
  • "Cloud computing startup Lonestar's Independence payload is a lunar data center test mission for data storage and transmission from the lunar surface."
  • LRA is a small hemisphere of light-reflectors built to servce as a precision landmark to "allow spacecraft to ping it with lasers to help them determine their precise distance..."
  • ROLSES is a radio spectrometer for measuring the electron density near the lunar surface, "and how it may affect radio observatories, as well as observing solar and planetary radio waves and other phenomena."
  • "Artist Jeff Koons is sending 125 miniature stainless steel Moon Phase sculptures, each honoring significant human achievements across cultures and history, to be displayed on the moon in a cube. "

United States

US Court Stalls Energy Dept Demand For Cryptocurrency Mining Data (semafor.com) 103

"Crypto mines will have to start reporting their energy use in the U.S.," wrote the Verge in January, saying America's Energy department would "begin collecting data on crypto mines' electricity use, following criticism from environmental advocates over how energy-hungry those operations are."

But then "constitutional freedoms" group New Civil Liberties Alliance (founded with seed money from the Charles Koch Foundation) objected. And "on behalf of its clients" — the Texas Blockchain Council and Colorado bitcoin mining company Riot Platforms — the group said it "looks forward to derailing the Department of Energy's unlawful data collection effort once and for all."

While America's Energy department said the survey would take 30 minutes to complete, the complaint argued it would take 40 hours. According to the judge, the complaint "alleged three main sources of irreparable injury..."

- Nonrecoverable costs of compliance with the Survey
- A credible threat of prosecution if they do not comply with the Survey
- The disclosure of proprietary information requested by the Survey, thus risking disclosure of sensitive business strategy

But more importantly, the survey was implemented under "emergency" provisions, which the judge said is only appropriate when "public harm is reasonably likely to result if normal clearance procedures are followed."

Or, as Semafor.com puts it, the complaint was "seeking to push off the reporting deadline, on the grounds that the survey was rushed through...without a public comment period." The judge, Alan Albright, granted the request late Friday night, blocking the [Department of Energy's Information Administration] from collecting survey data or requiring bitcoin companies to respond to it, at least until a more comprehensive injunction hearing scheduled for Feb. 28. The ruling also concludes that the plaintiffs are "likely to succeed in showing that the facts alleged by the U.S. Energy Information Administration to support an emergency request fall far short of justifying such an action."
The U.S. Department of Energy is now...
  • Restrained from requiring Plaintiffs or their members to respond to the Survey
  • Restrained from collecting data required by the Survey
  • "...and shall sequester and not share any such data that Defendants have already received from Survey respondents."

Thanks to long-time Slashdot reader schwit1 for sharing the news.


Books

Darwin Online Has Virtually Reassembled the Naturalist's Personal Library 24

Jennifer Ouellette reports via Ars Technica: Famed naturalist Charles Darwin amassed an impressive personal library over the course of his life, much of which was preserved and cataloged upon his death in 1882. But many other items were lost, including more ephemeral items like unbound volumes, pamphlets, journals, clippings, and so forth, often only vaguely referenced in Darwin's own records. For the last 18 years, the Darwin Online project has painstakingly scoured all manner of archival records to reassemble a complete catalog of Darwin's personal library virtually. The project released its complete 300-page online catalog -- consisting of 7,400 titles across 13,000 volumes, with links to electronic copies of the works -- to mark Darwin's 215th birthday on February 12.

"This unprecedentedly detailed view of Darwin's complete library allows one to appreciate more than ever that he was not an isolated figure working alone but an expert of his time building on the sophisticated science and studies and other knowledge of thousands of people," project leader John van Wyhe of the National University of Singapore said. "Indeed, the size and range of works in the library makes manifest the extraordinary extent of Darwin's research into the work of others."
Games

How One Developer Earned Over $300K From Games Made in 30 Minutes (theguardian.com) 70

An anonymous reader shares a report: "The first one, I'll be honest, probably took seven or eight hours," says TJ Gardner. "But the subsequent ones -- Stroke the Beaver, for example -- would have taken about half an hour." Gardner is the creator of the "Stroke" video games, available to download from the PlayStation Store for $4 a pop. Each one features a different animal -- cats, dogs and hamsters, along with less cuddly creatures such as snakes and fish -- and they all follow the same blueprint.

When you start the game, an image of the animal appears against a plain blue background. In the top left-hand corner of the screen are the words "Strokes 0." You press X to stroke the animal. The animal flashes briefly. The number in the corner goes up by 1. After 25 strokes, you are rewarded with a bronze trophy. Keep going until you hit 2,000 strokes, and you will receive a platinum award. That's it. There is no animation; there are no sound effects. Just a picture of an animal under a Creative Commons licence from Wikipedia, and some lo-fi acoustic beats looping endlessly in the background. No running, no jumping, no guns, no baddies, no special moves or power-ups or puzzles. Are the Stroke games even video games at all?
The Stroke games, launched in September 2022, have been downloaded more than 120,000 times, amassing nearly $350,000 in sales. Sony takes a 30% cut for hosting the game in the PlayStation Store, leaving Gardner with a pre-tax profit of about $240,000.
Open Source

VC Firm Sequoia Capital Begins Funding More Open Source Fellowships (techcrunch.com) 15

By 2022 the VC firm Sequoia Capital had about $85 billion in assets under management, according to Wikipedia. Its successful investments include Google, Apple, PayPal, Zoom, and Nvidia.

And now the VC firm "plans to fund up to three open source software developers annually," according to TechCrunch, which notes it "a continuation of a program it debuted last year." The Silicon Valley venture capital firm announced the Sequoia Open Source Fellowship last May, but it was initially offered on an invite-only basis with a single recipient to shout about so far. Moving forward, Sequoia is inviting developers to apply for a stipend that will cover their costs for up to a year so they can work full-time on the project — without giving up any equity or ownership.... "The open source world is to some extent divided between the projects that can be commercialized and the projects that are very important, very influential, but just simply can't become companies," said Sequoia partner Bogomil Balkansky. "For the ones that can become great companies, we at Sequoia have a long track record of partnering with them and we will continue partnering with those founders and creators."

And this is why Sequoia is making two distinct financial commitments to two different kinds of open source entities, using grants to support foundational projects that might be instrumental to one of the companies it's taking a direct equity stake in. "In order for Sequoia to succeed, and for our portfolio of companies that we partner with to succeed, there is this vital category of open source developer work that must be supported in order for the whole ecosystem to work well," Balkansky added. From today, Sequoia said it will accept applications from "any developer" working on an open source project, with considerations made on a "rolling basis" moving forward. Funding will include living expenses paid through monthly installments lasting up to a year, allowing the developer to focus entirely on the project without worrying about how to put food on the table.

Spotify, Salesforce and even Bloomberg have launched their own grant programs too, the article points out.

"But these various funding initiatives have little to do with pure altruism. The companies ponying up the capital typically identify the open source software they rely on most, and then allocate funds accordingly..."
AI

Thanks to Machine Learning, Scientist Finally Recover Text From The Charred Scrolls of Vesuvius (sciencealert.com) 45

The great libraries of the ancient classical world are "legendary... said to have contained stacks of texts," writes ScienceAlert. But from Rome to Constantinople, Athens to Alexandria, only one collection survived to the present day.

And here in 2024, "we can now start reading its contents." A worldwide competition to decipher the charred texts of the Villa of Papyri — an ancient Roman mansion destroyed by the eruption of Mount Vesuvius — has revealed a timeless infatuation with the pleasures of music, the color purple, and, of course, the zingy taste of capers. The so-called Vesuvius challenge was launched a few years ago by computer scientist Brent Seales at the University of Kentucky with support from Silicon Valley investors. The ongoing 'master plan' is to build on Seales' previous work and read all 1,800 or so charred papyri from the ancient Roman library, starting with scrolls labeled 1 to 4.

In 2023, the annual gold prize was awarded to a team of three students, who recovered four passages containing 140 characters — the longest extractions yet. The winners are Youssef Nader, Luke Farritor, and Julian Schilliger. "After 275 years, the ancient puzzle of the Herculaneum Papyri has been solved," reads the Vesuvius Challenge Scroll Prize website. "But the quest to uncover the secrets of the scrolls is just beginning...." Only now, with the advent of X-ray tomography and machine learning, can their inky words be pulled from the darkness of carbon.

A few months ago students deciphered a single word — "purple," according to the article. But "That winning code was then made available for all competitors to build upon." Within three months, passages in Latin and Greek were blooming from the blackness, almost as if by magic. The team with the most readable submission at the end of 2023 included both previous finders of the word 'purple'. Their unfurling of scroll 1 is truly impressive and includes more than 11 columns of text. Experts are now rushing to translate what has been found. So far, about 5 percent of the scroll has been unrolled and read to date. It is not a duplicate of past work, scholars of the Vesuvius Challenge say, but a "never-before-seen text from antiquity."

One line reads: "In the case of food, we do not right away believe things that are scarce to be absolutely more pleasant than those which are abundant."

Thanks to davidone (Slashdot reader #12,252) for sharing the article.
Crime

Zeus, IcedID Malware Kingpin Faces 40 Years In Prison (theregister.com) 39

Connor Jones reports via The Register: A Ukrainian cybercrime kingpin who ran some of the most pervasive malware operations faces 40 years in prison after spending nearly a decade on the FBI's Cyber Most Wanted List. Vyacheslav Igorevich Penchukov, 37, pleaded guilty this week in the US to two charges related to his leadership role in both the Zeus and IcedID malware operations that netted millions of dollars in the process. Penchukov's plea will be seen as the latest big win for US law enforcement in its continued fight against cybercrime and those that enable it. However, authorities took their time getting him in 'cuffs. [...]

"Malware like IcedID bleeds billions from the American economy and puts our critical infrastructure and national security at risk," said US attorney Michael Easley for the eastern district of North Carolina. "The Justice Department and FBI Cyber Squad won't stand by and watch it happen, and won't quit coming for the world's most wanted cybercriminals, no matter where they are in the world. This operation removed a key player from one of the world's most notorious cybercriminal rings. Extradition is real. Anyone who infects American computers had better be prepared to answer to an American judge."

This week, he admitted one count of conspiracy to commit a racketeer influenced and corrupt organizations (RICO) act offense relating to Zeus, and one count of conspiracy to commit wire fraud in relation to IcedID. Each count carries a maximum sentence of 20 years. His sentencing date is set for May 9, 2024.
Zeus malware, a banking trojan that formed a botnet for financial theft, caused over $100 million in losses before its 2014 dismantlement. Its successor, SpyEye, incorporated enhanced features for financial fraud. Despite the 2014 takedown of Zeus, Penchukov moved on to lead IcedID, a similar malware first found in 2017. IcedID evolved from banking fraud to ransomware, severely affecting the University of Vermont Medical Center in 2020 with over $30 million in damages.

Slashdot Top Deals