Robotics

Humanoid Robots Start Sorting Luggage In Tokyo Airport Test Amid Labor Shortage (arstechnica.com) 36

An anonymous reader quotes a report from Ars Technica: Humanoid robots are getting a new gig as baggage handlers and cargo loaders at Tokyo's Haneda Airport -- part of a Japan Airlines experiment to address a human labor shortage as airport visitor numbers have surged in recent years. The demonstration, set to launch in May 2026, could eventually test humanoid robots in a wide range of airport tasks, including cleaning aircraft cabins and possibly handling ground support equipment such as baggage carts, according to a Japan Airlines press release. The trials are scheduled to run until 2028, which suggests that travelers flying into or out of Tokyo may spot some of the robots at work.

[...] Japan Airlines is interested in testing whether humanoid robots powered by some of the latest AI models can adapt more readily to human work environments -- such as airports -- without requiring dedicated work stations or other significant workplace modifications. The airline's subsidiary, JAL Ground Service, has teamed up with GMO AI & Robotics Corporation to oversee the demonstration. The Japanese companies will test the G1 robot and Walker E robot from Chinese companies Unitree Robotics and UBTECH Robotics, according to The Asia Business Daily. Humanoid robots still typically cost tens of thousands of dollars per unit despite Chinese robotics manufacturers scaling up mass production, although the Unitree G1 robot costs as low as $13,500 for the baseline model.

A new video from an apparently staged demonstration in an aircraft hangar shows one of the humanoid robots tottering up to a large, metal cargo container and making a vague pushing gesture. But the cargo container only begins to move once a human worker starts the conveyor belt to move the container toward the aircraft. Presumably, the robots will need to put in much more effective work if they're to prove as productive as human airport workers. Having robots working directly alongside humans will also introduce new safety considerations for airports like Haneda Airport, which is Japan's second-largest airport, with flights arriving approximately every two minutes. The first step in the pilot program will involve identifying which airport areas will be safest for humanoid robots.

Iphone

How Will Apple Change Under Its New CEO? (9to5mac.com) 45

How will Apple change in September under its new CEO — former hardware chief John Ternus? The blog Geeky Gadgets is already expecting "significant updates to the iPhone over the next three years," as well as streamlined internal engineering (plus durability enhancements and high-capacity batteries).

2026: Foldable display
2027: Bezel-less iPhone 20 (celebrating the iPhone's 20th anniversary)

CNET's web sites (which include ZDNET, PCMag, Mashable and Lifehacker) are even hosting a contest "to see which of our readers can make the best Apple predictions for 2026. Answer five questions in any of our three rounds of the contest to be entered to win [$applePrize] in September."

But the blog 9to5Mac already has a list of new upcoming Apple products, courtesy of Bloomberg's Mark Gurman (who appeared on the TBPN podcast this week "to talk about Apple's CEO transition, what to expect from John Ternus, and more." As part of the conversation, Gurman said: "There are six major Apple products in development right now, six major new product categories." Here's the full list he shared:

1. AI AirPods
2. Smart glasses
3. Pendant
4. Smart display
5. Tabletop robot
6. Security camera

[...] Gurman has reported on the Pendant before as a new AI wearable that's an alternative to AI AirPods and Glasses. All three products are expected to rely heavily on a paired iPhone for Siri and other AI features. The smart display ('HomePad'), tabletop robot, and security camera are all brand new Apple Home products.

The AI features arrive "thanks to the revamped Apple Foundation Models trained by Google Gemini," reports the AppleInsider blog (citing Gurman's Power On newsletter at Bloomberg). The smart doorbell camera will include "an Apple Intelligence-upgraded version of the facial recognition already included with HomeKit Secure Video. Today, HSV can utilize the Apple Home admin's tagged faces in their Photos app to label people that are viewed on the camera. When a known person rings the doorbell, Siri will announce them by name over the HomePod chime."
AI

Is AI Cannibalizing Human Intelligence? A Neuroscientist's Way to Stop It (wsj.com) 22

The AI industry is largely failing to ask a key design question, argues theoretical neuroscientist/cognitive scientist Vivienne Ming. Are their AI products building human capacity or consuming it?

In the Wall Street Journal Ming shares her experiment about which group performed best at predicting real-world events (compared to forecasters on prediction market Polymarket) — AI, human, or human-AI hybrid teams. The human groups performed poorly, relying on instinct or whatever information had come across their feeds that morning. The large AI models — ChatGPT and Gemini, in this case — performed considerably better, though still short of the market itself. But when we combined AI with humans, things got more interesting. Most hybrid teams used AI for the answer and submitted it as their own, performing no better than the AI alone. Others fed their own predictions into AI and asked it to come up with supporting evidence. These "validators" had stumbled into a classic confirmation bias-loop: the sycophancy that leads chatbots to tell you what you want to hear, even if it isn't true. They ended up performing worse than an AI working solo.

But in roughly 5% to 10% of teams, something different emerged. The AI became a sparring partner. The teams pushed back, demanding evidence and interrogating assumptions. When the AI expressed high confidence, the humans questioned it. When the humans felt strongly about an intuition, they asked the AI to come up with a counterargument... These teams reached insightful conclusions that neither a human nor a machine could have produced on its own. They were the only group to consistently rival the prediction market's accuracy. On certain questions, they even outperformed it...

We are building AI systems specifically designed to give us the answer before we feel the discomfort of not having it. What my experiment suggests is that the human qualities most likely to matter are not the feel-good ones. They're the uncomfortable ones: the capacity to be wrong in public and stay curious; to sit with a question your phone could answer in three seconds and resist the urge to reach for it. To read a confident, fluent response from an AI and ask yourself, "What's missing?" rather than default to "Great, that's done." To disagree with something that sounds authoritative and to trust your instinct enough to follow it. We don't build these capacities by avoiding discomfort. We build them by choosing it, repeatedly, in small ways: the student who struggles through a problem before checking the answer; the person who asks a follow-up question in a conversation; the reader who sits with a difficult idea long enough for it to actually change one's mind. Most AI chatbots today default to easy answers, which is hurting our ability to think critically.

I call this the Information-Exploration Paradox. As the cost of information approaches zero, human exploration collapses. We see it in students who perform better on AI-assisted tasks and worse on everything afterward. We see it in developers shipping more code and understanding it less. We are, in ways that feel like progress, slowly optimizing ourselves out of the loop.

The author just published a book called " Robot-Proof: When Machines Have All The Answers, Build Better People." They suggest using AI to "explore uncertainty.... before you accept an AI's answer, ask it for the strongest argument against itself."

And they're also urging new performance benchmarks for AI-human hybrid teams.
Robotics

Ping-Pong Robot Makes History By Beating Top-Level Human Players (reuters.com) 29

Sony AI's autonomous table-tennis robot Ace has become the first robot to compete against top-level human players. Reuters reports: Ace, created by the Japanese company Sony's AI research division, is the first robot to attain expert-level performance in a competitive physical sport, one that requires rapid decisions and precision execution, the project's leader said. Ace did so by employing high-speed perception, AI-based control and a state-of-the-art robotic system. There have been various ping-pong-playing robots since 1983, but until now they were unable to rival highly skilled human competitors. Ace changed that with its performances against human elite-level and professional players in matches following the rules of the International Table Tennis Federation, the sport's governing body, and officiated by licensed umpires.

The project's goal was not only to compete at table tennis but to develop insights into how robots can perceive, plan and act with human-like speed and precision in dynamic environments. In matches detailed in the study, Ace in April 2025 won three out of five versus elite players and lost two matches against professional players, the top skill level in the sport. Sony AI said that since then Ace beat professional players in December 2025 and last month.
"The success of Ace, with its perception system and learning-based control algorithm, suggests that similar techniques could be applied to other areas requiring fast, real-time control and human interaction -- such as manufacturing and service robotics, as well as applications across sports, entertainment and safety-critical physical domains," said Peter Durr, director of Sony AI Zurich and leader for Sony AI's project Ace.

The findings have been published in the journal Nature.
Robotics

Robots Beat Human Records At Beijing Half-Marathon (techcrunch.com) 90

An anonymous reader quotes a report from TechCrunch: The winning runner at a Beijing half-marathon for humanoid robots finished the race today in 50 minutes and 26 seconds -- significantly faster than the human world record of 57 minutes recently set by Jacob Kiplimo. [...] [T]he winning time is a massive improvement over last year's race, when the fastest robot finished in two hours and 40 minutes.

The Associated Press reports that this year's winner was built by Chinese smartphone maker Honor. It seems the winning robot wasn't actually the fastest, as a different Honor robot finished in 48 minutes and 19 seconds. But that one was remote controlled -- the 50:26 robot was autonomous and won due to weighted scoring. About 40% of participating robots competed autonomously, while the remaining 60% were remote controlled, according to Beijing's E-Town tech hub. Not all of them did as well as Honor's robots, with one robot falling at the starting line and another hitting a barrier.

Robotics

Boston Dynamics' Robot Dog Can Now Read Gauges, Spot Spills, and Reason (ieee.org) 91

Boston Dynamics has integrated Google DeepMind into its robotic dog Spot, giving it more autonomous reasoning for industrial inspections like spotting spills and reading gauges. Spot can also now recognize when to call on other AI tools. IEEE Spectrum reports: Boston Dynamics is one of the few companies to commercially deploy legged robots at any appreciable scale; there are now several thousand hard at work. Today the company is announcing that its quadruped robot Spot is now equipped with Google DeepMind's Gemini Robotics-ER 1.6, a high-level embodied reasoning model that brings usability and intelligence to complex tasks.

[T]he focus of this partnership is on one of the very few applications where legged robots have proven themselves to be commercially viable: inspection. That is, wandering around industrial facilities, checking to make sure that nothing is imminently exploding. With the new AI onboard, Spot is now able to autonomously look for dangerous debris or spills, read complex gauges and sight glasses, and call on tools like vision-language-action models when it needs help understanding what's going on in the environment around it.
"Advances like Gemini Robotics-ER 1.6 mark an important step toward robots that can better understand and operate in the physical world," Marco da Silva, vice president and general manager of Spot at Boston Dynamics, says in a press release. "Capabilities like instrument reading and more reliable task reasoning will enable Spot to see, understand, and react to real-world challenges completely autonomously."

You can watch a demo of Spot's new capabilities on YouTube.
United States

Robot Birds Deployed by Park to Attract Real Birds - Built By High School Students (wyofile.com) 23

"Robotic bird decoys are being deployed at Grand Teton National Park," reports Interesting Engineering, "to influence the behavior of real sage grouse and help restore a declining population.". Robotics mentor Gary Duquette describes the machines as "kind of a Frankenbird." (SFGate shows one of the robot birds charging up with a solar panel... "Recorded breeding calls are played at the scene, with clucking and cooing beginning at 5 a.m. each day.")

Duquette builds the birds with a team of high school students, telling WyoFile that at school they "don't really get to experience real-world problems" where failures lurk. So while their robot birds may cost $150 in parts, the practical experience the students get "is priceless." Spikes in the electric currents burned out servo motors as the season of sagebrush serenades loomed, Duquette said. "The kids had to learn the difference between voltage and amperage...." To resolve the problem, the team wired a voltage converter in line with the Arduino controller and other elements on an electronic breadboard. "We pulled through and got it done in time," he said...

A noggin fabricated by a 3D printer tops the robo-grouse. Wyoming Game and Fish staffers in Pinedale supplied grouse wings from hunter surveys, and body feathers came from fly-tying supplies at an angling store. Packaging foam from a Hello Fresh meal kit replicates white breast feathers, accented by yellow air sacs...

The Independent wonders if more national parks would be visited by robot birds... During this year's breeding season, which runs through mid-May, researchers are using trail cameras to track whether real sage grouse respond to the robotic displays and return to the restored lek sites. If successful, officials say similar robotic systems could eventually be used in other national parks facing wildlife management challenges.
AI

Researchers Build a Talking Robot Guide Dog to Help Visually Impaired People Navigate (studyfinds.com) 27

"Only about 2% of visually impaired people in the United States use guide dogs," notes StudyFinds.com, "partly because breeding and training takes years and fewer than half the dogs in training actually graduate."

But someday there could be another option: What if you could ask your guide dog where the nearest water fountain is and hear it answer back, complete with directions and an estimated walk time? Researchers at the State University of New York at Binghamton have built a robotic guide dog that can do something close to that, holding simple back-and-forth conversations about navigation with its handler, describing the surrounding environment, and talking through route options as it leads the way... Their work, presented at the 40th Annual AAAI Conference on Artificial Intelligence, pairs a large language model, a system that understands and generates language, with a navigation planner. Together, the two let the robot understand open-ended requests, suggest destinations, and adjust plans on the fly.
Thanks to Slashdot reader fjo3 for sharing the article.
AI

OpenAI Calls For Robot Taxes, Public Wealth Fund, and 4-Day Workweek To Tackle AI Disruption 118

OpenAI is proposing (PDF) sweeping policy changes to help manage the societal disruption caused by advanced AI, including taxes on automated labor, a public wealth fund, and experiments with a four-day workweek. The company said the policy document offered a series of "initial ideas" to address the risk of "jobs and entire industries being disrupted" by the adoption of AI tools. Business Insider reports: Among the core policy suggestions is a public wealth fund, which would see lawmakers and AI companies work together to invest in long-term assets linked to the AI boom, with returns distributed directly to citizens. Another is that the government should encourage and incentivize employers to experiment with four-day workweeks with no loss in pay and offer "benefits bonuses" tied to productivity gains from new AI tools.

The policy document also suggests lawmakers modernize the tax system and shift the tax base to corporate income and capital gains, rather than relying on labor income and payroll taxes that could be hit by a wave of AI-powered job losses. It also recommends taxes related to automated labor. OpenAI also called for the accelerated expansion of the US's electricity grid, which is already feeling the strain from a wave of data center construction and energy demand for training ever more powerful AI models.
Sci-Fi

'Project Hail Mary': Real Space Science, Real Astrophotography (wcvb.com) 71

Project Hail Mary has now grossed $300.8 million globally after earning another $54.1 million this weekend from 86 markets, reports Variety, noting that after just nine days it's now Amazon MGM's highest-grossing film ever.

And last weekend it had the best opening for a "non-franchise" movie in three years, adds the Associated Press — the best since 2023's Oppenheimer: Project Hail Mary, which cost nearly $200 million to produce... is on an enviable trajectory. Its second weekend hold was even better than that of Oppenheimer, which collected $46.7 million in its follow-up frame.
But the movie is based on a book by The Martian author Andy Weir, described by one news outlet as "a former software engineer and self-proclaimed 'lifelong space nerd'... known for his realistic and clear-eyed approach to scientifically technical stories." Project Hail Mary has plenty of real science in it, whether it be space mathematics, physics, or astrobiology... The film's namesake project is even comprised of the space programs of other nations, such as Roscosmos from Russia, the Chinese space program, and the European Space Agency...

The story relies on work NASA has done regarding exoplanets, or planets outside our solar system... [This includes a nearby star named Tau Ceti approximately 12 light years from Earth which is orbited by four planets — two once thought to be in "the habitable zone" where liquid water can exist.] Tau Ceti has long been the setting used by sci-fi authors and storytellers. Isaac Asimov used it for his Robot series. Arthur C. Clarke's "Rama" spacecraft came across a mysterious tetrahedron in the Tau Ceti system. Authors Ursula K. Le Guin and Kim Stanley Robinson also set stories in Tau Ceti, and it also serves as the extrasolar setting of the 1968 Jane Fonda film Barbarella. Most recently, the Bungie video game Marathon is set in the far-off system, serving as part of the background story for the extraction shooter, about a large-scale plan to colonize the Tau Ceti system.

The movie also mentions 40 Eridani A, according to the article, a real star about 16 light-years away that was said to be orbited by the fictional planet Vulcan, home to Star Trek's Mr. Spock. It's also mentioned in Frank Herbert's Dune as the star system of the planets Ix and Richese ("noted for their machine culture and miniaturisation," according to the Stellar Australis site's "Project Dune" page).

And in a video on IMAX's YouTube channel, the film's directors explain how for a crucial scene they used non-visible-light photography, which is also an important part of modern astronomy. "Even the credits incorporate real astrophotography into the final moments," the article points out, using the work of award-winning Australian astrophotographer Rod Prazeres. "The only difference between his work of capturing space data in images and what ended up on the big screen was that he gave them 'starless versions' of his photographs to make it easier to place credit text over them."

Prazeres wrote on his web site that he was touched the producers "wanted the real thing... In a world where CGI and AI are everywhere, it meant a lot..."
Robotics

This Friendly Robot Just Installed 100 MW of Solar Power (electrek.co) 55

Utility-scale solar construction... by robots! It's "one of the largest real-world demonstrations," notes Electrek, with 100 MW of capacity installed by the "Maximo" robots from AES, one of the world's top power companies.

Maximo uses AI "to automate the heavy lifting of solar panels and accelerate solar installation," according to their web page, which shows a video of Maximo at work installing a vast field of solar panels in Kern County, California. With assistance from Nvidia, the Maximo team could "develop, test and refine robotic capabilities through physics-based simulation and AI driven modeling before deploying updates in the field," reports Electrek, and they're aiming for a full GW of solar generating capacity: After completing the first half of the Bellefield complex last summer, Maximo engineers went into a higher gear, with the latest version 3.0 robots consistently surpassing an installation rate of one module per minute, with construction crews installing as many as 24 solar panel modules per hour, per person. If that sounds fast, that's because it is. At full tilt, the latest Maximo robot-equipped crews have nearly doubled the output of traditional installation methods at similar solar locations throughout Southern California.

"Reaching 100 MW is an important milestone for Maximo and for the role robotics can play in solar construction," explains Chris Shelton, president of Maximo. "It demonstrates that field robotics can move beyond experimentation and deliver consistent results at utility scale. As solar deployment continues to accelerate globally, technologies that improve installation speed, quality and reliability will become increasingly important...."

Like just about every other business that demands a high degree of physical labor, the construction industry is facing huge labor shortages, making machines like Maximo that provide real efficiency gains welcome additions to the job site.

"The combination of AI, vision, robotics and simulation driven engineering reduced development and validation timelines," the Maximo team said in a statement, "and increased confidence in field performance as the robotic fleet scaled."
Robotics

Melania Trump Welcomes Humanoid Robot At White House Summit 94

Longtime Slashdot reader theodp writes: In Melania and the Robot, the New York Times reports on First Lady Melania Trump's inaugural Fostering the Future Together Coalition Summit, which brought together international leaders, First Spouses from around the world, tech leaders, educators, and nonprofits to collaborate on practical solutions that expand access to educational tools while strengthening protections for children in digital environments (Day 2 WH summary). The Times begins:

"On Wednesday, Mrs. Trump appeared at the White House alongside Figure 3, a humanoid, A.I.-powered robot whose uses, according to the company that makes it, include fetching towels, carrying groceries and serving champagne. But Mrs. Trump joins tech executives and some researchers in envisioning a world beyond robot butlery. She is interested in how these robots could cut it as educators. Both clad in shades of white, the first lady and the visiting robot walked into a gathering of first spouses from around the world, a group that included Sara Netanyahu of Israel, Olena Zelenska of Ukraine, and Brigitte Macron of France. The dulcet tones from a (presumably human) military orchestra played as the first lady and her guest entered the event. Both lady and robot extolled the virtues of further integrating robots into the educational and social lives of children. In the history of modern first-lady initiatives, which have included building a national book festival (Laura Bush), reshuffling the food pyramid (Michelle Obama) and advocating for free community college (Jill Biden), Mrs. Trump's involvement of a humanoid robot in education policy was a first."

"Figure 3 delivered brief remarks and delivered salutations in several languages. With its sleek black-and-white appearance, Figure 3 would fit right in with the first lady's branding aesthetic, which includes a self-titled coffee table book and movie, not least because the name "MELANIA" was emblazoned on the side of its glossy plastic head. After Figure 3 teetered gingerly away, Mrs. Trump looked around the room and told them that the future looked a lot like what they had just witnessed. 'The future of A.I. is personified,' she told her audience. 'It will be formed in the shape of humans. Very soon artificial intelligence will move from our mobile phones to humanoids that deliver utility.' She invited her guests to envision a future in which a robot philosopher educated children."
AI

Canada's Immigration Rejected Applicant Based On AI-Invented Job Duties (thestar.com) 73

New submitter haroldbasset writes: Canada's Immigration Department rejected an applicant because the duties of her current job did not match the Canadian work experience she had claimed, but the Department's AI assistant had invented that work experience. She has been working in Canada as a health scientist -- she has a Ph.D. in the immunology of aging -- but the AI genius instead described her as "wiring and assembling control circuits, building control and robot panels, programming and troubleshooting." "It's believed to be the first time that the department explicitly referred to the use of generative AI to support application processing in immigration refusals," reports the Toronto Star. "The disclaimer also noted that all generated content was verified by an officer and that generative AI was not used to make or recommend a decision."

The applicant's lawyer was shocked "how any human being could make this decision." "Somehow, it hallucinated my client's job description," he said. "I would love to see what the officer saw. Something seriously went wrong here."

The applicant's refusal came just as Canada's Immigration Department released its first AI strategy, which frames artificial intelligence as a way to improve efficiency, service delivery, and program integrity. The department says it has long used digital tools like analytics and automation to flag fraud risks and triage applications, and is now also experimenting with generative AI for tasks such as research, summarizing, and analysis. In this case, however, the department insisted the decision was made by a human officer and that generative AI was not involved in the final decision.
Transportation

Trapped! Inside a Self-Driving Car During an Anti-Robot Attack (seattletimes.com) 139

A man crossing the street one San Francisco night spotted a self-driving car — and decided to confront its passenger, 37-year-old tech worker Doug Fulop. The New York Times reports the man yelled that "he wanted to kill Fulop and the other two passengers for giving money to a robot." A taxi driver would have simply driven away. But Fulop's vehicle had no driver — it was a self-driving Waymo... Self-driving cars are designed to stop moving if a person is nearby. People can take advantage of that function to harass and threaten their passengers.... It was unsettling to be trapped inside a Waymo during an attack, Fulop said. "If he had kept hammering on one window instead of alternating, I'm sure he would have eventually broken through," he said. The attacker did not appear to be on drugs or otherwise impaired, but seemed to be overtaken by extreme anger at the self-driving car, Fulop said.

It did not seem safe to get out and run, he added, since the man was trying to open the locked doors and said he wanted to kill the passengers. They called 911 and Waymo's support line, Fulop said. Waymo told them that it would not manually direct the car away if someone was standing nearby, and that the passengers would be OK with the doors locked. The car's software does not allow riders to jump into the driver's seat and take over during an incident. The attack lasted around six minutes. By then, bystanders had begun cheering on the man, Fulop said. That distracted the man, who moved far enough away from the car that it could finally drive away...

Fulop said he had stopped using Waymo for a time after the January attack and would avoid the service at night unless the company changed its policy of not intervening when a hostile person threatened riders. "As passengers, we deserve more safety than that if someone is trying to attack us," he said. "This can't be the policy to be trapped there."

The article remembers other incidents — including a 2024 video showing three women screaming as their autonomous taxi is spray-painted by vandals. And technology author/speaker Anders Sorman-Nilsson says in Los Angeles five men on e-bikes surrounded his Waymo and forced it to stop. The author felt safe inside the vehicle, according to the times, which adds "He felt reassured knowing that Waymo's many exterior cameras were recording the men. After around five minutes, he said, they gave up and rode away."
Robotics

Amazon Plans to Test Four-Legged Robots on Wheels for Deliveries (cnbc.com) 20

CNBC reports: Amazon has acquired Rivr, a Swiss robotics company developing machines for "doorstep delivery," the company confirmed Thursday... It announced the deal in a notice sent to third-party delivery contractors... "We believe this technology, when working alongside your [delivery associates], has the potential to further improve safety outcomes and the overall customer experience, particularly in the last steps of the delivery process...." In its notice to delivery service partner owners, Amazon said Rivr's technology, which includes a four-legged robot on wheels, will allow it to research and test how the devices can be integrated into delivery operations, including "helping [delivery associates] carry packages from delivery vehicles to customer doorsteps."
Earth

'Pokemon Go' Players Unknowingly Trained Delivery Robots With 30 Billion Images 57

More than 30 billion images captured by Pokemon Go players have helped train a visual mapping system developed by Niantic. The technology is now being used to guide delivery robots from Coco Robotics through city streets where GPS often struggles. Popular Science reports: This week, Niantic Spatial, part of the team behind Pokemon Go, announced a partnership with Coco Robotics, a company that makes short-distance delivery robots for food and groceries. Soon, those robot couriers will scoot around sidewalks using Niantic's Visual Positioning System (VPS)-- a navigation tool that can reportedly pinpoint location down to a few centimeters just by looking at nearby buildings and landmarks. Niantic trained that VPS model on more than 30 billion images captured by Pokemon Go users, and claims it will help robots operate in areas where GPS falls short. [...]

Instead of helping users navigate the way that GPS does, VPS determines where someone is based on their surroundings. That makes Pokemon Go particularly useful as a data source, because players had to physically travel to specific locations and point their phones at various angles. That mapping effort got a significant boost in 2020, when the app added what it called "Field Research," a feature prompting players to scan real-world statues and landmarks with their cameras in exchange for in-game rewards. A portion of the data also reportedly came from areas known as "Pokemon battle arenas." Whether players knew it or not, those scans were creating 3D models of the real world that would eventually power the Niantic model. More data means better accuracy, and because Niantic was collecting images of the same locations from many different users, it could capture the same spots across varying weather conditions, lighting, angles, and heights. [...]

The idea is that Coco's robots can use VPS and four cameras mounted around the machine to get a far more precise read on their surroundings. In turn, the well-equipped robot will deliver food on time. On a broader level, Niantic says its partnership with Coco Robotics is part of a longer-term effort to build a "living map" of the world that updates as new data becomes available. Once VPS-equipped delivery robots hit the streets, they will collect even more info that can be fed back into the model to bolster its accuracy further. This kind of continuous, real-world data collection is already central to how self-driving vehicle companies like Waymo and Tesla operate, and is a large part of why that technology has improved so significantly in recent years.
Robotics

Qualcomm's New Arduino Ventuno Q Is an AI-Focused Computer Designed For Robotics (engadget.com) 25

Qualcomm and Arduino have unveiled the Arduino Ventuno Q, a new AI-focused single-board computer built for robotics and edge systems. Engadget reports: Called the Arduino Ventuno Q, it uses Qualcomm's Dragonwing IQ8 processor along with a dedicated STM32H5 low-latency microcontroller (MCU). "Ventuno Q is engineered specifically for systems that move, manipulate and respond to the physical world with precision and reliability," the company wrote on the product page. The Ventuno Q is more sophisticated (and expensive) than Arduinio's usual AIO boards, thanks to the Dragonwing IQ8 processor that includes an 8-core ARM Cortex CPU, Adreno Arm Cortex A623 GPU and Hexagon Tensor NPU that can hit up ot 40 TOPs. It also comes with 16GB of LPDDR5 RAM, along with 64GB of eMMC storage and an M.2 NVME Gen.4 slot to expand that. Other features include Wi-Fi 6, Bluetooth 5.3, 2.5Gbps ethernet and USB camera support.

The Ventuno Q includes Arudino App Lab, with pre-trained AI models including LLMs, VLMs, ASR, gesture recognition, pose estimation and object tracking, all running offline. It's designed for AI systems that run entirely offline like smart kiosks, healthcare assistants and traffic flow analysis, along with Edge AI vision and sensing systems. It also supports a full robotics stack including vision processing combined with deterministic motor control for precise vision and manipulation. It's also ideal for education and research in areas like computer vision, generative AI and prototyping at the edge, according to Arduino.
Further reading: Up Next for Arduino After Qualcomm Acquisition: High-Performance Computing
Robotics

Could Home-Building Robots Help Fix the Housing Crisis? (cnn.com) 120

CNN reports on a company called Automated Architecture (AUAR) which makes "portable" micro-factories that use a robotic arm to produce wooden framing for houses (the walls, floors and roofs): Co-founder Mollie Claypool says the micro-factories will be able to produce the panels quicker, cheaper and more precisely than a timber framing crew, freeing up carpenters to focus on the construction of the building... The micro-factory fits into a shipping container which is sent to the building site along with an operator. Inside the factory, a robotic arm measures, cuts and nails the timber into panels up to 22 feet (6.7 meters) long, keeping gaps for windows and doors, and drilling holes for the wiring and plumbing. The contractor then fits the panels by hand.

One micro-factory can produce the panels for a typical house in about a day — a process which, according to Claypool, would take a normal timber framing crew four weeks — and is able to produce framing for buildings up to seven stories tall... She says their service is 30% cheaper than a standard timber framing crew, and up to 15% cheaper than buying panels from large factories and shipping them to a site... She adds that the precision of the micro-factories means that the panels fit together tightly, reducing the heat loss of the final home, making them more energy efficient.

AUAR currently has three micro-factories operating in the US and EU, with five more set to be delivered this year... AUAR has raised £7.7 million ($10.3 million) to date, and is expanding into the US, where a lack of housing and preference for using wood makes it a large potential market.

There's other companies producing wooden or modular housing components, the article points out. But despite the automation, the company's co-founder insists to CNN that "Automation isn't replacing jobs. Automation is filling the gap." The UK's Construction Industry Training Board found that the country will need 250,000 more workers by 2028 to meet building targets but in 2023, more people left the industry than joined.
Medicine

Robotic Surgery Performed Remotely on Patient 1,500 Miles Away (bbc.com) 30

"A surgeon in London says he has performed the UK's first long-distance robotic operation," reports the BBC, "on a patient located 1,500 miles (2,400km) away..." Leading robotic urological surgeon Professor Prokar Dasgupta said it felt "almost as if I was there" as he carried out a prostate removal on [62-year-old] Paul Buxton... It is hoped that remote robotic surgery could spare future patients the "vast expense and inconvenience" of travelling for treatment, and help deliver better healthcare to people in more remote locations... Buxton had expected to be put on an NHS waiting list after receiving a shock prostate cancer diagnosis just after Christmas, but he "jumped at the chance" to be the first patient to undergo the treatment remotely as part of a trial. "A lot of people actually said to me: 'You're not going to do it, are you?'

"I thought, I'm giving something back here," he said...

The operation was performed from The London Clinic using a robot equipped with a 3D HD camera and four arms, all controlled through a console with a delay of only 0.06 seconds. The console in the UK was connected to the robot in Gibraltar via fibre-optic cables, with a backup 5G link. A team in Gibraltar remained on standby in case the connection failed, but it held throughout the procedure...

Dasgupta will perform the procedure again on 14 March, which will be live-streamed to 20,000 world-leading urological surgeons at the European Association of Urology congress. He added: "I think it is very, very exciting, the humanitarian benefit is going to be significant."

The U.K.'s National Health Service "is prioritising local robotic-assisted surgery," the article points out, "aiming for 500,000 robot-supported operations a year by 2035."

Thanks to Slashdot reader fjo3 for sharing the article.
Robotics

OpenAI's Former Research Chief Raises $70M to Automate Manufacturing With AI (msn.com) 22

"OpenAI's former chief research officer is raising $70 million for a new startup building an AI and software platform to automate manufacturing," reports the Wall Street Journal, citing "people familiar with the matter.

"Arda, the new startup co-founded by Bob McGrew, is raising at a valuation of $700 million, according to people familiar with the matter...." Arda is developing an AI and software platform, including a video model that can analyze footage from factory floors and use it to train robots to run factories autonomously, the people said. The company's software will coordinate machines and humans across the entire production process, from product design and manufacturability to finished goods coming off the line.

The startup's goal is to make manufacturing cost effective in the Western part of the globe, reducing reliance on China as geopolitical and national security concerns rise... At OpenAI, McGrew was tasked with training robots to do tasks in the physical world, according to this LinkedIn. McGrew was also one of the earliest employees at Palantir.

Slashdot Top Deals