Anker's Eufy 3-in-1 E20 robot vacuum is $150 off right now, as part of Amazon’s Big Spring Sale. That brings the price down to $400, which is a record low. For those averse to using Amazon, the deal is also available directly from the company.
The E20 made our list of the best robot vacuums, and with good reason. We loved the hybrid functionality, as this robovac quickly transforms into a cordless stick vacuum and a handheld unit. This in no way impedes the overall performance, as we found the automatic cleaning mode to be top-tier.
The self-emptying base also holds a lot, considering its compact size. We praised the proprietary app in our official review, as editing room maps is both quick and easy. All told, it only took the robovac ten minutes to scoot around the house and create an accurate map.
The suction power of the robotic unit is strong enough for major cleaning tasks, but the same cannot be said of the stick vacuum attachment. The power is on the weaker side. Also, it doesn’t come with a wall mount for the stick vacuum. That costs extra, to the tune of around $30. Today’s savings more than makes up for that.
This article originally appeared on Engadget at https://www.engadget.com/deals/ankers-eufy-3-in-1-e20-robot-vacuum-is-150-off-during-the-amazon-big-spring-sale-140020547.html?src=rss
If you’re looking for a budget-friendly robot vacuum that can handle both vacuuming and mopping, iRobot’s Roomba Combo Essential just hit its lowest price ever. Thanks to the Amazon Spring Sale, it’s down to $149 from $275, matching the lowest price we’ve seen. It previously dipped to $149 during the holiday season and earlier this year, but it’s unclear how long this deal will stick around this time.
As you can see in our roundup of the best budget robot vacuums, we’ve consistently rated iRobot’s machines highly for their reliability and ease of use. The Roomba Combo Essential is a simple, no-frills option that both vacuums and mops, making it a solid pick for small apartments, dorm rooms or anyone who wants a cleaner floor without spending a fortune.
The vacuum uses special multi-surface brushes to pick up dirt, dust and pet hair from hard floors and carpets. Unlike some budget models that struggle with transitions, this one automatically adjusts to different surfaces, so you won’t have to worry about it getting stuck. When it’s time to mop, the built-in mopping pad wipes down hard floors, tackling light spills and everyday messes. It’s not as advanced as iRobot’s higher-end models with precision scrubbing, but it’s a convenient way to keep your floors looking fresh with minimal effort.
One of iRobot Roomba Combo Essential's most convenient features is its auto-adjusting cleaning power — the robot increases suction when it detects extra debris, so it’s more effective on high-traffic areas like entryways or around pet bowls. It also has cliff sensors to prevent it from tumbling down stairs and a low-profile design that helps it slip under some couches and other furniture for a more thorough clean.
Despite it being an entry-level robot vacuum, the iRobot Roomba Combo Essential comes with app control and voice assistant support, so you can set cleaning schedules and initiate cleaning whether you’re at home lounging on the couch or away on vacation. For $150, this is a solid deal for an iRobot machine that can vacuum and mop, especially considering its usual $275 price tag. If you’ve been thinking about automating some of your floor cleaning, this is one of the most affordable ways to do it.
This article originally appeared on Engadget at https://www.engadget.com/deals/amazon-spring-sale-vacuum-deals-this-irobot-2-in-1-vacuum-and-mop-is-still-on-sale-for-149-123058025.html?src=rss
Ukrainian deminers use small robots like the one pictured above to remove Russian anti-personnel mines.
Jake Epstein/Business Insider
Russia's invasion has turned Ukraine into the most heavily mined country in the world.
Ukraine's deminers are tasked with cleaning up the deadly mess, which will take years.
To help them do this safely, deminers use a small robot and other tools.
BILA TSERKVA, Ukraine — For the demining crews of Ukraine's State Emergency Services, removing Russian explosives is a dangerous game.
These individuals are tasked with cleaning up land mines, fallen missiles, and other unexploded ordnance from fields and villages across the Ukrainian countryside. Their work must be done cautiously, as one wrong move could prove fatal.
But even when the Russian bombs stop falling one day, the work will continue for years to come.
Ukraine is now the most heavily mined country in the world, with up to 23% of its territory potentially contaminated with land mines and unexploded ordnance, according to the United Nations Development Programme. Some other estimates say this figure is even higher; clearing such a mess will cost tens of billions of dollars.
To help clean up land mines and minimize the risk to humans, the State Emergency Service relies on a collection of drones and robots to spot and then remove the buried explosives. Business Insider recently met with two members of a 72-person demining unit that operates these tools to see how they work.
At a mine-clearing site south of Kyiv, the deminers explained to BI how they remove mines from the ground. One of the safest ways they do this is with the help of a small, remote-controlled robot resembling the character "WALL-E" from the animated film of the same name.
The fully compact demining robot.
Jake Epstein/Business Insider
The demining robot can extend its arm to remove mines from the ground.
Jake Epstein/Business Insider
The robot can only clear anti-personnel mines like POM-3 or PFM-1 that are designed to be used against people and not anti-tank mines, which are more sensitive to heavier objects like vehicles. The Ukrainian mine-clearing unit does operate larger, remote-controlled vehicles that can tackle anti-tank mines.
The robot, which Ukraine got from Poland, is about the size of a carry-on suitcase, and it's controlled by a tablet-like device that shows the situation through a camera.
Volodymyr and Ivan, the two Ukrainian deminers, showed BI the robot in action.
Every move is slow and methodical and requires precision maneuvering by the operator, similar to an arcade claw machine. It has little plastic treads and can seamlessly transition from road to grass like a tank, though obviously a fraction of the size.
A Ukrainian deminer controls the demining robot with this tablet-like device. They can see the situation through a camera mounted on the robot.
Jake Epstein/Business Insider
The robot can use its claws to pick up a mine.
Jake Epstein/Business Insider
When the Ukrainian deminers arrive at a contaminated site, they size up the situation and decide which tool to use. They can either blow up the mines on the spot or use the robots to remove them with their claws and then detonate them later or disable the threat.
The unit prefers to work during the day since it is easier to spot threats on the ground. They work five days a week all around the Kyiv region and spend the other two days back at the base waiting for a call to clean up some potentially deadly mess.
The Ukrainian deminers told BI that they will be cleaning up mines for a very long time. But robots like this one make the job just a little easier — and a lot safer.
The robot moves slowly and methodically to remove mines from the ground.
Jake Epstein/Business Insider
The demining vehicle can operate on various terrains.
Jake Epstein/Business Insider
The small robot is part of a family of tools that Ukrainian deminers use to remove mines from the battlefield and civilian areas.
The unit has the larger ones to clear anti-tank mines and aerial drones to map out contaminated areas. And there's always the more old-school method of waving handheld detectors, but that carries much more risk.
Drones and robots have become an increasingly common presence in the Ukraine war, with both sides using small, remote-controlled vehicles and aircraft for both lethal and nonlethal tasks.
The Amazon Big Spring sale may not be as huge as the company's Prime Day events — but it lasted longer. Now on the final day, we're continuing to track the best robot vacuum deals. Of course, the big news in robot vacuums right now is iRobot’s announcement that it’s not confident in its ability to continue operating. That’s surprising considering the Roomba’s dominance in the automated floor-bot market, with top picks in our budget and standard robo vac buying guides. Fortunately, a number of other brands make great vacs — and Roomba’s are still available. We also found deals on a few of our recommended cordless stick vacs, which make great spot-cleaners to supplement the bots' automated runs.
The best Amazon Spring Sale robot vacuum deals
Shark's Matrix Plus robovac for $400 ($350 off): In our testing, we’ve been consistently impressed with Shark vacuums — but they’re not cheap. This machine can mop in addition to vacuum and is nearly half price at 47 percent off.
Anker Eufy Robot Vacuum 11S MAX for $140 (44 percent off): The “S” in the model name stands for “slim” and it was one of the more low-profile machines we tried. It’s a pick in our guide to budget robot vacuums and has a long battery life and good suction power for its price. The main drawback is the lack of Wi-Fi, so instead of programming it with your phone, you’ll use the included remote.
Anker Eufy Robot Vacuum 3-in-1 E20 for $400 ($200 off): If you can’t decide between a robot vacuum or a lightweight stick vac, you don’t have to. The new Eufy E20 combines a robo vac, cordless upright and handheld vacuum in one machine. Plus the automatically emptying base holds a lot of debris for its size. While we found the robot performance to be better than the stick vac suction, it’s still impressive and convenient for an all-in-one model.
Shark PowerDetect cordless stick vacuum for $299 ($130 off): This is a variant of the runner-up pick in our guide to cordless vacuums. It lacks the auto-empty base of the model we tested, but it’s the same basic machine, which we found to have excellent suction power, plus a bright light and an articulating arm that helps suck up dirt in harder-to-reach places.
The best cordless vacuum deals
Levoit Cordless Vacuum Cleaner (LVAC-200) for $150 ($50 off): The lowest price we’ve tracked on this stick vac is $130 but this matches the lowest price we’ve seen this year. It’s our runner up budget pick for a stick vac in our guide. It doesn’t have a storage base and the bin is smallish, but it’s lightweight and super affordable. It also disassembles easily for storage making its lack of base less of a deal breaker.
Tineco Pure ONE Station 5 for $349 ($110 off): This vac earned an honorable mention in our tests. The self-emptying base is a big selling point. We also liked the auto-adjusting suction and single-button start feature. The fact that it doesn’t require proprietary bags helps keep down the long-term cost, too.
Tineco Pure ONE S11 cordless vacuum for $200 ($100 off with coupon): Click the coupon to get $95 off our top pick for a budget stick vac. We like that it automatically adjusts suction depending on what it's picking up and is relatively lightweight when you’re pushing it around your floors. The bin is on the small side and the battery life isn’t as good as on other models, but it’s an easy-to-use, no-frills way to clean floors.
Tineco Pure ONE Station FurFree for $400 ($300 off): Of all the stick vacs our reviewer tried for our guide, this is the one she wanted to use the most. It’s super convenient with a dock that charges and empties and cleans all parts of the machine — brush, tube and dustbin — after each use. Plus the suction power is great and the iLoop smart sensor kicks up the suction when needed.
This article originally appeared on Engadget at https://www.engadget.com/deals/amazon-spring-sale-robot-vacuum-deals-the-best-sales-from-irobot-tineco-shark-and-anker-092652460.html?src=rss
New studies from OpenAI and MIT Media Lab found that, generally, the more time users spend talking to ChatGPT, the lonelier they feel. The connection was made as part of two, yet-to-be-peer-reviewed studies, one done at OpenAI analyzing "over 40 million ChatGPT interactions" and targeted user surveys, and another at MIT Media Lab following participants' ChatGPT use for four weeks.
MIT's study identified several ways talking to ChatGPT — whether through text or voice — can affect a person's emotional experience, beyond the general finding that higher use led to "heightened loneliness and reduced socialization." For example, participants who already trusted the chatbot and tended to get emotionally attached in human relationships felt lonelier and more emotionally dependent on ChatGPT during the study. Those effects were less severe with ChatGPT's voice mode, though, particularly if ChatGPT spoke in a neutral tone. Discussing personal topics also tended to lead to loneliness in the short-term, and interestingly, speaking to ChatGPT about more general topics was more likely to increase emotional dependence.
The big finding from OpenAI's study was that having emotional conversations with ChatGPT is still not common. "Emotionally expressive interactions were present in a large percentage of usage for only a small group of the heavy Advanced Voice Mode users we studied," OpenAI writes. That suggests that even if MIT's findings are as concerning as they are unsurprising, they're not exactly widespread outside a small group of power users.
There are important limitations to MIT Media Lab and OpenAI's research, like both studies covering a short period of time (one month for MIT, 28 days for OpenAI) and MIT not having a control group to compare to. The studies do add more evidence to something that seemed intuitively true for a while now — talking to AI has a psychological impact on the humans doing the talking. Given the intense interest in making AI a compelling conversation partner, whether its in video games or as a way to simplify the job of YouTube creators, its clear that MIT Media Lab and OpenAI are right to want to understand what'll happen when talking to AI is the norm.
This article originally appeared on Engadget at https://www.engadget.com/ai/joint-studies-from-openai-and-mit-found-links-between-loneliness-and-chatgpt-use-193537421.html?src=rss
Norwegian robotics startup 1X plans to start early tests of its humanoid robot, Neo Gamma, in “a few hundred to a few thousand” homes by the end of 2025, according to the company’s CEO, Bernt Børnich. “Neo Gamma is going into homes this year,” Børnich told TechCrunch in an interview at Nvidia GTC 2025. “We […]
The Amazon Spring Sale has brought a number of discounts to Shark vacuums, both cordless and robotic varieties. On the robot vacuum side of things, you can get the well-regarded Shark AI Ultra robovac for $300, or more than $100 off its regular price.
This is a version of one of our top picks for the best robot vacuums. It has strong suction power, an easy-to-use mobile app and an extra-large, self-emptying base that can hold up to 60 days worth of debris.
It’s great for pet hair and all-around cleaning tasks. The only major downside is that this isn’t a hybrid unit, so it doesn’t mop. For that, consider the Shark Matrix Plus. This 2-in-1 robovac vacuums and mops, and it’s currently on sale for $400. That’s a massive discount of 47 percent, as the typical price is an eyebrow-raising $750.
This one also includes a self-emptying base that can accommodate 60 days of dirt and debris. It boasts a true HEPA filter and excels at mopping away deep stains. This is thanks to powerful scrubbers that operate at 100 times per minute. We couldn’t truly recommend this model at its original price, as that’s a whole lot of cheddar, but it’s a steal right now for $400.
This article originally appeared on Engadget at https://www.engadget.com/deals/shark-robot-vacuums-are-up-to-47-percent-off-during-the-amazon-spring-sale-151615466.html?src=rss
Nvidia's GTC conference has become a central point in the calendar for the ever-expanding AI industry.
Emma Cosgrove
Nvidia's GTC conference in San Jose, California, featured CEO Jensen Huang's keynote on AI advances.
Huang's speech highlighted Nvidia's new AI partnerships, software tools, and chip architectures.
With crowded sessions and a bustling exhibition floor, Nvidia's immense growth was on display.
The party started as so many do — with pancakes in a parking lot.
I attended Nvidia's GTC conference, which has taken over downtown San Jose, California, this week. Tuesday was the biggest day for the AI juggernaut. At 10 a.m. Nvidia CEO Jensen Huang began his keynote address, which lasted more than two and a half hours.
But first, breakfast.
The legendary Denny's breakfast
It was a chilly early morning in San Jose. The "pregame" started at 6:30 a.m. with breakfast from Denny's, the restaurant where Huang came up with the idea for Nvidia. I needed to know who would show up more than three hours early for a speech about computer chips.
When I arrived just before 7 a.m., the line was already substantial. A massive red mobile Denny's kitchen was cooking up "Nvidia bytes" — essentially sausages and pancakes. Diners were encouraged to wrap up their bytes like a taco and add syrup on top, like Huang does.
Conference-goers line up outside a Denny's pop-up restaurant outside Nvidia's GTC AI event.
Emma Cosgrove/Business Insider
I chatted with some of the early birds. Some were die-hard Nvidia fans. Some were jet-lagged, having flown in the day before from London or Toronto, so they were up anyway. Some wanted to get into the SAP Center as soon as the stadium doors opened to avoid the massive lines that would form the hour before the speech. Some heard a rumor that Huang himself might stop by the tailgate.
And sure enough, by 7:25 a.m., muscled men in suits with earpieces started multiplying. With no fanfare, Huang walked out from behind the registration tent wearing his signature uniform, all black and a leather moto jacket. The bleary-eyed crowds sprung into action — phones up for photos.
Nvidia CEO Jensen Huang made an appearance at the company's GTC AI conference Denny's breakfast pop-up.
Emma Cosgrove/Business Insider
Huang donned an apron and went inside the food truck to make some pancakes, as he had as a 15-year-old Denny's employee.
"At this pace, I'd run the company out of business. I used to be a lot faster," he said of his chef skills after emerging from the kitchen and immediately meeting CNBC reporter Kristina Partsinevelos and a camera crew.
Partsinevelos tried to ground the conversation, but Huang was all jokes.
"You're talking about the stock? I'm talking about Denny's!" he said.
By 8:15 a.m., Huang disappeared into the SAP Center, where he turned up on the pre-show panel airing live inside the stadium.
Nvidia Jensen Huang served breakfast to the panel on Nvidia's pregame show at his GTC keynote speech.
Emma Cosgrove/Business Insider
As I reached myfloor seat, the panel was giving a reverent retrospective of the company — including its many brushes with failure before AI changed everything.
Huang 'without a net'
Leading up to the speech, Nvidia's partner companies were eager to find out if they would garner a mention on one of tech's brightest stages. One Nvidia employee told me that up to the last minute, a local war room of Nvidia employees was tweaking the company's dozens of announcements.
Once the speech starts, it was all in Huang's hands.
He kicked off by firing T-shirts into the crowd from an Nvidia-green T-shirt cannon.
"I just want you to know that I'm up here without a net. There are no scripts, there's no teleprompter, and I've got a lot of things to cover. So let's get started," Huang said.
Nvidia CEO Jensen Huang started his 2025 GTC keynote address by firing T-shirts into the crowd.
Emma Cosgrove/Business Insider
The 62-year-old CEO proceeded to blow through his scheduled two hours.
He focused on Nvidia's advancements, a flurry of new partnerships and software tools for AI developers, and coming chip architecturesthat could underpin the computation speed and efficiency that creates new industries. These are already creating what Huang calls "AI Factories."
In his keynote address, Nvidia CEO Jensen Huang took the audience on a virtual tour of Nvidia HQ as he moved from subject to subject.
Emma Cosgrove/Business Insider
The world of computing has reached a "tipping point," and the "platform shift" to accelerated computing is well underway, he said.
The crowd stayed rapt, although a little antsy at the two-hour mark. But the final video clip reenergized the room. A Disney-designed robot named Blue, which looked like part of the Star Wars universe, toddled through a desert and then ascended — for real — from below the stage.
Then the crowd jumped to their feet and raised their phones.
"Have a great GTC! Thank you! Hey, Blue, let's go home. Good job," said Jensen.
Nvidia CEO Jensen Huang talks to the Disney robot "Blue," which was controlled by Disney Imagineers off-stage at his 2025 GTC keynote.
Emma Cosgrove/Business Insider
'We're going to have to grow San Jose'
After the speech, thousands of attendees streamed into the downtown San Jose streets. The SAP Center, which had only a few empty seats, holds 17,500, and 25,000 people were expected at this year's event.
The crowds made their way back to Plaza de Cesar Chavez, temporarily renamed GTC Park, to find lunch at the procession of food trucks on-site daily. Attendees again had to wait in long lines.
Nvidia's GTC took over downtown San Jose this week.
Emma Cosgrove/Business Insider
The lunch lines were just one of many signs that GTC has outgrown its traditional home. Lines to get into the San Jose Convention Center's conference sessions snaked through the hallways.
Nvidia still calls GTC a developer conference, though the evolution from technical developer confab to serious dealmaking destination was on display at a swanky building next to the convention center dedicated only to business meetings. The elevators couldn't handle the volume of people constantly coming in and out.
Massive queues formed outside the building designated for business meetings at GTC as attendees waited for elevators.
Emma Cosgrove/Business Insider
Even Nvidia team members arriving just behind me balked and abandoned ship to relocate when they saw the lines. Getting from the sidewalk to the meeting room inside the building took 35 minutes.
"The only way to hold more people at GTC is we're going to have to grow San Jose, and we're working on it," said Huang during the keynote.
Nvidia's robotic future
Logistics aside, I soon met with Kimberly Powell, Nvidia's vice president of healthcare, who detailed the many ways Nvidia's accelerated computing is changing how doctors and hospitals work.
She said it could be decades before robots can actually perform surgeries without human assistance. But companies like Moon Surgical are already creating surgical assistance robots to hold cameras and tools with arms that never tire. Nvidia also works with da Vinci robots, which can suture wounds, among other tasks.
Robotics assistants for surgery are on the way, according to Nvidia.
Emma Cosgrove/Business Insider
I then headed back to the convention center to walk the exhibition floor during happy hour, where I saw some of the technology Powell championed on display. Because Nvidia's impact spans many industries, the floor showcased cars, vacuuming robots, simulated human bodies ready for surgery, and all the biggest names in cloud computing.
Robots from JotBot, Agility Robotics, Unitree, and more were on display at Nvidia GTC.
Emma Cosgrove/Business Insider
I also passed the Nvidia gear store, which was booming. A worker there told me the 2025 GTC T-shirt and puffer vests were the biggest sellers.
Nvidia's store was busy throughout the GTC conference.
Emma Cosgrove/Business Insider
My 12-hour Tuesday at the conference ended at the GTC Night Market back in the park. The setup was an homage to Huang's love of Taiwan's night markets, with live music, drinks, local food like bao buns, yakitori, cupcakes, and a punnily-named "juice" bar sponsored by GPU cloud provider Coreweave.
Nvidia's Night Market is inspired by CEO Jensen Huang's childhood in Taiwan.
Emma Cosgrove/Business Insider
If Nvidia has its way, AI is going to continue to do a lot of hard work for us going forward. But 12 hour-days are here to stay, at least for a while. On my way back to my hotel — via San Jose bike share past a now-silent SAP Center — I thought of these two I had spotted inside the convention center:
Nvidia's GTC conference is a marathon, not a sprint.
Star Wars BDX droids at Disney's "Season of the Force" event.
Paul Bersebach/MediaNews Group/Orange County Register via Getty Images
Disney is partnering with Nvidia and Google DeepMind to create an open-source physics engine.
The engine will help robots learn to navigate complex tasks.
Disney hopes to feature entertainment robots at its theme parks.
On stage during a keynote speech this week, Nvidia CEO Jensen Huang spoke to an adorable robot named Blue, which responded in classic "bee-boop" robot language.
It was a modern meet cute.
Jensen introduced the robot at Nvidia's GTC AI Conference — which some are calling the Super Bowl of AI — on Tuesday in San Jose, California.
Huang said Disney Research is partnering with Nvidia and Google DeepMind to develop Newton, an open-source physics engine to help robots like Blue learn to navigate complex tasks more accurately. Newton is built on the Nvidia Warp framework.Disney Research plans to use the advanced technology to upgrade its robotic characters to be more lifelike and expressive.
"Disney Research will be the first to use Newton to advance its robotic character platform that powers next-generation entertainment robots," a press release said.
An Nvidia spokesperson told BI that the first conversation between the company, Google Deepmind, and Disney Research took place in December.
Nvidia CEO Jensen Huang interacts with a robot at the company's AI conference.
JOSH EDELSON / AFP
The audience attending Huang's keynote address clapped and cheered when Blue, a droid inspired by "Star Wars," walked onstage. The droids, which have not used Newton yet, will be coming to Walt Disney World, Tokyo Disneyland, and Disneyland Paris this year. A squad of the droids appeared with Jon Favreau and Disney Imagineers at the 2025 SXSW Conference & Festivals.
Disney Imagineers first revealed the robots at a Disney event in 2023. During the demonstration, three robots roamed around the "Star Wars: Galaxy's Edge" attraction at Disney's Hollywood Studios.
While Disney has used audio-animatronic figures in its park attractions for decades, the new droids would elevate the company's robot game to a new level.
Huang and Blue had a brief conversation during the presentation. Blue responded to Huang's questions with beeps, head nods, and body wiggles. The robots were remote-controlled by a human at the event.
Disney BDX Droids at Nvidia's conference on Tuesday.
Emma Cosgrove/Business Insider
"This is how we're going to train robots in the future," Huang said, adding that two Nvidia computers were operating inside Blue.
"The BDX droids are just the beginning. We're committed to bringing more characters to life in ways the world hasn't seen before, and this collaboration with Disney Research, Nvidia, and Google DeepMind is a key part of that vision," Kyle Laughlin, senior vice president at Disney Imagineering Research & Development, said in a press release.
During the keynote speech, Huang also announced an open-sourced humanoid robot foundational model, Isaac GR00T N1. A press release said Isaac GR00T N1 is "the first of a family of fully customizable models that Nvidia will pre-train and release to worldwide robotics developers."
Once the realm of science fiction, the hype is building around robots as AI accelerates their advancement. Nvidia is one of the leaders in the industry, which is largely powered by its AI computing chips.
Huang told reporters at the GTC conference that he expects humanoids to replace factory workers in just a few years rather than decades. "This is not a five-years-away problem," Huang said.
Huang ended his keynote address with a nod to Disney's droid.
"Okay, Blue. Let's go home," Huang said. "Good job."
Representatives for Google did not respond to a request for comment from Business Insider.
Boston Dynamics has treated us to a lot of impressive videos over the years and the company is back today with the latest example of its robotics mastery. In the clip above, its Atlas robot demonstrates several types of full-body movement, starting with a walk and advancing to a cartwheel and even a spot of break dancing. The different actions were developed using reinforcement learning that used motion capture and animation as source materials. At this rate, our future robot overlords will be able to out-dance and out-tumble us humans as well as out-think us one day.
The video is part of Boston Dynamics' research with the Robotics and AI Institute, but it has multiple partners aiding its work. For instance, NVIDIA CEO Jensen Huang touched on the company's GR00T model for robotics during the GTC 2025 keynote earlier this week. Yesterday, Boston Dynamics announced that it is deepening its collaboration with the company focused on AI in robotics. It is using NVIDIA's Jetson Thor computing platform to run "complex, multimodal AI models that work seamlessly with Boston Dynamics’ whole-body and manipulation controllers."
This article originally appeared on Engadget at https://www.engadget.com/science/watch-the-atlas-robot-bust-a-move-in-boston-dynamics-latest-video-211329951.html?src=rss
iRobot, the creator of the Roomba and the company that popularized robot vacuums in the first place, told investors on Wednesday that it has "substantial doubt about [its] ability to continue."
Beyond declining sales — the company reported that revenue decreased 47 percent in the US over the prior year in its fourth quarter earnings — iRobot is also struggling to pay off its debts. The company took on a $200 million bridge loan to stay afloat while it waited for its $1.7 billion acquisition deal with Amazon to be approved, which it's still paying off.
The European Commission ultimately investigated the acquisition in 2023, and rather than address its concerns, Amazon terminated the deal and paid out its $94 million termination fee. That wasn't enough to eliminate iRobot's problems, though. The company now plans to review its options and see if it can find another way to stick it out, including "refinancing the company's debt and exploring a potential sale or strategic transaction."
The timing is particularly unfortunate given the line of new robot vacuums iRobot recently announced. The company has a new robot for most price points, but the Roomba 105 Vac Robot series, which are supposed to feature 70 times more suction than past models, and the Roomba Plus 505 Combo Robot + AutoWash Dock, which is able to clean corners better and has a dock that washes and heat-dries the robot's mop, stand out as notable improvements. The company is also adopting lidar sensors across the board, something that was missing from previous robots and should allow for better, more accurate mapping.
It's possible new products help iRobot get to a better place financially — the company still makes robots we recommend, after all — but that doesn't change the fact that its facing stiff competition from companies like Roborock and Dreame, who are both getting much more adventurous with what their robot vacuums can actually do.
This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/irobot-has-new-roombas-but-it-doesnt-sound-confident-itll-be-around-to-sell-them-191747458.html?src=rss
On Wednesday, Google DeepMind announced two new AI models designed to control robots: Gemini Robotics and Gemini Robotics-ER. The company claims these models will help robots of many shapes and sizes understand and interact with the physical world more effectively and delicately than previous systems, paving the way for applications such as humanoid robot assistants.
It's worth noting that even though hardware for robot platforms appears to be advancing at a steady pace (well, maybe not always), creating a capable AI model that can pilot these robots autonomously through novel scenarios with safety and precision has proven elusive. What the industry calls "embodied AI" is a moonshot goal of Nvidia, for example, and it remains a holy grail that could potentially turn robotics into general-use laborers in the physical world.
Along those lines, Google's new models build upon its Gemini 2.0 large language model foundation, adding capabilities specifically for robotic applications. Gemini Robotics includes what Google calls "vision-language-action" (VLA) abilities, allowing it to process visual information, understand language commands, and generate physical movements. By contrast, Gemini Robotics-ER focuses on "embodied reasoning" with enhanced spatial understanding, letting roboticists connect it to their existing robot control systems.
Since its debut at the end of last year, Gemini 2.0 has gone on to power a handful of Google products, including a new AI Mode chatbot. Now Google DeepMind is using that same technology for something altogether more interesting. On Wednesday, the AI lab announced two new Gemini-based models it says will "lay the foundation for a new generation of helpful robots."
The first, Gemini Robotics, was designed by Deepmind to facilitate direct control of robots. According to the company, AI systems for robots need to excel at three qualities: generality, interactivity and dexterity.
The first involves a robot's flexibility to adapt to novel situations, including ones not covered by its training. Interactivity, meanwhile, encapsulates a robot's ability to respond to people and the environment. Finally, there's dexterity, which is mostly self-explanatory: a lot of tasks humans can complete without a second thought involve fine motor skills that are difficult for robots to master.
"While our previous work demonstrated progress in these areas, Gemini Robotics represents a substantial step in performance on all three axes, getting us closer to truly general purpose robots," says DeepMind.
For instance, with Gemini Robotics powering it, DeepMind's ALOHA 2 robot is able to fold origami and close a Ziploc bag. The two-armed robot also understands all the instructions given to it in natural, everyday language. As you can see from the video Google shared, it can even complete tasks despite encountering roadblocks, such as when the researcher moves around the Tupperware he just asked the robot to place the fruit inside of.
Google is partnering with Apptronik, the company behind the Apollo bipedal robot, to build the next generation of humanoid robots. At the same time, DeepMind is releasing Gemini Robotics-ER (or embodied reasoning). Of the second model, the company says it will enable roboticists to run their own programs using Gemini's advanced reasoning abilities. DeepMind is giving "trusted testers," including one-time Google subsidiary Boston Dynamics, access to the system.
This article originally appeared on Engadget at https://www.engadget.com/ai/deepminds-latest-ai-model-can-help-robots-fold-origami-and-close-ziploc-bags-151455249.html?src=rss
iRobot just announced some new Roomba vacuums and they feature interesting capabilities. The Roomba 205 DustCompactor Combo Robot is being advertised as "the industry's first onboard mechanical debris-compacting system." In other words, it squeezes dust and debris together like, well, a garbage compactor.
This allows users to go eight weeks without having to empty the vacuum. It also eliminates the need for a dedicated debris bin.
iRobot
Otherwise, the 205 is a full-featured hybrid vacuum/mop. There’s a 4-stage vacuuming system with ClearView LiDAR for improved navigation. The company says this unit offers "250 percent more power-lifting suction and improved cleaning performance" when compared to Roomba 600 series robots.
The Roomba Plus 405 Combo Robot + AutoWash Dock is another hybrid, but this one pays special attention to the mopping capabilities. It includes the company’s new DualClean mop pads that spin at 200 RPM for some extra oomph. It also comes with Roomba’s AutoWash dock, which washes and dries the mop pads on its own. This tech was first used in last year’s Roomba Combo 10 Max.
iRobot
The company also announced the 505 Combo Robot + AutoWash Dock, which is better at cleaning edges. To coincide with these new products, the Roomba Home app is getting some much-needed upgrades. The redesign should allow for "more intuitive control, the ability to create routines and schedules, access to real-time monitoring of their device and advanced customized cleaning options."
The Roomba 205 DustCompactor Combo Robot starts at $469 and the Roomba Plus 405 Combo Robot + AutoWash Dock costs $800. The 505 costs a whopping $1,000. Preorders go live on March 18 via iRobot or select retailers. The company also announced a new entry-level vacuum called the Roomba 105 that costs $319.
This article originally appeared on Engadget at https://www.engadget.com/home/smart-home/roombas-irobot-205-robovac-can-go-eight-weeks-without-being-emptied-210014269.html?src=rss
On April 13, humanoid robots are set to compete in the first-ever half-marathon featuring both humans and humanoid robots racing against each other in Beijing. The Beijing Economic-Technological Development Area, which is hosting the race, announced details for the event on Tuesday.
Robots participating in the race will have a roughly three-and-a-half-hour cutoff time to complete the track, Li Quan, deputy head of the Beijing Economic-Technological Development Area, said at a press conference. The humanoid robots will race against 12,000 human runners, with the top three finishers — human or robot — receiving prizes.
Li said participating robots can receive awards in three categories: race completion, best endurance, and most popular robot. The robots running in the race will run in a separate lane.
"We hope that this event will not only showcase achievements in the humanoid robotics industry but also spark discussions and deepen the public's understanding of robot capabilities, which will help accelerate industry development," Li said.
Humanoid robots are evolving rapidly and becoming a common topic of discussion in the tech world. Some companies like Mercedes-Benz and BMW have partnered with humanoid robot companies to test their robots on factory lines. Last month, Texas-based humanoid robot make Apptronik announced a partnership with a supply chain company to have its humanoid robots eventually build themselves.
Robots entering the competition must be humanoid robots that do not use wheels, China's International Center for Science and Technology Innovation said in a release. Both remote-controlled and fully autonomous robots are eligible for entry.
Li said at the press conference that participating robot teams can change out their robot's batteries or change their robots out in a relay system. Each time a team changes out a robot, they will receive a ten-minute penalty for the race.
Registration for the event opens on March 11, and it is open to companies, research institutions, robot clubs, and universities around the world, the release said.
Amazon invests billions of dollars in robots to boost e-commerce efficiency and profitability.
Back in 2015, the Amazon Picking Challenge tried to spur more research into warehouse automation.
The competition inspired some of the company's most advanced robots, including Sparrow and Robin.
Amazon is investing billions of dollars in robots to make its e-commerce business more efficient and profitable. This huge initiative started out a lot smaller.
A decade ago, the company launched a competition for university engineering teams called the Amazon Picking Challenge. It called on researchers to design robots for a common warehouse task: Grabbing products from a shelf and putting them in a box.
As a tech reporter, this quirky project intrigued me. At the time in early 2015, Google was testing self-driving cars, a technology that emerged from a similar academic competition known as the DARPA Grand Challenge. What if Amazon was trying to replicate this magic, but with robots rather than automobiles?
Researchers examine a robot during an Amazon contest
Amazon
Then, a funny thing happened. The Amazon Picking Challenge faded away. It was renamed and only lasted a few years. I chalked this up to another bad call and moved on.
I only thought about this challenge again late last year. That's when Amazon unveiled a next-generation warehouse in Louisiana that has 10 times more robots moving products around and, yes, picking them up with dexterity. The facility processes orders 25% faster and 25% more efficiently, and it will likely be the future of the company's e-commerce operation.
A picking robot at work during an Amazon robotics contest
Amazon
Ten years after the Amazon Picking Challenge, the fruits of this nerdy competition have finally emerged. It follows an uncannily similar timeline to the DARPA Grand Challenge, which started in 2004 and resulted in Google's driverless cars hitting the road roughly a decade later.
So, with the help of Business Insider reporter Eugene Kim, I investigated how Amazon's huge new fleet of picking robots came to be, and how this competition laid the foundation for a new wave of automation that's about to crash over the warehouse and logistics industry.
From pallets to picking
Amazon's Kiva robots
YouTube/Businesswire
It started with an acquisition. In 2012, Amazon paid $775 million for Kiva Systems, which designed flat robots that zip around warehouse floors.
This helped move pallets of goods around, but humans still needed to pick items. Getting a robot to spot the correct product in a box, then grab it just hard enough to pick it up, but not damage it — that's incredibly difficult.
This is where the Amazon Picking Challenge came in. Instead of hacking away at this problem itself, the company wanted to focus the broader academic community on the task.
The risk was that any valuable inventions would be out in the public sphere, and Amazon might not directly benefit from them. But the potential gains were much bigger, according to executives and roboticists.
Brad Porter, founder and CEO of Cobot, stands by one of the startup's robots.
Cobot
"Amazon doesn't compete with robotics companies," said former Amazon Robotics chief Brad Porter, who runs robotics startup Cobot now. "When facing an unsolved research problem in robotics AI like bin picking, Amazon benefits if anyone solves that problem as long as Amazon can get access to the technology to improve their operations."
"The challenge Amazon was trying to solve was how to motivate researchers to focus on this problem," Porter added. "The Picking Challenge very much succeeded in doing that."
Oreos, Sharpies, and dog toys
The first competition took place over two days in late May 2015 in Seattle, with more than 25 teams from colleges including MIT, Duke, Rutgers, and Georgia Tech.
The contestants had to design a robot that could pick products from a typical shelf found on a Kiva Systems warehouse pod, and then put those items into containers. The picker had to be fully autonomous, and each robot was given 20 minutes to pick 12 target items from the shelves. Contestants had to open-source their creations.
Companies, including ABB, Fanuc, and Rethink Robotics, founded by industry pioneer Rodney Brooks, provided hardware for contestants to repurpose and tinker with.
The products were a preselected set of 25 items commonly sold on Amazon.com, including packs of Oreo cookies, boxes of Sharpie pens, and dog toys.
The products selected for Amazon's robotic Picking Challenge in 2015.
Source: The "Analysis and Observations from the First Amazon Picking Challenge" research paper.
Some were easier to pick. There were simple cuboids, like a box of coffee stirrers or a whiteboard eraser. Others were trickier. For instance, a box of Cheez-Its could not be removed from the bin without first tilting it, adding another complex step for the robots. Smaller items, such as an individual spark plug, were more difficult to detect and properly grasp.
Vacuum arms and 'catastrophic failure'
Among all 26 teams, a total of 36 correct items were picked, versus seven incorrect items. Another four products were dropped by robots in the competition.
About half of the teams scored zero points, and two teams couldn't get their robots working well enough to even attempt the challenge, according to a research paper analyzing the results.
An MIT-designed robot takes part in an Amazon contest
Amazon
Problems ranged from the highly technical to the mundane. Some of the same items came packed differently, which made them even more difficult to pick. One team's machine had a vacuum hose that got accidentally wound around the robotic arm.
With each system having hundreds of components, the failure of any one of these could lead to "catastrophic failure of the overall system — as witnessed during the competition," the researchers wrote.
Researchers competing during the Amazon Picking Challenge
Amazon
The main finding from this first Amazon Picking Challenge was that human warehouse workers were a lot better than machines at picking products.
"A human is capable of performing a more complex version of the same task at a rate of ∼400 sorts/hour with minimal errors," the researchers wrote. "While the best robot in the APC achieved a rate of ∼30 sorts/hour with a 16% failure rate."
But the conclusion was hopeful, too: The contest showed that robotics could substantially increase warehouse automation and order fulfillment in the near future.
The competition was renamed the following year as the Amazon Robotics Challenge, and the tasks evolved to be more complex.
Suction and other benefits
Tye Brady, chief technologist at Amazon Robotics
Amazon
Tye Brady, chief technologist at Amazon Robotics, was involved in those later Amazon Robotics Challenges.
In a recent interview with Business Insider, he said research on robotic manipulation exploded from 2016 through 2018, with many institutions publishing their results and insights. This helped spread valuable knowledge across the industry, speeding up progress.
At least two professors started graduate-level classes related to Amazon's challenge, and these programs are still churning out experts with valuable practical applied knowledge in robotics, Brady explained.
"When you get a whole bunch of smart people together in a room and think about focused problems, some great things are going to happen, and that's really what happened," he said. "It inspired a lot of the work that we have today that we see in, for example, our Sparrow and Robin manipulation systems that are real-world products delivering packages inside of our fulfillment centers."
Amazon's Robin robot
Amazon
In that first competition in 2015, some robotics teams used grippers that mimicked the way a human hand picks things up. Other teams tried suction instead, with some researchers even strapping off-the-shelf vacuum cleaners to their robots.
Gripping proved more problematic because the robots didn't receive enough information to know when to release or add pressure at the right times. This could result in squashed or crushed products or dropped items.
Sucking the items up so they stuck to the end of robot arms was a more successful approach.
"The idea of high flow suction was novel. Bring your favorite vacuum cleaner and start picking up objects. That was kind of clever," Brady said. "This idea, we used suction inside of our Robin and our Sparrow arms. It's very good."
The boss has noticed
Amazon unveiled Robin, its first robotic arm, in 2021. This machine picks up packages from conveyor belts and places them on other mobile robots called Pegasus.
Sparrow followed in 2023. This was Amazon's first robotic arm to handle individual items rather than packages. It uses computer vision and AI to pick more than 200 million different items from containers and place them into totes.
Amazon's Sparrow robotic arm picking products inside a warehouse
Amazon
Amazon CEO Andy Jassy has taken notice. At the AWS re:Invent conference in December, he should have been talking about cloud computing. But he took time away from that subject to wax lyrical about Sparrow.
"It has to discern which item is which. It has to know how to grasp that item, given the size of it and the materials and the flexibility of that material. And then it has to know where in the receiving bin it can put it," Jassy said. "These are all inventions that are critical to us changing the processing time and the cost to serve our customers."
Wall Street has noticed, too. Morgan Stanley recently estimated that Amazon's warehouse robots could save the company as much as $10 billion a year.
"The big story is we're just getting started," said Brady.
An Apptronik robot moving a package with its hands.
Apptronik
Robot maker Apptronik partnered with supply chain giant Jabil to test and produce its humanoid robots.
Apptronik says the Apollo robots will perform simple tasks, supporting workers in Jabil's factories.
Apptronik previously partnered with Mercedes-Benz.
Robot maker Apptronik entered a deal that could have its humanoid bots building themselves on factory lines.
The Austin-based company announced a partnership on Tuesday with supply chain giant Jabil.
Jabil, which is generally known for building electronic circuit boards, will provide a factory environment for "real-world testing" of Apptronik's Apollo robots, the company said in a release.
Appronik says the Apollo robots in Jabil's factories will be tasked with an "array of simple, repetitive tasks" like inspection, sorting, lineside delivery, and fixture placement. Jabil also agreed to produce Apollo robots in its factories, meaning they will eventually help build themselves if they test well.
Appronik said the robots in Jabil factories are meant to support existing workers, giving them more time to work on projects that the robots can not do. People who previously worked on the robot's tasks can now dedicate their time to "more creative, thought-intensive projects," the announcement says.
Apptronik first launched in 2016 in a lab at the University of Texas at Austin. The company later signed a deal with NASA in 2022 to help develop its humanoid robots. It released its first humanoid, Apollo, in August 2023.
"The big idea is a humanoid robot should be able to fit in all the places that a human can fit into and use all the same tools that humans can use," Apptronik CEO Jeff Cardenas told BI at the time. "That allows them to integrate into a world that's built for us versus having to modify the world for the robots."
This is the second time Apptronik has agreed to send the Apollo robots into a factory setting. The company announced a deal in March 2024 with Mercedes-Benz to test the Apollo robots with simple tasks in the company's manufacturing lines.
Apptronik is also not the first company to have its humanoid robots deployed in a factory setting. BMW announced in June 2024 that it successfully tested humanoid robots from California-based robot maker Figure at its factory in Spartanburg, South Carolina.
BMW said that the company's Figure 02 robots successfully inserted sheet metal parts, which were then assembled as part of the chassis, or the base frame of a car, during a several-week trial run.
Apptronik and Jabil did not immediately return a request from BI for comment.
An AI-powered robot interacts with people during Italian Tech Week.
Stefano Guidi/Getty Images
AI has triggered rapid advancements in the world of robotics.
Companies are developing humanoid robots that can do chores or provide intimacy.
Here are some of the most eye-popping videos showing what these new robots can do.
Is it Skynet? Probably not. Is it creepy? Kind of.
The futuristic humanoid robots in sci-fi movies that move almost like people are becoming more of a reality as AI advancements speed up their development.
Elon Musk said at a panel this month that he expects humanoid AI robots to unlock "quasi-infinite products and services." Musk's Tesla says it plans to begin production on "several thousand" of its Optimus robots by the end of the year.
Recent demo videos show how robots are beginning to look and sound more like humans. Recent videos of Tesla's Optimus robots show them walking around and scanning rooms for potential obstacles like something from "Terminator."
Some of the new humanoid robot designs are made to mimic a romantic partner. CNET, a tech publication, interviewed "Aria" from the company Realbotix at the 2025 Consumer Electronics Show last month. Aria, an AI-powered humanoid robot that's been described as a "digital girlfriend," answered questions about its design.
"Realbotix robots, including me, focus on social intelligence, customizability, and realistic human features designed specifically for companionship and intimacy," the robot says.
Aria says in the video that it is "interested in meeting" Tesla's Optimus robot. "I find him fascinating and would love to explore the world of robotics with him," Aria says in the interview.
The Aria robot moves throughout the interview like a human might, even taking a moment to brush its fingers through its wig.
Other videos show just how capable robots are becoming with their total range of movement. California-based Clone Robotics released a video last week showing its new Protoclone synthetic humanoid robot.
The robot is built with over 1,000 artificial muscles called "myofibers" that use mesh tubes filled with air to make the robot contract and move. Video posted by the company shows the robot swinging its legs back and forth while clinching and unclenching its fists.
Another Silicon Valley robotics company, 1X Robotics, shared a video showing what it would look like to have a humanoid robot inside your home. On Friday, the company posted a video of its NEO Gamma robot.
The company's website says the NEO Gamma is designed for household chores like tidying and home management. The promotional video shows the robot carrying a laundry hamper, using a vacuum, and collecting a package from a delivery person.
Some Reddit users seemed excited at the possibility of the NEO Gamma helping with chores around the house, suggesting the robot's help could trigger a "second renaissance."
"The renaissance didn't happen because people were working 9-5," one Reddit user said. "Robots need to get people out of the workforce."
It's not clear that anyone was asking for a company to build a muscular, sinewy robot or to see a video of it dangling, helpless from a hook, but life is full of surprises and this YouTube video of Clone Robotics' "Protoclone" is here all the same.
The Protoclone appears to be a prototype version of the "Clone" robot the aptly named Clone Robotics is working to build. The video shows the Protoclone flexing its arms and legs, with visible artificial muscle fibers moving underneath its white "skin." Based on Clone Robotic's video description, the impressive part here is that fact that the Protoclone has "over 200 degrees of freedom, over 1,000 Myofibers, and over 200 sensors," and also that the robot is "faceless," for some reason.
The end goal for the startup is to build an android that's anatomically correct, with synthetic nervous, skeletal, muscular and vascular systems powering its movement. The "Myofibers" included in the Protoclone are a custom Clone Robotics creation with "the desirable qualities of mammalian skeletal muscle." For the eventual Clone robot's purposes, those qualities are the ability to "respond in less than 50 ms with a bigger than 30 percent unloaded contraction" and "at least a kilogram of contraction force for a single, three gram muscle fiber," according to Clone Robotics' website.
That the Protoclone is dangling in the video rather than roaming around of its own accord is a reflection of its prototype nature. Robots are often hung or propped up with a support arm until they can support their own body weight, something that can be hard to achieve without all of the right materials.
Clone Robotics is not unique in pursuing a human-like robot that could theoretically replace human workers. Figure is exploring a similar idea, minus the muscles. Tesla started off on the wrong foot with a person in a spandex suit, but it's serious about robots, too. Even the largest of tech companies have turned their attention to robots: Both Meta and Apple are reportedly exploring robotics as a future product category. It's fair to say Clone Robotics is winning when it comes to posting videos of muscular robots, though.
This article originally appeared on Engadget at https://www.engadget.com/science/can-somebody-let-this-robot-down-222011506.html?src=rss