A routine tinkering project by a tech-savvy consumer has exposed a sweeping security failure in Chinese-made robot vacuums, revealing that thousands of devices worldwide could be quietly feeding detailed maps, video, and audio from inside private homes to anyone with the right access.
According to Breitbart, the incident centers on DJI Romo robot vacuums, after a user unintentionally gained control over more than 6,700 internet-connected units while trying to make his own vacuum work with a PlayStation controller, as reported by Toms Hardware. The vulnerability granted access not only to remote control functions but also to highly sensitive data, including detailed floor plans, live camera feeds, and microphone audio, raising fresh concerns about the national-security and privacy implications of Chinese-made smart devices operating inside Western homes.
The discovery was made by Sammy Adoufal, an AI strategist who used Claude Code to reverse engineer the communication protocol between his personal DJI Romo vacuum and the companys servers. His stated goal was modest and entirely local: he simply wanted to enable control of his own device using a PlayStation controller, but the process instead yielded credentials that opened the door to thousands of machines deployed across multiple continents.
Adoufal stressed that he did not engage in what most people would recognize as hacking or cyber intrusion. I didnt infringe any rules, I didnt bypass, I didnt crack, brute force, whatever, Adoufal said, explaining that he merely extracted the private token from his own Romo vacuum, which unexpectedly granted access to live servers in the United States, Europe, and China.
Some security researchers have speculated that the ease of this access suggests the presence of a deliberate backdoor built into the system by the Chinese manufacturer, potentially enabling covert surveillance. While such suspicions are difficult to prove definitively, the fact that a lone user, working on a personal project, could stumble into global control of thousands of devices underscores how fragile consumer privacy becomes when foreign-made, cloud-connected hardware is allowed unfettered access to homes.
Once he realized the scale and seriousness of the flaw, Adoufal chose not to exploit the access and instead immediately notified DJI of the vulnerability. The company responded by rolling out several updates that addressed the primary issue without requiring any action from end users, pushing fixes to secure affected devices and block further unauthorized access.
Yet Adoufal has warned that the core problem has not been fully resolved and that additional security concerns remain. Among them is the ability to stream video feeds from DJI Romo devices without a security PIN, as well as another serious but undisclosed vulnerability that he has chosen not to detail publicly for safety reasons.
He also noted that the issue is not limited to encryption during transmission between devices and servers. According to his findings, all data collected by the robot vacuums is stored in plain text on the servers, making it trivially readable to anyone who manages to gain server access, a practice that would be unthinkable under stricter, privacy-focused standards but is disturbingly common in poorly regulated smart home ecosystems.
Breitbart News has previously documented the dangers posed by internet-connected appliances such as robot vacuums, which often trade away privacy in the name of convenience and AI training. One woman was horrified when a picture of her sitting on the toilet was posted to Facebook by foreign gig workers monitoring devices for popular robot vacuum cleaner company Roomba, a stark example of how intimate moments can be harvested and exposed without consent.
An investigation by the MIT Technology Review revealed that gig workers in Venezuela were asked to label items in photographs of home interiors taken by the Roomba vacuum, some of which included people with visible faces. The employees subsequently posted at least 15 images to social media groups, including pictures of a child and a woman using the restroom, and it is thought that this is not an isolated incident and that labelers frequently receive access to private photos, videos, and audio.
In response to that scandal, iRobot terminated its agreement with one of its data annotation partners, Scale AI, attempting to distance itself from the misuse of customers images. However, iRobot CEO Colin Angle stated in a LinkedIn post that making such images available was necessary for training the companys object recognition algorithms, denying the concern that human gig workers could see test users images and faces, a justification that highlights the broader clash between Big Techs data-hungry AI ambitions and the basic expectation of privacy inside ones own home.
Login