Researchers create robot capable of cleaning your room and doing your laundry
Researchers found that the robot “successfully puts away 85 per cent of objects in real-world test scenarios”
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.A group of researchers have created a new robot which can help clean your room and do your laundry.
The researchers, who are based in different universities like Princeton and Colombia, have shared the new robot that is fit for everyday cleaning: The Tidy Bot. After its given specific commands, the device can effectively pick items from the floor and place them wear they belong, per the team’s paper.
Before using TidyBot, the researchers turned to a “text-based benchmark dataset” where people wrote out certain commands. They then asked large language models (LLMs), specifically GPT-3, to follow these instructions.
Some commands that people input were: “yellow shirts go in the drawer, dark shirts go in the closets, and white socks go in the drawer”. From there, the LLMs summarised these specific examples, which became: “light-coloured clothes go in the drawer and dark-coloured clothes go in the closet”.
There were a total of 24 scenarios in four different rooms. During each scenario, there were two to five potential places for objects to be placed. To help researchers understand the LLMs’ memorisation skills and ability to follow commands, objects were also described as seen or unseen. The success of this approach was also defined by how many objects are placed in the correct spots.
In the benchmark dataset, the work with the LLMs ultimately led to a “generalised rule for where objects belong, for a specific user”.
This strategy was also applied when doing laundry too, as researchers told the LLMs commands like “put clothes in the laundry basket”.
Researchers also noted that underlying evidence had found that LLMs “are a good match” for the generalised requirements” of personal robots.
Results found the robot achieved an “accuracy of 91.2 percent on unseen objects throughout all scenarios,” as noted in the paper.
The researchers went on to test this approach with TidyBot and found that it “successfully puts away 85 per cent of objects in real-world test scenarios”.
The paper specified that before TidyBot starts cleaning, users have to “provide a handful of example placements for specific objects”. The tasks would then be passed on to and summarised by the LLMs.
From there, “the robot will then carry out the cleanup task by repeatedly picking up objects, identifying them, and moving them to their target”spots.
On TidyBot’s official website, there are multiple videos of the robot in action as it sorts through clothes on the floor and puts them in laundry bins. Another video shows the machine sorting out trash before putting it into garbage and recycling bins.
The website also noted that there’s still work to be done on TidyBot in order for it to cater to each person’s strategies when cleaning.
“A key challenge is determining the proper place to put each object, as people’s preferences can vary greatly depending on personal taste or cultural background,” the site reads. “For instance, one person may prefer storing shirts in the drawer, while another may prefer them on the shelf. We aim to build systems that can learn such preferences from just a handful of examples via prior interactions with a particular person. “
“We show that robots can combine language-based planning and perception with the few-shot summarization capabilities of large language models (LLMs) to infer generalised user preferences that are broadly applicable to future interactions,” the website added.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments