A Cornell University-led team developed modular robots that can perceive their surroundings, make decisions, and autonomously assume different shapes in order to perform various tasks.
“This is the first time modular robots have been demonstrated with autonomous reconfiguration and behavior that is perception-driven,” said Hadas Kress-Gazit, associate professor in the Sibley School of Mechanical and Aerospace Engineering and principal investigator on the project. “We are creating a modular system that is able to do different tasks autonomously. By changing the high-level task, it totally changes its behavior.”
The results of this research were published in Science Robotics.
The robots consist of wheeled, cube-shaped modules that can detach and reattach to form new shapes with different capabilities. The modules, developed by researchers at the University of Pennsylvania, have magnets to attach to each other, and Wi-Fi to communicate with a centralized system.
These interchangeable modules are connected to a sensor module, which is equipped with multiple cameras and a small computer for collecting and processing data about its surroundings. The robot’s software includes a high-level planner to direct its actions and reconfiguration, as well as perception algorithms that can map, navigate, and classify the environment. Check out the video:
In earlier work, the researchers created an open-source online tool where users could create, simulate, and test designs for robot configurations and behaviors. They populated the library by hosting design competitions and inviting students to invent and test different shapes.
The library now consists of 57 possible robot configurations, such as Proboscis (with a long arm in front), Scorpion (modules arranged in perpendicular lines, with a horizontal row in front), and Snake (modules in a single line), and 97 behaviors, such as pickUp, highReach, drive, or drop. Once the robot is given a task, its high-level planner searches the library for shapes and behaviors that meet the current needs.
Other modular robot systems have successfully performed specific tasks in controlled environments, but these robots are the first to demonstrate fully autonomous behavior and reconfigurations based on the task and an unfamiliar environment, Kress-Gazit said.
“I want to tell the robot what it should be doing, what its goals are, but not how it should be doing it,” she said. “I don’t actually prescribe, ‘Move to the left, change your shape.’ All these decisions are made autonomously by the robot.”
The team proved the effectiveness of its system with three experiments. In the first, a robot was instructed to find, retrieve, and deliver all pink and green objects to a designated zone marked with a blue square on the wall. The robot used the “Car” configuration to explore, and then reshaped itself into “Proboscis” to retrieve a pink object from a narrow pathway, finally returning to its car shape to deliver its haul.
The modular, shape-shifting robot created by a team lead by Cornell University.
In the second experiment, the robot was charged with placing a circuit board in a mailbox marked with pink tape at the top of a set of stairs. In the third, it was instructed to place a postage stamp high on the box—essentially the same task, but requiring different behaviors in different environments.
Researchers found the hardware and low-level software were most prone to error. The second experiment, for instance, took 24 attempts before succeeding, with the stairs posing a particular challenge. If such issues are resolved, robots like these could be used for any jobs that require maneuvering in changing terrain, such as cleaning up from an earthquake or natural disaster in which a robot might need to enter building cracks and crevices, said Kress-Gazit.
“Modular robots in general are just fascinating systems, because you’re not restricted by one shape, so there’s a lot of flexibility,” she said. “The hardware is still in research stages, but if we had commercial modular robots, they would be very useful for anything where the environment changes significantly and the robot should adapt to its environment as well.”
The paper was co-authored with Mark Campbell, the John A. Mellowes ’60 Professor of Mechanical Engineering; mechanical engineering doctoral students Jonathan Daudelin and Gangyuan Jing; and Professor Mark Yim and doctoral student Tarik Tosun of the University of Pennsylvania.
The work was funded by the National Science Foundation.