Algorithm Uses Evolution To Design Robots

Imagine you’re running a race. To complete it, your body needs to be strong, and your brain needs to keep track of the route, control your pace, and prevent you from tripping.

The same is true for robots. To accomplish tasks, they need both a well-designed body and a “brain,” or controller. Engineers can use various simulations to improve a robot’s control and make it smarter. But there are few ways to optimize a robot’s design at the same time.

Unless the designer is an algorithm.

Thanks to advances in computing, it’s finally possible to write software programs that optimize both design and control simultaneously, an approach known as co-design. Though there are established platforms to optimize control or design, most co-design researchers have had to devise their own testing platforms, and these are usually very computationally intensive and time-consuming.

To help solve this problem, Jagdeep Bhatia, an undergraduate researcher at MIT and other researchers created a 2D co-design soft robotics simulation system called Evolution Gym. They presented the system at this year’s Conference on Neural Information Processing Systems. They also detailed the system in a new paper.

“Basically, we tried to make like a really simple and fast simulator,” said Bhatia, the lead author of the paper. “And on top of that, we built like a bunch of tasks for these robots to do.”

In Evolution Gym, 2D soft robots are made up of colored cells, or voxels. Different colors represent different types of simple components–either soft or rigid material, and either horizontal or vertical actuators. The results are robots that are patchworks of colored squares, moving through video game-like environments. Because it is 2D and the program is simply designed, it doesn’t need much computational power.

As the name suggests, the researchers structured the system to mimic the biological process of evolution. Rather than generate individual robots, it generates populations of robots with slightly different designs. The system has a bi-level optimization system– an outer loop and an inner loop. The outer loop is the design optimization: The system generates a number of different designs for a given task, such as walking, jumping, climbing, or catching things. The inner loop is the control optimization.

The researchers found that the system was highly effective for many of the tasks, and that the algorithm-designed robots worked better than human-designed ones.

“It’ll take each of those designs, it’ll optimize the controller for it in Evolution Gym on a particular task,” said Bhatia. “And then it’ll return a score back for each of those designs back to the design optimization algorithm and say, this is how well the robot performed with the optimal controller.”

In this way, the system generates multiple generations of robots based on a task-specific “reward” score, keeping elements that maintain and increase this reward. The researchers developed more than 30 tasks for the robots to attempt to perform, rated easy, medium, or hard.

“If your task is walking, in this case, you would like the robot to move as fast as possible within the amount of time,” said Wojciech Matusik, a professor of electrical engineering and computer science at MIT and senior author of the paper.

The researchers found that the system was highly effective for many of the tasks, and that the algorithm-designed robots worked better than human-designed ones. The system came up with designs that humans never could, generating complex patchworks of materials and actuators that were highly effective. The system also independently came up with some animal-like designs, though it had no previous knowledge of animals or biology.

On the other hand, no robot design could effectively complete the hardest tasks, such as lifting and catching items. There could be a number of reasons for this including the populations that the program selected to evolve were not diverse enough, said Wolfgang Fink, an associate professor of engineering at the University of Arizona who was not involved in the project.

“Diversity is the key,” he said. “If you don’t have the diversity, then you get rapidly nice successes, but you level off very likely sub-optimally.” In the MIT researchers’ most effective algorithm, the percent of robots that “survived” each generation was between 60 and 0 percent, decreasing gradually over time.

Evolution Gym’s simplistic, 2D designs also do not lend themselves swell to being adapted into real-life robots. Nevertheless, Bhatia hopes that Evolution Gym can be a resource for researchers and can enable them to develop new and exciting algorithms for co-design. The program is open-source and free to use.

“I think you can still gain a lot of valuable insights from using Evolution Gym and proposing new algorithms and creating new algorithms within it,” he said.