Soft robots learn to grip tools with the right force



Over the past 50 years, robots have become very effective at working in tightly controlled spaces, such as car assembly lines. But the world is not a predictable assembly line. Although humans find interacting with the myriad objects and environments found beyond the factory gates a trivial task, it’s extremely difficult for robots.

Researchers in the world of robotics are working on making robots that can perform a variety of tasks like a human. In similar efforts, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have teamed up with a group from the Toyota Research Institute to work toward this important goal. They designed a system that can grab tools and apply the appropriate force for a given task, such as scraping liquid or writing a word with a pen.

Called Series Elastic End Effectors (SEED), the system uses flexible bubble grippers and built-in cameras to map how the grippers deform over six-dimensional space and apply force to a tool. Using six degrees of freedom, the object can be moved left to right, up and down, back and forth, roll, pitch, and yaw. Bubble pliers use a learning algorithm to exert precisely the right amount of force on a tool for its correct use. The closed-loop controller uses SEED and visuotactile feedback to adjust the position of the robot arm to apply the desired force.

With SEED, each run detected by the robot is a recent 3D image of the grippers, allowing real-time tracking of how the grippers change shape around an object. The researchers used these images to reconstruct the tool’s position, and the robot used a learned model to map the tool’s position to the measured force.

The learned model is obtained using the robot’s previous experience, where it perturbs a force torque sensor to determine the stiffness of the bubble clamps. Now, once the robot senses the force, it compares it with the force the user commands, the researchers say. It would then move in the direction to increase strength, all on 6D space. The research team implemented their system on a robotic arm to put it to the test in a series of trials.

SEED was given just the right amount of force to wipe liquid off an airplane during the squeegee task. The robot also effectively wrote “MIT” using a pen and was able to apply the right amount of force to drive a screw. While SEED was aware that it had to command force or torque for a given task, if gripped too hard the object would inevitably slip, so there is an upper limit to this exerted harshness.

The system currently assumes a very specific geometry for the tools, it must be cylindrical, and there are still many limitations on how it can generalize when encountering new types of shapes. Researchers are work now about generalizing the framework to different shapes so that it can handle arbitrary tools in nature.

Journal reference:

  1. HJ Terry Suh, Naveen Kuppuswamy, Tao Pang, Paul Mitiguy, Alex Alspach, Russ Tedrake. SEED: series of elastic end effectors in 6D for the use of visuotactile tools. Arxiv, 2022; DO I: 10.48550/arXiv.2111.01376

Source link


Comments are closed.