We've updated our Privacy Policy to make it clearer how we use your personal data.

We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement
Engineering Team Develop Gripper Robot That Doesn't Need Training
News

Engineering Team Develop Gripper Robot That Doesn't Need Training

Engineering Team Develop Gripper Robot That Doesn't Need Training
News

Engineering Team Develop Gripper Robot That Doesn't Need Training

Researchers installed the soft robotic gripper on a Fetch Robotics robot in their lab. Credit: University of California San Diego
Read time:
 

Want a FREE PDF version of This News Story?

Complete the form below and we will email you a PDF version of "Engineering Team Develop Gripper Robot That Doesn't Need Training"

First Name*
Last Name*
Email Address*
Country*
Company Type*
Job Function*
Would you like to receive further email communication from Technology Networks?

Technology Networks Ltd. needs the contact information you provide to us to contact you about our products and services. You may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy practices and commitment to protecting your privacy, check out our Privacy Policy

How many robots does it take to screw in a light bulb? The answer: just one, assuming you’re talking about a new robotic gripper developed by engineers at the University of California San Diego.

The engineering team has designed and built a gripper that can pick up and manipulate objects without needing to see them and needing to be trained.  The gripper is unique because it brings together three different capabilities. It can twist objects; it can sense objects; and it can build models of the objects it’s manipulating. This allows the gripper to operate in low light and low visibility conditions, for example.

The engineering team, led by Michael T. Tolley, a roboticist at the Jacobs School of Engineering at UC San Diego, presented the gripper at the International Conference on Intelligent Robots and Systems (or IROS) Sept. 24 to 28 in Vancouver, Canada.

Researchers tested the gripper on an industrial Fetch Robotics robot and demonstrated that it could pick up, manipulate and model a wide range of objects, from lightbulbs to screwdrivers.

“We designed the device to mimic what happens when you reach into your pocket and feel for your keys,” said Tolley.

The gripper has three fingers. Each finger is made of three soft flexible pneumatic chambers, which move when air pressure is applied. This gives the gripper more than one degree of freedom, so it can actually manipulate the objects it’s holding. For example, the gripper can turn screwdrivers, screw in lightbulbs and even hold pieces of paper, thanks to this design.

In addition, each finger is covered with a smart, sensing skin. The skin is made of silicone rubber, where sensors made of conducting carbon nanotubes are embedded. The sheets of rubber are then rolled up, sealed and slipped onto the flexible fingers to cover them like skin.

The conductivity of the nanotubes changes as the fingers flex, which allows the sensing skin to record and detect when the fingers are moving and coming into contact with an object. The data the sensors generate are transmitted to a control board, which puts the information together to create a 3D model of the object the gripper is manipulating.  It’s a process similar to a CT scan, where 2D image slices add up to a 3D picture.

The breakthroughs were possible because of the team’s diverse expertise and their experience in the fields of soft robotics and manufacturing, Tolley said.

Next steps include adding machine learning and artificial intelligence to data processing so that the gripper will actually be able to identify the objects it’s manipulating, rather than just model them. Researchers also are investigating using 3D printing for the gripper’s fingers to make them more durable.

This article has been republished from materials provided by the University of California San Diego. Note: material may have been edited for length and content. For further information, please contact the cited source.

Advertisement