The next step forward in robotics is dexterity

The next steps forward in robotics

While the drones and driverless cars occupy the headlines, is the dexterity of the robots that will probably have an even bigger impact in business and everyday life.

“Robot manipulation is the next shoe to drop,” says Robert Platt, computer science professor and head of the Helping Hands robotics lab at Northeastern. “Imagine a robot that can do things with it’s hands in the real world—anything from defusing a bomb to doing your laundry. This has been a dream in the research community for decades, but now we’re finally getting to the point where it could actually happen.”

Recent progress in robot perception, machine learning and Big Data took us on the threshold of a huge leap in the ability of robots to perform fine motor tasks and act in uncontrolled environments, says Platt. The difference is between robots that can perform repetitive tasks in a structured lab environment and a new era of humanoid robots that can do significant work in the real world.

Why motor skills are left behind

There is an irony in robots and artificial intelligence known as the Moravek’s paradox: what is difficult for humans is relatively easy for robots, and what is easy for humans it is almost impossible for robots.

We can program a robot with the computational capability of defeating an international chess champion, but we find it hard to give it the dexterity of a 2-year-old child. Identifying and grasping a pencil in a random pile of stationery is nearly impossible for a robot and letting him open a door and walking into a room -as demonstrated in a recent international robot contest – can result in a comedy film. See video.

Humans have evolved their visual, sensory and motor skills for millions of years: these complex abilities are so deeply rooted in the human circuitry that we perform them unconsciously. Instead, endeavors such as mathematics, science or financial analysis are recent efforts to humans and are so much easier to replicate for engineers.

Despite the enormous challenge, Platt argues that autonomous robots are ready to make a big jump in their ability to manipulate unfamiliar objects.

For example, Platt and his team at the Helping Hands Lab trained a robot to find, catch and remove unfamiliar objects from a heap in disarray with a 93 percent accuracy. Achieving this goal requires significant advances in machine learning, perception and control.

The researchers used a technique called reinforcement learning in which the robot learns through trial and error. They created a simulated world in which the robot could practice to collect and manipulate objects in virtual reality. When the robot did what researchers wanted – grabbing an object from a heap – it has been given a reward. This technique allows the robot to develop skills in a virtual environment and then apply them to the real world.

A progress in the depth perception was essential to enable robots to work in an uncontrolled environment. Previously, they could only see the world as a flat field with seemingly random colors. But with this new 3D perception, they could identify individual objects in a crowded field.

While the vision is a great tool to drive large movements, fine motor skills require the sense of touch.

“Think of what you can do with gloves on,” explains Platt. “You can open the garage door, grab a shovel, and clear the driveway. But if you need to unlock the garage first, you need to take your gloves off to insert the key.”

As part of a grant from NASA, Platt’s laboratory has recently built a robot hand with tactile sensors and developed new algorithms to interpret the data.

“In order to insert a key into a lock, the robot needs to know exactly how it’s holding the key, down to the millimeter,” says Platt. “Our algorithms can localize these kinds of grasped objects very accurately.”

Platt’s laboratory demonstrated these new features by grabbing a USB connector and inserting it into a port. While this may not seem like a big result, it is a critical step toward making robots that can perform precise manipulation tasks like changing the battery on a mobile phone.

What’s next?

As with any nascent human progress – radar, telephone, internet – practical applications of robot dexterity are hard to predict. These are some assumptions:


Platt’s Helping Hands Lab – in collaboration with the University of Massachusetts and New Hampshire Crotched Mountain Rehabilitation Department – is building a wheelchair with a robotic arm that can grab objects around the house or run simple household tasks. This could allow elderly or people with disabilities continue to live independently in their own homes.

Platt is also interested in adapting this technology for everyday use.

“We hear a lot about the Alexa-style assistants that can answer questions by accessing the internet. But these assistants can’t do anything physical,” says Platt. “We want to equip these devices with a robotic body so you can say, ‘Alexa, get the newspaper,’ or ‘Alexa, clean up Jimmy’s room.’”


Professor of engineering Hanumant Singh in collaboration with Platt are building a mobile golf-cart sized robot equipped with a robotic arm that can move around Northeastern’s campus independently and perform tasks of simple manipulation such as taking out the trash.


This type of robot could take on similar duties in conflict zones and be used for dangerous operations such as defusing land mines. For example, Platt and his group have recently received a grant from the Office of Naval Research to develop fundamental manipulation technologies that will be used on board of warships.

Hazardous waste

Engineering professor Taskin Padir and his team received a grant from the Department of Energy to adapt NASA’s Valkyrie robot for hazardous waste disposal. There are over a dozen sites spread throughout the United States where radioactive waste was buried in tunnels during the cold war. For autonomous robots can locate, seize and store this waste in secure containers, they will need fine motor skills and the ability to work in unfamiliar surroundings.


Funded by a grant from the National Science Foundation, Engineering Professor Peter Whitney is working with researchers at Stanford University to create a robot that can perform MRI-guided surgery.

Space exploration

Platt is working with NASA researchers to develop a robotic handling capability of soft objects for future NASA missions.

“Robots that work flawlessly in the lab break down quickly when they’re placed in unfamiliar situations,” says Platt. “Our goal is to develop the underlying algorithms that will allow them to be more reliable in the real world. Ultimately, this will fundamentally change the way we think about robots, allowing them to become partners with humans rather than just machines that work in far away factories.”