Dactyl – OpenAI’s Robot Hand trained itself without any Human Learning (Video + Research Paper)

Pranav Dar 07 May, 2019 • 2 min read

Overview

  • OpenAI has designed a system called Dactyl that powers a robot hand to analyze and reposition objects
  • Dactyl uses simulations to train itself to replicate human behavior without any human assistance
  • It uses the same reinforcement learning techniques that are behind the popular OpenAI Five system

 

Introduction

Picking up an object and analyzing it may be an arbitrary task for humans, but don’t tell a machine that! Teaching a computer to detect objects, pick them up and analyze them has turned out to be way harder than anybody had initially imagined. What a few months old toddler can do is something that takes years of training for a machine to learn (that’s just one simple example of why we are nowhere near general artificial intelligence).

Robot hands have become the primary application machine learning researchers use to showcase their projects. And OpenAI, always at the cutting edge of AI research, have trained a robot hand that can manipulate objects with mind boggling dexterity. The system, which OpenAI is calling Dactyl, has been trained entirely using round after round of simulations. Dactyl learns to do tasks from scratch using the same reinforcement learning techniques that power the popular OpenAI Five system.

The task OpenAI researchers gave Dactyl was to reposition a given object (like a letter block) such that a new position is visible every time. Three cameras monitor how the hand works while the position and movement of fingertips is tracked in real-time. As more and more simulations were performed, Dactyl used human-level strategies to achieve the desired results. Again, this wasn’t labelled or taught, it came as a result of the simulations.

The below image, posted by OpenAI, shows how they built this system:

OpenAI’s blog post and research paper cover Dactyl in more technical detail. You can also check out the video below to see the robot hand in action:

 

Our take on this

This may seem like arbitrary research at first glance but it might be the first step towards general AI. Sure we have seen tons of robot hands before (), but what makes Dactyl different is that it isn’t programmed to perform any one single task. Place any object in that hand, and it will learn by itself how to change it’s orientation.

This goes to show that robots can adapt to human-like behavior. It will take a lot more experiments and research to perfect this and make it useful in a practical scenario, but at least the stepping stone has been laid down.

 

Subscribe to AVBytes here to get regular data science, machine learning and AI updates in your inbox!

 

Pranav Dar 07 May 2019

Senior Editor at Analytics Vidhya. Data visualization practitioner who loves reading and delving deeper into the data science and machine learning arts. Always looking for new ways to improve processes using ML and AI.

Frequently Asked Questions

Lorem ipsum dolor sit amet, consectetur adipiscing elit,

Responses From Readers

Clear

Kiye Sic
Kiye Sic 01 Aug, 2018

wow.. very interesting blog..