Quipt

TaminG Industrial Robots

 

Quipt is a gesture-based control software that facilitates new, more intuitive ways to communicate with industrial robots. Using wearable markers and a motion capture system, Quipt gives industrial robots basic spatial behaviors for interacting closely with people. Wearable markers on the hand, around the neck, or elsewhere on the body let a robot see and respond to you in a shared space. This lets you and the robot safely follow, mirror, and avoid one another as you collaborate together.

During her residency at Pier 9, Madeline Gannon developed a project called Quipt - a gesture-based control software that gives industrial robots basic spatial behaviors for interacting closely with people. Wearable markers and a motion capture system let the robot see and respond to you in a shared space. This lets you and the robot safely follow, mirror, and avoid one another as you collaborate together. See more of how Madeline developed Quipt on her instructable here - Instructables.com/id/How-to-Tame-Your-Robot/ For more info about Madeline's project visit - madlab.cc/quipt For info about Pier 9's Artist in Residence program visit - autodesk.com/air Director, Camera & Editor - Charlie Nordstrom Assistant Editor - Blue Bergen Music - "Empty Trees" by Ketsa (http://freemusicarchive.org/music/Ketsa/5th_Cycle/Empty_Trees) - http://ketsamusic.com/


Quipt augments an ABB IRB 6700 industrial robot by giving it eyes into its environment. Using a Vicon motion capture system, the software is structured to receive and reformat motion capture data into corresponding movement commands for the 6700. Movement commands are generated using our open-source library, Robo.Op (see it on github). Quipt also visualizes debugging data in an Android app, so a human collaborator has a mobile, continuous view of what the robot is seeing.

Moving out of the factory

Industrial robots are truly incredible CNC machines – not just for their speed, power, and precision, but because they are also highly adaptable. Unlike other CNC machines, when you put a tool on the end of the robot, you completely transform what it can do: put a sprayer on it, and it becomes a painting robot; put a gripper on it, and it becomes a material handling robot; put a welder on it, and it becomes a spot welding robot.

This adaptability has made the industrial robot a key piece of infrastructure for factory automation over the past 50 years. But despite their adaptability, industrial robots are fairly dumb machines: they have little-to-no awareness of the environment outside of their programmed tasks. This is one of the main reasons why industrial robots have thrived only in highly controlled environments, like factories. They need places where unpredictable objects (a.k.a. people) are strictly separated from their work zones.  

But an industrial robot's adaptability is useful beyond the factory. Putting a film camera onto an industrial robot gives a director precise, complex, and repeatable camera moves. Putting a loader onto an industrial robot gives a construction worker a way to move heavier quantities of materials. Putting a light onto an industrial robot gives a photographer more precise control of a scene's ambiance. While these are somewhat mundane use cases, they tease out some of the biggest challenges for bringing industrial robots outside of the factory: because they are blind to the world, they are very dangerous to use; because they need highly technical skill to program, they are very difficult to use.

On a standard factory line, industrial robots are programmed to repeat a single task 24 hours a day, 7 days a week. People are strictly separated from the robot work zone.

On a standard factory line, industrial robots are programmed to repeat a single task 24 hours a day, 7 days a week. People are strictly separated from the robot work zone.

As these machines move out of the factory and into live, ever-changing environments, we'll need them to see and understand how we are occupying the same shared space.

As these machines move out of the factory and into live, ever-changing environments, we'll need them to see and understand how we are occupying the same shared space.

Quipt works towards mitigating these two technical challenges, and demonstrates a way for industrial robots to be safer and easier to use in uncontrolled settings.  Instead of programming a robot through code, it models human-robot communication on body language. It gives an industrial robot spatial awareness and spatial behaviors for interacting closely with people. This lets a person's intuitive pointing, posturing, and movements guide the robot's actions. 

Motion Capture

Quipt uses motion capture to track a person working with the robot. Passive markers made from retroreflective tape are worn on the body, and are given a global position and orientation by the mocap system's tracking software. Quipt parses the streaming mocap data in a size and format the robot can handle. Aligning the world coordinates of the motion capture system with the world coordinates of the robot to gives an accurate reference frame for letting the robot 'see' what the mocap system senses.

Spatial Behaviors

With motion capture data integrated and the two coordinate systems aligned, the industrial robot now has an awareness of where a person is in space. At this point, Quipt can tell the robot how the person is moving and how it should move in response. Quipt uses three primitive spatial behaviors to guide the robot's movements: follow, mirror, and avoid. These three movement modes are the basic components of how two people interact with one another while working together in a shared space. Giving these behaviors to the robot provides the human counterpart with an intuitive understanding of the robot's movements and where it is going next. 

In follow mode, the robot moves to a point that is offset normal from the worn marker.

In follow mode, the robot moves to a point that is offset normal from the worn marker.

In mirror mode, the robot moves and reorients itself to 'look' where the marker is facing.

In mirror mode, the robot moves and reorients itself to 'look' where the marker is facing.

In avoid mode, the robot maintains a minimum distance from the marker by moving up and away.

In avoid mode, the robot maintains a minimum distance from the marker by moving up and away.

The three debugging screens above show different ways the robot responds as a person moves into their work zone. The active work zone (highlighted in blue) is an added safety measure: when a person can steps outside the work zone, no movement commands are sent to the robot.

Taking the 'industrial' out of Industrial Robots

We are excited for a future where industrial robots continue to move out of industrial settings – these new settings bring new design challenges that have yet to be explored in traditional automation. Automation tasks where the human is entirely removed from the equation is reaching a limit of diminishing returns. The next step is create ways for these machines to augment our abilities, not replace them. Reimagining the interfaces that connect us to an industrial robot not only changes how we use these machines, but also has the potential to innovate what we do with robotics arms.



Sponsors

Quipt WAS DEVELOPED at the Applied Research Laboratory at Autodesk Pier 9.