Hello There!

 

Hello Future Collaborators – Welcome to madlab.cc! Below is a snapshot of our on-goings here. We've put together a smattering relevant projects, academic research, and some works-in-progress. Links to full project briefs are provided in each synopsis ... Enjoy!

A little about mADLAB.cc

  • MADLAB.CC is headed by Madeline Gannona Researcher / Designer / Educator at Carnegie Mellon University.
  • We are a design collective exploring computational approaches to design, craft, and interaction.
  • The work below explores the intersection of design, robotics, computer science, and human-computer interaction.
  • We are especially passionate about developing future interfaces for fabrication machines, like 3D printers and industrial robots.  

 

See a full list of publications here.


Works


REVERBERATING ACROSS THE DIVIDE

See full project brief here.

Reverberating Across the Divide reconnects digital and physical contexts through a custom chronomorphologic modeling environment. The modeling interface uses a three phase workflow (3D scanning, 3D modeling, and 3D printing) to enable a designer to craft intricate digital geometries around pre-existing physical contexts. Chronomorphology –– like its nineteenth-century counterpart chronophotography –– is a composite recording of an object’s movement. Instead of a photograph, however, the recording medium here is a full three-dimensional model of the object — a virtual creature simulated within a digital environment. This virtual creature exists as a 3D printable module; it is constructed as a closed mesh, with a spring skeleton that prevents self-intersections. The composite, chronomorphologic model (of the virtual creature over time) retains these printable properties at each time-step. Therefore, no matter how intricate or complex, the digital geometry will always be exported as a valid, 3D printable mesh. The chronomorphologic modeling environment facilitates the rapid generation of baroque and expressive spatial forms that both respond and expand on existing physical contexts. By mediating 3D scanning and 3D printing through the modeling environment, the designer has a streamlined workflow for oscillating between virtual and analog environments. This ease between digital design and physical production provides a framework for rapidly exploring how subtle changes in the virtual environment, physical environment, or designer’s gestures can create dynamic variation in the formal, material, and spatial qualities of a generated design. –––––––––––––––––––––––––––––––––––––––––––––––––––––––– This project was supported in part by funding from the Carnegie Mellon University Frank-Ratchye Fund For Art @ the Frontier (http://studioforcreativeinquiry.org) –––––––––––––––––––––––––––––––––––––––––––––––––––––––– Music: "Portofino" by Teengirl Fantasy (http://teengirlfantasy.angelfire.com)

Reverb is a fabrication-aware design system that captures a user’s mid-air gestures to 3D model around 3D scanned physical contexts. This system captures a designer’s expressive, gestural input to generate intricate, aesthetically striking, wearable forms that are arbitrarily easy to design, print, and place back into context.

TAKEAWAYS

Reverb illustrates how we can use expressive gesture in combination with purposeful design by embedding physical constraints, machine constraints, and design aesthetics into an intuitive virtual modeling environment. 

We can generate intricate, ready-to-fabricate and ready-to-wear digital geometry by:

  1. Embedding intelligence into 3D scans of the body so they actively assist the ergonomics of created forms.
  2. Encoding material and machine constraints directly into digital geometry.
  3. Using animate, interactive geometry to balancing precise control with expressive gesture.

tactum: skin-Centric design & fabrication

See full project brief here.

Tactum is an augmented modeling tool that lets you design 3D printed wearables directly on your body. It uses depth sensing and projection mapping to detect and display touch gestures on the skin. A person can simply touch, poke, rub, or pinch the geometry projected onto their arm to customize ready-to-print, ready-to-wear forms. Technical Details: In its current iteration, Tactum uses computer vision and projection mapping to detect interactions with the body. Tracking and gesture recognition is done with a Leap Motion Controller, and visual feedback is projected onto the forearm using a Casio XJA251 projector. We extract a user's natural gestures – gestures that don't require specific training – to drive a body-based 3D modeling environment. A person can touch, poke, rub, pinch, grab, and twist the digital geometry projected onto their body. Fabrication & Ergonomic Fit: Since this base geometry is generated from 3D data of the arm, any design created through Tactum is inherently built to fit each individual user's body. Additionally, technical 3D printing constraints are also embedded within the geometry; this means that no matter how much you manipulate the digital geometry, every design generated through Tactum is guaranteed to be 3D printable. For more information, see the full project page at madlab.cc/tactum Tactum was developed in collaboration with Autodesk Research (http://www.autodeskresearch.com/), and with support from the Frank-Ratchye STUDIO for Creative Inquiry at Carnegie Mellon University (http://studioforcreativeinquiry.org/). Music by Broke For Free (https://soundcloud.com/broke-for-free/bonobo-recurring-remix)

Tactum is an augmented modeling tool that lets you design 3D printed wearables directly on your body. It uses depth sensing and projection mapping to detect and display touch gestures on the skin. A person can simply touchpokerub, or pinch the geometry projected onto their arm to customize ready-to-print, ready-to-wear forms. 

 
FRFAF_logo.jpg
 

Tactum was created in collaboration with Autodesk Research and supportED IN PART BY Carnegie Mellon University's Studio for Creative Inquiry


Robo.Op: Opening industrial robots for creative use

See full project brief here.

Robo.Op is an open hardware / open software platform for hacking industrial robots (IRs). Robo.Op makes it cheaper and easier to customize your IR for creative use, so you can explore the fringes of industrial robotics. The toolkit is made up of a modular physical prototyping platform, a simplified software interface, and a centralized hub for sharing knowledge, tools, and code. Robo.Op lets artists and designers with access to an industrial robot to: (1) Develop affordable custom tools for creative robotics. (2) Share tools and knowledge across robot platforms. (3) Bypass the expensive, proprietary software and hardware currently offered by robotics companies. Find out more at: http://peopleplusrobots.github.io/robo-op http://madlab.cc/robo-op Robo.Op was started by Madeline Gannon (http://madlab.cc), Zack Jacobson-Weaver (http://enartdezark.blogspot.com), and Mauricio Contreras (http://www.andrew.cmu.edu/user/mchernan)

In general, industrial robots are programmed through static 'off-line' methods: you are either (a) jog and record points in space as you move the robot using a joystick, or (b) you hard-code coordinates and commands into baked file. Although this is adequate for repetitive manufacturing tasks, it is extremely limiting for creative robotics. The trial-and-error nature of off-line does not facilitate fluid interaction between designer, robot, and environment, and is particularly time-consuming and tedious, once tasks become complex.

Robo.Op is an open hardware and software project that makes customizing your IR for 'on-line'  live-control easier and cheaper. The toolkit is made up of a modular physical prototyping platform, a simplified software interface, and a centralized hub for sharing knowledge, tools, and code.

It simplifies an industrial robot’s machine language into easy-to-understand commands. For example, the top line shows a ‘Move’ command written in RAPID, the proprietary language for ABB robots; whereas the bottom line show the same command written within the Robo.Op framework.

It also facilitates multiple modes of communication between its programming interface and an industrial robot. Integrating TCP/IP, Serial, and OSC communication into the framework extends industrial robotics to a number of previously unconnected software and hardware platforms.

 

Takeaways

Robo.Op makes advanced programming of industrial robots accessible to a wider audience. It makes these closed automation systems open for interaction and real-time control by:

  1. Simplifying arcane machine language into more legible, accessible commands.
  2. Separating a robot program from its proprietary machine language, enabling more facile integration with sensors and actuators.
  3. Extending industrial robotics to a number of previously unconnected software and hardware platforms through its multiple communication streams.

 

Interactive industrial robotics

ROBO•MASSEUSE

Robo•Masseuse adapts an ABB IRB140 into a back massager! We created a smart end-effector – embedded with pressure sensors – that lets a person lean into the robot to control a massage routine. A project from Madeline Gannon & Zack Jacobson-…

Robo•Masseuse adapts an ABB IRB140 into a back massager! We created a smart end-effector – embedded with pressure sensors – that lets a person lean into the robot to control a massage routine. A project from Madeline Gannon & Zack Jacobson-Weaver.

The robot knows what direction a user is leaning towards by comparing values streaming from the triangulated pressure sensors. The arduino-enabled end effector then updates the 'massage routine' to reorient and press harder or softer into user's back. 

A thick silicone head softens the direct pressure from the robot onto the back, providing comfortable nodules for the massage.

more Recent Developments

Interactive LIGHT TRACING

Showing a few different ways to use Android & Twitter to interact with an industrial robot. Example code and template files will soon beuploaded onto http://peopleplusrobots.github.io/robo-op/ Learn more on how we can push industrial robots away from automation and towards interaction at www.madlab.cc/robo-op

 

This concludes your tour of madlab.cc!

any Questions?

info@madlab.cc