Hello There!
A little about mADLAB.cc
- MADLAB.CC is headed by Madeline Gannon, a Researcher / Designer / Educator at Carnegie Mellon University.
- We are a design collective exploring computational approaches to design, craft, and interaction.
- The work below explores the intersection of design, robotics, computer science, and human-computer interaction.
- We are especially passionate about developing future interfaces for fabrication machines, like 3D printers and industrial robots.
See a full list of publications here.
Works
REVERBERATING ACROSS THE DIVIDE
See full project brief here.
Reverb is a fabrication-aware design system that captures a user’s mid-air gestures to 3D model around 3D scanned physical contexts. This system captures a designer’s expressive, gestural input to generate intricate, aesthetically striking, wearable forms that are arbitrarily easy to design, print, and place back into context.
TAKEAWAYS
Reverb illustrates how we can use expressive gesture in combination with purposeful design by embedding physical constraints, machine constraints, and design aesthetics into an intuitive virtual modeling environment.
We can generate intricate, ready-to-fabricate and ready-to-wear digital geometry by:
- Embedding intelligence into 3D scans of the body so they actively assist the ergonomics of created forms.
- Encoding material and machine constraints directly into digital geometry.
- Using animate, interactive geometry to balancing precise control with expressive gesture.
tactum: skin-Centric design & fabrication
See full project brief here.
Tactum is an augmented modeling tool that lets you design 3D printed wearables directly on your body. It uses depth sensing and projection mapping to detect and display touch gestures on the skin. A person can simply touch, poke, rub, or pinch the geometry projected onto their arm to customize ready-to-print, ready-to-wear forms.
Tactum was created in collaboration with Autodesk Research and supportED IN PART BY Carnegie Mellon University's Studio for Creative Inquiry
Robo.Op: Opening industrial robots for creative use
See full project brief here.
In general, industrial robots are programmed through static 'off-line' methods: you are either (a) jog and record points in space as you move the robot using a joystick, or (b) you hard-code coordinates and commands into baked file. Although this is adequate for repetitive manufacturing tasks, it is extremely limiting for creative robotics. The trial-and-error nature of off-line does not facilitate fluid interaction between designer, robot, and environment, and is particularly time-consuming and tedious, once tasks become complex.
Robo.Op is an open hardware and software project that makes customizing your IR for 'on-line' live-control easier and cheaper. The toolkit is made up of a modular physical prototyping platform, a simplified software interface, and a centralized hub for sharing knowledge, tools, and code.
It simplifies an industrial robot’s machine language into easy-to-understand commands. For example, the top line shows a ‘Move’ command written in RAPID, the proprietary language for ABB robots; whereas the bottom line show the same command written within the Robo.Op framework.
It also facilitates multiple modes of communication between its programming interface and an industrial robot. Integrating TCP/IP, Serial, and OSC communication into the framework extends industrial robotics to a number of previously unconnected software and hardware platforms.
Takeaways
Robo.Op makes advanced programming of industrial robots accessible to a wider audience. It makes these closed automation systems open for interaction and real-time control by:
- Simplifying arcane machine language into more legible, accessible commands.
- Separating a robot program from its proprietary machine language, enabling more facile integration with sensors and actuators.
- Extending industrial robotics to a number of previously unconnected software and hardware platforms through its multiple communication streams.
Interactive industrial robotics
ROBO•MASSEUSE
Robo•Masseuse adapts an ABB IRB140 into a back massager! We created a smart end-effector – embedded with pressure sensors – that lets a person lean into the robot to control a massage routine. A project from Madeline Gannon & Zack Jacobson-Weaver.
The robot knows what direction a user is leaning towards by comparing values streaming from the triangulated pressure sensors. The arduino-enabled end effector then updates the 'massage routine' to reorient and press harder or softer into user's back.
A thick silicone head softens the direct pressure from the robot onto the back, providing comfortable nodules for the massage.
more Recent Developments
Interactive LIGHT TRACING
Covered by Prosthetic Knowledge
This concludes your tour of madlab.cc!
any Questions?
info@madlab.cc