Robo.Op: SITREP!

Time for a Robo.Op situation report ... A brand new Getting Started Tutorial is now live on github

Over the past few weeks, we've been making a big push towards creating some fun demos as example projects for the software-side of Robo.Op. So far we've:

  • Created an Android app that moves the robot around (in a 2D plane).
  • Created an Android app that sends a what you draw on the device to the robot (so it can redraw it much, much bigger!). We are also using webcam to capture, track, and visualize its movements.
  • Connected Twitter to the robot, so you can tweet at @tweetbotIRL, and the robot will draw out your message (this also uses a webcam to track and draw out the robot's movements.)

Showing a few different ways to use Android & Twitter to control an industrial robot, and using a webcam to capture and trace movement. Example code and template files will soon be uploaded onto http://peopleplusrobots.github.io/robo-op/ Learn more on how we can push industrial robots away from automation and towards interaction at www.madlab.cc/robo-op music: https://soundcloud.com/podingtonbear/dust-in-sunlight

Libraries used with Robo.Op: Ketai (for p5 to android), twitter4j (for twitter to p5), and OpenCV for Processing (for point tracking)

Some more serious, in depth testing below:

Check out this post from Prosthetic Knowledge for a few more, much nicer, GIFs!

For Robo.Op's full project brief, click here.

Open Hardware Summit 2014 – Rome

We were honored to be invited to speak at this year's event, and delighted to meet so many inspiring people making the world a better place through Open Hardware. The OHS will be posting video of the talks soon, but you can also a summary from this recap from making society.

We also had a wonderful conversation with Alicia Soria of Postdigital Node on how robotics can blur boundaries between analog and digital domains:

We also want to give a final special thanks to Makerfaire Rome for sponsoring the summit, the organizers of this year's event (Addie Wagenknect, Simone Cicero, and Nahid Alam) and the Ada Initiative, an organization supporting women in open tech and culture, for granting Madeline one of this year's Ada Lovelace Fellowships.

From Industrial Robotics to Creative Robotics

Industrial robots (IRs) are expensive, closed, proprietary machines. Traditionally, these specialized CNC machines were limited to industrial & manufacturing applications. They are extremely useful in scenarios where it is safer, cheaper, faster, or more efficient to use a robot instead of a human: not only can IRs lift heavy equipment, handle dangerous materials, and move with extreme precision at rapid speeds, they can also work 24/7 with minimal maintenance.  

The impact of industrial automation has been felt in the US workforce for nearly half a century.  In 1972, UNIMATE, the first industrial robot, joined the General Motors assembly line to automate dangerous processes in die-casting.  Although mechanized manufacturing had been around for over 200 years, the integration of UNIMATE marked the first time a robot –a programmable arm – shared the same workspace as a person. Rather than augmenting or streamlining human labor, it replaced us.  UNIMATE signaled the larger cultural implications for the future of robotics: robotics would no longer be a subject science fiction or laboratory research; it would be an everyday reality of the workplace.

George Devol's UNIMATE: early generation arm in 1961 (left) and the robotic workforce on GM's assembly line in 1972 (right).

The past decade has seen dramatic job displacement in the global manufacturing workforce. Industrialized nations are incorporating industrial robots at much higher rates than human counterparts (see bottom-left figure.) Moreover, experts are anticipating that these job displacement trends are transferring to workforces outside of manufacturing. A 2014 Pew Research Center Report – AI, Robotics, and the Future of Jobs – surveyed nearly 2,000 experts on the near-future impact of robotic advances and AI on the global workforce.  The three overarching concerns from respondents were:

  1. Impacts from automation have thus far impacted mostly blue-collar employment; the coming wave of innovation threatens to upend white-collar work as well.
  2. Certain highly-skilled workers will succeed wildly in this new environment—but far more may be displaced into lower paying service industry jobs at best, or permanent unemployment at worst.
  3. Our educational system is not adequately preparing us for work of the future, and our political and economic institutions are poorly equipped to handle these hard choices.

CHANGING THE STATUS QUO

While the momentum pushing us towards an automated workforce seems daunting, now is the time to reframe the existing relationships between people and industrial robots. The current model within industrial manufacturing places IRs as adversaries – "they are coming for our jobs" as described by WiredScientific AmericanMIT Technology Review, and countless other technology magazines. However, adapting IRs for uses outside of manufacturing domains can change industrial robots from adversaries into collaborators. Instead of developing ways for IRs to replace human labor, a collaborative model can create ways for IRs to augmentamplify, and extend human capabilities and creativity.

Existing creative applications of industrial robots tend to fall into three categories: fabrication, hybrid art, and interaction & telepresence. Below are brief descriptions of each, and a few notable example projects.

Fabrication 

Industrial Robots are used to revive high-skill crafts that have been supplanted by industrial manufacturing and mass production processes: for example, carpentry, masonry, or plastering.

HYBRID ART  

Industrial robots are used in combination with traditional artistic practices: for example drawing, sculpting, painting, or performance.

INTERACTION & TELEPRESENCE

Industrial robots are used used to intervene in social settings or as a physical proxy for remote users.

Multi-Material 3D Printing
Notes and Lessons Learned

I'm very fortunate to have access to a multi-material 3D printer for the coming months.  Thus far, I've only worked with powder and plastic-based 3D printers, so I'm eager to find the advantages and limitations of resin-based printers.  For the next few months I'll be working with an Objet260 Connex, a polyjet printer that allows you to mix two materials to create up to 14 different material properties).  

As my first test with the machine and workflow, I decided to try to recreate a small material sample from Neri Oxman and Iris van Herpen's Anthozoa Cape & Skirt.  It was printed by a similar machine, although using different materials than what I have available.

I created a simplified parametric model of a polyp colony, using Tango Black (rubber) for the base and tops, and Vero Clear (rigid plastic) for the shells.

If you have Rhino5 and Grasshopper, you can download this .zip file to play around with the script.  Adding/moving points will change the shape of your colony.

A few things learned from the printing process below:

  1. Although you can mix the two materials to create 14 material properties, there's no default way to actually blend materials. So if you want a gradient from soft to hard or light to dark, you need to plan your digital geometry to get the desired material effect.
  2. Objets use a ton of support material.  It's very difficult to remove from complex/porous/hollow geometry, and you risk damaging any delicate detail in the process.
  3. Provide decent surface area when changing materials.  If you notice in the images below, the smaller polyps lost their rubber tops during cleaning while the larger ones were able to survive.  All the polyps have strong connection to the rubber base.

project 41 Invited critic @ FIU SOA

MADLAB will be traveling to Miami, FL this July as an invited critic for project 41: a tale of cities … that share one street. Presentations will review the nomadic design studio’s investigations into novel techniques in urban documentation and speculative mobile infrastructure. Project 41 is coordinated studio between the Department of Urban Speculation at the University of Illinois Chicago, Andrew SantaLucia, and Malik Benjamin for Florida International University’s 4th Year Accelerated Masters Architecture Studio. 

Read More

interactive projection mapping KINECT HACK

Project by Marynel Vázquez and Madeline Gannon | kinecting the virtual to the physical world This project was inspired by the work of an incredibly talented community of artists and designers that are using video mapping as a medium to reinterpret and transform banal, expected environments. The work of Pablo Valbuena was a strong influence over our explorations, and we sought to introduce dynamic interactivity to augmented sculpture as our novel addition to this community. Developed in C++, Openframeworks, and OpenNI, we are using the depth mapping capabilities of the kinect to evaluate the viewer(participant's) hand, and position it as the light source of the physical model. In effect, their hand becomes the sun, lighting or dimming our abstracted cityscape. for more see http://www.madlab.cc/?p=741 *produced at Carnegie Mellon University for Golan Levin's Interactive Art and Computation Design course **See more Kinect projects from our Carnegie Mellon course: http://golancourses.net/2011spring/projects/project-3-interaction/

This speed project with  Marynel Vázquez was inspired by the work of a talented community of artists and designers that are using video mapping as a medium to reinterpret and transform banal, expected environments. The work of Pablo Valbuena was a strong influence over our explorations, and we sought to introduce dynamic interactivity to augmented sculpture as our novel addition to this community.

With the relative novelty of the KINECT, our response to its inherent screen-based interaction was to pull it back out into the physical realm. Developed in C++, Openframeworks, and OpenNI, we are using the depth mapping capabilities of the KINECT to evaluate the participant’s hand, and position it as the light source of the physical model. In effect, their hand becomes the sun, lighting or dimming our abstracted cityscape, and blurring the border of virtual and actual.

This project has been featured on kinecthacks.comkinecthacks.net, and kinect-hacks.com.