RCed Mindstorm: CrazyShitCorner ETECTURE #DAD 2013

RCed Mindstorm: CrazyShitCorner ETECTURE #DAD 2013

Little transformers that jet around the corridors and make Karlsruhe “unsafe”… does that sound like a new Hollywood blockbuster? Not necessarily — visitors to the ETECTURE #DAD2013 were also confronted with them in the CrazyShitCorner!

The idea behind it

The intention was to use the Lego Mindstorm robot as a virtual presence at #DAD2013, i.e. the robot was to “represent” the interactors on site while they controlled the robot interactively from a very remote location. The objective was to allow the robot to move around the entire building — without any direct visual contact. The operator then had to interact and communicate with other people, as he himself could see and speak but could also be seen and heard.

What actually happened

A data connection had to be set up between the robot and the operator. This connection had to allow visual and sound transmission over large distances, as well as remote control. For interaction with other people, the robot needed the possibility of visual and audio transmission from and to the operator.

First, the robot was assembled in the usual way. To allow visual and audio transmission, the team decided to integrate a smartphone into the robot, with the smartphone then using Skype to send images and sound from and to the robot/operator via the operator’s Notebook. The smartphone allowed the operator to be seen/heard and thus to communicate with other people on site. To secure the smartphone to the robot, a bracket developed in-house was attached, with the smartphone secured to the front of the robot by gripping arms.

The data for the remote control was transmitted via WLAN and the operator controlled the robot using the keyboard of the Notebook.

However, direct transmission of the inputs from the browser to the robot was not possible as standard. Therefore, an interim layer had to be used. It was responsible for the communication between the browser and the robot and was able to send keyboard input from the Notebook to the robot via WLAN. To do this, a C# program with the following functionality was created:

  • Interface for connection with the robot
  • WebSocket server for communication with the browser
  • Logic to handle the incoming WebSocket requests appropriately

The browser provided the following control options:

  • Movement via arrow keys or buttons
  • Set four speed levels
  • Open/close gripping arms

jQuery, a free JavaScript library, was used to capture the events.

And so a little Lego friend took to the floor and mingled amongst the people, stopping now and again for a chat, and providing a great deal of fun not just for the team behind the controls.

Mindstorm-III_detail_page_preview

admin
By