Greetings!

If any of the info on this website was useful for your projects or made your head spin with creative ideas, and you would like to share a token of your appreciation- a donation would be massively appreciated!

Your donation will go to my Robotics Fund, straight towards more sensors, actuators, books, and journeys. It takes a lot to continue building these robots, and I’m really thankful of everyone who helps encourage me along the way.


USD


CAD

“A wise robot once said to me through Serial.println- that robots teach us about ourselves.”

All posts in Robot

Maker Faire Ottawa

Screen Shot 2014-08-15 at 10.24.29 AM

We showed our robots at the amazing Maker Faire Ottawa! There were many people who were interested in the robots, and became inspired to try building their own.

If you did not have the chance to make it to Ottawa, here is a video of our table!

Photos of our table:

ottawa_maker_faire_11

ottawa_maker_faire_12

Here are the statistics from what I’ve heard: There were about 4,000 people on the first day, and 3,000 on the second. They beat the total attendance from 2010 in the first hour. Maker Faire Ottawa is the fastest growing Mini Maker Faire. And if Commander Hadfield was in Ottawa, he would have visited too! Maybe next year ;D

We were also interviewed on CBC – read more here.

We had two RoboBrrds there. This new yellow one, Coolios, and the black Spikey one. Coolios works with a sonar sensor, except that it was a little buggy and it came across as a very hyper robot. Spikey works with the iPad App we developed, RoboBrrd Dance.

ottawa_maker_faire_23

Kids always enjoy interacting with RoboBrrd!

ottawa_maker_faire_18

(^ Thx to whomever took this photo, great shot!)

ottawa_maker_faire_31

(^ Photo cred @edgarmtoro – Thx!)

At one point in time, kids were lined up to use the RoboBrrd Dance app. How cool is that?!

We added some new RoboBrrd Kits to the store. Check them out if you want to get started building!

We also had the Automatic Food Slicer Robot in action, slicing some playdoh.

ottawa_maker_faire_17

Pretty much as expected- older adults were interested in this robot. I hope that I can make it more stable in the future, so that way they can buy / make one, and it will help them. That would be cool.

We recently finished off a portion of this project for entrance into the Hackaday Space Prize. We’ll be blogging about this later, but for now you can view all the information here.

ottawa_maker_faire_24

Kids also enjoyed interacting with AFSR. This is mostly because we were using the cool Hover gesture board. It takes a little time to figure out how far and fast to wave your hand for the gestures, but once they get it then they can control the robot very easily.

One of the best ideas we heard: We could use this robot to slice the crusts off bread! :D

The badge for Maker Faire Ottawa was absolutely stunning:

ottawa_maker_faire_30

Here are some nice tweets from makers!




We were also displaying Clyde the robot lamp! Some of the backers of Fabule’s Kickstarter recently received their lamp too. Stay tuned for more info about what we are going to be making with our Clyde- it will be exciting!

A few weeks prior to the Maker Faire, we received a huge box from Intel. With our Intel Galileo 2, we will be making Cognito Collobot. The goal is to make a robot that can give a TED talk. This is for the new TED XPrize. It will be challenging, but we are going to try.

Also on our display board, two panels for people who were really interested in it, was detailing my Moonshot-In-Progress project. In terms of Google Solve For X, here are the three main points:

Huge Problem: With the rise of the aging population, there will be more need for assistance in their homes. The physical objects that surround them will become problematic as motor ability decreases.

Breakthrough Technology: (Work in progress) “The LED of motors” — something that can be soldered to a pcb, and when given power it can actuate. Different patterns could perform various actuations. There could be an abundance of actuators!

Radical Solution: When motors are as available as LEDs, we could add them to everything. With software, we could manipulate all the objects around us like they are fluid — even have the objects able to sense and automatically move based on previous patterns.

Everything around us would no longer be inanimate physical objects, but instead ones that are alive and can adapt to our needs and the environment.

As of right now, I currently have two main ideas on how to possibly make this work. Still have some more reading and learning to do, but I will be working on this. Watching the Solve For X videos have been very inspiring.

Has no one else on this planet been bugged by the fact that we can’t just tell things to move? It takes very long to add motors to everything. We should just have motor tape — or something similarly accessible.

We still have to work out the idea more, but this is a crazy goal that we will chase and strive to achieve some day. ;)

We also displayed the parts from the Laser Level Teardown. A couple of people were interested in this.

In four years from now, maybe we will be sponsors for Maker Faire Ottawa. This sounds like a great goal. :)

If you are looking to support my work in some way, back my fan-funding campaign on Patreon and check out the RoboBrrd store!

This was a great Maker Faire. Thanks to everyone for making it a huge success. Special thanks to Britta, Remco, Olivier, Amos and Naomi! Without your help I would not have been able to show the robots here, so thanks. :)

ottawa_maker_faire_21

Tentacle Mechanism (work in progress 3)

tentacle_mech_more_1

The tentacle mechanism has come a long way since the last update. More LEDs, a better behaviour, and a mount for the electronics and wall. Plus some fun long exposure photos ;)

Here is a video clip where you can see it in action:

Also, the tentacle mechanism appeared on the Adafruit Show and Tell-

Check it out here at the 2:55-7:06 mark!

The behaviour for the tentacle mechanism has been difficult to figure out.

Our fist attempt was to use the outputs to communicate how many times the button should be pressed. At first, it was fun, but then it just becomes confusing. Sort of similar to: ‘Why is this thing doing this thing, what can I do to change it?’. You can watch a video of this here.

The next attempt was to use the ultrasonic sensor, and have different actions for each distance threshold. There are also ‘mini-actions’ that occur from the changes between these distances. So when you are interacting with it, the ‘dances’ that the tentacle does will be similar, but the introduction to that dance, LEDs blinking, will be different.

But in the code, it’s more than controlling the robot. There are ‘debug’ statements where the robot is saying things. It gives some context as to what the robot thinks is happening.

Screen Shot 2013-08-02 at 4.05.02 PM

So as you can see, this robot has some sort of creepy obsession with distance.

And it gets even more interesting when the human goes away:

Screen Shot 2013-08-02 at 4.05.24 PM

As for actually displaying the text to the humans, it might be nice to have a tiny OLED display at some distance away from the robot, that only lights up after some amount of time of interaction. This way the humans will pay more attention to the tentacle moving at first, then notice the display and keep interacting.

What is all this ‘be’ functions about in the code? Those are the ‘mini-actions’, as mentioned above. They just blink the LEDs in certain patterns and such. In a future robot, this will be more involved with the social drives/mind/emotions.

Taking long exposures of the tentacle moving has been quite fun. Here are some of my favourites:

tentacle_mech_more_2

tentacle_mech_more_4

tentacle_mech_more_6

Working on documenting it, there were a lot of lessons learnt while building this! ;)

Buddy 4000 + BLE App (work in progress)

We’ve been long awaiting the days when communicating to our robots from an iOS device would involve less jumping through hoops! BLE on the newer iOS devices is pretty sweet.

Here is a video demo of our app interacting with Buddy 4000 using BLE!

We’ve been working on this off and on for a few months. Actually, half of the core functionality was finished a while ago (sending data from the iOS device to the robot). The BLE module we were using wasn’t configured properly to send data from the robot to the iOS device, and we didn’t have a TI CC debugger needed to re-program the BlueGiga chip so yeah… When we heard that @sectorfej had new modules in his InMojo store, we quickly bought one!

Here is the BLE module (wires are 5V, GND, TX, RX):

ble_buddy_wip 004

There’s a great BGLib library for this module that has all sorts of features packed in to it. There wasn’t much documentation about sending data… Here’s how to do it:

  1. ble112.ble_cmd_attributes_write(20, 0, data_len_var, data_var);

The number 20 is the hard part. We couldn’t figure out where to find the info about this number or anything… so we iterated from 0-49 to find it! You might have to do the same for yours as well. Just keep an eye open on Xcode for when data is received on the app side, and then narrow down the numbers until you find the one that works.

We didn’t show this in the video, but we use sending data from the robot to the app for triggering sounds. Specifically owl and fart sounds. (Yes, this might be the most complex fart app to date). It works better with RoboBrrd, as it has sensors that can be used to trigger the sounds.

Anyway, most of the ‘core functionality’ of the app is involved with this (below) and the communication. We can even make it auto connect to a particular device (as you saw in the video), which makes the experience even more seamless.

ble_buddy_wip 003

Here is how auto-connection is done. We save the UUID and name of the device, then check if we see that specific one:

  1. - (void) connectToDefault {
  2.     // ok, let’s try this
  3.    
  4.     if(connected) return; // already connected don’t do anything
  5.    
  6.     NSUserDefaults *userDefaults = [NSUserDefaults standardUserDefaults];
  7.     NSString *givenUUID = [userDefaults objectForKey:defaultDeviceUUIDKey];
  8.     NSString *givenName = [userDefaults objectForKey:defaultDeviceNameKey];
  9.     //CBUUID *zeeUUID = [CBUUID UUIDWithString:givenUUID];
  10.     CFUUIDRef zeeUUID = CFUUIDCreateFromString(kCFAllocatorDefault, (CFStringRef)givenUUID);
  11.    
  12.     int i = 0;
  13.     for(CBPeripheral *periph in allPeripherals) {
  14.        
  15.         if(periph.UUID == zeeUUID) {
  16.             NSLog(@"same UUID");
  17.             if([periph.name isEqualToString:givenName]) {
  18.                 NSLog(@"same name – let’s try to connect");
  19.                 self.peripheral = periph;
  20.                 [bleManager retrievePeripherals:[NSArray arrayWithObject:self.peripheral]];
  21.             }
  22.         }
  23.        
  24.         i++;
  25.     }
  26.    
  27. }

Sometimes specific BLE modules have certain service UUIDs and characteristic UUIDs that you can only send data to. We’ve never experienced a problem with ‘spamming’ everything (yet), but we built in this feature just in case. This is when data is being sent from the app to the robot.

  1. CBUUID *uuidService;
  2.     CBUUID *uuidChar;
  3.    
  4.     int ss = [selectedShield intValue];
  5.    
  6.     switch (ss) {
  7.         case 0: {
  8.             // any
  9.             uuidService = [CBUUID UUIDWithString:roboBrrdServiceUUID];
  10.             uuidChar = [CBUUID UUIDWithString:roboBrrdCharacteristicTXUUID];
  11.         }
  12.             break;
  13.         case 1: {
  14.             // kst
  15.             uuidService = [CBUUID UUIDWithString:kstServiceUUID];
  16.             uuidChar = [CBUUID UUIDWithString:kstCharacteristicTXUUID];
  17.         }
  18.             break;
  19.         case 2: {
  20.             // dr kroll
  21.             uuidService = [CBUUID UUIDWithString:drkrollServiceUUID];
  22.             uuidChar = [CBUUID UUIDWithString:drkrollCharacteristicTXUUID];
  23.         }
  24.             break;
  25.         case 3: {
  26.             // redbear
  27.             uuidService = [CBUUID UUIDWithString:redbearServiceUUID];
  28.             uuidChar = [CBUUID UUIDWithString:redbearCharacteristicTXUUID];
  29.         }
  30.             break;
  31.         default:
  32.             break;
  33.     }
  34.    
  35.    
  36.     for(CBService *aService in self.peripheral.services) {
  37.         if([aService.UUID isEqual:uuidService] || ss == 0) {
  38.             for(CBCharacteristic *aCharacteristic in aService.characteristics) {
  39.                 if([aCharacteristic.UUID isEqual:uuidChar] || ss == 0) {
  40.                     [self.peripheral writeValue:sendData forCharacteristic:aCharacteristic type:CBCharacteristicWriteWithResponse];
  41.                 }
  42.             }
  43.         }
  44.     }

This also allows us to do certain actions for different shields. While going through the example code for the RedBear BLE shield, we noticed it needed a ‘reset’ (or something). We haven’t tested this yet, but hopefully it will make the RedBear one work:

  1. if([selectedShield intValue] == 3) { // redbear shield is weird
  2.        
  3.         CBUUID *uuidService = [CBUUID UUIDWithString:redbearServiceUUID];
  4.         CBUUID *uuidChar = [CBUUID UUIDWithString:redbearResetRXUUID];
  5.         unsigned char bytes[] = {0×01};
  6.         NSData *d = [[NSData alloc] initWithBytes:bytes length:1];
  7.        
  8.         for(CBService *aService in self.peripheral.services) {
  9.             if([aService.UUID isEqual:uuidService]) {
  10.                 for(CBCharacteristic *aCharacteristic in aService.characteristics) {
  11.                     if([aCharacteristic.UUID isEqual:uuidChar]) {
  12.                         [self.peripheral writeValue:d forCharacteristic:aCharacteristic type:CBCharacteristicWriteWithResponse];
  13.                     }
  14.                 }
  15.             }
  16.         }
  17.        
  18.     }

The above is called whenever data is received by the app- eg:

  1. peripheral:didUpdateValueForCharacteristic:error:

When it is all working, it’s really fun to interact with the robot in this way!

ble_buddy_wip 002

There are still some wonky things that happen. For example, if you send too much data- or if you send it at the same time. Sometimes we don’t even know what we did and it will just disconnect (though thanks to our code, it re-connects quickly and without interrupting the user). This happens infrequently, so it might be odd cases.

Special thanks to @macisv, who at SecondConf last year taught me lots about BLE and let me experiment with it! And of course @sectorfej for making this great module that we used :)

Now for the hard part: completing and releasing it. It’s kind of weird, even though we are using this quite often… we have kind of come to dislike this interaction (of pressing and holding buttons) because it’s quite boring. So we’re not sure yet if this one will be finished, or if we’ll be trying something else, perhaps with more gestures and such.

ble_buddy_wip 001

More fun coding ahead! :D