If any of the info on this website was useful for your projects or made your head spin with creative ideas, and you would like to share a token of your appreciation- a donation would be massively appreciated!

Your donation will go to my Robotics Fund, straight towards more sensors, actuators, books, and journeys. It takes a lot to continue building these robots, and I’m really thankful of everyone who helps encourage me along the way.



“A wise robot once said to me through Serial.println- that robots teach us about ourselves.”

All posts in Robot

Robot Base Prototyping

This is the first part at my initial prototype of my final project for Fab Academy. It’s a continuation of my Solve for X moonshot robot idea.

Fab Academy is a global distributed class for learning how to make (almost) anything at Fab Labs around the world. I’m participating remotely, my remote guru is Shawn Wallace from AS220 in Providence, Rhode Island. He actually gave RoboBrrd a Maker Faire Editor’s Choice ribbon in the past! Cool! I will be visiting EchoFab in Montreal, Quebec to do some of the lab work too.

Check out my Fab Academy page for this week- Computer Aided Design. So far it is quite a neat experience even as a remote student. I’m psyched for FA all the time, what a cool opportunity! It will be what I make of it, so keeping my eyes open to try new things and challenge myself.

As I learn more each week at Fab Academy, I’ll be able to improve on the final project design for next time. The reason why I’m starting now is to hopefully have a simple demo done for Maker Faire Bay Area or even RoboGames.

Alright, finally, let’s bring on the images and captions from the prototyping process!

Working on creating a base for connecting multiple robots together. There are a few requirements that I had in mind- mainly it has to be able to fold up, and has to be light weight.

Here’s a cross section of the base. It uses internal 3d printed hinges.


The hinge design is parametric. We actually got the dimensions right for 3dp on the first try:



Added littleBits to the ends as the prototype module connectors. Here’s what it looks like folded up:


Assembling it! (Cool sounds in this Vine!)

Here’s what it looks like all assembled with links holding it in place:



Connecting it in different ways:


This was a neat surprise- the modules do not have to be flat along a surface. The connectors are on a piece that swivels, so here you can see they are at 90 degrees to each other:


What if one of the links was bent? That might be cool. I tried to print this:


Had a filament jam:


Finally it worked (had to print with supports, lame)- but conceptually I messed up on the part. Derp derp.


That’s all for now. Next step I’m working on is the first robot to be placed on this base, followed by creating a spec for the controller board and interface board. (So that I can order the components early and they will get here maybe on time)

Maker Faire Ottawa

Screen Shot 2014-08-15 at 10.24.29 AM

We showed our robots at the amazing Maker Faire Ottawa! There were many people who were interested in the robots, and became inspired to try building their own.

If you did not have the chance to make it to Ottawa, here is a video of our table!

Photos of our table:



Here are the statistics from what I’ve heard: There were about 4,000 people on the first day, and 3,000 on the second. They beat the total attendance from 2010 in the first hour. Maker Faire Ottawa is the fastest growing Mini Maker Faire. And if Commander Hadfield was in Ottawa, he would have visited too! Maybe next year ;D

We were also interviewed on CBC – read more here.

We had two RoboBrrds there. This new yellow one, Coolios, and the black Spikey one. Coolios works with a sonar sensor, except that it was a little buggy and it came across as a very hyper robot. Spikey works with the iPad App we developed, RoboBrrd Dance.


Kids always enjoy interacting with RoboBrrd!


(^ Thx to whomever took this photo, great shot!)


(^ Photo cred @edgarmtoro – Thx!)

At one point in time, kids were lined up to use the RoboBrrd Dance app. How cool is that?!

We added some new RoboBrrd Kits to the store. Check them out if you want to get started building!

We also had the Automatic Food Slicer Robot in action, slicing some playdoh.


Pretty much as expected- older adults were interested in this robot. I hope that I can make it more stable in the future, so that way they can buy / make one, and it will help them. That would be cool.

We recently finished off a portion of this project for entrance into the Hackaday Space Prize. We’ll be blogging about this later, but for now you can view all the information here.


Kids also enjoyed interacting with AFSR. This is mostly because we were using the cool Hover gesture board. It takes a little time to figure out how far and fast to wave your hand for the gestures, but once they get it then they can control the robot very easily.

One of the best ideas we heard: We could use this robot to slice the crusts off bread! :D

The badge for Maker Faire Ottawa was absolutely stunning:


Here are some nice tweets from makers!

We were also displaying Clyde the robot lamp! Some of the backers of Fabule’s Kickstarter recently received their lamp too. Stay tuned for more info about what we are going to be making with our Clyde- it will be exciting!

A few weeks prior to the Maker Faire, we received a huge box from Intel. With our Intel Galileo 2, we will be making Cognito Collobot. The goal is to make a robot that can give a TED talk. This is for the new TED XPrize. It will be challenging, but we are going to try.

Also on our display board, two panels for people who were really interested in it, was detailing my Moonshot-In-Progress project. In terms of Google Solve For X, here are the three main points:

Huge Problem: With the rise of the aging population, there will be more need for assistance in their homes. The physical objects that surround them will become problematic as motor ability decreases.

Breakthrough Technology: (Work in progress) “The LED of motors” — something that can be soldered to a pcb, and when given power it can actuate. Different patterns could perform various actuations. There could be an abundance of actuators!

Radical Solution: When motors are as available as LEDs, we could add them to everything. With software, we could manipulate all the objects around us like they are fluid — even have the objects able to sense and automatically move based on previous patterns.

Everything around us would no longer be inanimate physical objects, but instead ones that are alive and can adapt to our needs and the environment.

As of right now, I currently have two main ideas on how to possibly make this work. Still have some more reading and learning to do, but I will be working on this. Watching the Solve For X videos have been very inspiring.

Has no one else on this planet been bugged by the fact that we can’t just tell things to move? It takes very long to add motors to everything. We should just have motor tape — or something similarly accessible.

We still have to work out the idea more, but this is a crazy goal that we will chase and strive to achieve some day. ;)

We also displayed the parts from the Laser Level Teardown. A couple of people were interested in this.

In four years from now, maybe we will be sponsors for Maker Faire Ottawa. This sounds like a great goal. :)

If you are looking to support my work in some way, back my fan-funding campaign on Patreon and check out the RoboBrrd store!

This was a great Maker Faire. Thanks to everyone for making it a huge success. Special thanks to Britta, Remco, Olivier, Amos and Naomi! Without your help I would not have been able to show the robots here, so thanks. :)


Tentacle Mechanism (work in progress 3)


The tentacle mechanism has come a long way since the last update. More LEDs, a better behaviour, and a mount for the electronics and wall. Plus some fun long exposure photos ;)

Here is a video clip where you can see it in action:

Also, the tentacle mechanism appeared on the Adafruit Show and Tell-

Check it out here at the 2:55-7:06 mark!

The behaviour for the tentacle mechanism has been difficult to figure out.

Our fist attempt was to use the outputs to communicate how many times the button should be pressed. At first, it was fun, but then it just becomes confusing. Sort of similar to: ‘Why is this thing doing this thing, what can I do to change it?’. You can watch a video of this here.

The next attempt was to use the ultrasonic sensor, and have different actions for each distance threshold. There are also ‘mini-actions’ that occur from the changes between these distances. So when you are interacting with it, the ‘dances’ that the tentacle does will be similar, but the introduction to that dance, LEDs blinking, will be different.

But in the code, it’s more than controlling the robot. There are ‘debug’ statements where the robot is saying things. It gives some context as to what the robot thinks is happening.

Screen Shot 2013-08-02 at 4.05.02 PM

So as you can see, this robot has some sort of creepy obsession with distance.

And it gets even more interesting when the human goes away:

Screen Shot 2013-08-02 at 4.05.24 PM

As for actually displaying the text to the humans, it might be nice to have a tiny OLED display at some distance away from the robot, that only lights up after some amount of time of interaction. This way the humans will pay more attention to the tentacle moving at first, then notice the display and keep interacting.

What is all this ‘be’ functions about in the code? Those are the ‘mini-actions’, as mentioned above. They just blink the LEDs in certain patterns and such. In a future robot, this will be more involved with the social drives/mind/emotions.

Taking long exposures of the tentacle moving has been quite fun. Here are some of my favourites:




Working on documenting it, there were a lot of lessons learnt while building this! ;)