Before the summit started, we were sitting at the sculpture robo-busking for votes! At that point, Ian came over and wanted to do an interview! It was an excellent interview, and he uploaded it really quickly at the summit so we could get more votes for the scholarship! Thanks Ian!
I actually did go to some of the talks! Specifically, the ones in the morning before the break. The Arduino Team’s keynote was really really great!
After that, I sort of hung around the cafeteria area showing off Learning Pet! A lot of people said they would vote, which was really great! After the crowd died down, I went into the cafeteria area to watch the stream and maybe work on some ADK stuff.
That was when the creator of ThingSpeak himself caught me and said Hello! ThingSpeak is a really cool Internet of Things website. It’s relatively small and new, which is why I like it compared to the others.
He told me about the location data parameter in the API. I never knew this existed! Then I was wondering how to get the location from Mac OS, if there was actually a framework for that. It turned out that there was! Wow! And it was since 10.6 too! I never knew this! Making it work was really great, it was only checking to see if it worked was what we really got caught on (because the XML file goes from oldest to newest).
It was then when I saw David Cuartielles from the Arduino team when I waved, who joined the table. We were talking about Learning Pet, and it turned out that he was the one who created the Processing ADK Tool! Wow! What a cooincidence!
I told him about all of the bugs, and asked how I can fix them. He showed me the code for the ADK tool, and walked me through how to build it in Eclipse! Building a tool for Processing is a little different because you have to tell ant that there are some things that are already pre-compiled, so it doesn’t have to check them.
I played around with the code for a while and sort of got used to the way things work. There are some places where it will be tricky to be able to do what I want to specifically do.
We also tried to figure out why there are four parameters on the Arduino side, and only three on the Android App side. It turns out that the Arduino is the one telling the Android what App it needs, rather than the other way around. This means that of course the Arduino side needs the description and website parameters. Which I guess makes more sense in retrospect
I’ll definitely be helping out more with this Processing ADK Tool stuff. The thing that motivates me the most is that when I first got the ADK and Android, I figured that this should be about 10x easier and 50x quicker than making an iOS App. It wasn’t, and many other people feel the same way, but now it is my goal to make it so.
We did listen to some of the talks while we were down hacking and learning on some code. They were really good! I didn’t manage to get to the breakout session, but they were all sort of scattered and I wasn’t listening to the directions anyway… playing with the code was more fun.
The Demo session was fun, lots of people loved Learning Pet and also said that they voted for it! However, when they announced the winners, Learning Pet didn’t place in the top three. I really appreciate everyone voting, though. To be honest and somewhat egotistical, I think Learning Pet’s documentation was the best and most complete. No one even came close!
Here is a video by johngineer about Learning Pet! Thanks johngineer!
The one thing that I would improve though, is to make the organizers a little more friendly towards everyone, and not just caring primarily about the sponsors. Yes, it is important to make the sponsors feel good since without them then there wouldn’t be this event, but it is also important to make the people at the summit itself feel good also. For example, at the demo session one of the organizers was talking with all of these sponsors in front of my demo area and goofing around and taking photos, but never bothered to say hello or ask about my project. It was sort of uncool and unmakerly (if that’s a word). The way I think of it is… you might as well be friendly to everyone, because we are all in this together!
All in all, the Open Hardware Summit was great for connecting with some of the people I have met online! It also turned out to be a great learning experience for building tools for Processing, and seeing how the Processing ADK tool actually compiles with API v10 rather than v7 (it is literally just setting the number different hahahaha)!
Introducing Learning Pet, a mini RoboBrrd with a very large theme- education! Learning Pet enriches lessons by creating a physical interface to interact with the virtual world.
We demonstrate a number sorting game, where the student interacts with the robot to blast virtual UFO’s with the lowest value. Correct answers are celebrated with a wing flap, and each level-up with a dance. We use the Accessory Development Kit to interface with mobile devices while away from the computer.
On LearningPet’s webpage, it has all the detailed information about the hardware, software and design. There is also a handy checklist at the top, so that at a glance you can quickly see the important facts.
I have been coding up an App storm the past few weeks!
First off, I got an actual phone that runs the Android OS! The user experience is sort of meh, I guess it’s a personal opinion sort of thing. What they did on the Android OS was separate some functions out to the hardware. I have to admit, it’s a bit confusing to go from the mindframe of everything will be accessible via the screen to pressing actual buttons. The menu button is great, though.
Making Android Apps is done in Java (yay, my first language!) through Eclipse. It will be fun to do some of the advanced things on this platform, like Augmented Reality. I would love to Augment explosions, so you can blow stuff up with your mind (er, smartphone), without harming anything. Imagine how much more peaceful the world would be with such an App?!
Next, KiloWhatt a la iPad: Split Screen View
The split screen view is really important in iPad Apps. KiloWhatt, right now, is the only energy management App for iPad, so updating its Universality is quite important. Making split views is, of course, more tricky than it seems. There are some really bad gotchas that can get to you. The one that got me was auto-rotation, it was sneaky. (You have to have autorotation enabled for all views).
All that is left in this version is to get all the information appearing, and make some popup view controllers. I missed the deadline for the App submission before Christmas, so I think I will work on this more for a few weeks to improve everything, like adding in the easy-peasy plist sorting capabilities.
Sorting data in Cocoa is breathtaking. Imagine you have a NSArray of NSDictionary…
This is using Cocos2d for iPhone. It is amazing. This actually runs at 60fps on an iPhone 4, and rarely drops to 58fps. I was trying to use Core Animation for a lot of things, like a Crazy Stars revamped, but Cocos2d is definitely better to use. They have lots of documentation, examples, and friendly people on IRC who help you out.
What I want to do with this App is add in some Box2d, so that I can fling the particle around and it will be bouncing all over the place! It would also be really neat if I can add some satellites around each of the emitters. I really enjoy the way single pixels show up on a retina display, they are tiny and crisp, so it would be interesting to see those flowing around the emitters.
Special shout out to Ken aka Retro for sending me 20 chips that I can use as the biocore in BEAM robots! Oh, and these hilarious Robot Monkey tissues!
To finish off, here’s a nice photo I took a few weeks ago: