Archive for December, 2011

SLAM IT TO ME BABY and all that Jazz

Posted in Software on December 28, 2011 by asteriondaedalus

Mobile Robot Course
Introduction to Mobile Robotics (engl.) – Autonomous Mobile Systems.  You will need Octave (the MATLAB clone).

Do read also Simultaneous Localisation and Mapping (SLAM): part 1 and part 2.

Handy reading if you are working with monster trucks as the basic robotic vehicle (Ackermann Drive).  Calculating circular paths to meet with a waypoints.

A more down to earth piece talks about waypoints, dead-reckoning et al.

Here is an interesting piece that talks about subsumption.  Same chap talks about odometry.

Seek and you shall …

Posted in Agent, Arduino, Prototyping, Software on December 4, 2011 by asteriondaedalus

… well, find Seek.

A pivotal paper on avatar behaviors caught my attention (see here).

I started to look at pulling the basics out of the C++ templates and work it into basic behaviors for use on Arduino/Processing, then came across someone who had already “ported” it (see here).

Ideally then, ignoring the graphics and using “real” sensor input (rather than simulated) one should have a basic set of vehicular behaviors (running, say, on the Romeo) with the Jason app, running on the Android phone, selecting behaviors.

Work to go but things falling into one’s lap to spur one on what!

DOH!!

Posted in Air, Software, Vision on December 3, 2011 by asteriondaedalus

How did that escape me!

I just remembered.  The AR-Drone has a down looking camera that will also be a target of omni-directional obstacle avoidance.

44576hiparrot_ARDrone_08vcl-bpp1-665px

There has been some work building the ARDrone with OpenCV so some smoothing of the way.  Interestingly, mostexperiments have been using Optical Flow as an avoidance scheme, using the front camera – rather than what is ostensibly a simpler approach with a little hardware hack and gentler software.

PS a little hand is at …

Actually!!

Posted in Chase That Dog!!, Prototyping, Software, Vision on December 3, 2011 by asteriondaedalus

I have a Sony Bloggy (one of the first models with the rotating camera) and anyone who knows Bloggy knows it has a panoramic lens attachment.

Did you know that you can do optical avoidance relatively simply with a panoramic view (http://www.pirobot.org/blog/0004/)?

So, I will need to see what needs to be done to use the bloggy, or the omni-directional lens from the bloggy with another camera.  Since the lens is designed to work with the Bloggy camera I am guessing it will be tune the distance which may require some machining of the lens to fit it to, say, the Android phone.

I will use the PC as the remote smarts for the moment is that the proof of concept will be using RoboRealm Vision for Machines (http://www.roborealm.com).

First good thing.  If you do your research, like any good little integrator, you’ll discover some quite expensive omni-directional setups (>$1K).  So, Bloggy lens (at around $100) is a bonus.

So if you can use the bloggy itself, streaming to your BeagleBoard running Linux that would be one option.  Would take some work, what with having to find/write drivers etc.

The second is modify the lens base to work with Android phone camera and use IP WebCam to pump images up to client (Android or PC) for processing.  (This will be the approach for the prototyping.)

Mind you, Snapdragon S2 based phones will drip in price soon enough (makes me wonder why you would bother with Beagleboard if a Android phone plus Arduino would give you more for less) – my S1 based phone was $70.

So busy with work but here is updates to date

Posted in Chase That Dog!!, Ground, Prototyping, Vision on December 3, 2011 by asteriondaedalus

ARDrone now in possession.  Second hand from a pal who has moved to China for work.  Good luck D!

I am buying a house so stocked up on a few things in meantime.  9DOF Razer etc.

I got a DF Bluetooth for the ROMEO as the USB Host board covered too much around the edges.  The initial experiments will be from the laptop any way (which now sports dual boot Ubuntu and Windoze).

First experiments I am working towards are using IP WebCam (on Android) and Cambozola to produce something like the functionality of WOWWee Rovio (basic video streaming and remote control) before moving onto using OpenCV to experiment with flow optics.  OpenCV is slowly being ported (under open source by TI – see here and here otherwise go here otherwise here) to the OMAP devices to accelerate it on the ARM side of the world – so no need to rush to install on Android just yet.

[QualCOMM Snapdragon S2 and above has an already tuned fast CV library (go here). Pity my phones are S1 😦 ]

Android phone (acting as video streaming service) will be taped to front of indoor carriage as there is no real need to put any more dings in the ARDrone (just yet).

The neat thing about IP WebCam, from an integration point of view, it could just as easily be used as a video stream to a application on the phone that is serving the video.  A nice way of decoupling the functionality.

Also grabbed a MEGA IO shield on sale.  It has an SD card pinout (I already have the SD card breakout) and an XBEE site.  This will be great then for experimenting with OpenMAV protocol for moving to the outdoor carriage.  It will also be used for the prototyping work – so the more capable boards don’t get handled as much OR charred if something goes wrong 😉