Although it might be closer to ArduBlock? Although ArduBlock doesn’t cover ARM etc.
Archive for the Software Framework Category
So, I plugged in my recently acquired endoscopic camera to see what it might look like through the omni lens. Of course, I didn’t expect it to work out too well because of the LED light which would flood the lens.
I installed VS2012 (doesn’t like my Windows 7 apparently but seems to run anyoldway).
So, poked around the examples from OpenFramework and found one that was a simple frame grabber. I had to modify it to point at my camera (there is another pesky camera definition on my system which I think was an old clapped out USB camera I tried running that crashed the computer – looks like it didn’t install properly, as usual).
Anyway, the result is not earth shattering:
BUT this framework works on linux, windows and android so I can design and implement on my windows box before porting to either of android device or beaglebone. Pretty neat.
First I will take the OSX example of bloggie unwrapping to work it towards a windows visual bumper.
This is running on my new (second hand) 64-bit LINUX box.
This is stage 1.
Stage 2 was to buzz out my Beaglebone development environment. So far I have installed Eclipse and the ARM7 development environment onto the 64 bit LINUX box and built a small binary to dump onto the Beaglebone. Unfortunately, I don’t appear to be able to configure the LINUX host to “see” the ttyUSBx ports (having tried two different approaches). So, waiting on user group help (and you know how I feel about that).
In the meantime I guess I still have to sort what parts of MOOS-IVP I wan’t to run on the Beaglebone (no-visual apps generally) so I got the thing running my my LINUX host to get a feel for what can be parred back. Might try setting up my 32 bit LINUX box to “pretend” to be the embedded environment to get the ball rolling while I sort out the cross-compilation environment.
The MOOS-IVP runs as a separate environment to the vehicle computer so I am paring it up with an AIO which will do all the motor control and the 10-DOF sensing etc.
The code for the AIO (from a port of AP2) is MAVLINK literate so I am thinking of bridging to AIO via Python (which comes with the Beaglebone). There are various Pythonesque libraries I am toying with for application backbones including SPADE2, ZeroRPC (which is atop ZeroMQ), iPOPO OR even a combination of all three depending on needs. Certainly, I am looking at ZeroRPC to act as a bridge out to the Synapse SNAP bits and bobs since the MESH of SNAP will replace the MAVLINK mesh aspects.
ZeroC has a distributed computing framework that I came across recently. Originally, I was experimenting with AllJoyn, since it came from QualComm – source of powerful graphics etc. But, AllJoyn needs work to build distributed computing on top, though it does offer ad hoc, local comms.