Archive for the Software Category
So, I played with lisp years ago, and other functional languages.
Somewhere along the line the notion of thinking functionally became an anti-pattern – at least from the OO camp.
So, functional thinking has returned! Apparently.
I have been watching in the background, it would always have come back IMHO.
OO tried to be the Panacea. But the biggest problem, I felt, it obfuscated the functional model for the application.
We still end up with a functional model, even if its a tacit one.
In fact, sequence diagrams, activity diagrams and even robustness diagrams of UML – not to mention BPMN, all are functional modelling approaches.
The problem is you need the right orientedness for the problem expression.
It harks back to my chattering on Jason vs Jack in the Agent Oriented world. Jack obfuscates the Agent Orientedness of the problem, making you develop with Java OO libraries.
DSL are in vogue, even if its just @nnotations in Java or Python – allowing you to create you’re own “orientedness”.
Now, don’t get me started on parallel computing – recall I used to code in OCCAM and Transputers decades ago. It is been driven by multi-core world somewhat but, of course, they haven’t architected for parallelism. Just stuck lots of cores on a substrate hoping someone would sort out the coding problem.
So, how do we do it? We don’t use OCCAM!? Although Go! is a great alternative I am thinking. Still no one gets behind mature ideas like Plan 9?! We just start afresh with same stale old ideas.
Worth a look!
… it turned out to be a couple of hours as I found I could connect to the BBB from Winoze box but not from the 64 bit Debian box??
Turned out the BBB, like the BB, is a little dodgy and needs to be reset as you are not guaranteed an Ethernet connection on startup.
Makes it a nuisance I imagine for embedded applications I am guessing.
In any event, finally got around to doing hello world from Eclipse.
Just a start. Now the fun begins.
This is running on my new (second hand) 64-bit LINUX box.
This is stage 1.
Stage 2 was to buzz out my Beaglebone development environment. So far I have installed Eclipse and the ARM7 development environment onto the 64 bit LINUX box and built a small binary to dump onto the Beaglebone. Unfortunately, I don’t appear to be able to configure the LINUX host to “see” the ttyUSBx ports (having tried two different approaches). So, waiting on user group help (and you know how I feel about that).
In the meantime I guess I still have to sort what parts of MOOS-IVP I wan’t to run on the Beaglebone (no-visual apps generally) so I got the thing running my my LINUX host to get a feel for what can be parred back. Might try setting up my 32 bit LINUX box to “pretend” to be the embedded environment to get the ball rolling while I sort out the cross-compilation environment.
The MOOS-IVP runs as a separate environment to the vehicle computer so I am paring it up with an AIO which will do all the motor control and the 10-DOF sensing etc.
The code for the AIO (from a port of AP2) is MAVLINK literate so I am thinking of bridging to AIO via Python (which comes with the Beaglebone). There are various Pythonesque libraries I am toying with for application backbones including SPADE2, ZeroRPC (which is atop ZeroMQ), iPOPO OR even a combination of all three depending on needs. Certainly, I am looking at ZeroRPC to act as a bridge out to the Synapse SNAP bits and bobs since the MESH of SNAP will replace the MAVLINK mesh aspects.
Go figure, now that I am chaffing at bit I am sucking in Python modules and trying to compile this and that and …
Of course, you can’t hook all you want to into one installation of Python. Some muck is 32bit, some muck is 64bit.
And of course some of the useful network based muck needs to be on a POSIX compliant Linux and not your WINDOZE box.
Because I wanted to use mucky 32bit SNAP and SNAPpy (and especially Connect) it turns out I can have PyCUDA to port some of the mucky real-time SLAM examples because that needs 64 bit muck. Of course, on Windoze so there goes some of the neater RPC and parallel programming tools.
Was even tempted to bang together either an ATOM motherboard I have around the place, or a couple of dual core AMD64 with linux so I can have a 32bit LINUX and a 64bit LINUX.
Of course the LINUX leaning makes sense because of ROS, but I was actually looking at other distributed cache and linda-like tuple spaces as I don’t get the impression the ROS topics give you much more than that (and I wanted something that ran on 32bit without LINUX).
Most of this is because I am looking at offloading processing to a server over wifi so that you can get away with less smarts on the indoor/outdoor rovers. Especially since the SNAP Portal is running Python2.7/32bit I was restricted to that on the laptop. So a pure Python RPC is in order to skip between the laptop and the ATOM running 32bit LINUX/ROS.
Mind you, as exciting as all that sounds it hasn’t been – going through the trial and error and trying to build things from the mucky instructions people write for building.
The one thing I learnt years ago, working in Defence on software maintenance, no one did good help files. With the advent of Open Source it’s all gone downhill further.
Still, all this to match the RPC of the SNAPpy with python based RPC across server applications.
Especially if this pesky OpenCV compiles without muckiness on my laptop. They talk about python bindings but you need the bin directory full there was talk of windoze binaries but they don’t appear in the download I chose. Otherwise import cv works fine, just no blinking module calls. None of this is clear and the gist of the problem is the depth and breadth of priori knowledge you appear to need to get it up and running.
I have looked at two AHRS for the razor. One of then doesn’t appear to have a means to calibrate but also appears to claim it isn’t needed. The nuisance, the yaw setting (compass) didn’t seem to behave the way I would have expected.
It appeared that the north-south axis wasn’t 90degrees with the east-west.
Now, I found an EMI meter (so-called) for my iPhone and … well I should have known, I was trying to calibrate the compass while the sensor sat on the top of my laptop.
There was beer involved.
Mental note to oneself: Use a longer usb cable and find a spot half a meter at least from the laptop.
Still, I am happier with the other version of the AHRS as the on screen behavior seems to equate (mostly) with the behavior of the sensor. Still needs calibration so still a little work.