Archive for the Sensing Category

Woof woof woof

Posted in Arduino, MQTT, Sensing on June 18, 2017 by asteriondaedalus

A neighbor complained that the dogs were barking all day.

Likely cause all the renovation work going on (late into the night as well).

I looked around for software so I could detect the barking and potentially turn sprinklers on for a minute – to distract them.

Options are also a buzzer, so they associate the buzzer with the sprinkler coming on.  Then later the sprinkler needn’t come on.

HE is funny.  Not liking to wet he has an aversion to the solar powered, IR triggered, sprinkler I use to at least keep them away from the side of the house.

The psychology is funny though, they stop short of going down the side of the house and bark anyway.

Much of the software is really noddy.  Level sensing.  So I looked for microphones and downloaded python scripts etc.  The software would also send you an email if the dogs barked.

Still, the problem running on the PC was obvious.  The problems setting up a microphone on a C.H.I.P. or Orange Pi were not insurmountable BUT still was that over kill.

My time in signal processing get get me excited about using correlations to discern dogs barking from loud noises but there are other tricks there.  The load noise would be of a fixed duration that repeated.  So, a window with say four loud noises could count as barking in a backyard otherwise quiet.

The thought of the signal processing didn’t daunt me, but the turn around time to get something running did.  So, the plan is a super duper system later, at the moment I just want to monitor their behaviour and sort something out that might curb it.

So I thought I would go with a arduino compatible microphone/sound sensor instead.  An ESP-01 would do it as the sensor has a tuneable level sensor and raises as signal if the level is at or over that set.

However, it also puts out an analogue signal, from the mic, so a WeMOS mini would be better since it has the ADC input.

arduino-compatible-microphone

The sound sensor takes 5v, as does one of the pins on the D1 mini.    This way I can pump out noise events by the level sensor early using mqtt.   And later, I can pump analogue samples out by wifi.

I am already using node-red to send emails from the OPiZ so I can pump out hourly “woof” stats.   This setup is also cheap enough to set a couple up so that different parts of the yard (and thereby irrigation sectors) can play their part.

I have a suspicion neighbors are not helping and banging the fence.  So I am looking for a vibration sensor to get stats on that as well.

Advertisements

Stereo slam dunk

Posted in Python RULES!, Sensing, The Downside of software development, Vision on June 11, 2017 by asteriondaedalus

With some pain I got the stereo camera that turned up the other day, from aliexpress, to work (provisionally).

stereo_slam1

This is on my windoze PC using 64bit Stackless Python and OpenCV 3.2.

Trick, that stopped me for two days, was working out the problem where one or the other camera would work.  But both together hung.  I would swap order and get same thing.

Turned out to be USB 2.0 choking.  So fix was to work out how to set the image size small enough for the two camera streams to cooperate on the on USB port.

Camera is this one:

stereo_slam2

Which has specs of: 1280*720 MJPEG 30fps 120 degree dual lens usb camera module HD CMOS OV9712. Which is, as it turns out, a lie in this configuration.  The device is USB 2.0 so will choke when trying to pump both through at the same time.  Some work will be needed to sort the maximum resolution that the cameras can be set to – there is likely some black magic math somewhere (or trial and error).

I haven’t used much science in the selection (I waited until prices dropped and grabbed the lowest price one at the time).  I opted for wider field of view because I suspect that creates greater disparity between points to help localisation – however, don’t quote me as that is not back up by any reading at the moment.

The hangup, at the moment, is that while the two cameras are working, OpenCV does rather have various matrix types and so the rotten thing (as usual) “thin”  or sporadic documentation.

If you find “help” any it will be using deprecated functions (from previous versions of OpenCV) or in C++ etc.

Even just a disparity map, that uses the stereo image to show depth planes, needs matrix conversions.

Still, once these are worked out I can buzz out a design on the PC before migrating to an embedded form factor (C.H.I.P., ODROID-C0 or Orange Pi Zero, perhaps even old Android phone).

I am after something to pump a point cloud out.  Using mono-slam is fun but I am not sure that having to get the camera video processing and platform pose working together is happiest medium – since people are helping out with stereo camera like this especially.

Quality Control

Posted in Sensing on May 13, 2017 by asteriondaedalus

So, I decided to put a dispute into Aliexpress to get half my $5 back on the lens I ordered.

No markings so you can’t easily tell what focal length it is supposed to be.  Go figure, I pinged the vendor and asked how they know what lens to send if there are no markings and they “apologized” as they couldn’t tell me what focal length it was.

On principle then I will try to get half my money back, which is likely less than the transaction fees on VISA card but will make the vendor think about their quality control – I hope.

Go figure, in the end I found a couple of 2.8mm f/l lens that are marked and decided I can play with the one I have and get an even tighter f/l in which (again) increases the maximum velocity over the ground the sensor can be used for BUT IMPORTANTLY that means I can trade that off and get the sensor closer to the ground for use on my rock crawler.  The 3.6mm “imposter” does bring the sensor height down but it also has a 90 degree field of view which constricts it somewhat.  Any hint of motion in the field of view goes towards the odometry doesn’t it!

The options I had were 100 degree and 110 degree field of view.  I opted for the 2.8mm f/l 110 degree to get more image crammed in.

Parallel versus Concurrent

Posted in Parallel Talk, Sensing on December 26, 2016 by asteriondaedalus

So, I fell upon Chapel, a parallel programming language.  It makes sense for things that Erlang and Elixir are not good for – like image processing.

It isn’t Google’s,  it’s actually Cray’s.

So fun facts:

  1. When working in Canada I came across a coffee mug in a second hand store that was from Cray.
  2. You can Cram a replica of a Cray into an FPGA.

Now I have Erlang up and running on my ODRIOD-C1 home server.  I will get Elixir and Phoenix running as the wife will likely want something less geeky than node-red for her interface to the home automation.

In the meantime, for fun, Chapel is at this very moment compiling on the OD-C1.  Why not, quad core.  Apparently, for non-Cray etc. monsters, you simply need a  UDP module compiled, a lot of fiddling, and you can have two or more nodes (Locales) running.

In any event, the parallel helloworld found me four cores on me OD-C1.

hello-chapel

Black Magic

There was a configuration script to run but other than that I did not have to tell it how many cores, so some smarts buried does the job.  Otherwise the code to run the print on all four cores is a straight forward as:

config const numTasks = here.maxTaskPar;
coforall tid in 0..#numTasks do
   writeln("Hello, world! (from task " + tid + " of " + numTasks + ")");

There is also an image processing tutorial using Chapel.

There are even lecture notes around.

Zippity do dah, zippity ZAPPP!

Posted in Hardware, Sensing, The after market on December 25, 2016 by asteriondaedalus

Looking at OpenSprinkler circuit, the use of SCR to control the solenoids appears to require the addition of TVS diodes a plenty.

That would be a design decision.

As OpenSprinklette is using relays to drive the 24VAC solenoids we need not apply protection there.

Nuisance factor though is that trying to source non-SMD bidirectional channeled TVS, for the rain gauge input, not so much fun.

For the prototype at least, without any SMD pads on the prototype shield, we are happier with DO-15 or similar package types for the protoboard’s pad spacings.

Luckily though AliExpress came to the rescue.   I have ordered a packet 100 x 36V 1500W bidirectional TVS for US$11.   Now the original design calls for 48V but the markings on the devices suggested unidirectional and the text supporting them did not mention bidirectional.  So, just in case I have ordered 10 x 39V 1500W for US2.80.  That’s the problem with AliExpress and working with penny market vendors therein.  You take what you can get.

So rain gauge and TVS on the way.  I finally decided the shield I am designing will have an input for a rain gauge for those you want a minimal four sector.  The idea of a rain gauge with an ESP8266 has not gone away though.  Where we are using the mqtt and node-red we can build up as we please – over time.

 

Lidar for pennies – almost

Posted in Open Source can be professional, Sensing on May 8, 2014 by asteriondaedalus
Looking good!

Looking good!

Sub $500 laser scanner on the market.  Competition should explode here and bring prices down further.

Looking good for home hobbyist.

Now that’s an idea!

Posted in Android, Arduino, Sensing, Vision on May 2, 2014 by asteriondaedalus
Cyclops!

Cyclops!

Feast your eyes!

The beautiful accidents right.

The GripGo base, once I ripped the suction cup off (broke off without actually any resistance) was a couple of millimeters in diameter larger than a circular “ledge” on the back of the indoor rover so a liberal smear of epoxy and voila!!  An adjustable mount for my old Samsung SII.

Do notice the omnidirectional lens on the top.  That is a “spare” that turned up.  Cost $60 but came with an old knock about bloggie which I will find some other use for.

Unfortunately the second bloggie fell through, the poor dear at the other end of the interweb was too dull to know how to mail something interstate – oddly, this one comes from twice as far away.  Can’t fathom it but there are still Luddites around.

Anway, I have two of these lens as well as the “periscope” version (that I am waiting on the USB catheter camera).  So, plenty if you ask me.

I will need to get a student copy of Matlab to use a fancy calibration library which looks like it lifts any distortion out of the image once it is unwrapped.  Even if that falls through, a simple wall detector will allow this beast to avoid bumping into walls – we hope … mwahahahahaha!

Recall the phone runs S4A and Python so while the application will be in C++ or Android it can ping a socket with steering directions – or that is the plan.

Beauty is that the adjustable holder allows other configurations when other algorithms and vision approaches are being played with.

Nope, same Rover.

Nope, same Rover.