Yearly Archives: 2015

Re-working Wall-E, Part I

Posted October 25, 2015

About a month ago I posted an article describing ‘the end of the road’ for the spinning-LIDAR implementation on my 2-motor Wall-E wall-following robot (see the post here).  Since then I haven’t had the time (or frankly, the inclination) to start the process of re-working Wall-E.  However, today I decided to start by removing the spinning LIDAR and all the supporting pieces, stripping Wall-E back to bare-bones as shown in the photo below.

Wall-E stripped back to bare-bones.  Note removed spinning LIDAR parts on left.

Wall-E stripped back to bare-bones. Note removed spinning LIDAR parts on left.

The LIDAR itself (or its older silver-label cousin) will go back on, but in a fixed forward-looking configuration.

I will probably also take the time now to make  a couple of other improvements, such as replacing the left-side (right side in the above photo) blue plastic bumper with the improved red plastic one, and replace the charging jack so the 4WD and the 2WD versions can share the same charging adapter.

Stay tuned!

Frank

 

 

 

Building up the New 4WD Robot – Part 2

Posted 10/13/15

After getting the battery pack and charger module assembled and working properly, it was time to integrate it into the 4WD chassis, and add the motor drivers and navigation controller subsystems.  The battery pack/charger module was constructed in a way that would allow it to be mounted in the motor bay with the motors, giving it some protection and getting it out of the way of the rest of the robot.  The DFRobotics 4WD kit comes complete with a power plug and SPDT power switch, and these were used for charging and main robot power switching, respectively.

Battery pack being installed in the motor bay. Note power switch and charging plug on left

Battery pack being installed in the motor bay. Note power switch and charging plug on left

Battery pack installed in motor bay

Battery pack installed in motor bay

Battery pack installed in motor bay

Battery pack installed in motor bay

Maintenance access to motor bay and battery pack

Maintenance access to motor bay and battery pack

MotorBayClosed

Motor bay closed, with motor and power wiring shown

After getting the power pack installed and the motors wired, it was time to move on to the ‘main deck’ components – namely the Arduino Mega and the two dual-motor drivers.  We spent some quality time trying different component layouts, and finally settled on the one shown in the following photo.

Initial component placement on the main deck

Initial component placement on the main deck

The motor drivers and the Arduino were mounted on the main deck plate by putting down a layer of adhesive-backed UHMW (Ultra-high Molecular Weight) teflon tape to insulate the components from the metal deck plate, and then a layer of double-sided foam tape to secure the components to the UHMW tape.  In the photo above, the Arduino is mounted toward the ‘front’ (arbitrarily designated as the end with the on/off switch and charging plug), and the motor drivers are mounted toward the ‘rear’.

After mounting the motor drivers and the Arduino, we added a terminal strip at the rear for power distribution, ribbon cables from the Arduino to the motor drivers, and cut/connected the motor wires to the motor driver output lugs.  The result is shown in the photos below.

Ribbon cable initial installation

Ribbon cable initial installation

Ribbon cable initial installation

Ribbon cable initial installation

Final ribbon cable routing

Final ribbon cable routing

To test this setup, we borrowed a nice demo program created by John Boxall of Tronix Labs.  This demo was  particularly nice, as it used the exact same flavor of L298-based motor driver as the ones installed on our robot, so very little adjustment was required to get the demo code to run our 4WD robot.  After fixing the expected motor wiring reversal errors, and getting all the wheels to go in the same direction at the same time, we were able to video the ‘robot jig’ as shown below

 

So, at this point we have a working robot, but with no navigational capabilities.  Now ‘all’ we have to do is add the XV-11 NEATO LIDAR on the sensor deck, get it to talk to the Arduino, and then get the Arduino smart enough to navigate.

Stay tuned! ;-).

Frank and Danny

Building up the New 4WD Robot – Part 1

Back in May of this year  I purchased a 4-wheel drive robot kit from DFRobots as a possible successor to my then-current Wall-E 3-wheel (2 drive motors and a small castering nose wheel). I didn’t have time to do more than just assemble the basic kit (see this post), so it spent the intervening months gathering dust on my shelf.  Coincidentally, my wife arranged with our  son to kidnap her grand-kids for a week (giving our son and his wife a much-needed break, and giving us some quality grand-kid time), so I decided to brush off the 4WD robot kit as a fun project to do with them, in parallel with re-working Wall-E to remove its spinning LIDAR assembly and replace it with a hybrid LIDAR/Sonar setup.

The plan with the 4WD robot is to incorporate another spinning-LIDAR setup – this one utilizing the XV-11 spinning LIDAR system from the NEATO vacuum cleaner.  This system rotates at approximately 300 RPM (5 RPS), so there is a decent chance that will be fast enough for effective wall navigation (the Wall-E LIDAR setup couldn’t manage more than about 200 RPM and that just wasn’t good enough).

However, before we get to the point of determining whether or not the XV-11 LIDAR system will work, there is a LOT of work to be done.  At the moment, I can see that there are four major subsystems to be implemented

  • Battery Supply and Charger
  • Motor Controller integration
  • XV-11 LIDAR controller
  • Navigation controller

Battery Supply and Charger

In preparation for the project, I purchased 2ea 2000 mAH Li-Ion batteries and a supply of ‘basic Li-Ion charger’ modules from SparkFun.  In my previous work with Wall-E, I had devised a pretty decent scheme for charge/run switching using a small 2-pole, double-throw relay to switch the battery pack from series connection for running the robot to independent-parallel for charging, so I planned to use the same setup here.  After the usual number of screwups, I wound up with a modular battery pack system that could be tucked away in the motor compartment of the 4WD robot.

Battery pack showing charging modules and switching relay. The tan capacitor-looking component is actually a re-settable fuse

Battery pack showing charging modules and switching relay. The tan capacitor-looking component is actually a re-settable fuse

Battery pack showing charging modules and switching relay. The tan capacitor-looking component is actually a re-settable fuse

Battery pack showing charging modules and switching relay. The tan capacitor-looking component is actually a re-settable fuse

In the above photos, the tan component that looks very much like a non-polarized capacitor is actually a re-settable fuse!  I really did not want to tuck this battery pack away in a relatively inaccessible location without some way of preventing a short-circuit from causing a fire or worse.  After some quality time with Google on the inet, I found a Wikipedia entry for ‘polymeric positive temperature coefficient device (PPTC, commonly known as a resettable fuse, polyfuse or polyswitch).  These devices transition from a low to a high resistance state when they get hot enough – i.e. when the current through them stays above a threshold level for long enough.  They aren’t fast (switching time on the order of seconds for 5X current overload conditions), but speed isn’t really a factor for this application.  I don’t really care if I lose a controller board or two, as long as I don’t burn the house down.  Even better, these devices reset some  time after the overload condition disappears, so (assuming the batteries themselves weren’t toasted by the overload), recovery might be as simple as ‘don’t do that!’.

Motor Controller Integration

When I got the 4WD robot kit, I also purchased the DFRobots ‘Romeo’ controller module. This module integrates an Arduino Uno-like device with dual H-bridge motor controllers with outputs for all 4 drive motors.  Unfortunately, the Romeo controller has only one hardware serial port, and I need two for this project (one for the PC connection, and one to receive data from the XV-11 LIDAR).  So, I plan to use two Solarbotics dual-motor controllers, and an Arduino Mega as the controller

XV-11 LIDAR Controller

The XV-11 LIDAR from the NEATO vacuum has two interfaces; a 2-wire motor drive that expects a  PID signal from a FET driver, and a high-speed serial output that transfers packets containing  RPM, status, and distance  data.  A company called ‘Get Surreal’ has a really nice Teensy2.0-based module and demo software that takes all the guesswork out of controlling the XV-11, but its only output is to a serial monitor window on a PC.  Somehow I have to get the data from the XV-11, decode it to extract RPM and distance data, and use the information for navigation.  The answer I came up with was to use the Get Surreal PCB without the Teensy as sort of a ‘dongle’ to allow me to control the XV-11 from an Arduino Mega.  XV-11 serial data is re-routed from the Teensy PCB connector to Rx1 on the Arduino Mega.  The Mega, running  the same demo code as the Teensy, decodes the serial packets, extracts the RPM data and provides the required PID signal back to the driver FET on the Teensy PCB.  Not particularly fancy, but it works!

XV-11 Controller V1.1 without the Teensy 2. The original connectors were replaced with right-angle versions

XV-11 Controller V1.1 without the Teensy 2. The original connectors were replaced with right-angle versions

Side view of the XV-11. The right-angle connectors lower the height considerably

Side view of the XV-11. The right-angle connectors lower the height considerably

XV-11 connected to the 'controller dongle'

XV-11 connected to the ‘controller dongle’

The 'before' and 'after' versions of the XV-11 controller

The ‘before’ and ‘after’ versions of the XV-11 controller

During testing of the ‘XV-11 dongle’, I discovered an unintended consequence of the change from upright to right-angle connectors.  As it turned out, the right-angle connectors caused the pin assignments to be reversed – yikes!  My initial reaction to this was to simply pull the pins from the XV-11 connectors and re-insert them into the now-proper places.  Unfortunately, this meant that I could no longer go back to controlling the XV-11 from the original controller module – bummer.  So, I wound up constructing two short converter cables to convert the now-reversed XV-11 cables to the ‘normal’ sense for the Get Surreal controller module.  Nothing is ever simple….

Navigation Controller

The navigation controller for the 4WD robot will be an Arduino MEGA 2560.  This board was chosen because it has LOTS of digital/analog/pwm I/O and multiple hardware serial ports.  You can’t have too many I/O ports, and I need at least two (one for the USB connection the host PC, and one to interface to the NEATO XV-11 spinning LIDAR) hardware serial ports.  The downside to using the Mega is its size – almost twice the area of the Arduino Uno.

Arduino Mega 2560 for LIDAR I/O and navigation control

Arduino Mega 2560 for LIDAR I/O and navigation control

End of the Road for Wall-E’s Spinning LIDAR System :-(

30 September, 2015

Sad to say, but I believe I have ‘hit the wall’ on development of an effective spinning-LIDAR navigation system for Wall-E.  Even with a souped-up spinning platform (approx 3 rev/sec), I have been unable to reliably navigate a straight hallway, much less recover from stuck conditions.  In addition, the combination of drive motor currents and the increased current required to drive the spinning platform at the increased rate drains  the battery in just a few minutes.  So, I have reluctantly concluded that the idea of completely replacing the acoustic sensors with a spinning-LIDAR system is a dead end; I’m going to have to go back to the original idea of using the acoustic sensors for wall following, and a fixed-orientation LIDAR for stuck detection/recovery.

I started down the spinning-LIDAR road back in April of this year, after concluding a series of tests  that proved (at least to me) that the use of multiple acoustic sensors was not going to work due to intractable multi-path and self-interference problems.  In a follow-up post at the end of that month, I speculated that I might be able to replace all the acoustic sensors  with  a spinning-LIDAR system for both wall following and ‘stuck detection’.  At the time I was aware of two good candidates for such a system – the NEATO XV-11 robot vacuum’s spinning-LIDAR subsystem, and the Pulsed Light ‘LIDAR-Lite’ component.  I chose to pursue the Pulsed-Light option because it was much smaller and lighter than the XV-11 module, and I thought it would be easier to integrate onto the existing Wall-E platform.  Part of the appeal of this option was the desire to see if I could design and build the necessary spinning platform for the LIDAR-Lite device, based on a 6-wire slipring component available through AdaFruit.

In the time since that post at the end of April, I successfully integrated the LIDAR into the Wall-E platform, but only recently got to the point of doing field tests.  The first set of tests showed me that the original spin rate of about 120 RPM (about 2 RPS) was way too slow for successful wall following using my original ‘differential min-distance’ technique, and so I spent some time investigating PID control techniques as a possibility to improve navigation.  Unfortunately, this too proved unfruitful, so I then investigated ideas for increasing the LIDAR’s spin rate.  The LIDAR spinning platform is driven by a drive belt (rubber O-ring) attached to a pulley on a separate 120 RPM DC motor.  My original setup used a pulley ratio of approximately 1:1, so the motor and LIDAR both rotated at approximately 120 RPM.  By increasing the diameter of the drive pulley, I was able to increase the LIDAR rotation rate to approximately 180 RPM (about 3 RPS), but unfortunately even that rate was too slow, and the increased drive motor current rapidly drained the batteries – rats!

'The Last Spinning LIDAR' version.  Note the large gray drive pulley.  Gets the spin rate up to around 180 RPM, but at the cost of much higher battery drain.

‘The Last Spinning LIDAR’ version. Note the large gray drive pulley. Gets the spin rate up to around 180 RPM, but at the cost of much higher battery drain.

So, while I had a LOT of fun, and learned a lot in the process, it’s time to say ‘Sayonara’ to the spinning-LIDAR concept (at least for Wall-E – still plan to try the XV-11 module on the 4WD robot) and go back to the idea of using acoustic sensors for wall following and a forward-looking LIDAR for obstacle avoidance and ‘stuck’ detection.

Stay tuned!

The Evolution of an Outside Shot – Part III

23 September 2015

Well, it’s been over a year since my last post on this subject, and while I’d like to say I’ve made great strides, In reality progress  has been somewhat spotty.  From November of last year  until about two months ago  I had been practicing my shot almost every day, and was seeing slow but steady improvement.  I even got to the point where I was making the occasional 3-point shot ‘in competition’ (at my age, the ‘in competition’ is  definitely in quotes!).  However, while my shooting prowess was increasing, so was pain in both my knees, and sometime in July  it got to be too much to bear anymore.  When a doctor’s visit and a set of X-rays ruled out skeletal damage as the culprit, I tried some massage therapy to see if that would help; it did, but not significantly enough to get me back on the court.  At the recommendation of the massage therapist,    I started a  physical therapy course, and although I haven’t quite finished the PT sessions, my knee pain has gone from ‘cant-stand-it-anymore’ agony to ‘mild-old-age-annoyance’ stiffness.  I’m not sure how much of this is just abstinence from shooting and how much is due to the exercises, but at this point I’m not sure I care! 😉

So, yesterday I went back to some mild practice shooting for the first time in a couple of months.  Instead of just shooting 3’s, I’m limiting my practices to short-range, foul-line, and what I call ‘2-1/2’ range (halfway between the foul line and the 3-point arc).  I was encouraged to find that the break had not erased all my skill gains (such as they were), and I was still able to shoot with a reasonable amount of good form for an ‘almost-septuagenarian’ ;-).  The following short videos show side and front views of my ‘2-1/2’ point range shooting form.   Hopefully I will be able to increase the frequency and duration of my practice sessions without re-injuring my knees.

PID Control Study for Wall-E

22 September, 2015

In my last post I described the results of some ‘field’ (long hallway) testing with Wall-E, with an eye toward validating my idea of using the ‘min distance angle’ as the primary input to my wall-following robot navigation algorithm.  While the initial results of static testing were encouraging, the results of a more realistic simulated wall following run where I manually pushed the robot along the hallway were even more discouraging.  I became convinced that the ‘min distance angle’ just didn’t have sufficient resolution for good steering, and getting better resolution would require a significant re-design of the tach sensor setup (in progress, but…).

So, I started thinking about using a PID (Proportional Integral Differential) controller using just the min distance as the input.  PID controllers can be quite effective dealing with complex electromechanical processes, but they can be tricky to ‘tune’ correctly.  After reading up on PID controllers in the Arduino world for a while, I ran across a nifty blog site managed by Brette Beauregard (http://brettbeauregard.com/).  Brette  is apparently a  PID god, and he has graciously shared his knowledge with us mere mortals in the form of a number of introductory articles on PID design, a new Arduino PID library, a nifty PID Autotune library, and an active PID Google Group.   Thanks Brett!  I’d nominate you for sainthood, but I think you went past that a while ago ;-).

Anyway, after reading through a lot of the introductory material and even understanding some of it, I decided to give the PID Autotune library and the associated example sketch a try.  I downloaded the library, and fired up the example sketch using a spare Arduino Uno I had laying around.  After the requisite amount of fumbling around, I started getting some recognizable output on the serial monitor, and after a while I even figured out how to enable the auto-tuning feature.  The following printout and associated Excel plots show the results before and after tweaking the PID tuning constants.

PID Autotune library example.  Data and Excel plot of before and after auto-tuning, with simulated input/output.

PID Autotune library example. Data and Excel plot of before and after auto-tuning, with simulated input/output.

From the plots it is pretty obvious that the auto-tuned PID parameters do a much better job of acquiring and tracking the setpoint.

This is pretty awesome stuff, and I definitely plan to try a PID controller for Wall-E my wall-following robot.  However, here in the real world there are a few flies in the PID ointment, especially with respect to auto-tuning.  Most significantly, the auto-tuning process takes about 9 cycles of input/output swings to come up with suggested tuning constants, and acquiring those 9 cycles without Wall-E wandering off into the middle of the living room or crashing into a wall (or even worse, being snagged by the dreaded ‘stealth slippers from hell’ (aka the wife’s fuzzy slippers).   I may just have to suck it up on this one and tune the PID constants manually, we’ll see.

Stay tuned!

Frank

 

Field-Testing the Improved Spinning LIDAR system

Posted 17 September, 2015

After getting the improved tachometer assembly  integrated into Wall-E’s spinning LIDAR setup, I decided to repeat the hallway  field testing that I performed back in July of this year (see this post for the details).  My ‘theory of navigation’ for wall following is that I should be able to determine Wall-E’s orientation relative to a nearby wall by looking at where the spinning LIDAR’s minimum distance measurement occurs relative to Wall-E’s ‘nose’.

Just as I did back in July, I placed  Wall-E a short distance away from a long straight wall, in three different orientations – parallel, 45-deg nose-in, and 45-deg nose-out.  For each of these orientations I let the spinning LIDAR ‘look’ at the wall for about 10 revolutions, and then I manually changed the orientation. The distance and angle (actually the interrupt number, but that is the same as the angle) values were recorded in a text log.

The LIDAR Field Test Area. Note the dreaded fuzzy slippers are still lurking in the background

The LIDAR Field Test Area. Note the dreaded fuzzy slippers are still lurking in the background

Wall-E oriented at approximately 45 degrees nose-out

Wall-E oriented at approximately 45 degrees nose-out

Wall-E oriented at approximately 45 degrees nose-in

Wall-E oriented at approximately 45 degrees nose-in

Wall-E in the 'parallel' configuration

Wall-E in the ‘parallel’ configuration

LIDAR distance measurements to a nearby long wall

LIDAR distance measurements to a nearby long wall

The text log was arranged as shown in the following screenshot.

A small portion of the data log for the wall orientation test. Note the line at the top describing the orientation sequence

A small portion of the data log for the wall orientation test. Note the line at the top describing the orientation sequence

Next, I wrote an Excel VBA script to parse the text log file and extract just the half of each revolution where the LIDAR was scanning the nearby wall, skipping over the half.  For each such scan, I searched the distance data for the minimum value, capturing that value and its associated interrupt number (i.e. angle).  All the extracted distance values and the min dist/min angle numbers were written to the spreadsheet, and then I plotted the mininum distance interrupt  number (i.e. angle) vs rev number for about 90 LIDAR revolutions.

If my theory holds water, then I should be able to see variations in the minimum interrupt number over time that corresponds to the orientation changes.  As shown in the following plot, that is exactly what happens.

Excel plot of the interrupt number corresponding to minimum distance vs rev number

Excel plot of the interrupt number corresponding to minimum distance vs rev number

As can be seen in the above plot, the Min-Dist-interrupt (MDI) starts out between 4 and 5, and stays there for the first 9-10 revolutions.  At about rev 11, it jumps to the 7-8 range, where it stays until about rev 20. The it drops back to 5-6 for 10 revs, and then drops again to the 2-3 range.  This pattern then repeats for the duration of the plot.  In my current spinning LIDAR configuration, Interrupt 1 starts at Wall-E’s tail, and proceeds along Wall-E’s left side to interrupt 10 at the nose.  So, an MDI of 4-5 should correspond to the parallel orientation, while a lower number should correspond to nose-out and a higher one to nose-in.  From the test condition description, Wall-E was placed parallel for 10 revs, then nose-in for 10, then nose-out for 10, then back to parallel, repeat.    It is clear from the plot that the actual behavior of the MDI matches the predicted behavior quite nicely – EUREKA!! ;-).

Although the above test results are quite encouraging, it is still not entirely clear that this technique can actually be used for effective navigation at practical travel speeds.  There is undoubtedly some correlation between the spinning LIDAR rotation rate and the maximum travel speed at which the LIDAR can provide information fast enough for effective steering.  For instance, the NEATO XV-11 spinning LIDAR system rotates at about 300 RPM (5 RPS), and it seems to travel no faster than about 1-2 m/sec.  This might mean that my 120 RPM (2 RPS) spin rate would only support travel speeds in the 0.5-1 m/sec range.  In addition, my current 18-degree resolution may be too coarse for effective steering.  Again using the XV-11 as a baseline, it has a resolution of 1 degree, 18 times mine.  With the much faster speed of the new V2 ‘Blue Label’ Pulsed Light LIDAR, I could probably double or even triple my current angular resolution, but 18X might be a bit much ;-).

Next up – analyzing the data from a simulated navigation test, where I manually pushed Wall-E along the hallway, simulating as close as possible how I think Wall-E might navigate, assuming the spinning LIDAR data is fast enough and accurate enough.  As I moved Wall-E along, I recorded the same distance and interrupt number data as before, so it will be interesting to see if this data continues to support my current ‘theory of navigation’ – stay tuned!

22 September 2015 Update:   The data from the simulated navigation test was a mess – nothing recognizable as a pattern.  The one thing it  did do was convince me that the ‘angle of least distance’ idea wasn’t going to work – and that something else was going to have to be done. What, I don’t know  yet…

Frank

 

Wall-E gets an improved Tachometer

Posted 16 September, 2015

Back in June of this year I posted about my initial efforts to implement a tachometer subsystem as part of a spinning-LIDAR system (see ‘LIDAR-Lite Gets its Own Motor‘),  This implementation used a ‘plugged gap’ technique for detecting the index position of the LIDAR, as shown below.

Diagram of the tach wheel for Wall-E's spinning LIDAR system

Diagram of the tach wheel for Wall-E’s spinning LIDAR system

The idea was that when the time between gap interrupts was more than 2 gap durations (assuming a relatively constant rotation speed), then the ‘plugged gap’ must be between the IR LED and the photo-diode.  This design  works OK, but has a couple of nagging drawbacks:

  • It depends on a fixed  open-loop timing delay; if the motor RPM varies enough, the fixed delay threshold might be too large or too small.
  • The plugged-gap technique removes two interrupt positions from the wheel, meaning that position information is missing during that 54-degree arc.

So, after getting my V2 ‘Blue Label’ LIDAR from Pulsed Light and figuring out how to use it (see this post), I decided to see what I could do about addressing  both the above problems.  At first I  thought I might be able to simply add a second photo-diode on the sensor side of the tach sensor assembly, coupled to the existing single IR LED via a slanted channel in the tach wheel, as shown below.  The idea was that when the interrupt was fired at the edge of the index gap, the value of the  ‘other’ sensor could be read – if the value was below a certain threshold (meaning more impinging IR energy), then that sensor must be lined up with  the index hole.  This meant that the index hole needed to be offset from the index gap, so the max energy receive position would coincide with the position at which the interrupt fired, which occurs at the gap edges,  not the center.

Tach Wheel V4

This turned out to be a miserable failure.  The IR LED’s have a very narrow illumination sector, and there wasn’t enough off-axis energy to reliably detect the index hole.

So, some five versions later, along with a complete redesign of the sensor assembly, I have what I think is a nicely working implementation.  The single LED slanted-hole design was scrapped in favor of a two-LED/sensor one, and the index sensing hole was replaced by a circumferential gap, as shown below.

Latest two-sensor/LED design. Note the circumferential gap is centered on one edge of the index gap, so the index sensor voltage will be minimum (max incident energy) when the gap-edge interrupt fires.

Latest two-sensor/LED design. Note the circumferential gap is centered on one edge of the index gap, so the index sensor voltage will be minimum (max incident energy) when the gap-edge interrupt fires.

 

For the tach sensor assembly, the single IR LED was replaced by two independent IR LED’s, each with its own 240-ohm current-limiting resistor (I tried running them in series with a single resistor, but that didn’t work very well either).  The original enclosed sensor slot was replaced by an exposed ‘trough’ with two retainer bars (the trough without the retainer bars didn’t work either).  See below for the ‘new improved’ tach sensor assembly.

Tach Sensor Assy2

Tach sensor assembly showing the mounting holes for the two 3mm IR LED’s . The thin gaps visible in the background are the corresponding channels into the photo-diode trough

 

 

Tach Sensor Assy1

Tach sensor assembly showing the photo-diode trough. The backs of the photo-diodes are glued to a thin plastic carrier that is captured by the retaining bars

After running some preliminary tests, I mounted  the new tach wheel and sensor assemblies and ran some O’scope tests to see how the new design worked.  I was  very pleased to see that it appears  to be working better than I could have hoped.  In the following O’scope photo, the top trace is the index sensor, and the bottom trace is the normal tach-gap sensor.  Both are 2 volts/cm and exhibit full 5-volt swings.

O'so;pe photo with the signal of interest highlighted. The top trace is the index sensor, and the bottom one is the tach-gap sensor.

O’so;pe photo with the signal of interest highlighted. The top trace is the index sensor, and the bottom one is the tach-gap sensor.  The index interrupt will occur at the first rising edge of the bottom trace.

The gap interrupt of interest occurs at the first rising edge of the bottom trace.  As can be seen from the photo, reading the value for the index gap sensor (top trace) at this point will retrieve a stable ‘0’ value, perfect for index gap detection!

The following photos show the new tach wheel and sensor assembly mounted on Wall-E, with the LIDAR assembly removed for clarity.

TachWheel2 TachWheel3

Stay tuned for more test results from this configuration!

Posted 17 September, 2015:

After re-installing the LIDAR and making all the right connections (not a trivial task, btw), I fired the system up using my ‘DAFAP_Plus’ (that’s. “Distance as Fast as Possible’ plus modifications for interrupt handling) sketch and took some index sensor measurements.  In the screenshot below, the ‘Sensor’ values are from the index sensor.  As expected, they are near the top of the 10-bit A/D range for interrupt numbers 1-19 (a reading of 700 implies about 3.5VDC).  However, the sensor reading for interrupt 20 is now  much better than it was before; before implementing the improved LED driver and new tach wheel layout, the max readings were about the same, but the minimum reading was occasionally  over 400 (i.e. about 2VDC) – making it harder to reliably discriminate between non-index and index gap  cases.  Now the index gap reading is a reliable ‘0’, providing a 3.5VDC differential – more than double the 1.5VDC  differential before – yay!!

Another item of note in the readout below is the ISR Ms value.  This is the time required to service the associated interrupt,  including the time required for the Pulsed Light ‘Blue Label’ LIDAR to take a distance measurement. Note that all of these times are in the single digit range – meaning I could probably double the number of tach wheel gaps (i.e. double the system angular resolution) if I wanted to.  Note that there is an extra value shown for each ‘interrupt 20’ line; this ‘Idx Ms’ value is the total time between index gap appearances, i.e. the total rotation time.  So, the LIDAR is rotating just a tad shy of 120 RPM (2 RPS), which should be fast enough for decent wall-following navigation.

Log from a 17 Sept 2015 test run; note the '0' sensor value at interrupt 20

Log from a 17 Sept 2015 test run; note the ‘0’ sensor value at interrupt 20

 

Frank

 

 

 

More work on the NEATO XV-11 LIDAR Module

Posted 08/24/2015

Progress on the Pulsed Light ‘Blue Label’ spinning LIDAR system has been put on hold for a few days pending the arrival of some 3mm IR LEDs  needed for an upgraded tach sensor wheel, so I am passing the time working on the alternative system, the XV-11 LIDAR module from the NEATO vacuum.

Back in May of this year I posted some results where I mated the Teensy 2.0 based Get Sureal XV-11 module controller (see the post here) and got very good results with a LIDAR ‘image’ of the walls of a cardboard box.  After this effort, I put the XV-11 aside for two reasons; first, I received a ‘Silver Label’ (V1) LIDAR-Lite unit from Pulsed Light and was having too much fun designing and implementing a spinning-LIDAR system, and second, I couldn’t figure out how to get the XV-11 data into my robot controller (an Arduino Uno).  The Teensy 2.0 XV-11 controller parses the data stream from the XV-11, routes it upstream to the USB host, processes commands from the USB host, and maintains the XV-11 rotation speed using a PID controller.  This is all well and good, but the Uno isn’t a USB host, and adding that capability would be a major PITA.  So, I put the whole thing on the back burner, hoping that inspiration would strike at some later date.

Now, almost three months later, I had some ideas I wanted to try to achieve the  goal of getting the XV-11 streaming data onto the Uno robot controller so that it could be used for navigation.

The Problem(s):

The XV-11 LIDAR module streams angle, position, data-quality, and RPM information over a serial connection to the Teensy 2.0 controller, and there is a LOT of it.  The XV-11 rotates at a nominal rate of 300 RPM, i.e. 5 RPS.  During every 200 msec rotation, it sends 360 (at least) data groups, where each data group contains pointing angle (0-360), distance (in cm, I think), a data quality metric, and RPM.  Each data group must be parsed to extract the needed information.  The Teensy firmware also provides a PWM waveform to control the XV-11’s rotation speed, based on the RPM values being reported over the serial port.

The Teensy 2.0 boasts two hardware serial ports, but one is used for the connection to the upstream USB host,  and the other one is used to receive the data from the XV-11.  So, no easy way to get the needed XV-11 data from the Teensy to the Uno – bummer :-(.  And, even if the Teensy had a third hardware serial port, the Uno only has one – and it is used to connect to its upstream USB host – double bummer :-(.

And, even if I could figure out a way to get the data over to the Uno, how was I going to keep the data stream from  swamping the Uno’s (or the Teensy’s) very finite resources.  With the spinning LIDAR system,  I only capture 18 data groups/revolution, and even this amount threatens to blow out the top of working memory.  There is no way it can handle the 360 data groups from the XV-11.  Moreover, those data groups are arriving at about twice the rate of the groups from the Pulsed Light spinning LIDAR system.

The (partial) Solution – Software Serial to the Rescue!:

In between times where I was actively working on the Pulsed Light spinning LIDAR project, I kept returning to the problem of how to get XV-11 data into the Uno, and in my wanderings through the online Arduino world I ran across references to ‘Software Serial’, where virtual serial ports could be implemented using two of the Arduino GPIO pins.  This sounded intriguing, but all the available libraries come with baggage of one sort or another; one can’t send/receive simultaneously, another can do that, but is sensitive to other interrupts…  Then, just the other day I ran across ‘SimpleSoftSerial’ a version written by ‘Robin2’ just for the Uno (see the post here).  The whole thing is just a ‘.ino’ file, not a library at all, and it does exactly what I want – YAY!  Having the ability to add another serial port to the Uno solves part of the problem, so I decided to see if I could get just this part going, and maybe figure out the data management issue at some future time.

Robin2 also kindly provided a pair of Arduino sketches to demo the ‘Simple SoftSerial’ capability. One part runs on a Uno (of course, as that is the target processor for this little hack) and the other ‘Partner’ program runs on (in his case) a Mega as it requires multiple hardware  serial ports.  I didn’t have a Mega handy, but I  did have the Teensy 2.0 that came with the Get Sureal XV-11 controller, and it has two hardware serial ports.  So, I disconnected the XV-11 LIDAR module from the Teensy, and temporarily re-purposed it for this test.

I loaded ‘DemoPartner.ino’ onto the Teensy, and ‘DemoSimpleSoftSerial.ino’ onto a spare Uno, connected the two using a pair of jumpers, and voila – it ‘worked’ but only in one direction.  I was able to see characters transmitted on the Uno  showing up on the serial monitor port on the Teensy, but not the other way around.  I had previously used my trusty O’scope to probe the pins of the connector from the Teensy to the XV-11 module, and  thought that I had determined which pin was Tx and which was Rx, but clearly something wasn’t adding up.  At first I thought I just had a loose connection with my jumper leads, but after some more fiddling, I became convinced that wasn’t the problem.  With my ‘scope,  I probed the actual Teensy board Tx pin (D3), and discovered that the Teensy  serial data was  there, but it wasn’t making it to the XV-11 connector pin!  Initally this seemed unlikely, as The Teensy/XV-11 combination was working fine – until  the realization hit me that the XV-11 is a transmit-only device – there is no serial traffic going from Teensy to XV-11, and therefore there is no need to have the Teensy’s Tx pin connected to the XV-11 connector!  After confirming this theory using a continuity checker, I bypassed the XV-11 connector entirely by soldering a 2-pin header onto the (fortunately side-by-side) Tx & Rx (D3 & D2 respectively) pins of the Teensy module.  Once I made this change, I started seeing bi-directional data transmissions as advertised.

The following photo shows the experimental setup, with the temporarily disconnected XV-11 module in the background.  I have also included a screenshot of the serial port monitors for both the Teensy module (running the ‘Partner’ sketch) and the Uno (running the ‘Demo’ sketch), on two separate instances of Visual Studio 2013/Visual Micro.

Experimental setup. Uno in foreground, Teensy and (temporariliy disconnected) XV-11 LIDAR module in background

Experimental setup. Uno in foreground, Teensy and (temporariliy disconnected) XV-11 LIDAR module in background

Screenshot showing serial port monitors from Uno (bottom) and Teensy (top).

Screenshot showing serial port monitors from Uno (bottom) and Teensy (top).

Now that I have demonstrated the basic Uno virtual serial port capability, I plan to try and use this capability to get XV-11 serial data into my Uno motor controller by piggy-backing on the Teensy 2.0’s serial connection to the XV-11.

My plan is to return  the Teensy  module back to its original configuration, connected to the XV-11 via its second hardware serial port and running the Get Sureal processing sketch.  Then I’ll put that same sketch  on the Uno, but modify it to use the virtual serial port set up via the ‘Simple SoftSerial’ capability.  If I do it correctly, I should be able to see the XV-11 data on both the Teensy 2.0 and Uno USB host serial monitors.

Stay tuned!

Frank

8/25/2015 Late addendum.  Tried that trick and it didn’t work :-(.  Turns out the virtual serial port isn’t anywhere near fast enough.  Advertised speed is 9600 bps, with some speculation that it will work at 14200.  Unfortunately, the XV-11 runs at 115200.  So, I’ll either have to abandon the virtual port idea (and the Uno processor!) or figure out a way of slowing the XV-11 output down, or something else entirely.  Bummer

Under-Counter Heat Gun Holster

Posted 08/18/2015

I have an old Black & Decker paint stripping gun that I have found to be perfect for shrinking heat-shrink tubing on my electronics projects. It’s old and beat up, but it is does that one thing very well.  However, it is big, bulky and its metal tip stays HOT for a long time after use (note the burned rubber/plastic on the tip), so finding a place to put the bloody thing after use has always been a bit of a PITA.

Oldie-but-goody - my trusty B & D paint stripper works perfectly for heat-shrink tubing

Oldie-but-goody – my trusty B & D paint stripper works perfectly for heat-shrink tubing

I was recently re-arranging my office/lab work areas for better efficiency, and I once again ran into the problem of “where do I put this bloody heat gun!”, when I had an epiphany; I have not only one, but TWO 3D printers, and so I should be able to design and print some sort of under counter holster for this thing!

When our Hobbit-house (earth sheltered house) was built way back when, I had my office/lab outfitted with a built-in wrap-around work surface with NO LEGS!  The work surface is supported with 5/16″ steel L-brackets built into the wall structure, so I can roll my work chair from one end to the other with nary a chance of banging my knees – yay!

Anyway, that meant that I had these steel beams in several places along the span of my work surface, and it turned out that one of them was in just the right place to be a convenient under-counter mounting location for my new holster brainstorm.  And as an added bonus, when the heat gun is in the holster, the HOT metal tip would rest against the steel support, not against anything remotely flammable – double yay!

So, now all I had to do was design and print something that would capture the forward body of the gun but still allow for easy insertion & removal.  After a few minutes with my trusty digital calipers and another few minutes on TinkerCad, I had a preliminary design that I thought might work.  I decided to not add a mounting structure in the initial design, as I just wanted to see if I had the dimensions right, and then adjust from there.  With TinkerCad and a 3D printer, you don’t have to be right the first time; if it doesn’t work or isn’t complete – it’s just a few minutes work to adjust the design, and another few minutes (or in the case of the holster – another few hours) to print another iteration; the modern version  of ‘cut and try’ is ‘print and try’ ;-).

Version 1 of the heat gun under-counter holster.  I didn't include any mounting provisions on this first version - and it turned out I didn't need any

Version 1 of the heat gun under-counter holster. I didn’t include any mounting provisions on this first version – and it turned out I didn’t need any

The holster was printed in white ABS on my MicroCenter PowerSpec Pro 3D printer.  I used white ABS because that was what was on the printer at the moment   (and I didn’t want to use the brilliant neon magenta ABS stuff I had on the other extruder head).  I assumed I was going to go through several versions (and I still may) so I didn’t really care what I used for the first version.  In any case, the print took about 3 hours (most of which was spent while I was away playing bridge – hee hee).  When I tried the result on my heat gun, it actually fit pretty well.  The holster was just a teeny bit undersized, but the walls were thin enough (and ABS is flexible enough) so that the gun fit easily but tightly; I hadn’t really planned it this way, but I’d rather be lucky than good!

The whole thing worked so well in the initial tests that I decided to try and mount it under the counter on the steel L-bracket as planned.  Instead of my non-existent mounting surface, I drilled some 6-32 holes in the front and rear upper inside corners. The rounded corners of the gun leave a bit of space there, enough so a protruding screw head won’t cause a problem.  The biggest problem was drilling the mounting holes in the steel bracket – a 5/16″ thick piece of steel is not a trivial job!

After getting the first hole in, I noticed that I didn’t really need the second one – the heat gun stays pretty level, and can’t go anywhere, even with just one mounting screw.  The following photos show the installation.  The holster is mounted far enough under the counter that I can roll my chair past this spot without my knees hitting the gun, but still close enough so I can easily remove/insert the gun in the holster.

HeatGunHolster1 HeatGunHolster2 HeatGunHolster3