Monthly Archives: May 2015

LIDAR-Lite Rotates!

Posted 5/29/15

In previous posts I described a couple of LIDAR alternatives to my original ultrasonic ping sensor system for Wall-E’s navigation capabilities, and the challenges I faced in implementing them.  The LIDAR-Lite module is easily interfaced to an Arduino controller via the I2C interface and there is  plenty of example code for doing this.  However, in order to use it as the primary navigation sensor, it needs to spin at a controlled 1-5 RPS (60-300 RPM) and there has to be a way to determine  the rotational  angle associated with each distance measurement.  In a previous post (http://gfpbridge.com/2015/05/dc-motor-speed-control-using-optical-tachometer/) I described my experiments with one of Wall-E’s wheel motors to implement speed control using an IR LED/photodiode/toothed-wheel tachometer.

After successfully implementing speed control on Wall-E using the right wheel motor, I next turned my attention to implementing a drive train to connect the wheel motor to the LIDAR Lite unit.  I couldn’t just connect the LIDAR module to the motor shaft, as the LIDAR wiring would simply wrap itself to death as soon as the motor started turning.  I had previously acquired the  slip ring module  (described in  http://gfpbridge.com/2015/04/robot-of-the-future-lidar-and-4wd/) shown below

Adafruit Slip Ring with 6 contacts

Adafruit Slip Ring with 6 contacts

So I needed a way to connect the rotating part of the slip ring to the LIDAR module, and the non-rotating part to the robot chassis and (via a drive belt) to the motor shaft.

In TinkerCad, I designed a grooved pulley with a rectangular cross-section axial hole to fit over the motor shaft, an  adapter from the rectangular LIDAR mounting plate to the cylindrical slip ring rotating side, and a chassis mounting bracket that would allow the non-rotating side of the slip ring to be adjusted toward and away from the motor shaft pulley to properly tension the drive belt.  The drive belt is a standard rubber O-ring from McMaster Carr.

Now that I have motor speed control working and the LIDAR spinning, I need to connect the LIDAR electrically to the Arduino Uno controller and see if I can actually collect angle-specific LIDAR distance data (distance from the LIDAR, angle from the wheel speed tachometer control software).  Stay Tuned!

Frank

 

LIDARMountChassisBracket LIDARMountChassisSlipRing LIDARMountLIDARBracket LIDARMountMotorPulley

Spinning LIDAR drive assembly and LIDAR unit.  Note O-ring drive belt.

Spinning LIDAR drive assembly and LIDAR unit. Note O-ring drive belt.

 

DFRobots ‘Pirate’ 4WD Robot Chassis

Posted 5/28/15

A while back I posted that I had purchased a new 4WD robot platform from DFRobot (http://www.dfrobot.com/), and it came in while I was away at a bridge tournament. So, yesterday I decided to put it together and see how it compared to my existing ‘Wall-E’ wall-following platform.

The chassis came in a nice cardboard box with everything arranged neatly, and LOTS of assembly hardware.  Fortunately, it also came with a decent instruction manual, although truthfully it wasn’t entirely necessary – there aren’t that many ways all the parts could be assembled ;-).  I had also purchased the companion ‘Romeo’ motor controller/system controller from DF Robot, and I’m glad I did.  Not only does the Romeo combine the features of an Arduino Leonardo with a motor controller capable of 4-wheel motor control, but the Pirate chassis came with pre-drilled holes for the Romeo and a set of 4 mounting stand-offs – Nice!

So, at this point I have the chassis assembled, but I haven’t quite figured out my next steps.  In order to use either the XV-11 or PulsedLight LIDAR units, I need to do some additional groundwork.  For the XV-11, I have to figure out how to communicate between the Teensy 2.0 processor and whatever upstream processor I’m using (Arduino Uno on Wall-E, or Arduino Leonardo/Romeo on the Pirate).  For the LIDAR-Lite unit, I have to complete the development of a speed-controlled motor drive for rotating the LIDAR.  Stay tuned!

Frank

Parts, parts, and more parts!

Parts, parts, and more parts!

Motors installed in side plates

Motors installed in side plates

Side plates and front/back rails assembled

Side plates and front/back rails assembled

Bottom plate added

Bottom plate added

Getting ready to add the second deck

Getting ready to add the second deck

Assembled 'Pirate' chassis

Assembled ‘Pirate’ chassis

Side-by-side comparison of Wall-E platform with Pirate 4WD chassis

Side-by-side comparison of Wall-E platform with Pirate 4WD chassis

Over-and-under comparison of Wall-E platform with Pirate 4WD chassis

Over-and-under comparison of Wall-E platform with Pirate 4WD chassis

Optional 'Romeo' motor controller board.  Holes for this were pre-drilled in the Pirate chassis, and mounting stand-offs were provided - Nice!

Optional ‘Romeo’ motor controller board. Holes for this were pre-drilled in the Pirate chassis, and mounting stand-offs were provided – Nice!

Fun with the NEATO XV-11 LIDAR module

Posted 05/23/15

In my last post (DC motor speed control using optical tachometer) I described my effort to implement  a speed controller for one of Wall-E’s wheel motors so I could use it as the drive for  a spinning LIDAR  system using theLIDAR-Lite unit from  PulsedLIght (http://pulsedlight3d.com/products/lidar-lite).  Although I got the speed controller working, delivery of the 4WD robot chassis I had ordered to carry the LIDAR-Lite system was delayed so I couldn’t fully implement the system.  In the meantime, I decided to play with the NEATO LIDAR unit I had acquired from eBay.

The NEATO XV-11 unit is a very cool self-contained spinning LIDAR unit intended for use in the NEATO robot vacuum cleaner.  I find it interesting and amusing that such a  technically elegant and useful module was developed for what is essentially a luxury toy, and  that it is available to hobbyists for a reasonable price!  The bad news is that the XV-11 emits a binary data stream that requires a fair bit of processing to make useful.  Fortunately for us mere mortals, Get Surreal (http://www.getsurreal.com/) produces a Teensy-based XV-11 controller (http://www.getsurreal.com/product/xv-lidar-controller-v1-2) that does most of the work; it has connectors to mate with the XV-11 on one end, and a USB connector on the other for data retrieval/control.  All that is required is an upstream USB  device with a serial monitor of some sort.  In my case, I used my laptop  and the RealTerm serial monitor (http://sourceforge.net/projects/realterm/) to do the job.

As an experiment, I placed the XV-11 in a 14 x 14 inch cardboard shipping box, and connected it up.  After capturing some processed data  using my laptop and RealTerm, I sucked the processed data into Excel 2013 for analysis.  The data transmitted by the Get Surreal Teensy controller is formatted as colon and space delimited (angle, distance, SNR) triplets.  Angle is in integer degrees, distance is in mm, and SNR  is an integer in parentheses with 0 representing missing data, and large numbers indicating high SNR.  At the end of each 360-degree set of data, the XV-11 reports the elapsed time (in integer millisec) for the previous data set.  The screenshot below shows the last few lines of a complete 360 degree dataset, along with the elapsed time value.

Last few items in a complete dataset, along with the elapsed time report

Last few items in a complete dataset, along with the elapsed time report

Excel – at least the 2013 version – is a very cool data analysis program.  Not quite as cool as MATLAB (which I used a  lot as a research scientist at Ohio State), but still pretty good, and nowhere near as expensive (it was free for me at the university, but it is very expensive for civilians).  Excel’s graphing routines are amazingly powerful and easy to use – much better IMHO than MATLAB’s.  In any case, I used Excel to graph some of the XV-11 data I had just captured.  I started with Excel’s stock polar graph (it’s called a Radar plot in Excel), and got the following plot with absolutely no effort on my part (other than selecting the Radar plot type)

Stock Excel Radar Plot for 360 degree data set.

Stock Excel Radar Plot for 360 degree data set.

This appears to be an excellent representation of the actual box, with a few data points missing (the missing points had SNR values of zero, so I could easily have gotten Excel to disregard them).  Although I had physically placed the XV-11 in the box in such a way as to be parallel with the sides of the box, the data shows a tilt.  This is due to the way that the XV-11 reports data – 0/360 degrees is  not the physical front of the device – it is offset internally by about 11 degrees (no idea why)

As my original Wall-E robot was conceived as a wall-following device, I was interested in using the LIDAR data to do the same thing – follow the walls.  So, I converted the polar (angle/radius) data into X/Y coordinates to see if I could condense the data down to something I could use for wall guidance.  The next plot is the same dataset as in the above plot, but converted to X/Y coordinates.

XV-11 Dataset converted to X/Y coordinate system

XV-11 Dataset converted to X/Y coordinate system

This, although perfectly understandable given the box environment, wasn’t really helpful as a possible wall-following algorithm, so I decided to look at the line slope instead of just the raw X/Y coordinates.  This gave me the next Excel plot, shown below.

Calculated line slope m = dY/dX for XV-11 dataset

Calculated line slope m = dY/dX for XV-11 dataset

This plot was interesting in that it definitely showed that there were only two slopes, and they were the negative reciprocals of each other, as would be expected from a box with two sets of parallel sides perpendicular to each other.   Also, having only two values to deal with vastly simplifies the task of making left/right steering decisions, so I thought maybe I was on to something for LIDAR Wall-E.

As I drifted off to sleep that night, I was reviewing my results so far when it occurred to me that I was thinking of the problem the wrong way – or maybe I was trying to solve the wrong problem.  I was trying to use LIDAR data to follow walls a la Wall-E, but what I really wanted to do was have the robot navigate typical indoor layouts without getting stuck anywhere. I had chosen a wall-following algorithm because that’s what I could do with ultrasonic ping sensors.  Another possible way to solve the  problem is to have the robot move  in the direction  that offers the  least restriction; i.e. in the ‘most open’ direction.  This would be very difficult to accomplish with ping sensors due to their limited range and inherent multipath and fratricide problems.  However, with a LIDAR that can scan the entire 360 degrees in 200 msec, this becomes not only possible, but easy/trivial.  So, the new plan is to mount the XV-11 LIDAR on Wall-E, and implement the ‘most open in the forward direction’ algorithm.  This  should result in something very like wall-following in a long hallway, where the most open forward direction would be along the length of the hall.  When the robot gets near the end of the hall, then either the left or right perpendicular distance will become ‘the most open’ direction which should cause Wall-E to make a hard right or left turn, followed by another same-direction turn when it gets close to the other wall. After the two right-angle turns, Wall-E should be heading back down the hall in the opposite direction.

Stay tuned…

Frank

 

DC motor speed control using optical tachometer

Posted 04/13/15

In my last post I described my plans for upgrading Wall-E (my Wall-following Robot) with a LIDAR package of some type, and my thought that I might be able to use such a package to not only replace the existing front-facing ping sensors, but (with a bit of rotating magic) the side sensors as well.

In order to replace *all* the sensors, the LIDAR package would have to rotate fast enough so that it could produce front, left, and right-side distance readings in a timely enough fashion to actually implement wall-following.  I’m not sure exactly what the requirements for wall-following are, but I think it’s safe to say that at least the measurements to the followed wall must be in the several-per-second range, or Wall-E could run into the wall before it figures out it is getting too close.  The other side, and the front could be taken at a more relaxed pace if necessary, but the wall being tracked has to be done correctly.

In order to rotate a unit such as the LIDAR-Lite from PulsedLight, I would  need a speed-controlled motor of some kind.  I considered  both stepper motors and direct-drive DC motors.  Since I already had two DC motors (the left and right wheel motors on Wall-E) and they came with ‘tachometer sensors’ (plastic disks with slots for optical wheel motion sensing), I thought I’d give this a try.  Earlier in my robot startup phase, I had obtained  some IR LED/Photodiode pairs, so I had at least the basic building blocks for a tachometer system.  I was  already speed-controlling Wall-E’s wheel  motors for steering using PWM from the Arduino Uno, so that part was already in place.  ‘All’ I had to do was couple the input from a tachometer into the already-existing PWM speed control facility and I would have a closed-loop speed-controlled rotating base for my LIDAR system – cool!

OK, so now I have all the parts for a speed-controlled motor system – I just have to assemble them. First up was a way of mounting the IR LED and IR detector in such a way that the slots in the tachometer wheel would alternately make and break the light path between them.  In the past when I had to do something like this, I would carve or glue something out of wood, or bend up some small pieces of aluminum.  However now I have a very nice 3D printer and the TinkerCad design package, so I could afford to do this a different way.  The confluence  of hobby robotics and 3D printing allows so much more design/development freedom that it almost takes my breath away.  Instead of dicking around for a while and winding up with something half-assed that is used anyway because it is way too much trouble to make another  -better – one, a 3D printer based ‘rapid iteration’ approach allows a design to  be evolved very quickly, with each iteration so cheap as to be literally throw-away.    To illustrate the approach, the image below shows the evolution of my IR LED/IR detector bracket, along with the original 20-slot tachometer wheel that came with the motors and a 10-slot version I printed up as a replacement (the tach signal-to-noise ratio was too low with the 20-slot original).

Evolution of an IR tach sensor bracket, along with the original and a custom-printed tach wheel

Evolution of an IR tach sensor bracket, along with the original and a custom-printed tach wheel

The evolution proceeded from left to right in the image.  I started with just a rectangular piece with a horizontal hole to accommodate the IR LED, and a threaded hole in the bottom to affix it to the robot chassis.  Then the design evolved a ‘foot’ to take advantage of a convenient slot in the robot chassis, for physical stability/registration purposes.  Then I added a second side with a slot in it to accommodate the  IR detector, with the tach wheel passing between the two sides.  This basic two-sided design persisted throughout the rest of the evolution, with additional material added on the IR LED side to accommodate the entire length of the IR LED.  Not shown in the photo are some internal evolutionary changes, most notably the width of the slot that allows IR energy from the LED to fall on the detector – it turns out that the detector opening should be about 1/2 the width of a tooth slot for best signal.  Each step in the above evolution cost me about 30 minutes of design time in TinkerCad, and a few pennies worth of filament.  Moreover, once I have the end design, printing more is essentially free.  Is that cool, or what?

Wall-E's right motor being used as my tachometer test bed

Wall-E’s right motor being used as my tachometer test bed.  Note the piece of scotch tape on the wheel, used for manually timing RPM.

Tachometer sensor bracket, showing IR LED and tach wheel

Tachometer sensor bracket, showing IR LED and tach wheel

Tachometer sensor bracket, showing slot for the IR detector

Tachometer sensor bracket, showing slot for the IR detector

Since I was already controlling the speed of Wall-E’s motors with an Arduino Uno (albeit for steering), I simply modified the wall-following program to act as a test driver for the tach feedback system.  The output of the IR detector was connected to an analog input, and the analog readings were captured and imported into an Excel spreadsheet for analysis.

The first test showed that I wasn’t getting enough signal swing between the slot and non-slot (plug) states of the tach wheel (less than 100 out of a possible 1024 levels), and this led me to start experimenting with different IR detector apertures.  As shown in the second plot below, constricting the aperture provided a marked improvement in SNR (about 3 times the peak-peak variation).

First test of the tach sensor system.  Note the not-impressive variation between wheel slot and plug readings

First test of the tach sensor system. Note the not-impressive variation between wheel slot and plug readings

Paper barrier with a small slot placed in front of detector aperture

Paper barrier with a small slot placed in front of detector aperture

The above results led directly to the final round of evolutionary changes to the tach sensor bracket, where the detector aperture was changed from a large circle (same diameter as the IR LED) to a small slit.  In addition, to further improve the SNR, the tach wheel itself was redesigned from 20 slots to 10  with  the slots and plugs equal area.  In addition one slot was removed to create an absolute wheel position ‘index mark’.  After these changes, the tach sensor test was redone resulting in the following plot.

IR Detector response with a narrow slit aperture and a 10-tooth wheel.

IR Detector response with a narrow slit aperture and a 10-tooth wheel.

Now the signal varies from 0 to 800, allowing easy and reliable ‘off’ to ‘on’ state detection, and index mark detection.

After incorporating the physical changes noted above, an Arduino program was developed to test whether or not the motor could be accurately speed controlled.  Rather than trying to manually threshold-detect the above waveform, I simply used  Mike Schwager’s very cool EnableInterrupt Library (see  https://github.com/GreyGnome/EnableInterrupt) and set the Tach signal analog input to trigger an interrupt on each signal change.  This resulted in two interrupts per slot position, but this was easily handled in the software.

After getting the program working,  I found that I could control the motor such that, when set to 60 rpm, 20 wheel revolutions (as measured by counting the scotch tape on the wheel) took exactly 20 seconds.

Well, I’m not quite sure where I’m going from here.  Now I have demonstrated that I can control a typical hobbyist/robot motor for use as a LIDAR turret.  However, I’m not entirely convinced that a spinning LIDAR can produce wall distance measurements fast enough for successful wall following, and it will be a major PITA to modify Wall-E sufficiently to find out.  For one thing, I can’t really use one of Wall-E’s drive wheels as the LIDAR turret motor without giving Wall-E an unacceptable ‘limp’  (If I did that, I guess I would have to change his name from ‘Wall-E’ to ‘Quasimodo’ ;-)).  For another, to mount the LIDAR and turret on the current Wall-E chassis would be a major project by itself, as Wall-E’s real estate is already heavily populated with ‘stuff’.

So,  I think I’m going to wait until my new 4WD robot chassis arrives (it was unfortunately delayed for a week or so), and then build it up from scratch as a LIDAR-only platform. I can then use one of the motors from Wall-E as the turret motor for the PulsedLight LIDAR-Lite system.  In the meantime, I think I’ll try and mount the NEATO XV-11 spinning LIDAR on Wall-E, as it doesn’t require any additional motors (it has its own motor built in), and see if I can successfully follow a wall using only LIDAR.

Stay Tuned…

 

Frank