Author Archives: paynterf

Python Script for Challenging Invalid Voter Registrations

Posted 31 May 2024

The folks at TrueTheVote.org (the organization that used cellphone geotracking to expose widespread voter fraud during the 2020 election) put together a database to expose huge numbers of invalid voter registrations across the country. Most of these invalid registrations are due to the voter having moved out of their original voting district/county, but not removed by the responsible election board. While this seems pretty innocuous (and was, in earlier, less troubled times), this now represents a huge opportunity for fraud in the upcoming 2024 election.

Although the TTV folks have the data, they can’t do much about it without the help of concerned citizens who actually vote in those regions because local laws require that any voter challenge be raised by a voting citizen in that particular region.

So, TTV generated a website called ‘IV3’ which allows concerned citizens from anywhere in the U.S. to create an account and query the IV3 database for problematic voter registration records for their voting district/county. For instance, I live and vote in Franklin county, Ohio and my page on the IV3 site looks like this:

If I click on ‘View Active’, I get a page displaying the first record that matches the criteria, i.e. a voter still registered in Franklin county but who has since moved to an address outside of the County, as shown below:

If I want to challenge this voter’s registration in Franklin county, I would click on ‘Challenge this record’, which would display ‘Cancel’ and ‘Submit’ buttons as shown below:

Clicking on the ‘Submit’ button would remove the record from the ‘Active’ list and place it on the ‘Challenged’ list, which could then be exported in .CSV format for submission to the Franklin county board of elections.

It sometimes takes more than 100 seconds for the site to display a new record after each challenge submission, so this gets old pretty fast. After several days of plugging along while working on other things, I had managed to challenge about 600 records from the more than 42,000, a mere ‘drop in the bucket’. So, I started to wonder if I might be able to automate this a bit with a Python script; a web-bot of sorts.

After some research, I discovered a web-page automation API called ‘Selenium’ that could be called from a Python script, so I started learning how to use Selenium to do what I wanted. After the usual number of mistakes and appeals to StackOverflow for guidance, I got a working Python script together, as shown below:

Note that in order to use this script, you must have Python3 and the Selenium extension installed on your computer.

Even though I used ‘FranklinCountyOhioChallenges’ as the name of the main function, this script should be usable for any other location (or you can simply change the name, as long as the two occurrences in the script have identical names).

After getting the script working, I can now run the script to challenge any number of voters with a very simple command, as shown below:

On my windows system (and I’m pretty sure this holds for **nux systems as well) all I have to do to run another batch of the same size is to click on the ‘up-arrow’ button once and then hit ‘Return’. If a different batch size is desired, it’s ‘up-arrow’, edit the batch size, then ‘Return’.

I have found that doing a batch size of 100 takes about 90 minutes, so I can do several of these during the day while working on other things, and then I generally do a batch of 500 overnight. This allows me to do at least 1000 or so each day, so it will still take me around 42 days to challenge all the 42K or so registered voters who have moved out of the county. Your mileage may vary, of course :).

Each time I get a thousand or so challenges done, I click on the ‘View My Challenges’ button on the main page, and then on the ‘Export’ button as shown below, to download the challenges into a .CSV file that is directly readable in Excel (or any other modern spreadsheet program). I then use Excel to print out the entire batch (using Portrait mode and scaling to ‘fit all columns on one page’). Then I fill out and sign the cover form required by the Franklin County Board of Elections, attach the printed out challenge records, and physically submit the form and data to the BOE. As courtesy I also email the .CSV file to the responsible officer there, and so far they seem to appreciate the effort.

22 June 2024 Update:

My script started failing on me a few days ago, and I couldn’t see why. After using the issue as my ‘going to sleep puzzle’, I realized I could go back to my old manual process and see if it worked. If it did, then something in my script was bad. If it failed, then something had changed on the iv3 website.

As it turned out, IV3 had added a new ‘View Moved and Registered’ button, and moved all the qualifying records (which, it turned out, was all of them) into the new database. So, when I clicked on my normal ‘View Active’ button, I got ‘No Records Found’, which of course also killed my script :(.

So, the fix was to direct my script to the new button instead, and then all was well. I have updated the above script to the new version.

Stay tuned,

Frank

Improved Pill/Caplet Dispenser

Almost three years ago I designed and fabricated some pill/caplet dispensers for the half-dozen or so prescription meds I have managed to accumulate over the last decade or so. A while ago, one of my prescriptions changed its tablet to a much smaller size, so I decided to update my design while fabricating a replacement dispenser.

Between the last project and this one I’ve been playing with OnShape, a web-based 3D CAD package, so I thought I would use it to see if I could do better than last time. I really like OnShape because it uses a 2D ‘sketch’ based design philosophy, which makes tweaks and/or modifications much easier – change a few 2D sketches, and the entire design changes along with it.

The previous design implemented a smooth collar that was a press-fit for the pill bottle cap which turned out to be kind of clunky. This time I thought I might try implementing internal threads on the collar so instead of a press fit it would simply screw on like the original cap, and I discovered that a ‘ThreadCreator’ extension existed for OnShape – neat!

So, I worked my way through the process, and came up with the following design, available to anyone with a free OnShape account

This design has internal threads for a 37mm cap with the standard 4.7mm thread pitch, so it will screw directly onto the pill bottle, ‘eliminating the middleman’. Here are some photos of the finished product:

And here is a short video showing the dispenser in action:

10 June 2024 Update:

Last night I attempted run two more prints of this model, as I have two additional pill bottles of the same diameter with older pill dispensers, but the prints failed catastrophically – bummer! I rounded up the usual suspects (bed temp, model arrangement, Z-axis tuning, etc, and finally managed to get another print going, at least through the raft and first few layers. After bitching and moaning about this for a while, it occurred to me that if I had documented the layout and settings more aggressively from the first print, I wouldn’t have wasted all those hours last night and today. So, once I’m sure I have a consistent print configuration, I will document it here.

I got a good print with the following settings:

  • Flashforge Creator PRO II ID
  • Left Extruder – Red PETG, 240C, 80C Bed
  • Right Extruder AquaSys120 240C, 80C Bed
  • 2-layer raft using support filament (AquaSys120)

See the following images for the full setup:

After a 3-hour side-trip into the guts of the Flashforge to clear an extruder jam, I was able to get the second print underway. As I write this it is about 6% finished, but all the way through all the support material parts (so it should finish OK).

17 June 2024 Update:

Not so fast! I realized that the threaded portion of the dispenser cap, while functional, was very poorly printed due to the lack of supports (and, as I found out later, also due to the resolution setting). In addition, the side walls of the V7 slide box were too thin and broke apart easily. After modifying the design, I attempted another print using the above settings, but the PVA dissolvable filament simply refused to stick to the print bed – arrrrrgggggghhhhh!

After going through the whole extruder & bed temperature search routine again yesterday, including replacing the heated bet PEI layer and even putting down blue painter’s tape with no success, I was perusing google-space for clues and kept running across reports where dehumidifying the PVA filament worked. I didn’t see how that would help me, as we control the relative humidity in our house to about 50% +/-, but hey, what did I have to lose at this point?

So, before going to bed I dug out my filament dehumidifier rig and left my filament in it overnight (and until about noon the next day for a little over 12 hours). Then I tried some prints and although not successful at first, the results were encouraging. I finally got two really good prints with the following setup:

  • Slicer resolution: ‘0.15m OPTIMAL’ setting in Prusa Slicer
  • Right (PVA) extruder: 220C
  • Left (PETG) extruder: 240C (this was constant throughout)
  • Bed: 40C
  • Layer of blue painter’s tape on top of the PEI substrate

So, I think the big takeaway from this episode is: PVA must be explicitly dehumidified BEFORE each print session. Otherwise the PVA will not stick to the print bed, no matter what you do.

Stay tuned,

Frank

07 July 2024 Update:

After getting the threaded pill bottle dispenser cap working, I decided to try my luck with my two 57mm twist-lock pill bottles. The twist-lock cap geometry was considerably harder to design. Rather than trying to design and print everything as one piece, I decided to separate the dispenser piece from the cap mating piece, as shown below:

Bottle Cap Mating Ring
Pill Dispenser body and slide
All three pieces together. Note that the cap mating ring fits into the dispenser body

Fix for Inadvertent Crimson Trace Laser Activation with M&P Bodyguard 380 in ‘Sticky’ Holster

Posted 05 May 2024

My daily carry pistol is the M&P Bodyguard 380 in a ‘Sticky’ brand Holster, as shown below:

I carry this in my jeans front pocket, and it works great. I regularly practice smoothly drawing the pistol, activating the Crimson Trace laser, and getting the gun on target. Unfortunately after a year or so of use I started seeing occurrences where the laser wouldn’t activate, and investigation showed that the laser battery was dead. The first time this happened I just wrote it off to the normal battery life, but the second and third times were definitely too close together to be a battery life issue. I finally figured out that the laser was being inadvertently activated in the holster – clearly not a good solution. The good news is, it made me more determined than ever to not count on the laser. Now I practice with and without the laser (although I much prefer the ‘with’ scenario).

Thinking about the problem, I inferred that the ‘Sticky’ holster, when new, comparatively stiffer when new than after hundreds of cycles of inserting and removing it from my jeans pocket, and of course hundreds of dry-fire draw and shoot repetitions. Eventually the holster gets pliable enough so the normal inward pressure from my jeans pocket is enough to activate the laser at some point (and once is enough, as once activated it will probably stay ON until battery exhaustion). As you might imagine, replacing the battery in the laser module is a major PITA, as the pistol itself must be disassembled, and then the laser module removed to access the battery. Then the procedure must be run in reverse to re-assemble everything, and now the laser alignment must be checked and adjusted as necessary (another major PITA).

Thinking about solutions, I contemplated 3D-printing a holster insert that would replace the original holster stiffness (and I might still do this). However, I was struck by the idea that the real solution to the holster material pressing in on the laser activation button is to remove the holster material around the button; then the holster material thickness becomes an additional guard around the button. Instead of being the culprit, it now becomes the solution – cool! Here’s another photo showing the ‘Sticky’ holster with a (unfortunately crude) hold around the pistol’s laser button:

‘Sticky’ holster with material over laser activation button removed

After – once again – replacing the laser battery, I plan to run with this setup for a while and see how long the laser batteries last this time.

17 July 2024 Upate:

From May of this year until now I had no problem with laser battery life, but yesterday I found the batteries low/dead again. This indicated I was still getting inadvertent laser activation even with the laser button cutouts shown above. So, I decided to see if I could improve on the design a bit.

I went into OnShape, my 3D design tool of choice, and designed a hollow ‘holster bump’ as shown in the screenshot below:

The idea, as shown below, is to protect the laser activation button on each side of the gun, so it can’t be activated when in the holster

I then used hot glue to affix them temporarily to the gun to confirm their positions coincided with the holes I had previously mad in the holster. After this, I glued the bumps into the holes with superglue. We’ll see how this works.

Stay tuned,

Frank

Printing NY Times Crossword Puzzles Using Across Lite & AutoIt Script

Posted 05 May 2024

I and my wife are crossword puzzle addicts. To feed our habits, I signed up for the NY Times crossword puzzle archives and downloaded the Friday, Saturday and Sunday puzzles (the Mon-Thurs puzzles were too easy) for each week between January 2015 and December 2022.

Originally I would print out a puzzle as required by opening/printing the puzzle file using Across Lite, but this got old in a hurry. So, I decided I would create a program to automagically print an entire folder’s worth of puzzles in ‘batch mode’ utilizing two-sided printing – yay! Looking around for the best/easiest way to accomplish this, I ran across an application called ‘AutoIt’, specifically created as a shell-script generator to run Windows (or Mac) applications and system functions.

It took a while to work my way through the command reference and examples, but eventually I was able to create an AutoIt shell script to do what I wanted; The script prompts the user for a directory containing Across Lite *.puz files, and offers to print them all in equal N-puzzle batches, thereby allowing the user to print a batch and then move the printed puzzles from the output tray to the input tray for a double-sided result.

This worked great, with the only downside being that the user’s PC cannot be used for anything else while the script is running, as it grabs the mouse cursor to launch Across Lite and print the current .puz file.

Here’s the script, as it stands now in May of 2024 (saved on my system as C:\Users\Frank\Documents\Personal\Crosswords\Print Pending\240504 PUZ Print Script.au3):

And here is the output log for printing the contents of the ‘C:\Users\Frank\Documents\Personal\Crosswords\Print Pending\2015’ folder containing 156 puzzle files, printed in four batches of 39 files each:

Return of the Robot – sort of

After some time away from my autonomous wall-following robot, I have started spending time with it again. The first thing that happened was I tried a long-term run in my home, only to find that the ‘mirrored-surface’ feature I added some time ago caused the robot to enter an infinite loop, even when encountering a non-mirrored surface – oops! This eventuality was such a bummer that I stopped working with the robot for several months.

When I worked up the courage to address the problem, the first thing I did was to back out the ‘mirrored-surface’ code, reverting back to the state of affairs represented by the ‘WallE3_QuickSort_V5.ino’ Arduino project. This required quite a bit more work than I had anticipated; I had thought my process of incremental builds would shield me from that – NOT! Eventually I was able to use some ‘diff’ tools to work my way through the process.

After getting the code squared away without the ‘mirrored surface’ code, I decided to take my robot out for a walk – well actually it was the robot taking me for a walk ‘in’ for a walk around the house. Here’s a link to the (not-so-short) video showing the action. I have included this as a link to the video file on my Google Drive site, because it’s too long to fit on my WordPress blog page.

Hearing Aid AGC Testing

Posted 11 March 2023,

I have been wearing hearing aids for quite some time, compliments of a lifetime around airplanes, long before hearing protection became a thing. I recently got a set of Jabra ‘Enhance 200’ aids via direct order, and I like them very much, EXCEPT I have noticed that my perceived hearing acuity seems to vary quite distinctly over a period of a minute or two. I first noticed this when I would turn on a water tap while washing dishes or preparing to take a shower. I would turn on the tap, and then 20-30 sec later, the perceived sound of the water coming out of the tap would increase significantly – even though the water flow rate had not changed. Later, in a social setting (bridge club), I will experience significantly lower and higher perceived speech volumes – very frustrating! I hypothesized that the aids employ an AGC (Automatic Gain Control) of some sort that is getting triggered by an initial loud noise which reduces the gain (and my perceived noise/speech level), and then 10-50sec later the gain would go back up again.

Just recently though, another thought popped into my head – what if the perceived volume changes aren’t due to some property of the hearing aids, but instead are a physiological feature of my current hearing/understanding processes? Hmm, I know I have some issues with my Eustachian tubes blocking and unblocking, and I also know that on occasion the volume changes are correlated with a ‘blocked Eustachian tube’ feeling, so this isn’t a completely crazy hypothesis.

So, how to distinguish ‘meat space’ audio response from hearing aid responses? I decided I could set up an experiment where I could expose one or both of my hearing aids to a volume ‘step function’ and monitor the output for AGC-like responses (an initial rapid drop in output, followed by an eventual return to normal). Something like the following block diagram:

The idea is that the Teensy would produce an audio output in the human audible range that can be controlled for amplitude and duration. The audio would be presented to the hearing aid, and the amplified output of the hearing aid would then be captured with an external microphone connected back to the Teensy. A plot of captured amplitude vs time, with a ‘step function’ input should indicate if the hearing aid is employing an AGC-like response function.

After some fumbling around and searching through the posts on the Teensy forum, I ran across this post describing how to create a simple 440Hz sinewave output from the DAC pin. The original poster was having problems, but after Paul Stoffregen added the ‘magic sauce’ (adding the ‘AudioMemory(10);’) line, everything worked fine. When I copy/pasted the posters code and added the line, I got the nice 440Hz waveform shown below – yay!!

The next step is to hook the DAC output to a small speaker, so I can drive the hearing aid. When I tried hooking a speaker directly to the DAC output, it clipped badly – oops! Fortunately, I remembered that long ago I had purchased a ‘prop shield’ for the Teensy LC/3.1/3.2 controllers, and this contains a 2W audio amp whose input connects directly to the DAC output – nice! So I dug around and found the part, soldered on headers, and plugged it on top of the Teensy.

15 March 2024 Update:

So, yesterday Space X launched IFT3 (Integrated Flight Test #3), consisting of ‘Super-Heavy Booster 10’ and ‘Ship 28’. The combination was the largest, most powerful rocket ever launched, by a fair margin. The upper stage (Starship 28) flew from Boca Chica Texas to the Indian Ocean near Australia in about 49 minutes – wow! At that point, the upper stage was the largest object ever launched into space in all history – Wow Again!

OK, back to reality. I managed to get the prop shield working – after the usual number of false-starts and errors. One ‘gotcha’ was that I hadn’t realized the 2W audio amp was a Class-C type, which means it switches on and off at a very high frequency (100KHz or so) – way above human hearing range, and the audio AM modulates that signal. When it is connected to a speaker or other audio transducer, it acts like a low-pass filter and all that comes out is the audio; this is a really neat trick, but it means that the audio amp output signal is basically impossible to look at directly with a scope – oops. So, I got a pair of MEMS microphone breakout boards from Sparkfun and used the microphone to turn the speaker audio back into an electrical signal that I could view with the scope. Here’s the setup:

Teensy ‘propshield’ mounted on Teensy 3.2. Note Sparkfun MEMS microphone suspended over speaker

This worked great, and I was able to verify that the speaker audio output was a reasonable replica of what the T3.2 DAC was putting out.

The next step was to feed the MEMS mic output back into a Teensy 3.2 analog input so I could measure the signal amplitude (A4 in the above block diagram). Then I would modify the Teensy program to deliver a 1-2 sec ‘pulse’ of high amplitude audio to the hearing aid, followed by a constant low-amplitude signal. The measured amplitude of the hearing aid output (as received by the MEMS mic) would be monitored to see if the hearing aid exhibited AGC-like behavior.

However, as I started setting this up, I realized I would have to solder yet another flying lead to the top of the prop shield, as the Teensy 3.2 pins were no longer accessible directly. So I decided to fix this problem by adding female headers to the top side of the prop shield to allow access to all Teensy pins. The result is shown in the ‘before/after’ photos below:

19 March 2024 Update:

I soon discovered that my plan for routing the MEMS mic back to a Teensy analog input and just averaging the results over time wasn’t going to work, as a glance at the MEMS output to the Teensy (shown below) would make quite obvious:

MEMS output with 1000Hz sinewave input to speaker. Average value is 3.3V/2

The average value for this signal is just the DC offset, which will always be the same. The only thing that varies is the amplitude – not the average value – oops!

OK, so the obvious work-around to this problem would be to put a half-wave or full-wave rectifier circuit between the MEMS output and the Teensy so the analog input could measure the half- or full-wave amplitude instead of the average. But, I really didn’t want to add any more circuitry, and besides I have this entire 72MHz computer at my beck and call – surely I can get it to emulate a half- or full-wave rectifier?

So, after the usual number of screwups, I got this working reasonably well – at least enough for a ‘proof-of-concept. The basic idea is to take analog input readings as fast as possible, and use the resulting values to compute the average value (in A/D units – not voltage), and then take the absolute value of the difference between each measurement and the average value – this essentially implements a full wave rectifier circuit in software.

The following data and Excel plot shows the results for the waveform shown above:

The above data was collected by sampling the input at about 20KHz (50 Usec). As can be seen from the above, the average value is a constant 511.64 (out of a zero to 1023 scale), and the actual measured values varied from about 264 to about 755. Here are Excel plots of both the measured input and the calculated amplitude:

So it looks like this idea will work. For the intended application (determining if my hearing aids exhibit AGC-like behavior, I can perform a running average of the full-wave rectified signal using something like a 0.1 Sec interval (2000 samples). That should accurately capture the onset and release of the 1-sec HIGH tone, and have plenty of resolution to capture any AGC-like sensitivity increase over a longer time – say 30 Sec or so.

Here’s the code that produced the above outputs:

I made another run with the A/D resolution set to 12 bits to see if it made any appreciable difference. As can be seen in the following Excel plot – it didn’t:

Here’s another plot showing the microphone output, but this time with the DAC sinewave output amplitude reduced to the point where the microphone output isn’t large enough to clip.

Microphone Input to A/D Converter

In the reduced sinewave amplitude plot above, the ‘Meas’ plot is still centered about the halfway mark in the 12-bit range of values, while the average value of the ‘Amp’ plot has been reduced from about 1800 to about 200.

So, now that I know that the DAC-Speaker-Microphone-ADC loop works, I need to extend it to record amplitude values over an extended period – at least 30 sec, and more like a minute or more.

I modified my control program to create a 1-second ‘burst’ of a HIGH amplitude sinewave, followed by an infinitely long period of a LOW amplitude sinewave. The HIGH amplitude was chosen to fully clip the microphone output, and the LOW amplitude was chosen to be well above the noise floor, but still very small compared to the HIGH amplitude signal. Here are O’scope photos of both the HIGH & LOW signals:

Here is the output from the program (the LOW amplitude output was manually terminated after a few seconds):

31 March 2024 Update:

After getting all of the above working, I then installed one of my Jabra ‘Enhance 200’ aides in between the speaker and the microphone, as shown in the photos below:

With the aid installed, I got the following microphone output using my ‘burst + long-term low level audio’ setup.

Even though the Jabra aid did NOT exhibit anything like the AGC behavior I expected, there *was* a sort of cyclical response with the Jabra aid that wasn’t there without the aid in the middle. This cyclical behavior repeats about once every five seconds and *could* be some sort of AGC-like behavior – just not the one I was expecting.

Stay tuned,

Frank

Convert Condor Task Briefing Custom Waypoint Description Blocks to XCSoar-compatible .CUP Format ‘Additional Waypoints’ file

Posted 23 March 2024

After a multi-year hiatus, I recently started flying contests again in the Condor Soaring Simulator. As sort of a side project, I have also been working with the XCSoar glider navigation program, to see if I could use XCSoar to help navigate AAT/TAT tasks in Condor (Condor doesn’t support AAT/TAT tasks natively with the in-sim PDA).

After using XCSoar for a while, I became frustrated with XCSoar’s inability to define ‘custom’ turnpoints based on LAT/LON coordinates, which are used quite frequently in Condor contest tasks. After a long-fought and ultimately unsuccessful battle with XCSoar’s source code to see if I could modify the program to facilitate this, I admitted defeat and decided to try another way to skin this cat. XCSoar will accept an ‘Additional Waypoints’ file, so I decided to see if I could create a program to convert the ‘new TP’ blocks in the Condor ‘Task Briefing’ description to XCSoar-compatile .CUP file waypoint lines, which could then be loaded into XCSoar for selection as task waypoints.

Here is the .CUP file format defintion page from the SeeYou (naviter) program website:

The above description is NOT very easy to read. It is full of errors, so some imagination is required to make sense of it. The ‘hardpoints’ in the description are as follows:

  • Latitude strings are exactly 9 characters long. Longitude strings are exactly 10 characters long.
  • In latitude strings, the decimal point is exactly the 5th character (Char4) . In longitude strings, the decimal point is exactly the 6th character (Char5).
  • Both latitude and longitude strings apparently must be zero-padded as necessary to make the string character counts work out. For instance, in the longitude example the ‘degree’ value of ‘014’ must be exactly three characters.

Example Run:

Here’s a recent task briefing from Condor-Club:

As can be seen in the above screengrab, TP2-TP2 are all ‘custom’ turnpoints defined only by Lat/Lon coordinates. Manually adding these to a .CUP formatted file for use in XCSoar would be essentially impossible, given that the turnpoint coordinates only become visible 15 minutes before server start.

To start the process, the turnpoint blocks from the above briefing were copy/pasted one at a time into a text document (I use NotePad++), as follows (note that I manually changed the name of the last turnpoint from ‘Finish’ to TP 6, as my Python script currently only looks for ‘Start’ and ‘TP’ starting strings)

My CondorTPX_to_CupWP.py Python script opens a ‘FileOpen’ dialog where the input file (in this case ‘NewTPs_IN.txt’) can be selected by the user, and a ‘FileSave’ dialog where the output file (in this case ‘NewTPS_OUT.CUP’) can be selected, and then parses through the blocks in the input file, converting them to equivalent .CUP-formatted lines compatible with XCSoar. Here is the console printout from the ‘verbose’ (-v) version of the script:

At the very end of the above printout, the newly-written contents of the output file are read back out again as a verification that the conversion was successful. Here is the actual contents of the ‘NewTPS_OUT.CUP’ file:

This file now has to be transferred to the directory used by XCSoar for waypoints, and then selected in XCSoar (Config->System->Site Files) to load as the ‘More Waypoints’ selection. After this, all the above turnpoints will be available for task construction. Here’s a photo of my Android tablet with the above task turnpoints loaded:

XCSoar task map, using converted task briefing turnpoint blocks
Same task as above, from Condor-Club briefing

It is clear from the above images that the Condor-Club ‘custom’ task turnpoints have been converted properly from text blocks to SeeYou .CUP format waypoint strings, so now I can use XCSoar to navigate Condor tasks with ‘custom’ turnpoints – Yay!

Here’s the Python script I created to do the conversion:

Enjoy!

Frank

Bridgemate II Keypad Membrane Replacement

Posted 22 February 2024

COBA, the Central Ohio Bridge Association, owns a number of Bridgemate II table pads, and lately we have been hearing a number of complaints about having to press buttons harder than normal to get the expected response. After some research, one of our members discovered that we could purchase replacement membranes from Bridgemate for $15 each, so we decided to undertake a membrane replacement program.

Opening up a Bridgemate II

The keyboard cover/upper-half can be removed by disengaging a number of flexible plastic tabs, as shown in the following page from the Bridgemate site:

Opening the case on a Bridgemate II

While this wasn’t quite as trivial as the above description, it wasn’t really all that hard. Most of the difficulty was in trying NOT to break things during the opening process on the first unit. Once that was accomplished (thankfully without breaking anything), I’m sure the rest will go much easier. Here are some photos showing the disassembled Bridgemate II

Visual inspection of this first unit showed everything to be very clean – basically indistinguishable from a brand-new unit. In particular, the key membrane seemed to completely intact, and could be easily manipulated between the ‘pressed’ and the ‘unpressed’ states with just fingertip pressure.

Stay tuned,

Frank

Using VS Code to Debug Linux Makefile Projects

Posted 20 February 2024,

I recently got re-interesting in soaring (glider racing) after a number of years away. As part of this new journey, I became interested in contributing to the development of the XCSoar glider racing navigation software. I had contributed to this program some years ago in a very minor way, and thought maybe now that I’m retired and have plenty of time to waste (NOT!!), I could maybe contribute in a more meaningful way.

In any case, XCSoar development is done in Linux (Specifically in Debian Linux, but that’s another story), so I started thinking about creating a development environment on a Debian Linux box. I had an old Dell Precision M6700 ‘desktop replacement’ laptop that I hadn’t turned on for years, so I dug it out, installed Linux (Ubuntu first, then Debian), and with the help of Ronald Niederhagen (see this post), I was able to clone the XCSoar repo from Github and built both the Linux and Android versions.

However, I still needed to find a way to ‘break into’ the code, and to do that I was going to need a way of running the program under debug control. I have done this sort of thing for decades on the Windows world, but not so much in Linux, so I was sort of starting from scratch. After a LOT of research, I found that Microsoft had a free Linux IDE called VS Code – sort of a lightweight version of Visual Studio aimed at the Linux and IOS world.

So, I installed VS Code on my Linux box and started the ‘lets learn yet another coding IDE’ dance, again starting with a lot of Google searches, running tutorials, etc etc. After creating some basic ‘hello world’ programs using VS code, I started thinking about how to use VS Code on the XCSoar project, comprised of thousands of source files and a Makefile almost three hundred lines long. I knew just enough about Makefiles to type ‘make’ or ‘make all’, so there was a steep learning curve ahead to use VSCode for debugging.

After yet another round of Google searches and forum posts, I found a relevant tutorial on the ‘HackerNoon’ website. the HackerNoon tutorial seems to be aimed at Windows users (Linux doesn’t use extensions to denote executable files), but I was able to work my way through and translate as appropriate. I suggest you open the HackerNoon tutorial in one window, and this ‘translation’ in another.

Original:

Translation for Linux:

Download & Install VS Code: See this link. Everything else should already be available as part of any modern Linux distro (I’m using Debian 12 – ‘bookworm’).

main.cpp: (same as original – no translation required)

Makefile (Original):

Makefile (Translation for Linux):

Run the ‘main.exe’ make target, then run the executable to make sure everything works (Original):

Translation for Linux:

I placed ‘mymain.cpp’ and ‘Makefile’ in my home directory tree as shown below:

so to test the Makefile and the executable, I did the following:

This test confirms that the source code compiles correctly, that the Makefile is correct, and the output from the compiled program is as expected – yay!

Setting up VSCode debugger:

At this point the HackerNoon tutorial and my experience parted ways. I could get the ‘Run and Debug’ side panel open, and I could open ‘mymain.cpp’ in the VSCode editor window, but I couldn’t find anything that suggested it could “create a launch.json file”. I tried to send a comment to the author, but comments weren’t enabled for some reason (and yes, I did register and log in). I suspect that the disparity between the HackerNoon example and my VSCode reality is due to changes in VSCode after the HackerNoon tutorial was created. Which is why I absolutely hate posts like this that aren’t dated in any way shape or form. Because of this, it is impossible to determine if changes like I encountered are due to mistakes on the part of the author, or just changes in the underlying software being demonstrated.

So, I was left to randomly click on things until, by some miracle, the ‘launch.json’ file was created and appeared in the VSCode edit window. Later I went back and tried to re-create the miracle and I sort of figured it out; with the ‘mymain.cpp’ file open in the editor, click on the ‘gear’ icon (upper right as shown in the screenshot below):

And select ‘(gdb) Launch’ from the menu items presented. This will create a ‘launch.json’ file and open it in the editor, as shown below:

launch.json file created by VSCode

At this point we can reconnect to the ‘HackerNoon’ tutorial and copy/paste the launch.json contents from the tutorial to the newly-created launch.json file in VSCode, as shown below

launch.json after copy/pasting example code from HackerNoon site

Here’s the ‘launch.json’ file from HackerNoon:

And here’s the same file, after translating it for Linux instead of Windows:

The next step is to create a ‘tasks.json’ file using the ‘New File’ (ALT-Ctrl-N) in VSCode. This brings up a dialog forcing the user to specify the folder into which the new file will be placed, which seemed a bit strange for me, but what do I know. Put the new file in the same (hidden) .vscode folder as the companion ‘launch.json’. In my setup this was layed out as shown below:

Here’s the original ‘tasks.json’:

And here’s the same file after translation for Linux:

The above three files (mymain.cpp, launch.json and tasks.json) form a complete set (a ‘configuration?) that enables VSCode to compile and then run – under debug control – a .cpp file. Here’s a short video showing two complete compile/run cycles:

Video showing VSCode compiling/running/debugging using a Linux Makefile

At this point is is clear that the combination of VSCode, in conjunction with appropriately constructed ‘Makefile’, ‘launch.json’ and ‘tasks.json’ files is capable of debugging a Linux-base C++ program. This means that – at least in theory – I should be able to aim VSCode at the humungous Makefile associated with the equally humungous XCSoar program and step through the source code under debug control – yeeehah!

23 February 2024 Update:

Well, the theory (about being able to run XCSoar under debug control) has now been validated. I was able to run XCSoar using the following launch & tasks.json file contents

Launch.json:

Tasks.json:

And here is a short video showing the action:

So now I’m set; I can run XCSoar under debug control, and I should be able to poke around and hopefully make some changes (or at least understand some of the magic tricks). On thing I’ve already found is that (depending on where the breakpoints are) XCSoar can grab the mouse focus (not the keyboad, thankfully), so even though the mouse cursor can move around the screen outside the XCSoar window, all mouse clicks go to XCSoar. Fortunately I found that keyboard inputs still go to VSCode, so I can still (with some difficulty) get around OK.

Trying my Hand at XCSoar Program Development

Long ago and far away, back when I was doing a lot of real-life soaring, I was heavily involved in the development of the ClearNav flight computer/navigator and I’d like to think I made it better than when I first started working with it. Among other things, I helped develop the thermalling assistant in ClearNav, and also helped port that functionality to XCSoar.

Now, many years late, I am no longer doing real-life soaring, but I’m once again starting to fly in virtual soaring races using the Condor2 soaring simulator. As a part of that, I have been attempting to use the XCSoar flight computer/navigation assistant as an external adjunct to the in-sim Condor flight computer. In particular, Condor2’s flight computer has no capability to support AAT/TAT tasks, so using an external PNA (Personal Navigation Assistant) that does support AAT/TAT tasks makes sense. To facilitate this effort, Condor2 can output GPS NMEA sentences to an external device.

I have nice dual-24″ monitor setup at home, so my initial thought was to run Condor2 on one monitor, and the PC version of XC on the other. This actually works, but with a big ‘gotcha’ – when Condor is running, it captures the mouse and keyboard input, and won’t let it go unless you use ‘ALT-TAB’ to switch to another running program. This means that if you wish make an adjustment in XCSoar, you have to ALT-TAB out of Condor2 and into XCSoar, make whatever adjustments, and then ALT-TAB back to Condor2 – more than a little bit messy. In addition, while ‘away’ from Condor2, it is continuing to fly along without pilot input – maybe OK for a flatland task, but definitely bad for one’s (simulated) health in gnarly terrain.

So, my next idea was to buy a cheap Android device and run XCSoar on it, connected to Condor2 on my PC via Bluetooth. I found a really nice PRITOM M10 10 inch tablet with 2 GB RAM, 32 GB internal storage on Amazon for about $50USD, and figured out how to get NMEA data to it using Bluetooth – nice!

Then I constructed a small AAT/TAT task in Condor2’s default Slovenia scenery, loaded the same task into the M10, and flew it multiple times to see if I could figure out how to best use XCSoar to optimize navigation through both defined areas. This worked really well, but I ran into some problems with the XCSoar software. At least in the Android version, the ‘AAT Time’ and ‘AAT Delta Time’ readouts blanked out several times during the tests, rendering XCSoar pretty much useless for AAT/TAT tasks (see this post for all the gory details).

The above experience led me to think about looking through the source code to see if I could find out why these AAT-related values were going missing. Way back in the day I had done this once before when I ported the ClearNav thermalling assistant algorithm to XCSoar, but that time I didn’t know enough about Github repos and pull requests to do it the right way – I had to throw myself on the mercy of the other developers to make it happen. Since I’m now retired and not flying in real-life anymore, I have a bit more time, so I decided to give it a go.

I started by reading though the XCS Developer Manual, where I found that the primary development platform for XCSoar is Linux, and I haven’t played in Linux-land since my days as IT manager at The Ohio State University ElectroScience Lab a couple of decades ago. Nevertheless, I had an old Dell Precision M6700 15″ ‘desktop replacement’ laptop hanging around doing nothing, so I dug it out and installed Ubuntu 12.4 LTS on it. I chose Ubuntu because it was reportedly easier to use by beginners, and known to install OK on my laptop model. Installation was pretty easy, and using the information in the developer’s manual I was able to clone the Github repo, install the required packages, and actually get the default UNIX version of XCSoar compiled and running on my Ubuntu laptop – yay!

With XCSoar running on my laptop, I decided to try running the same AAT/TAT test task with Condor2 GPS NMEA sentences connected to XCSoar on my Linux laptop using VSPE on my windows box to connect Condor2 to a TCP port, and ‘socat’ on the Linux box to create a bridge between TCP and UDP ports. Then in XCSoar I set ‘Device A’ to point to the UDP port created by socat. Amazingly, this all worked! (See this post for the details). With this setup I ran the test AAT, but the ‘AAT Time’ and ‘AAT Delta Time’ values stayed rock-solid through the entire task – yikes!

This result led me to believe that maybe I could just abandon the Android M10 and just use XCSoar on my Linux box to fly AAT tasks. However, when I actually tried this on a Condor2 race, the Linux box version of XCSoar kept dropping GPS inputs – don’t know why.

So, I decided to see if I could compile XCSoar for Android on the Linux box and ‘side-load’ it onto my M10 Android tablet. If this worked, then I could instrument the code on my Linux box, run it on my M10, and maybe figure out why the AAT Time/AAT Delta Time data was getting corrupted. After all, how hard could it be – I had already cloned the source repo and gotten the UNIX target to compile properly? As it turned out – “Pretty Darned Hard!”

I was stumbling around on the XCSoar developer’s forum, trying to figure out why compiling for Android wasn’t working, when Ronald Niederhagen took pity on me and sent me a direct email with some suggestions. The foremost of those was, ‘Install Debian’ and your life will be easier’. Although I had noticed some references to Debian on some of the development steps, I hadn’t paid much attention to it – after all, all Linux installation are all the same, right? Wrong.

So, I went off into a side project for replacing Ubuntu with Debian on my Dell laptop. Once I got that part accomplished, Ronald sent me the following instructions:

The first ‘make’ command above ran successfully, and I was able to launch XCSoar on my Linux Debian laptop with no problem. However, the second ‘make’ command to compile XCSoar for Android failed miserably – ouch!

When I reported this result to Ronald, his reply was:

Ronald’s first sentence — “You are not far from success” was pretty heart-warming. I have done a LOT of programming over a half-century of engineering, and I was well aware how easily projects of this nature can spiral out of control, so ‘hearing’ such encouragement from such an obviously competent source was a life-saver.

Anyway, I ran the command to install the Java jre, confirmed success with the ‘which command, and then re-ran the ‘install-android-tools.sh’ script. This time it completed OK, so then I ran the ‘make -j4 TARGET=ANDROID command. This time it seemed to complete OK, but no corresponding *apk file was generated.

When I reported this to Ronald, he asked me to re-run the compile, but this time redirect both the normal (i.e. stdout) and the error (i.e. stderr) outputs to files and send them to him. OK, so I resurrected (with the help of Google) my 20-year old memories of how to redirect to files in Linux, got the job done, and sent them off to Ronald. In no time at all, Ronald sent back the following:

I had no idea what ‘javac’ was (other than a suspicion it was java-related). When I ran the ‘which’ command on my box it came up blank, so obviously Ronald was on the right track. After a couple more back-and-forths, Ronald said:

I ran the install, and FINALLY I was able to get the XCSoar Android version installed:

With more help from Google, I was finally able to get XCSoar ‘side-loaded’ to the M10. As it turned out, the trick was to copy the *.apk file to my Google Drive site, and then use ‘Drive’ on the M10 to ‘install’ the app. Basically the process is:

  • Copy the *.apk file from my Linux laptop to my Google Drive site
  • Uninstall XCSoar from the M10 by dragging the icon to the trashcan
  • Use ‘Files->Drive’ to access my Google Drive account from the M10
  • Double-click on the *.apk file in Google Drive
  • Select ‘install’ on the resulting dialog.

Here is a short video showing the process:

Installing a fresh Android version of XCSoar on my Android M10 tablet

So now, Thanks to Ronald Niederhagen, I’m all set for XCSoar development. In the meantime, however, the original reason I wanted to get into XCSoar development (disappearing AAT infobox data) seems to have — disappeared. Not to worry though, as I have lately gotten some other ideas for XCSoar ‘improvements’ 😉

A last thought on this subject from a septuagenarian engineer/programmer, I appreciate how much time and effort Ronald put into helping a noob along. Ronald didn’t know me from Adam, and yet he took the time to help. He didn’t know that I’m a 75-year old broke-down engineer, ex-pilot, (ex – everything, for that matter!), and I don’t know anything about him, either. Heck, Ronald could be a 12-year old kid with acne programming in his parents’ basement wearing pajamas, but in this case he was clearly the teacher and I was clearly the student. It’s pretty cool when the internet and free discourse facilitate this kind of international (and maybe intergenerational) collaboration.

Stay Tuned,

Frank