Autobaud detection

Hello,
I’ve written an Analyzer in C++ with the Analyzer SDK and want to add true autobaud detection.
The available helper functions TrackMinimumPulseWidth() and GetMinimumPulseWidthSoFar() aren’t really helpful, because there may be glitches in the signal that lead to unusable baudrates. Sadly there is no API function that allows you to iterate through the shortest 100 pulses, for example. It would also be nice to get the bit value of the pulse. This requested bit level could be added as a parameter for example.

I successfully added true autobaud detection by creating a histogram for dominant and recessive pulses within a reasonable baudrate range. I then add the shortest dominant and recessive pulse together to have a better result accuracy and then calculate the baudrate based on sum of these pulse durations on the “first” run of the WorkerThread() function. I need the analyzer to be re-run again since you cannot move backwards in the data and am doing this by returning true in the NeedsRerun() function if the autobaud detection was requested in the settings. Once my autobaud detection has run I automatically deactivate it in the settings.

This approach works in Logic 1.x although with one small caveat. It only works on the inital run of the Analyzer. If for example the Analyzer is already present without autobaud detection but with a wrong baudrate and you then change the settings to activate autobaud detection it doesn’t work. The reason is that my internal variables that I use aren’t set properly. When the user chagnes the settings of an existing Analyzer, is there some callback being called where I can reset my internal variables?

In Logic 2.x my autobaud does not work at all. I think is because there is no break condition for the WorkerThread(). Since I first go through all samples until either CheckIfThreadShouldExit() triggers or DoMoreTransitionsExistInCurrentData() returns false to analyze the data I would need a way to force a re-run. Can this be accomplished somehow?

Nevermind, I was thinking too generic for my case.
In my protocol there is a 0x55 sync field as the first byte. I check for this pattern and test if the “bit durations” are somewhat in the same range and then set the bitrate based on the pattern length. Basically how a device would implement autobaud detection when there is a sync field involved.

Otherwise a bulit in histogram would be quite useful I think.

@cg1 Thanks for posting about this. You’re absolutely right about the caveat with using TrackMinimumPulseWidth() and GetMinimumPulseWidthSoFar() when glitches may be present. This is actually the method we used for Autobaud in Logic v1 in the past. It was a rudimentary approach, quite simple to implement but prone to be incorrect.

We haven’t looked into this problem since then. Right now we’re focused on getting Logic v2 ready for official release, so that’s our priority at the moment.

When we get back to working on analyzers, we may put together a more comprehensive autobaud solution, perhaps with a similar approach to what you have shared.

I created an idea post a while back to get this added in for the Async Serial analyzer as a starting point, and to start tracking interest in this for prioritization purposes. Feel free to add your votes/comments to it below in the meantime.