I’m using Logic 2.3.7 on Linux with a simple parallel decoder set to falling edge and a one-bit bus, which I’m using to drive a custom HLA. I found my HLA was occasionally getting out of sync with the bits and on further investigation it seems the simple parallel decoder is occasionally emitting the same bit twice. I couldn’t find any other reports of this behaviour or any issues on the GitHub repository, but apologies if this is already reported.
There’s no double edge or other glitch on the clock signal, this happens even with a very generous glitch filter, and happens at a range of sample rates and clock speeds (e.g. 10kHz to 50MHz clock, 1MS/s to 500MS/s sample rate, glitch filter either off or up to half the clock period, etc). Using a digital measurement reports the correct number of rising and falling edges, and zooming all the way in to the edge doesn’t reveal anything either. More weirdly, if I save the capture and re-open it, it analyses perfectly, no duplicate bits, and my HLA stops going wrong. It seems to happen within a few hundred clock edges though I’ve not noticed any particular pattern as to exactly when it occurs.
Another symptom of the issue is the arrowheads on the clock falling edge that indicate sampling moments disappear whenever an edge which is double-counted is in view, so you can localise the problem quite quickly by scrolling around until the arrowheads appear/disappear. I guess this is because it’s trying to draw two on top of each other. Both the decode table and the
frame.start_time in the HLA show exactly the same timestamps for the duplicate bit.
With arrowheads, duplicate edge slightly off-screen to the right:
Scrolling a few pixels right, duplicate edge comes into view, arrowheads vanish:
I would upload a sample capture, but when I open it there’s no problem, this only occurs on live data.
For now I’m able to work around this by returning early from
frame.start_time is the same as the previous frame, but the analysis in Logic and the exported data from the decoder all contain the duplicate data, so it seemed worth reporting.