I have tried to use digital trigger to align analog measurement and noticed high difference some how the start time and relativ difference in the raw export is not matching between the digital and the analog channels - only in the raw export. If I zero reset both recordings assume that analog and digital are starting at the same time ignoring the start time meta info, the records seem to match better but hard to say if its correct !? Anything know about it?
analog_0.bin:0 tstart=5.452571
analog_13.bin:13 tstart=5.452571
analog_2.bin:2 tstart=5.452571
analog_3.bin:3 tstart=5.452571
digital_7.bin:8 tstart=5.452649
digital_9.bin:10 tstart=5.452649
this is systematic I guess it must have something to do with the first sample time but makes no sense still I could not find something about it in manual - might be that I am blind …
using Version: 2.4.29 of Logic and the Pro 16
See another post about analog filtering details:
… specifically, about the group delay feature. You can look at your device-specific calibration file to see how your setup is calibrated. Ultimately, it is trying to ‘re-time’ the analog (filtered) data with the digital data streams due to extra latency caused by the various filters (hardware & software) and other delays (ADC sampling/conversion, etc.)
Basically, a digital read is ‘faster’ and more ‘real-time’ than an analog reading. There is more signal processing hardware and post-processing software steps on the analog samples which will lag behind the timeline of the digital sample captures.
Thus, a simple step function (i.e., change from 0V to 5V ‘instantly’) needs to be ‘synchronized’ between analog vs. digital display so they still appear to transition at the same ‘time’ in the waveform display
(see mGroupDelay
parameter in calibration file).
I assume the ‘raw’ captured data is just the samples, without the group delay timing offset taken into account. Therefore, I suspect the meta data / start timestamps are needed to compare between channel timelines.
Problem though if I replot the readout respecting the meta data start times the data seems misaligned between Digital and analog to realty. For this purpose I just recorded Anaolg and digital on a step function in saleae and with oscilloscope. If I then ignore the start time thereby shift the analog backwards in time the alignment is much better to what is expected… and then the plottet picture using the raw data looks also much more like the one in the Logic Tool so something must be of here or misunderstood by me.
Your comments about the calibration file are interesting, I had a look but I do not understand yet need some more time to get into it … .
Regardless thanks for the explanation and awareness point of all the analog sampling magic that has to be taking in account.
Even though I am not 100% sure, but I still thing I am onto something that ether needs fixing or a better explanation.