Start Time Missmatch - Analog vs. Digital channel in Raw export ?!

See another post about analog filtering details:

… specifically, about the group delay feature. You can look at your device-specific calibration file to see how your setup is calibrated. Ultimately, it is trying to ‘re-time’ the analog (filtered) data with the digital data streams due to extra latency caused by the various filters (hardware & software) and other delays (ADC sampling/conversion, etc.)

Basically, a digital read is ‘faster’ and more ‘real-time’ than an analog reading. There is more signal processing hardware and post-processing software steps on the analog samples which will lag behind the timeline of the digital sample captures.

Thus, a simple step function (i.e., change from 0V to 5V ‘instantly’) needs to be ‘synchronized’ between analog vs. digital display so they still appear to transition at the same ‘time’ in the waveform display
(see mGroupDelay parameter in calibration file).

I assume the ‘raw’ captured data is just the samples, without the group delay timing offset taken into account. Therefore, I suspect the meta data / start timestamps are needed to compare between channel timelines.