I am using the CTAG (Beast) version for a project where I need to know exactly the round-trip latency of the system, so I was measuring it as follows:
- Loopback from output to input using a mini jack to mini jack cable
- 48 kHz sample rate and 64 samples blocksize
- I load a 5 seconds exponential sine sweep from a wav file in the init function, which I then play in the audio thread, recording the audio inputs as well. I store the input signal in a wav file as part of the cleanup function.
I then compute the IR by spectral division input/output in MATLAB, compensate for the 2*BLOCKSIZE delay (I actually want to know the HW delay) and get my delay by looking at the Dirac delta.
The problem is that most of the times I get consistent results (58 samples delay) but sometimes I get 58+32 samples delay or 58+64 samples delay. So half/one blocksize extra delay. This happens randomly and I have seen the probability of getting that unexpected delay seems to be higher when Scope is enabled.
I would say the code is right. I have even tried using MLS sequences, and the problem remains.
So my question is if this is somehow expected and could have any explanation or I may be doing something wrong?