Hi, I've got some strange results while measuring RTT latencies between audio in and out.
Maybe, I'm missing out something but I can't see what.
I have physically connected audio input and output with a loopback cable and I'm logging the samples received and sent samples into two text files with the logging function.
Then, those logs are processed into Matlab with the cross-correlation (xcorr function) to measure the delay.
The signal I'm using is a white noise bandpass-filtered between 100 and 1000 Hz with a duration of about 10000 samples at 44100 Hz.
Strange results are:
- It seems that values input and output samples are inverted in log files, i.e, if samples sent to input are 0.5, 0, -0.1 then I'm receiving -0.5, 0, 0.1 (at least in log files).
- Measured delay is contant whatever the blocksize I'm choosing and is equal to 73 samples
I could send you the project zipped if needed.
Any idea about what's happening ?
Here is the cross correlation between output and input signals with a blocksize of 2 samples at 44100 Hz:
Cross-correlation code:
inputsamples; % contains myvarin, a vector with samples logged from audio input
outputsamples; % contains myvarout, a vector with samples logged from audio input
s1 = myvarout;
s2 = myvarin;
[acor,lag] = xcorr(s2,s1);
[~,I] = max(abs(acor));
timeDiff = lag(I) % sensor 2 leads sensor 1 by 350 samples
figure;
subplot(311); plot(s1); title('s1');
subplot(312); plot(s2); title('s2');
subplot(313); plot(lag,acor);
title('Cross-correlation between s1 and s2')