FatJak the one viewable on the bela_scope (which seems to be upper than 5V (??))
The Bela scope will show whatever you send to it, so its relation to real-world voltages (if any) is determined by what signals you connect to it.
One thing to keep in mind, is that these are the nominal I/O voltage ranges for Pepper Rev1 and Rev2 when configured in the default way, which expects you to run it from ±12V, using the 5V regulator (J14 set to "12V with regulator") and use the on-board 10V voltage reference (i.e.: J16 set to "10V to pots from rack") and set to use the NE5532 for the audio outputs (i.e.: J21 and J22 set to "Amplified audio w/ rack 12V"):
- 2 audio in (AC-coupled, 10V pk-to-pk, 44.1kHz, 16bit)
- 2 audio out (DC-coupled, -5 to -5V, 44.1kHz, 16bit)
- 8x CV in (DC-coupled, 0-10V with pots attenuating, 22.05kHz, 16bit, 12V safe). Optionally, up to 4x of the CV in sockets can be swapped for trigger ins, leaving the pots connected directly to Bela's analog inputs.
- 8x CV out (DC-coupled, 0-5V, 22.05kHz, 16bit)
This means that if you write a full-scale (0 to 1) output to a CV out (0V to 5V) and loop it back to the CV in, this will correspond to a half-scale reading on the analog input (0 to 0.5). On top of this you need to add the considerations about tolerance mentioned above. That's expected behaviour although it may appear confusing. The rationale behind it is that 0V to 5V is the maximum available voltage at the output without additional amplification and that it was easy enough to get a 0V to 10V input range with just passives. Btw, a different selection of resistors can give different (unipolar) input ranges, see here. Also consider the 5V rail tends to be quite noisy so the useful (less noisy) CV out range is often constrained to something like 0V to 4.8V.
For calibration, I'd read the value with the pot at a minimum (again, I'd be surprised if this is more than 0.01 away from 0) and then again with the pot at the max, then this should work:
[adc~ 3]
|
[-~ $1]
|
[*~ $2]
|
[clip~ 0 1]
where $1 = minReading
and $2 = 1 / (maxReading - minReading)
.
FatJak The non zero minimum is on all the channels, as measured by voltmeter in CV OUT.
I am not sure I understand this. To test if there is any offset at CV OUT, you should have a patch like this:
[sig~ 0]
|
[dac~ 3]
and read the output at CV out 0. I expect that to be very close to 0V. According to the AD5688 datasheet https://www.analog.com/media/en/technical-documentation/data-sheets/AD5628_5648_5668.pdf, figure 28, it should be under 2mV. Then try again with [sig~ 0.96]
which should give a nominal out of 4.8V.
FatJak Potx: real measure min-max volt. on CV OUT / approx. displayed volt. with bela_scope / approx. ratio
It seems that you are conflating a few things here: are you testing with something like this?
[adc~ 3]
| \
| [dac~ 27] // scope
|
[dac~ 3] // out
As you see there:
- the scope does not display a voltage, but rather a value between 0 and 1, so what's your "approx displayed volt"?
- any errors in the ADC range will translate to an offset and scale error on the DAC