Hi, do you have any suggestion how would be possible to stream audio between many bela users? How to create multi-peer communication like this https://github.com/kolson25/WebRTC-Multi-Peer-Video-Audio ? Does exist any way, how to connect bela to the node server...? Thanks in advance for any comments.
Peer-to-peer audio streaming
gulesz to stream audio between many bela users
are these users on the same network?
this seems to be browser-based so it's a no-go.
What are you trying to achieve?
hi, thanks for reply. I try to make a kind of sharing listening in a small group of people. People should be at the same place in distance up to 20metres. The aim is to build a system on bela which could stream the audio from each person of the group. Optimally, to create a kind of Sonobus app on bela (https://www.sonobus.net/sonobus_userguide.html). It is not neccesary to be online. I was thinking if it would be possible to use the node js server on bela and to create some app js and store it on bela. Then, to establish one bela as a main server to where the other belas (mini ones) can connect through their wifi dongles. If that would be possible?
It looks like sonobus uses jack as an audio backend. That would be:
Sonobus Client -> Jack -> Bela audio environment -> DAC
and the latency is probably going to be high enough that all the benefits of using the Bela audio environment are going to be lost. I am wondering if it would be a better option to simply use the Bela cape as an ALSA device. That is probably going to make things easier. Another alternative, which is bad under an engineering point of view, but may just work if you have good enough wireless dongles would be to use Bela's UdpClient
and UdpServer
libraries and use them to send blocks of audio from one device to the others, perhaps sending to the broadcast address, i.e.: whatever.your.subnet.255
, e.g.: 192.168.1.255
. The receivers would have to buffer a reasonable amount of data before starting to play (again: latency!), but it may work well enough on a local network. You
enough wireless dongles would be to use Bela's UdpClient and UdpServer libraries and use them to send blocks of audio from one device to the others, perhaps sending to the broadcast address, i.e.: whatever.your.subnet.255, e.g.: 192.168.1.255.
I think this would be a fine solution. Could you please more explain or elaborate on how to do it?
I am wondering if it would be a better option to simply use the Bela cape as an ALSA device..... so, to create kind of ALSA mixer for the remote peers?
gulesz ..... so, to create kind of ALSA mixer for the remote peers?
well you can load the /lib/firmware/BB-BONE-AUDI-02-00A0.dtbo
instead of /lib/firmware/BB-BELA-00A1.dtbo
and the Bela cape will show up as an ALSA device. From there on it should be easier to run it with Jack following the sonobus
instructions.
thanks, but could you outline the steps how can I run the jack and sonobus on bela, please? Consider me as a total beginner on Linux. thanks
I think the only part where I can be of help here is by telling you how to have the Bela cape show up as an ALSA device, but for the rest of the setup, it is for you to follow the instructions made available by sonobus
.
Now, the "Bela as ALSA" issue. I rarely do that, so it periodically breaks. After about one hour of work on this, I got got it to work for Bela, but it is still not doing it for BelaMini. Before going through this more in detail, which one do you have at the moment? If you have a two big Belas around, it may be easier for me to tell you how to do it there, then you try to get the sonobus to work on those two and evaluate if that works for your purposes, at which point I'd look into fixing this for BelaMini, too.
I have Bela and Ctag...
OK, so for Bela the most straightforward way is to flash a BeagleBoard image from here, then boot from it, ssh
on it with ssh debian@beaglebone.local
(password temppwd
), then modify the file /boot/uEnv.txt
to add this line:
uboot_overlay_addr5=/lib/firmware/BB-BONE-AUDI-02-00A0.dtbo
save and reboot. aplay -L
should show the device!
thanks, I am going to try that. But before it, I am curious if it would be an easier and lower latency solution with udpServer and udpClient as you mentioned (in the case it will run on the local server). Then, I imagine that each of the group has his/her own bela (mini) with its own wifi dongle and everybody is connected to a router (TL-WR802N).
gulesz easier and lower latency solution with
easier to try it out probably, not necessarily to make it fully working/reliable. Lower latency possibly (but again, possibly unreliable), but expect > 20ms or more for passable reliability.
the priority is to have latency <20 ms. Otherwise, I can connect like that: mics>recorder>phone>sonobus. I thought if it would be possible to use the node server on bela, upload the javascript code for multi-peer communication, and then the bela use as a local server. Simply, to get rid of sonobus and any remote servers.
- Edited
the priority is to have latency <20 ms.
Honestly I think you won't be able to achieve that reliably. There are dedicated wireless audio modules that struggle to achieve 10 ms or 15 ms, and these devices do literally just that!
In principle one cold connect the TI CC8530 to Bela via I2S and use that for communication across devices (though I am not sure whether it can broadcast to several devices), but that would require spinning a PCB design just for it...
If the latency requirement is relaxed (e.g.: > 50ms), then I think you can start thinking about using the Linux wifi stack for it.
Hi @giuliomoro and @gulesz
This is what I need too! Slightly different use case but essentially the same.
Is it possible to access the raw audio data in any way from within the underlying linux?
Am I correct in thinking that if I go down the BeagleBoard/ALSA route then I wont have access to the Bela IDE or can I use the IDE but ALL audio will be high latency? Also, which beagleboard image should I flash it with an how do I revert back to the bela image if it all goes wrong?
I've been looking around the UDPServer/USPClient route but can't find any example anywhere to pipe audio with it.
Could you expand the setup a bit there and how I can send audio data out?
Is the audio stream available on the i2c buss that I can tap off externally?
I also thought maybe I could write to some shared memory somewhere and read it externally?
I'm loving the Bela so far, it's exactly what I needed!
simplecut Is it possible to access the raw audio data in any way from within the underlying linux?
not really.
simplecut Am I correct in thinking that if I go down the BeagleBoard/ALSA route then I wont have access to the Bela IDE
You could have access to it, but none of the programs provided by it will run. Still, if you want to use it for its editor capabilities, that'll be fine, but probably the BeagleBoard-provided IDE is a better choice as it comes with a different set of examples which will actually work (though I am not sure they have any audio ones).
simplecut ALL audio will be high latency?
yes
simplecut Also, which beagleboard image should I flash it with
the most recent will do
simplecut how do I revert back to the bela image if it all goes wrong?
the easiest route is to keep the BeagleBoard image on the SD card to try things out. If you don't like it, remove the SD card and it will go back to the Bela image.
simplecut I've been looking around the UDPServer/USPClient route but can't find any example anywhere to pipe audio with it.
is this to send audio data elsewhere on the board or to a different device on a local network? In the former case, this approach is probably overkill and a Pipe
object to a virtual alsa soundcard is probably a better approach.
simplecut t can't find any example anywhere to pipe audio with it.
There's not one. This one streams uncompressed audio via TCP. It uses big buffers (i.e.: high latency) to avoid overruns, but it was designed for working on a 16 channel board; buffer sizes can probably be decreased for your application. Also remove the fileOut
object as that also writes to disk.
simplecut Is the audio stream available on the i2c buss that I can tap off externally?
There is and I2S (not I2C) stream that you could tap on.
simplecut I also thought maybe I could write to some shared memory somewhere and read it externally?
externally from where? Yes you could in principle use shared memory to communicate with a different process, but you'll have to do it with the very same tight timing requirements that the Bela code uses, or you won't read the memory fast enough and it may be partially overwritten by the time you are done reading.
What is your application exactly?