Well in principle only this was needed:
However, installing torch fails as there is no build available for armv7, so I tried to install it from the
Bullseye repos with apt-get:
apt-get install libtorch-dev python3-torch libprotobuf-dev
and that worked (though it installed 500MB of software ...).
Then the configuration stage would fail because it can't find the silly path
CMake Error at /usr/lib/arm-linux-gnueabihf/cmake/Caffe2/Caffe2Targets.cmake:116 (message):
The imported target "c10" references the file
but this file does not exist. Possible reasons include:
* The file was deleted, renamed, or moved to another location.
* An install or uninstall procedure did not complete successfully.
* The installation package was faulty and contained
but not all the files it references.
Call Stack (most recent call first):
This seems to fix it, but don't ask me why it is needed:
ln -s /usr/lib/arm-linux-gnueabihf/ /usr/lib/lib
Anyhow, now that it's more-or-less properly configured it fails when building:
[ 20%] Building CXX object backend/CMakeFiles/backend.dir/backend.cpp.o
/root/nn_tilde/src/backend/backend.cpp: In member function ‘void Backend::perform(std::vector<float*>, std::vector<float*>, int, std::string, int)’:
/root/nn_tilde/src/backend/backend.cpp:22:8: error: ‘InferenceMode’ is not a member of ‘c10’
22 | c10::InferenceMode guard;
As c10 seems to come with pytorch, this may mean that we are building against an incompatible version of pytorch? Then maybe you should find the appropriate version of pytorch and build it from source.
I'll stop the debugging here, feel free to go on, but ultimately I am not sure how much performance you can get out of this when running on Bela, given how you won't be able to use the GPU for pytorch and it'll have to run on the CPU... So I am not sure this is worth anyone's time.
IF you are not so focused on
[nn~] but more generically DL for Bela, check out https://github.com/rodrigodzf/DeepLearningForBela