I have procured a raspberry pi 5 and in pursuit of a superior form factor/packaging for my project have purchased a pcie to m.2 board (the pineboard hat AI drive) + Google Coral m.2 TPU.
I need to get the Coral tpu m.2 working, and it requires some kernel modules (the Google gasket and apex modules).
My question/request is if the kernel modules could be baked into the raspberry 5 Balena OS, or if it already is and there’s just some flag I need to switch on - as I should imagine there will be a number of others also hoping to use this combination of hardware and it would be great if it were more plug and play.
Certainly, I’m developing a computer vision at edge project. It needs to be reasonably small, low powered and inexpensive. It needs to be at the edge as streaming >1080p video over a SIM is not going to work.
Up to now it’s used the Coral USB which has its fair share of challenges regarding up to date libraries, but mercifully the actual hardware side is quite plug and play with just the need for some UDEV rules.
The m.2 option is cheaper, more performant, and offers a better packaging for the device.
Just to confuse matters:
I am also likely to try the Raspberry pi AI kit once it’s in stock, and this might also become quite popular: https://www.raspberrypi.com/documentation/accessories/ai-kit.html < I haven’t looked into it in depth but suspect it also involves some kernel modules which if run from docker alone become non-trivial to install.
I’ve so far adapted the nvidia example in the balena-os/kernel-module-build , the gist is here:
I’m baking this into the application itself run in an entrypoint script before starting the python application, and exiting if mod needs to be loaded to effectively reboot.
from tflite_runtime.interpreter import Interpreter, load_delegate
delegate=load_delegate('libedgetpu.so.1.0')
self.interpreter = Interpreter(model_path=self.model_path,experimental_delegates=[delegate])
With following error:
detect File "/usr/local/lib/python3.10/site-packages/tflite_runtime/interpreter.py", line 168, in load_delegate
detect raise ValueError('Failed to load delegate from {}\n{}'.format(
detect ValueError: Failed to load delegate from libedgetpu.so.1.0
Just circling back to this, I’ve parked it as being a slight distraction from the main task of developing the application, and can continue on USB for now, but I’ll summarise my assessment (though I don’t have much hardware/kernel knowledge to really be able to intelligently know what’s really going on):
Step 2: Update the Kernel
Kernel version 6.6.30 or higher is needed for the Pineboards Hat Ai overlay. This version is only available via rpi-update right now, please use this tutorial to install the Kernel). To check your kernel version use this command:
uname -a
Expected output:
Linux coralpi 6.6.30-v8+ #1761 SMP PREEMPT Thu May 2 16:54:52 BST 2024 aarch64 GNU/Linux
And that’s the only step that I don’t think is in place.
So it looks like support for coral edge TPU m.2 on this board requires dtoverlay=pineboards-hat-ai which is only baked into pi linux kernel 6.6.30 or higher.
As at time of writing Balena OS 5.3.22:
Linux ***** 6.6.22-v8 #1 SMP PREEMPT Tue Mar 19 17:41:59 UTC 2024 aarch64 aarch64 aarch64 GNU/Linux
Meanwhile I have a hailo-8l variant of the same bundle on order (Ai Bundle (Hailo 8L) – Pineboards) and that looks to not be reliant on that overlay - though the docs DO assume you’re on raspbian and have done a full upgrade which might have some other special sauce yet to be discovered.
That being said, I also have a RPI AI kit on order with the hailo8L and hope to find some personal time to ensure it gets correctly integrated into balenaOS for the pi5, so hopefully in the next month or so we will have this support. Lets see!