I am currently working on a Dockerfile that I can use as a base image for a balena-based Python project using the Jetson Nano board. The image should support TensorRT so that we can do hardware-accelerated AI inference at the edge. The latest Dockerfile can be found here: https://gitlab.com/p.d.boef/tensorrt-balena/blob/master/Dockerfile.tensorrt. Instructions to build the image can be found in the root of this repository.
I am sharing this Dockerfile because I think it may be useful for other users with a similar use case, and to potentially get some advice and feedback on my work, especially with respect to reducing the final image size, which is currently around 3.5 GB.
Note that building the image will take a long time (around an hour), as all the necessary Python packages have to be built from scratch on an ARM platform such as the Jetson Nano.
I’ll continue updating the Dockerfile, any suggestions and comments are welcome!