Jetson Nano image with TensorRT for Python projects

I am currently working on a Dockerfile that I can use as a base image for a balena-based Python project using the Jetson Nano board. The image should support TensorRT so that we can do hardware-accelerated AI inference at the edge. The latest Dockerfile can be found here: https://gitlab.com/p.d.boef/tensorrt-balena/blob/master/Dockerfile.tensorrt. Instructions to build the image can be found in the root of this repository.

I am sharing this Dockerfile because I think it may be useful for other users with a similar use case, and to potentially get some advice and feedback on my work, especially with respect to reducing the final image size, which is currently around 3.5 GB.

Note that building the image will take a long time (around an hour), as all the necessary Python packages have to be built from scratch on an ARM platform such as the Jetson Nano.

I’ll continue updating the Dockerfile, any suggestions and comments are welcome!

I am by no means an expert at dockerFiles but I had some thoughts.

adding
&& rm -rf /var/lib/apt/lists/* && apt-get clean
at the end of where you use apt, might save you some space. I don’t use ubuntu though so I am not sure it applies.

For maximizing the build time with I would eliminate the separate apt-get commands unless it is necessary for things to be installed in order. In addition the block where you remove things should be added to the RUN command of which they are most relevant or where those unneeded files are created. If them come with the image then put them at the top.

Your build is farrrrr more complicated than mine so I could be completely wrong. I really enjoy seeing how far you can take the balena build engine though, it is a great example. Thanks for sharing.

-Thomas

@tacLog thanks for your suggestions! I have updated the Dockerfile by grouping some of the RUN commands and clearing the apt cache, and the image size has been reduced to 3.1 GB. It can be found on the Docker Hub as well: https://cloud.docker.com/repository/registry-1.docker.io/pdboef/nano-tensorrt/tags.

Furthermore, for the largest Python packages, such as numpy, pre-built ARM wheels are downloaded to save about 40 minutes of compile time on the device itself.