slow and not updating device with a large docker container ( > 10 gb)


I want to deploy a docker container above 10 gb, which is impossible to do i’ve noticed. I use allot of computer vision models which take most of the space in. Most of the times it just keeps on hanging. One time it updated above 80% and gave a auth error.

The SD card is brand new 128gb. Reflashed and replaced with other one so that is not the issue.

Hi Martijn, let’s see what we can do to get you up and running!

First and foremost, yes, containers that large are more susceptible to download timeouts, broken pipes, SD Card write issues, and more. We are actually currently exploring some ways to improve that download streaming of the container to the SD Card, so that might get better in the future. In the meantime, what type of a device is this? Jetson of some sort?

Next, how is the device connected? Is it WiFi? If so, any chance you can try ethernet?

Another option is to try to reduce the container size by using “Build” and “Run” stages in your Dockerfile, as described here: Optimize your builds - Balena Documentation

Finally, although the 128gb card size is great, makes sure the read/write speed on the card is adequate. The faster, the better. Those can be pricier, but, it can make a huge difference. Hope that helps!

@neonlink Please do share with us the model of hardware you’re running; our engineers have been working on some improvements and bug fixes relating specifically to large image updates. Some are tied to meta-balena and some to the Jetson products, so there may be some GH Issues for me to share with you that could be of help. :slight_smile:

Also, can you tell us what process is actually hanging? Is it the download of your image? An update you’re doing? If an update, is it to the application or the hostOS? Any other details you have would be quite helpful. Thanks!