First of all, as a new user, let me say I am very grateful for the work the team has done. The offer perfectly meets my requirements and will enable my company to deploy awesome devices to our customers.
I have an architecture question regarding multi-container setup on our devices (raspberry pi 4).
Our devices need to run multiple services at the same time. One of them is a data acquisition pipeline that communicates with other machines and pulls data out, then uploads everything into our database. No problem here, I was able to setup each device with environment and service variables so that they run the same version of the app and each communicate with a different machine.
Now I have a second service that needs to run, and it’s for inference of a predictive machine learning model on the edge. The tricky part is that each device needs to run a different model. The inference pipeline container is the same for all devices, but each needs to use different config files (yaml) and artifacts (the compiled models) because each customer has a different model trained for them.
The model and config files will also need to be updated from time to time, ideally with minimal manual work.
How would I go about doing this with balenaCloud? Would I need a separate docker registry to hold customer-specific images and then configure the docker-compose file to pull an image based on an environment variable? Is it possible to do everything inside balenaCloud to have automatic builds on aarch64, etc.?
I could imagine to have a separate project for each customer, with no device added, just to build images of the data files and maybe I can pull these dynamically from the main project that has all the devices… not sure if that’s possible.
Hi there,
unfortunately we don’t have any process that allows you to have automatic builds . To do so you will always need to do a balena push. Would it rather be possible to set a device ENV that points to the model via a link? I think of something like a script that takes the right model depending on the device ENV (which can be a GitHub repo link or similar). Could this help you?
Not a problem to do a balena push, that would be acceptable.
Regarding your suggestion, yeah having a separate script that downloads the files could be a solution, but I would need to arrange for a separate private repo and custom auto-update system (to tell the service a new model is available for instance). This would involve generating access tokens for each device to access the repo, potentially installing git to fetch new releases, and ideally having a repo per customer. That’s a lot of work, I think!
Maybe using a separate private docker hub repo would be an option, with something like watchtower, to serve and update containers containing the device-specific data. Authentication would be made with --registry-secrets maybe? Although that only works at build time, so watchtower would fail.
Is there a way to pull images from the balena docker registry manually? i.e. I would have a URL pointing to the correct image in the balena registry (registry2.balena-cloud.com/v2/...) setup as device ENV, and use that variable inside the docker-compose (as the image property). The image could be produced by a separate Application ideally. I’m not sure how I would identify the latest image for a given Application, maybe with some sort of tagging.
Basically, if I boil it down, what I would love is the ability to run multiple Applications on the same device. One common application that runs on all devices, and a custom Application for any given particular device. That would allow to leverage all the power of cross-building and automatic updates that balena provides.
The ability to run multiple applications on a device is on our road map but it’s not something you can do right now.
As a workaround for the time-being I would suggest either creating a release per model and pinning each model to a specific device or creating an application per model.
Thanks for the hint. Release pinning would also complicate the update process, since each device would have its own release and would need to be pinned to the correct one. In the end I think I will go for the first suggestion, which is to have a separate git repo where I can have one branch per device, and to pull the files with a script according to some ENV variable. I can manage credentials with build-time secrets and will probably use the same access token for all devices as a tradeoff to limit complexity.