When pushing a repository to balena it will start building from docker-compose.yml. Can I run any package installers before that it does that or are support to run those locally and install any library?
Perhaps there is a equivalent .travis.yml or github actions like functionality?
Hi there, I’m not sure I understand your question, any libraries that you want to add to your image should be run as part of the Dockerfile. For example for Javascript, you can see how the dependencies are copied over and installed here. What language do you use?
Well what I’m getting at is that you usually do not store library packages inside the git repository. For example node_modules or vendor (for PHP) directories are in .gitignore. So to be able to use COPY inside a Dockerfile one should run npm install or composer install first.
However, I think git push belana master to deploy is a deprecated method anyway and balena push app-name should be used instead. That seems to simply grap the whole local directory and packages it before sending it to the cloud… That includes any files installed via package managers…
If you look at the example I sent you, what you would do is only COPY the package.json, and run npm install inside the Dockerfile. That will install all your libraries as part of the Docker image for the right architecture, instead of copying the node_modules folder, which you shouldn’t be doing anyway. It doesn’t matter if you use git push balena master or balena push app-name, the behavior, and the way to implement it would be the same. Does that make sense?
I need to add a github token to be able to run this successfully. What is the best approach to that? A service or device ENV variable does not seem appropriate because I need this token only on docker build time…
I could put this in my Dockerfile but man wondering how secure that is and how I would avoid hardcoding the token value…