Hi @jason10 we have a couple of questions regarding your workflow.
Are you doing git push ... to send your code, or using the CLI such as balena push ...?
The private repository is that your code is somewhere else stored as private repository, such as a GitHub or GitLab private repo?
The git submodule update --init --recursive is indeed the right thing to to update the submodules, but when you are doing the docker build, the .git information is not available, it’s not built in a docker context, as you found out.
The next thing you do, is it pulling the same repository in the docker build that you are pushing to? In general we do not recommend pulling from the git repos that you have pushed to balena-cloud, I don’t think that’s really something you want to do.
For private repos in general, I think you would have to have some authentication as part of the docker build steps, for example adding an SSH key that git can use to pull with, or adding the https credentials (username/password) of a repo, if that’s applicable, but depends on what you tell us, where’s your private repo that you are pulling the submodules from.
Also, in general submodules are not supported in the project that you are pushing, but of course you can do that in your build steps. See one of our earlier answers about submodules in general:
For mor advice, it would be good if you can tell more about your project structure! Can you give more info on that?
So far what I’ve seen, our advice would be:
use a separate repo that has the submodules, compared to your balena deployment
the balena project in the pull step clones that other repo, and updates the submodules with credentials added
Alternatively, you can do local builds and then your balena deployment can use submodules (see balena build and balena deploy of the CLI). In that case all the build happens on your computer, with all the stuff checked out (thus you would initialize the submodules before running the build) and only the final containers are pushed. Not sure if that’s a viable aternative for you.
I know this is a lot of information, let us know if anything is unclear, and I’m checking with the team, if I’ve missed any options for you, especially if you can elaborate on the questions above. Thanks!
The project structure is similar to the “getting started with multicontainer” example you provide.
Instead of the directories data, haproxy, and frontend belonging to the repository they are git submodules.
Mostly I do this with third party libraries that I have forked and patched. If the library is up to date and public then I can clone it.
Given that, at this point, the repositories in question are public on github (both original and forks) I can use git clone. However there will be submodules that come from private repos.
@jason10 could you tell us a bit more why the CLI won’t work in the field for you? Keep in mind Paulo was talking about using the balena push <applicationName> command, which can trigger a build in our builders exactly like a git push (but allowing you additional options). This is different from the local push that works locally on development devices when you specify a device.
@pcarranzav I was confused. I haven’t used the CLI very much, only for local development, and I was thinking only of its local development applications. I will take a closer look at balena push <applicationName>
I haven’t worked on that project or been with the company that had that problem in more than four years.
I think, based on what I am reading here that the cli push approach would be worth investing.
What I remember being able to do under the time pressures I was under was to copy sources from the other repos into a master production repo and build from there. A handrolled git submodule sort of approach.
My selection of balena was abandoned by next two CTOs who also made various hardware decisions that lead to more manual docker containers. Startup life.