Cloning git submodules when pushing to


I have created a repository for pushing to which contains a number of sub container. These are run in a single container using docker-in-docker thanks to the brilliant post and blog article by @justin8.

I would like to be able to develop the sub containers in a separate repository then use git submodules to include them in the parent. I know it is possible to deploy an application to heroku using git submodules and think this would be an amazing feature of however when I push to the submodules are not detected.

Any answers or suggestions to the following would be much appreciated

  • Can I create a git hook that is executed on server when code is pushed but before being built?
  • Should I consider a different approach to the problem or avoid docker-in-docker entirely?
  • Would it be best to host the sub containers on Docker Hub and install from there?
  • Might this be a feature that is included in in the future?
1 Like

.git files, submodules, and ssh keys

These are very good questions! On the docker-in-docker side I know there’s work to bring multi-containers onto, and that should be something to contribute (but there’s no hard timeline for that yet) :tools:

I personally prefer startup scripts, and then all this would be doable, installing the required projects in separate RUN lines in the Dockerfile, and the start script or systemd kicks them off on application start. But I know that it’s not suitable all the time, and not everyone finds it palatable. :slight_smile:

For git submodules in Docker, there’s some examples that people used (“Git submodules inside Dockerfile repository"” and the link from there). What it looks like, if I see the example correctly, it’s separating the deployment Dockerfile from the application code that is being deployed, the submodules are in the application code, and within the Dockerfile run things with git clone ... followed by a git submodule update --init --recursive.

An example: let’s say you have Main_Project, and submodules Subproject_A and Subproject_B. Add the subprojects into your Main_Project as git submodules:

├── Subproject_A
└── Subproject_B

Then in your Dockerfile have something like:

RUN git clone <wherever>/Main_Project.git && \
    cd Main_Project && \
    git checkout ${MAIN_PROJECT_VERSION} && \
    git submodule update --init --recursive && \

From the example above, you can even include MAIN_PROJECT as a submodule in the application’s git repo (the one that contains the Dockerfile in the first place, so still everything is coupled together, but decoupled enough, to be able to do this tiered deployment. I think it’s pretty much the example linked above used (direct github link).

Is this any useful? :sunrise:

For your specific questions:

  • no deploy hooks available at this time
  • see comments above, probably better to avoid DinD, but should be possible to make it work
  • subcontainers on Docker Hub might work depending on your setup, though can’t think of anyone who tried yet. It might be quite a bit of extra management
  • multicontainers is an upcoming feature, though not sure if planning anything around submodules at the moment
1 Like

Kubernetes to balena, trying to understand multi-app or container deployment

Thank you so much for such a complete response to the both the questions I asked and even the questions that I did not ask. The link that you have shared is really useful so thanks again. I am going to work with a single container setup with git submodules to keep components of the application decoupled.

I can understand the recommendations not to use docker in docker as in tests I have also found that a number of advantages Resin has (Dockerfile.template and building containers in the cloud) get lost when building multiple containers at runtime on the device.

1 Like


Great to hear! :sparkles: please let us know whatever you figure out along the way, we are all learning together :smiley:

1 Like


@imrehg started testing git submodule with multi-container, do you maybe have an example project using that logic?

The idea is to have a git submodule as one of the containers



Hi @eblex that is unfortunately not supported, so the project that you are pushing cannot currently have submodules. It can only use submodules as part of the docker build, (ie. the dockerfiles of the services checking out the other project and its submodules as a RUN step).

If you use the CLI, however, you can build locally on your development device and push the containers to balena cloud, then you could use submodules (and the build would be from the current checked out state, using balena deploy --build ... Check the help for that, balena deploy --help

Will take a note, nonetheless, see if we can provide some better way of doing things.



My work around has been to use a repository just for the “balena build”. I run a script that rsyncs changes from the repositories that contain the sub-containers, commits any changes to the “balena build” repository, and then pushes to the “balena build” remote.



Hey there! I think our new balena push should do what you need. It completely bypasses git, so from its point of view, there is no such thing as git submodules; it will see sub-directories, and use rsync to sync all the files.

1 Like


Which version of balena cli supports this?

The documentation describes it as a drop in replacement for git push. Does this include using the .gitignore files?



@jason10, certainly any 2019 CLI release supports this, which would mean version 9.8 and above. Older versions probably support it too. If you have a specific version you need to use, let us know. Otherwise, the latest version is always the default recommendation, and at the time of this writing it is 9.15.1.

Yes, balena push will take .gitignore into account, and actually also .dockerignore. This actually has pros and cons. Check the following GitHub issue for certain behaviours that are good to be aware of: