How to update custom kernel image, dtb and modules in Balena OS


We have added camera support for Jetson AGX Xavier for the L4T32.4.3.
Need to add that camera support in Balena OS for Jetson AGX Xavier. How to update the Kernel Image, DTB and modules in the Balena OS?

Is there any specific documentation available to update custom L4T changes in Balena OS.



Regarding how to specify a custom DTB file, you can toggle the configuration that reads “Define the file name of the DTB to be used. Only supported by supervisor versions >= v11.14.2.” through the dashboard on a device or fleet-wide level. You may then upload your own DTB which will be applied to the device or fleet after the Supervisor reboots the device. As for the other questions, I’ll reach out to engineers who have more knowledge about this aspect and we’ll get back to you.


Hi, the current L4T version in BalenaOS for the Xavier AGX is 32.4.4. Please open a PR with your changes in the device repository GitHub - balena-os/balena-jetson and we’ll follow up from there.

Hi All,

Thanks for the replies.

I am trying add the changes to balena-jetson repository (GitHub - balena-os/balena-jetson) for the L4T 32.4.4.

For that I checkout to the tag “v2.67.3+rev5” and created a sample branch on that tag named, “econ_dev_v2.67.3_rev5_tag”.

In the balena-jetson source, I could able to see pre-built devicetree blobs and patch files at the location “layers/meta-balena-jetson/recipes-kernel/linux/linux-tegra/”

I have attached my kernel, dtb and module patch along with pre-built dtb file “tegra194-p2888-0001-p2822-0000-nilecam25_two_lane.dtb”

NileCAM25_CUXVR_JETSON_AGX_XAVIER_L4T32.4.4_dtb.txt (28.3 KB)
NileCAM25_CUXVR_JETSON_AGX_XAVIER_L4T32.4.4_kernel.txt (37.5 KB)
NileCAM25_CUXVR_JETSON_AGX_XAVIER_L4T32.4.4_module.txt (451.3 KB)
tegra194-p2888-0001-p2822-0000-nilecam25_two_lane_dtb.txt (285.4 KB)

Is this enough to add these file at the location, “layers/meta-balena-jetson/recipes-kernel/linux/linux-tegra/”

How these changes will be applied to jetson source and built. Please clarify.

Also kindly share the steps create a PR in the balena-jetson.


Hi, the latest tag in balena-jetson is v2.80.5+rev4, please always use the latest HEAD version for your PRs, and not older versions. You can have a look at this PR which adds support for two different Xavier boards, the NRU120S and the CTI Rogue Xavier. You’ll need to specify the custom DTB for your Xavier board, the custom pinmux file if any, custom ODMDATA if any, and integrate your custom camera driver (here’s an example on how to do this: linux-tegra: Backport gasket driver · balena-os/balena-jetson@20a6333 · GitHub)

After you integrated the changes, you can do a local build with balena-yocto-scripts/build/barys -b build_my_xavier -m <your_custom_yocto_xavier_device_type>

Hi @acostach

Thanks for the quick reply.

As you mentioned, I added a new configuration “nilecam25-jetson-xavier” to include kernel and patches in the balena-jetson.

When I try to build the source using below command, I am facing timeout issue.
balena-yocto-scripts/build/barys -b nilecam25_xavier_balena_os -m nilecam25-jetson-xavier

[000000008][LOG]BalenaOS build initialized in directory: nilecam25_xavier_balena_os.
*[000000008][LOG]Run build for nilecam25-jetson-xavier: MACHINE=nilecam25-jetson-xavier bitbake balena-image *
[000000008][LOG]This might take a while …
Timeout while waiting for a reply from the bitbake server (60s)
[000000069][LOG]Build for nilecam25-jetson-xavier failed. Check failed log in nilecam25_xavier_balena_os/tmp/log/cooker/nilecam25-jetson-xavier .
[000000069][LOG]If build for nilecam25-jetson-xavier succeeded, final image should have been generated here:
[000000069][LOG] nilecam25_xavier_balena_os/tmp/deploy/images/nilecam25-jetson-xavier/balena-image-nilecam25-jetson-xavier.balenaos-img

What could be the reason for this issue?


Sounds like a generic Yocto error, can you check if there are any bitbake.lock / bitbake.sock files present in the build directory, and if so remove them and try again? Also worth ensuring that docker is installed and running, and /var/run/docker.sock has rw permissions for owner and group.

Hi @acostach

I built the balena os image “balena-image-nilecam25-jetson-xavier.balenaos-img” successfully.

But facing dependency issue while try to flash the image using “GitHub - balena-os/jetson-flash: This tool allows users to flash BalenaOS on Jetson supported devices”.

./bin/cmd.js -f tmp/balena-jetson/nilecam25_xavier_balena_os/tmp/deploy/images/nilecam25-jetson-xavier/balena-image-nilecam25-jetson-xavier.balenaos-img -m jetson-xavier


  • throw err;*
  • ^*

Error: Cannot find module ‘unbzip2-stream’

  • at Function.Module._resolveFilename (module.js:547:15)*
  • at Function.Module._load (module.js:474:25)*
  • at Module.require (module.js:596:17)*
  • at require (internal/module.js:11:18)*
  • at Object. (/media/koil/H_Drive/Projects/Jetson/Xavier/Customer_Works/Refraction_AI/jetson-flash/lib/utils.js:20:13)*
  • at Module._compile (module.js:652:30)*
  • at Object.Module._extensions…js (module.js:663:10)*
  • at Module.load (module.js:565:32)*
  • at tryModuleLoad (module.js:505:12)*
  • at Function.Module._load (module.js:497:3)*
  • at Module.require (module.js:596:17)*
  • at require (internal/module.js:11:18)*
  • at Object. (/media/koil/H_Drive/Projects/Jetson/Xavier/Customer_Works/Refraction_AI/jetson-flash/bin/cmd.js:27:15)*
  • at Module._compile (module.js:652:30)*
  • at Object.Module._extensions…js (module.js:663:10)*
  • at Module.load (module.js:565:32)*

I am using Ubuntu 18.04 Host PC. I couldn’t find “unbzip2-stream” package for Ubuntu 18.04. Do you have any method to meet dependencies needed for flashing the image using Ubuntu18.04 PC?


Hi, glad to hear you local build succeeded. For flashing you will need to use NodeJs v10 or v12, and install the list of dependencies here using npm, not ubuntu’s apt-get. Example: npm i unbzip2-stream etc.

Since the your device type is named nilecam25-jetson-xavier and this slug isn’t available in the API, the device won’t register nor appear in the dashboard. For this, you can try configure the image with an existing “Jetson AGX Xavier” dashboard application using balena-cli on your host:
balena os configure balena-image-nilecam25-jetson-xavier.balenaos-img -a <your_jetson_agx_xavier_dashboard_application_name> --version <BalenaOS_version, i.e v2.80.5+rev4>. Note: doing so will register the device as an AGX Xavier, the device type will need to be PRed and merged in the jetson repository to be added in the API and become available for download in the dashboard.

Hi @acostach

Thanks for hints. It was really helpful in flashing the custom image to xavier kit.
However still the devices is not visible in the dash board. Please refer the attached image.

I have configured the “balena-image-nilecam25-jetson-xavier.balenaos-img” using balena-cli as follows,

$ balena os configure balena-jetson/nilecam25_xavier_balena_os/tmp/deploy/images/nilecam25-jetson-xavier/balena-image-nilecam25-jetson-xavier.balenaos-img -a nvidia-jetson-xavier-nilecam25 --version=v2.80.5+rev4
? Network Connection ethernet
Configuring operating system image

And I flashed the balena os image using jetson-flash tool as follows,

$ ./bin/cmd.js -f balena-jetson/nilecam25_xavier_balena_os/tmp/deploy/images/nilecam25-jetson-xavier/balena-image-nilecam25-jetson-xavier.balenaos-img -m jetson-xavier

Attached the boot log “nilecam25_balena_os_image_dmesg.txt” from jetson xavier kit for your reference.
nilecam25_balena_os_image_dmesg.txt (29.8 KB)

We cannot access the board using minicom also (ie sudo minicom -D /dev/ttyUSB0). Kindly let us know to execute basic commands to check the camera functionalities in baleno os.


@KoilArulRaj.S what if you rename all occurrences of “nilecam25-jetson-xavier” to “jetson-xavier” in both config.json and device-type.json from the resin-boot partition of the image, before flashing the device? You can mount the image as a loop device with losetup -fP <image>

Hi @acostach ,

I could able to see the xavier kit appearing in the dash board after the changes made in the config.json and device-type.json files present in the resin-boot partition.

Accessing the device terminal using the following command. We confirmed that the camera is probed correctly.
balena ssh "device-uuid"

Is there any way to run linux commands such as follows in the balena os image?
v4l2-ctl --stream-mmap --stream-count=100 -d /dev/video0

gst-launch-1.0 v4l2src device=/dev/video0 ! “video/x/x-raw,width=640,height=480,format=UYVY” ! xvimagesink -v


Hi, glad to hear it’s now showing up in the dashboard. Like I mentioned before, for the image to be available in the API and in the dashboard it will be necessary for you to PR the changes and for them to get merged and released as a new device type, since for the Xavier AGX Nvidia does not offer u-boot for custom device tree loading at startup.

To execute those commands you will need to create a Dockerfile in which you install gstreamer as well as various nvidia gst plugins, v4l-utils, etc. Here is an example Dockerfile for Xavier AGX that runs L4T 32.4.4 in the HostOS that will help you get started.

Hey @KoilArulRaj.S ,

Have you had the chance to try out the example that my colleague shared? Let us know if you still have any issues or questions.



Yes. I could able to see streaming with the dockerfile example.

Now I am now trying to see streaming through nvarguscamerasrc plugin through another camera module (e-cam80_cunx).

I could able to verify streaming using v4l2-ctl as follows,

But when I tried to launch the streaming through Gstreamer nvarguscamerasrc command, it throws following failure.

Do I need to install anything extra to support nvarguscamerasrc streaming. I hereby attached the dockerfile corresponding to this test Dockerfile.txt (2.1 KB)

Can you please help me in resolving this.


Hi, I don’t have this setup or type of camera to test, however, from these logs I notice that the nvarguscamerasrc gstreamer plugin can’t connect to the nvargus-daemon, can you try open it first and keep it running in another terminal? You might also need to export DISPLAY=:0 in the gst-launch terminal if you haven’t already as I recall it was needed by nvoverlaysink

Hi @acostach

Thanks for the reply. Just now got the chance to check your suggestions. But still the streaming fails to launch. It seems, to run nvarguscamerasrc gstreamer pipeline, we need to install few more components in the dockerfile.

Is they any working configuration file available for accessing NVIDIA ISP camera module through argus application (nvarguscamersrc)?



Any updates for the above question?


Hi, unfortunately I don’t have the configuration files for this camera, perhaps there were some .isp files delivered with the camera module? You can have a look at these two threads for packages you may need for the nvargus daemon: