Support x86_64. Video analyze services (Not IoT)

FYI, The corresponding OpenBalena is running at AWS instance.

Oh! In that case, OpenBalena doesn’t support os downloads. You will have to download the image from

1 Like

Unfortunately, I cannot find generic86-64-ext at

If I run openBalena on my local machines, then I can download OS images?


Do you have a balenaCloud account? One thing that you can do is that you can configure your CLI to use the credentials for balenaCloud while downloading the image. And then once you have the images you can go back to using OpenBalena

Yes, I created balenaCloud account yesterday. Please let me try it out!

BTW, I figured out that I can download generic-x86-64 os image from below link. Can it be used in openBalena also? such as

balena os configure out.img --app myApp

Check out the OpenBalena getting started guide here -
You can configure the image that you have downloaded using the balena os configure command


I successfully flashing balenaOS on x86_64 machines and access from balenaCLI. And move to deploy services.

Which device type is for genericx86-64-ext from ?

Thanks, LimPro

Hey @limpro

genericx86-64-ext is the machine name device type for Generic x86_64 device type. is is similar to the intel nuc image but comes with more built in drivers.


Hi Rahul,

Thanks for your comment. So which device can I choose from ?

Thanks, LimPro

Also thanks for the feedback. Generic x86_64 is a new device type are we will be updating the docs and download links here( soon

hey @limpro,

I don’t quite follow,

So which device can I choose from ?

For your use case, you should use the genericx86-64-ext device type which you mentioned you were able to download above. However, it is not (yet) listed here:

Hello Balena Team,

Looking for two helps~ :hugs:

First one, Is there any guide to install CUDA on BalenaOS x86_64?
I referred
But failed with genericx86-64-ext devices.

Second one,
Regarding Docker-compose on Balena, do you support --shm-size and --gpus options?
I found that Balena seems took in recent few months.
However, these options are not working.

Hi @limpro, some questions to understand better:

  • At what stage of the provisioning process does the error happen?
  • What’s the error that you see when it fails?

For your second question, I’m not sure what you refer to with that issue, but you should be able to use the shm_size property as documented here along with the other listed properties. If they’re not listed, they’re most likely not available yet

Hi @thundron,

To verify whether balena can support GPU available container(nvidia/cuda:10.1-base), I ran below command in the device (OS Ver.: balenaOS 2.48.0+rev5 / Device Type: genericx86-64-ext)

balena-engine run --gpus all,capabilities=utility nvidia/cuda:10.1-base nvidia-smi

Please refer attached screenshot

Hello, I don’t see the attachment in our system. Please could you try attaching again?

Hello @srlowe,

Can you see my screenshot?

Yes, I have it now thank you. Looking into this.

Hi Balena Team,

My project needs to deploy AI services on GPU machines(x86_64, Nvidia GPU). I am considering to use Balena for the project(GPU machines as balenaDevices). So it is critical that CUDA enabled docker can run on balenaDevice.

Could you help me figure out whether GPU inference(using Nvidia Docker) can run on balenaDevcie(x86_64), please?

Just to clarify your device is a computer with a GPU. If so can you send us the GPU that you’ll use?

=> YES. Support x86_64. Video analyze services (Not IoT)

Hi @mbalamat

If so can you send us the GPU that you’ll use?

=> Do you ask me to send GPU machine to you? Sorry it cannot be.

Balena team, could you help me find answers to below simple questions, please? Then my project can go forward

  • Is there any guide to install CUDA on BalenaOS x86_64?
  • Could you help me figure out whether GPU inference(using Nvidia Docker) can run on balenaDevcie( x86_64 ), please?

Thank you! LimPro