Support x86_64. Video analyze services (Not IoT)

Hi
Check out the OpenBalena getting started guide here - https://www.balena.io/open/docs/getting-started/
You can configure the image that you have downloaded using the balena os configure command

Hi,

I successfully flashing balenaOS on x86_64 machines and access from balenaCLI. And move to deploy services.

Which device type is for genericx86-64-ext from
https://www.balena.io/docs/reference/base-images/devicetypes ?

Thanks, LimPro

Hey @limpro

genericx86-64-ext is the machine name device type for Generic x86_64 device type. is is similar to the intel nuc image but comes with more built in drivers.

Thanks

Hi Rahul,

Thanks for your comment. So which device can I choose from Machine names and architectures - Balena Documentation ?

Thanks, LimPro

Also thanks for the feedback. Generic x86_64 is a new device type are we will be updating the docs and download links here(https://www.balena.io/os/#download) soon

hey @limpro,

I don’t quite follow,

So which device can I choose from Machine names and architectures - Balena Documentation ?

For your use case, you should use the genericx86-64-ext device type which you mentioned you were able to download above. However, it is not (yet) listed here: Machine names and architectures - Balena Documentation

Hello Balena Team,

Looking for two helps~ :hugs:

First one, Is there any guide to install CUDA on BalenaOS x86_64?
I referred https://www.balena.io/blog/getting-started-with-the-nvidia-jetson-nano-using-balena/
But failed with genericx86-64-ext devices.

Second one,
Regarding Docker-compose on Balena, do you support --shm-size and --gpus options?
I found that Balena seems took https://github.com/docker/compose/issues/6691 in recent few months.
However, these options are not working.

Hi @limpro, some questions to understand better:

  • At what stage of the provisioning process does the error happen?
  • What’s the error that you see when it fails?

For your second question, I’m not sure what you refer to with that issue, but you should be able to use the shm_size property as documented here along with the other listed properties. If they’re not listed, they’re most likely not available yet

Hi @thundron,

To verify whether balena can support GPU available container(nvidia/cuda:10.1-base), I ran below command in the device (OS Ver.: balenaOS 2.48.0+rev5 / Device Type: genericx86-64-ext)

balena-engine run --gpus all,capabilities=utility nvidia/cuda:10.1-base nvidia-smi

Please refer attached screenshot

Hello, I don’t see the attachment in our system. Please could you try attaching again?

Hello @srlowe,

Can you see my screenshot?

Yes, I have it now thank you. Looking into this.

Hi Balena Team,

My project needs to deploy AI services on GPU machines(x86_64, Nvidia GPU). I am considering to use Balena for the project(GPU machines as balenaDevices). So it is critical that CUDA enabled docker can run on balenaDevice.

Could you help me figure out whether GPU inference(using Nvidia Docker) can run on balenaDevcie(x86_64), please?

Just to clarify your device is a computer with a GPU. If so can you send us the GPU that you’ll use?

=> YES. Support x86_64. Video analyze services (Not IoT) - #6 by limpro

Hi @mbalamat

If so can you send us the GPU that you’ll use?

=> Do you ask me to send GPU machine to you? Sorry it cannot be.

Balena team, could you help me find answers to below simple questions, please? Then my project can go forward

  • Is there any guide to install CUDA on BalenaOS x86_64?
  • Could you help me figure out whether GPU inference(using Nvidia Docker) can run on balenaDevcie( x86_64 ), please?

Thank you! LimPro

Excuse me, sorry I meant to send us the name of the GPU you’ll use eg GTX1050, not the physical GPU. I have routed your questions to the devices and balena-engine teams, we’ll get back to you as soon as possible. In the meantime, if it’s ok with you, it would be great to tell us more about what hardware setup (GPU model) are you planning to use.

Oh haha. sorry. Currently, my developing machine has two GTX 1080Ti but I will use GTX2080 also in near future.
The number of GPUs can be varied(depends on each project situation). Some machines have 3 GPUs or just one.

the image you mentioned above nvidia/cuda:10.1-base is it able to handle multiple GPUs?

the image you mentioned above nvidia/cuda:10.1-base is it able to handle multiple GPUs?

Yes, it can access one and/or all GPUs

Thanks for the feedback.
We will get back to you once we have more details regarding using --gpus with balena engine.