Here we go!
FYI, The corresponding OpenBalena is running at AWS instance.
Oh! In that case, OpenBalena doesn’t support os downloads. You will have to download the image from balena.io/os
Unfortunately, I cannot find generic86-64-ext at https://www.balena.io/os/
If I run openBalena on my local machines, then I can download OS images?
Do you have a balenaCloud account? One thing that you can do is that you can configure your CLI to use the credentials for balenaCloud while downloading the image. And then once you have the images you can go back to using OpenBalena
Yes, I created balenaCloud account yesterday. Please let me try it out!
BTW, I figured out that I can download generic-x86-64 os image from below link. Can it be used in openBalena also? such as
balena os configure out.img --app myApp
Check out the OpenBalena getting started guide here - https://www.balena.io/open/docs/getting-started/
You can configure the image that you have downloaded using the
balena os configure command
I successfully flashing balenaOS on x86_64 machines and access from balenaCLI. And move to deploy services.
Which device type is for genericx86-64-ext from
genericx86-64-ext is the machine name device type for Generic x86_64 device type. is is similar to the intel nuc image but comes with more built in drivers.
Thanks for your comment. So which device can I choose from https://www.balena.io/docs/reference/base-images/devicetypes/ ?
Also thanks for the feedback. Generic x86_64 is a new device type are we will be updating the docs and download links here(https://www.balena.io/os/#download) soon
I don’t quite follow,
So which device can I choose from https://www.balena.io/docs/reference/base-images/devicetypes/ ?
For your use case, you should use the
genericx86-64-ext device type which you mentioned you were able to download above. However, it is not (yet) listed here: https://www.balena.io/docs/reference/base-images/devicetypes/
Hello Balena Team,
Looking for two helps~
First one, Is there any guide to install CUDA on BalenaOS x86_64?
I referred https://www.balena.io/blog/getting-started-with-the-nvidia-jetson-nano-using-balena/
But failed with
Regarding Docker-compose on Balena, do you support
I found that Balena seems took https://github.com/docker/compose/issues/6691 in recent few months.
However, these options are not working.
Hi @limpro, some questions to understand better:
- At what stage of the provisioning process does the error happen?
- What’s the error that you see when it fails?
For your second question, I’m not sure what you refer to with that issue, but you should be able to use the
shm_size property as documented here along with the other listed properties. If they’re not listed, they’re most likely not available yet
To verify whether
balena can support GPU available container(nvidia/cuda:10.1-base), I ran below command in the device (OS Ver.: balenaOS 2.48.0+rev5 / Device Type: genericx86-64-ext)
balena-engine run --gpus all,capabilities=utility nvidia/cuda:10.1-base nvidia-smi
Please refer attached screenshot
Hello, I don’t see the attachment in our system. Please could you try attaching again?
Can you see my screenshot?
Yes, I have it now thank you. Looking into this.
Hi Balena Team,
My project needs to deploy AI services on GPU machines(x86_64, Nvidia GPU). I am considering to use Balena for the project(GPU machines as balenaDevices). So it is critical that CUDA enabled docker can run on balenaDevice.
Could you help me figure out whether GPU inference(using Nvidia Docker) can run on balenaDevcie(x86_64), please?
Just to clarify your device is a computer with a GPU. If so can you send us the GPU that you’ll use?