yocto-build-scripts not documented clearly

The instructions for using balena-yocto-scripts/build/balena-build.sh aren’t very clear.

jkridner@slotcar:~/workspace/blue-nano/balena/yocto$ ./build.sh 
+++ dirname ./build.sh
++ cd -- .
++ pwd -P
+ SCRIPTPATH=/home/jkridner/workspace/XXX/balena/yocto
+ cd /home/jkridner/workspace/XXX/balena/yocto/balena-beaglebone
+ ./balena-yocto-scripts/build/balena-build.sh -d beaglebone -i balena-image-flasher -s /home/jkridner/workspace/blue-nano/balena/yocto/build
[balena_lib_environment]: Defaulting to balena-cloud.com
~/workspace/XXX/balena/yocto/balena-beaglebone ~/workspace/XXX/balena/yocto/balena-beaglebone
Submodule details:

 99807501efffc8c5034c88361049650a02511a78 balena-yocto-scripts (v1.19.12)
 39a79c43f1b8ab4426d7a9c1cdeb9a9514101061 contracts (v2.0.12)
 49046a8af48a37c0fdc75e085e86c52fad474199 layers/meta-arm (yocto-3.1.1-18-g49046a8)
 094cc1766365844e9e4dcf46f4f247cad0231715 layers/meta-balena (v2.101.11)
 f2d02cb71eaff8eb285a1997b30be52486c160ae layers/meta-openembedded (f2d02cb71)
 f56fd4ec6835d68c4e9f8a9630963acd34d172e5 layers/meta-rust (remotes/origin/common-rust-native-20-gf56fd4e)
 9dce84ef28a1b5a782193f9715435f0f5c93a2ae layers/meta-ti (07.01.00.006-3-g9dce84ef)
 012ad10a89a889c21e67c27dc37d22520212548f layers/poky (dunfell-23.0.3)
~/workspace/XXX/balena/yocto/balena-beaglebone
9980750: Pulling from balena/yocto-build-env
Digest: sha256:8c339c19b67f15e8f9c11041f7144c1d35eb0c95b1c6ec88147895dc3fa94250
Status: Image is up to date for balena/yocto-build-env:9980750
docker.io/balena/yocto-build-env:9980750
[INFO] Creating and setting builder user 1000:1000.
unix:///var/run/docker.sock /var/run/docker.pid
fatal: not a git repository: /work/../../../.git/modules/balena/yocto/balena-beaglebone

I’m very unclear what the script is looking for and how I should provide it.

Per

work_dir="$( cd "${script_dir}/../.." && pwd )"

I guess the script expects balena-beaglebone to be the work-dir, however it further seems to desire for it to contain a .git directory. In my case, it is a submodule and therefore .git is a file and not a directory.

Wow, That was a headache. I couldn’t find any way to specify with git to move the metainformation under balena-beaglebone/.git, rather than having .git be a file with gitdir defined. I manually moved the information then massaged all of the .git files to point to the new path for metainformation (git-dir) and all of the config files to point to the new paths for all the work-directories. Ugh.

Anyway, all seems to be building now. Because only balena-beaglebone is exported as a workdir and the git metainformation is used, I don’t see a way around having real git metainformation in balena-beaglebone/.git. It would be nice if someone could find out how to get it there if balena-beaglebone itself is a submodule of another project.

ERROR: mkfs-hostapp-native-1.0-r0 do_compile: Execution of '/work/build/tmp/work/x86_64-linux/mkfs-hostapp-native/1.0-r0/temp/run.do_compile.790731' failed with exit code 1:
cgroups: cgroup mountpoint does not exist: unknown
Error response from daemon: No such image: e665ae2b1d2d
WARNING: exit code 1 from a shell command.

ERROR: Logfile of failure stored in: /work/build/tmp/work/x86_64-linux/mkfs-hostapp-native/1.0-r0/temp/log.do_compile.790731
Log data follows:
| DEBUG: Executing shell function do_compile
| cgroups: cgroup mountpoint does not exist: unknown
| Error response from daemon: No such image: e665ae2b1d2d
| WARNING: exit code 1 from a shell command.
| ERROR: Execution of '/work/build/tmp/work/x86_64-linux/mkfs-hostapp-native/1.0-r0/temp/run.do_compile.790731' failed with exit code 1:
| cgroups: cgroup mountpoint does not exist: unknown
| Error response from daemon: No such image: e665ae2b1d2d
| WARNING: exit code 1 from a shell command.
| 
NOTE: recipe mkfs-hostapp-native-1.0-r0: task do_compile: Failed
ERROR: Task (/work/build/../layers/meta-balena/meta-balena-common/recipes-containers/mkfs-hostapp-native/mkfs-hostapp-native.bb:do_compile) failed with exit code '1'
$ cat balena-beaglebone/build/tmp/work/x86_64-linux/mkfs-hostapp-native/1.0-r0/temp/log.do_compile.790731
DEBUG: Executing shell function do_compile
cgroups: cgroup mountpoint does not exist: unknown
Error response from daemon: No such image: e665ae2b1d2d
WARNING: exit code 1 from a shell command.
ERROR: Execution of '/work/build/tmp/work/x86_64-linux/mkfs-hostapp-native/1.0-r0/temp/run.do_compile.790731' failed with exit code 1:
cgroups: cgroup mountpoint does not exist: unknown
Error response from daemon: No such image: e665ae2b1d2d
WARNING: exit code 1 from a shell command.

Hi Jason, I am afraid that the scripts in balena-yocto-scripts are conditioned to balena-yocto-scripts being a submodule in a git project. So the device project must be created with:

git clone --recursive <URL>

About the build errors, is your Linux distribution using cgroups v2? If so, could you try to revert to v1 with the following kernel command line arguments see if the issue persists:

systemd.unified_cgroup_hierarchy=0
systemd.legacy_systemd_cgroup_controller

Where do I apply those settings? I’m trying to rebuild balena-beaglebone (beaglebone.coffee) as-is to begin, prior to me making any modifications.

To revert back to using cgroups v1, you need to add kernel command line parameters to the host system (the workstation you are using to build balenaOS). You can can find instructions for example in https://linuxconfig.org/how-to-set-kernel-boot-parameters-on-linux.
I suggest you initially edit the grub boot entry to append those settings so they apply only to the next boot. As a reminder, the arguments to add for a systemd enabled Linux distribution are systemd.unified_cgroup_hierarchy=0 systemd.legacy_systemd_cgroup_controller.

1 Like

I have

jkridner@slotcar:~/workspace/blue-nano/balena/beagleplay-gateway$ cat /proc/cmdline 
BOOT_IMAGE=/boot/vmlinuz-6.8.0-71-generic root=UUID=6419efef-87d5-4c98-8cf6-784c73f8a23d ro amdgpu.dc=1 radeon.audio=1 systemd.unified_cgroup_hierarchy=0 systemd.legacy_systemd_cgroup_controller

but yet…

jkridner@slotcar:~/workspace/blue-nano/balena/yocto$ ./build-beagleplay-gateway.sh 
+++ dirname ./build-beagleplay-gateway.sh
++ cd -- .
++ pwd -P
+ SCRIPTPATH=/home/jkridner/workspace/blue-nano/balena/yocto
+ cd /home/jkridner/workspace/blue-nano/balena/yocto/balena-beaglebone
+ ./balena-yocto-scripts/build/balena-build.sh -d beagleplay -i balena-image-flasher -s /home/jkridner/workspace/blue-nano/balena/yocto/build
[balena_lib_environment]: Defaulting to balena-cloud.com
~/workspace/blue-nano/balena/yocto/balena-beaglebone ~/workspace/blue-nano/balena/yocto/balena-beaglebone
Submodule details:

 a78462c0fcb2f580b80734d9a0900cc72fa180f2 balena-yocto-scripts (v1.33.3-183-ga78462c)
 1d499fe7d33791e1ac6dccfe031e178b7d520bb4 contracts (v2.0.108)
 b187fb9232ca0a6b5f8f90b4715958546fc41d73 layers/meta-arm (yocto-4.0.3)
 0dd9f8e4c8a32a503723375bc4a87c557d57f356 layers/meta-balena (v6.5.48)
 0f37766e9cdcf4919d5be85abc54a7219aaf36cb layers/meta-cyclonedx (remotes/origin/kirkstone)
 fda737ec0cc1d2a5217548a560074a8e4d5ec580 layers/meta-openembedded (remotes/origin/stable/kirkstone-nut-65-gfda737ec0c)
 9818c93673e9e184f97e658edcd71db74df12a84 layers/meta-ti (09.02.00.006)
 445c60a484f33d200e2890b4a174b436cd2f969e layers/poky (yocto-4.0.18-34-g445c60a484)
~/workspace/blue-nano/balena/yocto/balena-beaglebone
1.36.10-yocto-build-env: Pulling from balena-os/balena-yocto-scripts
3e4a36dff2ad: Pull complete 
cec5122c3b93: Pull complete 
581e46f3e488: Pull complete 
9769269bb3c9: Pull complete 
33ff6990f1e0: Pull complete 
c772292d5875: Pull complete 
0bc9c729e6aa: Pull complete 
27d3e0453a08: Pull complete 
80b5b816422d: Pull complete 
a59e48c12f44: Pull complete 
597c05700679: Pull complete 
aca3cfdf3602: Pull complete 
dcf3387f34e1: Pull complete 
a822e4c31625: Pull complete 
Digest: sha256:6990f96e8d4b0f51fb5641f40a535e6a5abb77d16b384d621ea5b69608633ce6
Status: Downloaded newer image for ghcr.io/balena-os/balena-yocto-scripts:1.36.10-yocto-build-env
ghcr.io/balena-os/balena-yocto-scripts:1.36.10-yocto-build-env
[INFO] Creating and setting builder user 1000:1000.
unix:///var/run/docker.sock /var/run/docker.pid
[INFO] The configured git credentials for user builder are:
Resin Builder
buildy@builder.com
[INFO] Running build as builder user...
Building JSON manifest...

up to date, audited 4 packages in 495ms

found 0 vulnerabilities
...Done
You had no conf/local.conf file. This configuration file has therefore been
created for you from /work/balena-yocto-scripts/build/../../layers/meta-balena-beaglebone/conf/samples/local.conf.sample
You may wish to edit it to, for example, select a different MACHINE (target
hardware). See conf/local.conf for more information as common configuration
options are commented.

You had no conf/bblayers.conf file. This configuration file has therefore been
created for you from /work/balena-yocto-scripts/build/../../layers/meta-balena-beaglebone/conf/samples/bblayers.conf.sample
To add additional metadata layers into your configuration please add entries
to conf/bblayers.conf.

The Yocto Project has extensive documentation about OE including a reference
manual which can be found at:
    https://docs.yoctoproject.org

For more information about OpenEmbedded see the website:
    https://www.openembedded.org/


  _           _                   ___  ____
 | |__   __ _| | ___ _ __   __ _ / _ \/ ___|
 | '_ \ / _` | |/ _ \ '_ \ / _` | | | \___ \
 | |_) | (_| | |  __/ | | | (_| | |_| |___) |
 |_.__/ \__,_|_|\___|_| |_|\__,_|\___/|____/

 -------------------------------------------- 

Resin specific images available:
	balena-image
	balena-image-flasher

BeagleBone AI-64                         : $ MACHINE=beaglebone-ai64 bitbake balena-image-flasher
BeagleBone Green Gateway                 : $ MACHINE=beaglebone-green-gateway bitbake balena-image-flasher
BeagleBone Green                         : $ MACHINE=beaglebone-green bitbake balena-image-flasher
BeagleBone Green Wireless                : $ MACHINE=beaglebone-green-wifi bitbake balena-image-flasher
BeagleBone Black                         : $ MACHINE=beaglebone bitbake balena-image-flasher
PocketBeagle                             : $ MACHINE=beaglebone-pocket bitbake balena-image
BeaglePlay                               : $ MACHINE=beagleplay bitbake balena-image-flasher

[000000004][LOG]BalenaOS build initialized in directory: build.
[000000004][LOG]Run build for beagleplay: MACHINE=beagleplay bitbake balena-image-flasher 
[000000004][LOG]This might take a while ...
Loading cache...done.
Loaded 0 entries from dependency cache.
Parsing recipes...WARNING: /work/build/../layers/poky/meta/recipes-devtools/rust/rust-cross_1.59.0.bb: CVE_CHECK_IGNORE is deprecated in favor of CVE_STATUS
WARNING: /work/build/../layers/poky/meta/recipes-devtools/cmake/cmake_3.22.3.bb: CVE_CHECK_IGNORE is deprecated in favor of CVE_STATUS

...

WARNING: /work/build/../layers/poky/meta/recipes-devtools/gcc/gcc-source_11.4.bb: CVE_CHECK_IGNORE is deprecated in favor of CVE_STATUS
done.
Parsing of 5574 .bb files complete (0 cached, 5574 parsed). 8656 targets, 926 skipped, 0 masked, 0 errors.
NOTE: Resolving any missing task queue dependencies
NOTE: Resolving any missing task queue dependencies
NOTE: Resolving any missing task queue dependencies
NOTE: Resolving any missing task queue dependencies
NOTE: Resolving any missing task queue dependencies
NOTE: Resolving any missing task queue dependencies

Build Configuration (mc:default):
BB_VERSION           = "2.0.0"
BUILD_SYS            = "x86_64-linux"
NATIVELSBSTRING      = "ubuntu-22.04"
TARGET_SYS           = "aarch64-poky-linux"
MACHINE              = "beagleplay"
DISTRO               = "balena-os"
DISTRO_VERSION       = "6.5.48"
TUNE_FEATURES        = "aarch64"
TARGET_FPU           = ""
meta-balena-rust     
meta-balena-common   
meta-balena-kirkstone = "HEAD:0dd9f8e4c8a32a503723375bc4a87c557d57f356"
meta-balena-beaglebone = "v6.5.48-bn:c7206d8fa4677e8cbd512fd501dcb86bd9b94232"
meta                 
meta-poky            = "HEAD:445c60a484f33d200e2890b4a174b436cd2f969e"
meta-oe              
meta-filesystems     
meta-networking      
meta-python          
meta-perl            = "HEAD:fda737ec0cc1d2a5217548a560074a8e4d5ec580"
meta-ti-bsp          
meta-ti-extras       = "HEAD:9818c93673e9e184f97e658edcd71db74df12a84"
meta-arm             
meta-arm-toolchain   = "HEAD:b187fb9232ca0a6b5f8f90b4715958546fc41d73"
meta-cyclonedx       = "HEAD:0f37766e9cdcf4919d5be85abc54a7219aaf36cb"

NOTE: Cleaning cyclonedx work folder /work/build/tmp/cyclonedx
NOTE: Fetching uninative binary shim http://downloads.yoctoproject.org/releases/uninative/4.4/x86_64-nativesdk-libc-4.4.tar.xz;sha256sum=d81c54284be2bb886931fc87281d58177a2cd381cf99d1981f8923039a72a302 (will check PREMIRRORS first)

Build Configuration:
BB_VERSION           = "2.0.0"
BUILD_SYS            = "x86_64-linux"
NATIVELSBSTRING      = "ubuntu-22.04"
TARGET_SYS           = "arm-poky-eabi"
MACHINE              = "beagleplay-k3r5"
DISTRO               = "balena-os"
DISTRO_VERSION       = "6.5.48"
TUNE_FEATURES        = "arm armv7a vfp thumb callconvention-hard"
TARGET_FPU           = "hard"
meta-balena-rust     
meta-balena-common   
meta-balena-kirkstone = "HEAD:0dd9f8e4c8a32a503723375bc4a87c557d57f356"
meta-balena-beaglebone = "v6.5.48-bn:c7206d8fa4677e8cbd512fd501dcb86bd9b94232"
meta                 
meta-poky            = "HEAD:445c60a484f33d200e2890b4a174b436cd2f969e"
meta-oe              
meta-filesystems     
meta-networking      
meta-python          
meta-perl            = "HEAD:fda737ec0cc1d2a5217548a560074a8e4d5ec580"
meta-ti-bsp          
meta-ti-extras       = "HEAD:9818c93673e9e184f97e658edcd71db74df12a84"
meta-arm             
meta-arm-toolchain   = "HEAD:b187fb9232ca0a6b5f8f90b4715958546fc41d73"
meta-cyclonedx       = "HEAD:0f37766e9cdcf4919d5be85abc54a7219aaf36cb"

NOTE: Cleaning cyclonedx work folder /work/build/tmp/cyclonedx
Initialising tasks...done.
Sstate summary: Wanted 2998 Local 0 Mirrors 0 Missed 2998 Current 0 (0% match, 0% complete)
NOTE: Executing Tasks
NOTE: Running task 1 of 7692 (mc:k3r5:/work/build/../layers/poky/meta/recipes-devtools/gcc/gcc-source_11.4.bb:do_rm_work)
NOTE: Running task 2 of 7692 (/work/build/../layers/poky/meta/recipes-devtools/gcc/gcc-source_11.4.bb:do_rm_work)
NOTE: Running task 3 of 7692 (/work/build/../layers/meta-ti/meta-ti-bsp/recipes-kernel/linux/linux-bb.org_git.bb:do_rm_work)
NOTE: Running task 4 of 7692 (/work/build/../layers/poky/meta/recipes-support/libgpg-error/libgpg-error_1.44.bb:do_cyclonedx_package_collect)
NOTE: Running task 5 of 7692 (/work/build/../layers/poky/meta/recipes-support/libmd/libmd_1.0.4.bb:do_cyclonedx_package_collect)
NOTE: Running task 6 of 7692 (virtual:native:/work/build/../layers/poky/meta/recipes-devtools/docbook-xml/docbook-xml-dtd4_4.5.bb:do_cyclonedx_package_collect)
NOTE: Running task 7 of 7692 (virtual:native:/work/build/../layers/poky/meta/recipes-devtools/docbook-xml/docbook-xsl-stylesheets_1.79.1.bb:do_cyclonedx_package_collect)
NOTE: Running task 8 of 7692 (/work/build/../layers/meta-openembedded/meta-oe/recipes-devtools/android-tools/android-tools-conf_1.0.bb:do_cyclonedx_package_collect)
NOTE: Running task 9 of 7692 (/work/build/../layers/poky/meta/recipes-extended/findutils/findutils_4.9.0.bb:do_cyclonedx_package_collect)
NOTE: Running task 10 of 7692 (/work/build/../layers/meta-balena/meta-balena-common/recipes-core/fatrw/fatrw_0.2.21.bb:do_cyclonedx_package_collect)
NOTE: Running task 11 of 7692 (virtual:native:/work/build/../layers/poky/meta/recipes-devtools/xmlto/xmlto_0.0.28.bb:do_cyclonedx_package_collect)
NOTE: Running task 12 of 7692 (virtual:native:/work/build/../layers/poky/meta/recipes-support/itstool/itstool_2.0.7.bb:do_cyclonedx_package_collect)
WARNING: CVE_CHECK_IGNORE is deprecated in favor of CVE_STATUS
ERROR: PermissionError: [Errno 1] Operation not permitted

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/work/layers/poky/bitbake/bin/bitbake-worker", line 268, in child
    bb.utils.disable_network(uid, gid)
  File "/work/layers/poky/bitbake/lib/bb/utils.py", line 1653, in disable_network
    with open("/proc/self/uid_map", "w") as f:
PermissionError: [Errno 1] Operation not permitted

...

I did echo 0 | sudo tee /proc/sys/kernel/apparmor_restrict_unprivileged_userns and hope that makes it go away. Bug #2056555 “Allow bitbake to create user namespace” : Bugs : apparmor package : Ubuntu has thoughts on configuring apparmor.

Man, this build environment is restrictive. My best machines are Arm and there is no Arm build method. I’ve tried everything I can do to avoid touching BalenaOS itself. This is not fun.

hey @jkridner could you try:

sudo sysctl -w kernel.apparmor_restrict_unprivileged_userns=0

Definitely usability needs some attention. It does not help that most balena user do not have a need to build the OS themselves.

Well, if you added support for extlinux.conf-based device tree updates for Beagles, I wouldn’t need to either.