Two instances of supervisor trying to start

One of our devices has gotten into a strange state after updating supervisor from 12.x to 14.x.

There are two supervisor services running. This causes an error in one, resulting in a service restart loop. I can stop one of the services (175e0217fe3a), but it automatically restarts anytime the device is rebooted. The other service (7e29310a1ed3) restarts immediately if I stop it.

Any ideas how to fix this? Was there a change between v12 and v14 of the supervisor to make it run as a service? Previously, I did not see the supervisor listed as a Service on the device dashboard.

This device is running balenaOS 2.94.4.

Output of balena ps:

175e0217fe3a   9f09bf7eb394                                                     "/usr/src/app/entry.…"   5 days ago   Restarting (1) 34 seconds ago                                               balena-supervisor_5325320_2272141_a3d0d0936183278f66c21db317ea4d93
7e29310a1ed3   registry2.balena-cloud.com/v2/0a55aa88126d881d74d65f52dd09cc5d   "/usr/src/app/entry.…"   5 days ago   Up 2 hours (healthy)                                                        balena_supervisor

Hi there, thank you for reporting this. There are some changes in newer supervisors to make it run as a service, but the supervisor should not pick them up. Is your device running against balena-cloud? How did you update your supervisor?

Could you also run the following commands on your device and paste the output?

# Get the target state
curl -H "Authorization: Bearer $(cat /mnt/boot/config.json | jq -r .deviceApiKey)" $(cat /mnt/boot/config.json | jq -r .apiEndpoint)/device/v3/$(cat /mnt/boot/config.json | jq -r .uuid)/state -s | jq .
# Configured supervisor version 
 cat /etc/balena-supervisor/supervisor.conf
# Get engine logs
journalctl -u balena --no-pager -a -n 100

Thank you

Yes the device is running against balena-cloud. I updated the supervisor via the balena cloud dashboard.

{
  "d8aefd6ff79d8e61762e01539cd9a780": {
    "name": "empty-dream",
    "apps": {
      "2e66a95795c149959c69472a8c2f92b8": {
        "id": 1667445,
        "name": "armv7hf-supervisor",
        "is_host": false,
        "class": "app",
        "releases": {
          "a3d0d0936183278f66c21db317ea4d93": {
            "id": 2272141,
            "services": {
              "balena-supervisor": {
                "id": 1242822,
                "image_id": 5325320,
                "image": "registry2.balena-cloud.com/v2/0a55aa88126d881d74d65f52dd09cc5d@sha256:0f7e969a30741974d2f21a38df0b382050bde679c9973e80c2f9a4205d19062f",
                "environment": {

                },
                "labels": {
                  "io.balena.features.balena-api": "1",
                  "io.balena.features.balena-socket": "1",
                  "io.balena.features.dbus": "1"
                },
                "composition": {
                  "privileged": true,
                  "tty": true,
                  "restart": "always",
                  "network_mode": "host",
                  "labels": {
                    "io.balena.features.balena-api": "1",
                    "io.balena.features.dbus": "1",
                    "io.balena.features.balena-socket": "1"
                  }
                }
              }
            }
          }
        }
      },
      "b13b83a46837490a930271151bd4ec69": {
        "id": 1133435,
        "name": "SSWM20180620_testing",
        "is_host": false,
        "class": "fleet",
        "releases": {
          "a76ebead9aa79510a6ebcca6dba920d1": {
            "id": 2210783,
            "services": {
              "main": {
                "id": 61305,
                "image_id": 5074151,
                "image": "registry2.balena-cloud.com/v2/90adc7e92780a8761a9fd2cb9576f531@sha256:80450df61e421229d8d8cd03c6549c7831076e8555ef4cb5167212790a0fcdd0",
                "environment": {

                },
                "labels": {
                  "io.balena.features.dbus": "1",
                  "io.balena.features.firmware": "1",
                  "io.balena.features.kernel-modules": "1"
                },
                "composition": {
                  "depends_on": [
                    "redis",
                    "aws"
                  ],
                  "restart": "always",
                  "privileged": true,
                  "labels": {
                    "io.balena.features.kernel-modules": "1",
                    "io.balena.features.firmware": "1",
                    "io.balena.features.dbus": "1"
                  }
                }
              },
              "redis": {
                "id": 422163,
                "image_id": 5074152,
                "image": "registry2.balena-cloud.com/v2/81cd99042d1238d00b9ec0a73b378cab@sha256:787b6c8488b554bfc2bc0092a933a713bd3f40bb449ef5715e998934acf837d4",
                "environment": {

                },
                "labels": {},
                "composition": {
                  "image": "arm32v7/redis:5.0.10",
                  "restart": "always",
                  "command": "redis-server --save 600 1 --maxmemory 512000000 --maxmemory-policy allkeys-lru",
                  "volumes": [
                    "resin-data:/data"
                  ],
                  "privileged": true,
                  "ports": [
                    "6379:6379"
                  ]
                }
              },
              "aws": {
                "id": 422161,
                "image_id": 5074153,
                "image": "registry2.balena-cloud.com/v2/8d3f943d43ec19406bf611a45c96e7f1@sha256:f2731a183f0180d0a94ffc68d14c5c83c9e1a1395f238ac57844141559272b5b",
                "environment": {

                },
                "labels": {},
                "composition": {
                  "restart": "always",
                  "privileged": true,
                  "depends_on": [
                    "redis"
                  ],
                  "volumes": [
                    "log-volume:/logs"
                  ]
                }
              },
              "monitor": {
                "id": 495259,
                "image_id": 5074154,
                "image": "registry2.balena-cloud.com/v2/460cc928fdc4ae4ed807db44ab834f13@sha256:ddabdd8493e52f852f046547250e16a31a4b88dea3e660781c13d9373655df86",
                "environment": {

                },
                "labels": {
                  "io.balena.features.dbus": "1",
                  "io.balena.features.supervisor-api": "1"
                },
                "composition": {
                  "restart": "always",
                  "privileged": true,
                  "depends_on": [
                    "redis"
                  ],
                  "labels": {
                    "io.balena.features.supervisor-api": "1",
                    "io.balena.features.dbus": "1"
                  },
                  "volumes": [
                    "log-volume:/logs"
                  ]
                }
              }
            },
            "volumes": {
              "resin-data": {},
              "log-volume": {}
            }
          }
        }
      }
    },
    "config": {
      "BALENA_SUPERVISOR_HARDWARE_METRICS": "false",
      "RESIN_HOST_CONFIG_dtoverlay": "w1-gpio",
      "RESIN_HOST_CONFIG_gpio": "20=op,dh",
      "RESIN_SUPERVISOR_DELTA": "1",
      "RESIN_SUPERVISOR_NATIVE_LOGGER": "false",
      "RESIN_HOST_CONFIG_avoid_warnings": "1",
      "RESIN_HOST_CONFIG_disable_overscan": "1",
      "RESIN_HOST_CONFIG_disable_splash": "1",
      "RESIN_HOST_CONFIG_dtparam": "\"i2c_arm=on\",\"spi=on\",\"audio=on\"",
      "RESIN_HOST_CONFIG_gpu_mem": "16",
      "RESIN_HOST_FIREWALL_MODE": "",
      "RESIN_SUPERVISOR_DELTA_VERSION": "3",
      "RESIN_SUPERVISOR_PERSISTENT_LOGGING": "true",
      "RESIN_SUPERVISOR_VPN_CONTROL": "true",
      "RESIN_SUPERVISOR_POLL_INTERVAL": "900000",
      "RESIN_SUPERVISOR_DELTA_REQUEST_TIMEOUT": "59000"
    }
  }
}

cat /etc/balena-supervisor/supervisor.conf
# This file represents the last known version of the supervisor
SUPERVISOR_IMAGE=registry2.balena-cloud.com/v2/0a55aa88126d881d74d65f52dd09cc5d
SUPERVISOR_VERSION=latest
LED_FILE=/sys/class/leds/led0/brightness

The journal file consists of a lot of application level logging, but logs relevant to the supervisor (when both are trying to run) are:

08.09.22 09:02:04 (-0400) <balena-supervisor> ln: /var/lib/rce: File exists
08.09.22 09:02:04 (-0400) <balena-supervisor> ln: /var/lib/rce: File exists
08.09.22 09:02:34 (-0400) <balena-supervisor> ln: /var/lib/rce: File exists
08.09.22 10:08:42 (-0400) <balena-supervisor> ln: /var/lib/rce: File exists
08.09.22 10:08:50 (-0400) <balena-supervisor> ln: /var/lib/rce: File exists
08.09.22 10:08:59 (-0400) <balena-supervisor> ln: /var/lib/rce: File exists
08.09.22 10:09:01 (-0400) Service exited 'balena-supervisor sha256:9f09bf7eb3944fa0ce6c94f1da72f1996d4abcbe9c316850558ba644aa43f13d'
08.09.22 10:09:10 (-0400) Restarting service 'balena-supervisor sha256:9f09bf7eb3944fa0ce6c94f1da72f1996d4abcbe9c316850558ba644aa43f13d'
08.09.22 10:09:10 (-0400) <balena-supervisor> ln: /var/lib/rce: File exists

08.09.22 10:09:12 (-0400) Service exited 'balena-supervisor sha256:9f09bf7eb3944fa0ce6c94f1da72f1996d4abcbe9c316850558ba644aa43f13d'
08.09.22 10:09:28 (-0400) Restarting service 'balena-supervisor sha256:9f09bf7eb3944fa0ce6c94f1da72f1996d4abcbe9c316850558ba644aa43f13d'
08.09.22 10:09:28 (-0400) <balena-supervisor> ln: /var/lib/rce: File exists

Hi again, thank you for that information. I suspect something might have gone wrong with the supervisor update process and for some reason the extra container got created. Just stopping the container probably is not enough as the engine would restart it on reboot. Could you try the following and let me know if the problem reocurrs?

systemctl stop balena supervisor
# Stop and remove all running supervisor containers shown on `balena ps -a`
balena stop 175e0217fe3a 7e29310a1ed3  | xargs balena rm
# Restart the supervisor service
systemctl start balena-supervisor && journalctl -u balena-supervisor -a --follow

After the last line you’ll also get a set of supervisor logs. Could you share those too? If the problem reoccurs, the logs will also help me have some more context to figure out the problem.

Thanks again

Doesn’t seem to be working. Both supervisors seem to restart

systemctl stop balena supervisor
Failed to stop supervisor.service: Unit supervisor.service not loaded.
Warning: Stopping balena.service, but it can still be activated by:
  balena-engine.socket

balena stop 175e0217fe3a 701b64187e2d  | xargs balena rm
175e0217fe3a
701b64187e2d

systemctl start balena-supervisor && journalctl -u balena-supervisor -a --follow
Sep 09 17:27:40 d8aefd6 balena-supervisor[3685]: [error]   Error on docker event: Error: connect ECONNREFUSED /var/run/balena-engine.sock
Sep 09 17:27:40 d8aefd6 balena-supervisor[3685]: [error]         at PipeConnectWrap.afterConnect [as oncomplete] (net.js:1144:16) Error: connect ECONNREFUSED /var/run/balena-engine.sock
Sep 09 17:27:40 d8aefd6 balena-supervisor[3685]: [error]       at PipeConnectWrap.afterConnect [as oncomplete] (net.js:1144:16)
Sep 09 17:27:48 d8aefd6 systemd[1]: balena-supervisor.service: Main process exited, code=exited, status=137/n/a
Sep 09 17:27:48 d8aefd6 systemd[1]: balena-supervisor.service: Failed with result 'exit-code'.
Sep 09 17:28:16 d8aefd6 balena-supervisor[13182]: Error response from daemon: No such container: resin_supervisor
Sep 09 17:28:38 d8aefd6 balena-supervisor[13253]: balena_supervisor
Sep 09 17:28:38 d8aefd6 balena-supervisor[14182]: active
Sep 09 17:28:42 d8aefd6 balena-supervisor[14312]: Template parsing error: template: :1:2: executing "" at <.Image>: map has no entry for key "Image"
Sep 09 17:28:44 d8aefd6 balena-supervisor[14470]: WARNING: The requested image's platform (linux/arm) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested
Sep 09 17:28:55 d8aefd6 balena-supervisor[14470]: [info]    Supervisor v14.0.15 starting up...
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [info]    Setting host to discoverable
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [warn]    Invalid firewall mode: . Reverting to state: off
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [info]    Applying firewall mode: off
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [debug]   Starting systemd unit: avahi-daemon.service
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [debug]   Starting systemd unit: avahi-daemon.socket
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [debug]   Starting logging infrastructure
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [info]    Starting firewall
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [debug]   Performing database cleanup for container log timestamps
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [success] Firewall mode applied
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [debug]   Starting api binder
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [info]    Previous engine snapshot was not stored. Skipping cleanup.
Sep 09 17:28:56 d8aefd6 balena-supervisor[14470]: [debug]   Handling of local mode switch is completed
Sep 09 17:28:57 d8aefd6 balena-supervisor[14470]: (node:1) [DEP0005] DeprecationWarning: Buffer() is deprecated due to security and usability issues. Please use the Buffer.alloc(), Buffer.allocUnsafe(), or Buffer.from() methods instead.
Sep 09 17:28:57 d8aefd6 balena-supervisor[14470]: [debug]   Spawning journald with: chroot  /mnt/root journalctl -a -S 2022-09-09 17:21:04 -o json CONTAINER_ID_FULL=4c2c6e2491bfa7ec69a1d48b46abb874fde0b0def2c14d2827d325eb774361fa
Sep 09 17:28:57 d8aefd6 balena-supervisor[14470]: [info]    API Binder bound to: https://api.balena-cloud.com/v6/
Sep 09 17:28:57 d8aefd6 balena-supervisor[14470]: [event]   Event: Supervisor start {}
Sep 09 17:28:58 d8aefd6 balena-supervisor[14470]: [debug]   Spawning journald with: chroot  /mnt/root journalctl -a -S 2022-09-09 17:16:19 -o json CONTAINER_ID_FULL=d8fd35a8b24c1c1a44884e582400ed278f8a5958fcccad5de0a63e457888bb86
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [debug]   Spawning journald with: chroot  /mnt/root journalctl -a -S 2022-09-09 17:20:59 -o json CONTAINER_ID_FULL=35b618ad4e0b7616ca1df6d366964cf80320ab1b428f0e83571b109cd1a0ea7d
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [debug]   Spawning journald with: chroot  /mnt/root journalctl -a -S 2022-09-09 17:16:21 -o json CONTAINER_ID_FULL=c6a27fb1919bea78e948ff7e9179fab96cf95b63874cb722e728376ea2565511
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [debug]   Connectivity check enabled: true
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [debug]   Starting periodic check for IP addresses
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [info]    Reporting initial state, supervisor version and API info
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [debug]   Skipping preloading
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [info]    Starting API server
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [info]    Supervisor API successfully started on port 48484
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [debug]   VPN status path exists.
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [info]    Applying target state
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [debug]   Ensuring device is provisioned
Sep 09 17:28:59 d8aefd6 balena-supervisor[14470]: [info]    VPN connection is active.
Sep 09 17:29:00 d8aefd6 balena-supervisor[14470]: [info]    Waiting for connectivity...
Sep 09 17:29:00 d8aefd6 balena-supervisor[14470]: [debug]   Found unmanaged or anonymous Volume: 2ff9bd50a1ce3e26c7ff82d24b94625dd2e252b026ee1c778d674a2194a9fdc2
Sep 09 17:29:00 d8aefd6 balena-supervisor[14470]: [debug]   Found unmanaged or anonymous Volume: 978f57378f327e2885650d8c10a3a8b83da0af05fe20c84cc08c91c8dd0468c1
Sep 09 17:29:00 d8aefd6 balena-supervisor[14470]: [debug]   Starting current state report
Sep 09 17:29:00 d8aefd6 balena-supervisor[14470]: [debug]   Starting target state poll
Sep 09 17:29:00 d8aefd6 balena-supervisor[14470]: [debug]   Spawning journald with: chroot  /mnt/root journalctl -a --follow -o json _SYSTEMD_UNIT=balena.service
Sep 09 17:29:01 d8aefd6 balena-supervisor[14470]: [event]   Event: Service install {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:01 d8aefd6 balena-supervisor[14470]: [info]    Reported current state to the cloud
Sep 09 17:29:02 d8aefd6 balena-supervisor[14470]: [event]   Event: Service installed {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:02 d8aefd6 balena-supervisor[14470]: [event]   Event: Service start {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:04 d8aefd6 balena-supervisor[14470]: [event]   Event: Service started {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:04 d8aefd6 balena-supervisor[14470]: [debug]   Spawning journald with: chroot  /mnt/root journalctl -a -S 2022-09-09 17:29:04 -o json CONTAINER_ID_FULL=29e10864e2eee228c882047888e06ea2df2d47a33b4b434d1fe5ac9e7742cfeb
Sep 09 17:29:04 d8aefd6 balena-supervisor[14470]: [debug]   Found unmanaged or anonymous Volume: ac337648f5a1f4e8e779a8cfc87148c197841060ec139ef7ab49e2101da357d8
Sep 09 17:29:04 d8aefd6 balena-supervisor[14470]: [debug]   Found unmanaged or anonymous Volume: 2ff9bd50a1ce3e26c7ff82d24b94625dd2e252b026ee1c778d674a2194a9fdc2
Sep 09 17:29:04 d8aefd6 balena-supervisor[14470]: [debug]   Found unmanaged or anonymous Volume: 978f57378f327e2885650d8c10a3a8b83da0af05fe20c84cc08c91c8dd0468c1
Sep 09 17:29:05 d8aefd6 balena-supervisor[14470]: [debug]   Finished applying target state
Sep 09 17:29:05 d8aefd6 balena-supervisor[14470]: [success] Device state apply success
Sep 09 17:29:05 d8aefd6 balena-supervisor[14470]: [info]    Applying target state
Sep 09 17:29:05 d8aefd6 balena-supervisor[14470]: [debug]   Found unmanaged or anonymous Volume: 2ff9bd50a1ce3e26c7ff82d24b94625dd2e252b026ee1c778d674a2194a9fdc2
Sep 09 17:29:05 d8aefd6 balena-supervisor[14470]: [debug]   Found unmanaged or anonymous Volume: 978f57378f327e2885650d8c10a3a8b83da0af05fe20c84cc08c91c8dd0468c1
Sep 09 17:29:05 d8aefd6 balena-supervisor[14470]: [debug]   Found unmanaged or anonymous Volume: ac337648f5a1f4e8e779a8cfc87148c197841060ec139ef7ab49e2101da357d8
Sep 09 17:29:05 d8aefd6 balena-supervisor[14470]: [debug]   Finished applying target state
Sep 09 17:29:05 d8aefd6 balena-supervisor[14470]: [success] Device state apply success
Sep 09 17:29:10 d8aefd6 balena-supervisor[14470]: [info]    Internet Connectivity: OK
Sep 09 17:29:10 d8aefd6 balena-supervisor[14470]: [event]   Event: Service exit {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:12 d8aefd6 balena-supervisor[14470]: [event]   Event: Service restart {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:13 d8aefd6 balena-supervisor[14470]: [info]    Reported current state to the cloud
Sep 09 17:29:13 d8aefd6 balena-supervisor[14470]: [event]   Event: Service exit {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:15 d8aefd6 balena-supervisor[14470]: [event]   Event: Service restart {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:16 d8aefd6 balena-supervisor[14470]: [event]   Event: Service exit {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:18 d8aefd6 balena-supervisor[14470]: [event]   Event: Service restart {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:19 d8aefd6 balena-supervisor[14470]: [event]   Event: Service exit {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:22 d8aefd6 balena-supervisor[14470]: [event]   Event: Service restart {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}
Sep 09 17:29:24 d8aefd6 balena-supervisor[14470]: [event]   Event: Service exit {"service":{"appId":1667445,"serviceId":1242822,"serviceName":"balena-supervisor","commit":"a3d0d0936183278f66c21db317ea4d93","releaseId":2272141}}

Yeah, I was looking at the code and I think I know what it is. Can you share the contents of the file /mnt/boot/device-type.json?

root@d8aefd6:~# cat /mnt/boot/device-type.json
{
	"slug": "raspberrypi3-64",
	"version": 1,
	"aliases": [
		"raspberrypi3-64"
	],
	"name": "Raspberry Pi 3 (using 64bit OS)",
	"arch": "aarch64",
	"state": "RELEASED",
	"imageDownloadAlerts": [
		{
			"type": "warning",
			"message": "The Raspberry Pi 3 is not capable of connecting to 5GHz WiFi networks unless you use an external WiFi adapter that supports it."
		}
	],
	"instructions": [
		"Write the OS file you downloaded to your SD card. We recommend using <a href=\"http://www.etcher.io/\">Etcher</a>.",
		"Insert the freshly burnt SD card into the Raspberry Pi 3 (using 64bit OS).",
		"Connect your Raspberry Pi 3 (using 64bit OS) to the internet, then power it up."
	],
	"gettingStartedLink": {
		"windows": "https://www.balena.io/docs/learn/getting-started/raspberrypi3/nodejs/",
		"osx": "https://www.balena.io/docs/learn/getting-started/raspberrypi3/nodejs/",
		"linux": "https://www.balena.io/docs/learn/getting-started/raspberrypi3/nodejs/"
	},
	"options": [
		{
			"isGroup": true,
			"name": "network",
			"message": "Network",
			"options": [
				{
					"message": "Network Connection",
					"name": "network",
					"type": "list",
					"choices": [
						"ethernet",
						"wifi"
					]
				},
				{
					"message": "Wifi SSID",
					"name": "wifiSsid",
					"type": "text",
					"when": {
						"network": "wifi"
					}
				},
				{
					"message": "Wifi Passphrase",
					"name": "wifiKey",
					"type": "password",
					"when": {
						"network": "wifi"
					}
				}
			]
		},
		{
			"isGroup": true,
			"isCollapsible": true,
			"collapsed": true,
			"name": "advanced",
			"message": "Advanced",
			"options": [
				{
					"message": "Check for updates every X minutes",
					"name": "appUpdatePollInterval",
					"type": "number",
					"min": 10,
					"default": 10
				}
			]
		}
	],
	"yocto": {
		"machine": "raspberrypi3-64",
		"image": "balena-image",
		"fstype": "balenaos-img",
		"version": "yocto-honister",
		"deployArtifact": "balena-image-raspberrypi3-64.balenaos-img",
		"compressed": true
	},
	"configuration": {
		"config": {
			"partition": {
				"primary": 1
			},
			"path": "/config.json"
		}
	},
	"initialization": {
		"options": [
			{
				"message": "Select a drive",
				"type": "drive",
				"name": "drive"
			}
		],
		"operations": [
			{
				"command": "burn"
			}
		]
	}
}

Ok, I think I know what the problem is. The target state of the supervisor given by the API is for an armv7hf architecture, but your device-type.json says the device architecture is aarch64. I assume your fleet is of type armv7hf and you provisioned a raspberry pi 4 into it (which I forgot is an option). I’ll create a fix for this, here is the issue so you can track the progress Provisioning an aarch64 device into an armv7 architecture causes the target state supervisor to be installed · Issue #2006 · balena-os/balena-supervisor · GitHub

Interesting. The device is actually a Pi 3, but we are running the 64 bit OS on it. There is no option to select a device type of Raspberry Pi 3 (64 bit OS) when changing the device type in the dashboard, only when creating a new device.

We have done this before (running 64bit on regular Pi 3 device type) and hadn’t run into issues. In the future, should I provision a brand new device with the correct device type?

Thanks for your help!

raspberrypi3-64 will also report an aarch64 architecture. I don’t think you really need to do anything, aarch64 devices can be installed in armv7 fleets because aarch64 is backwards compatible.

The extra supervisor is annoying but it should be harmless. I’ll create a supervisor fix in the next few days and when you upgrade to it, the problem should go away.

1 Like

Hi again, I am working on a fix for this, but I have not been able to replicate the issue in order to test. No matter what I do, even if the device is a rpi3-64 bit on a 32-bit fleet, the API always gives me the configuration for the right architecture.

Can you give me more details about your provisioning process? How did you get the image? How soon after the device appearing on the dashboard did you update to the new supervisor? Had you done a HostOS update on this device before performing the supervisor update?

Anything you can think of that could help me replicate the issue will be useful.

Thank you!

Hi,
Based on our history records I can see that the device was provisioned on 2021-06-04 as a raspberrypi3 which should also be how it shows up in the dashboard, rather than a raspberrypi3-64 that its current device-type.json indicates.
Can you provide any further information about the provisioning process that you followed or any information in general that might be related with this?

Kind regards,
Thodoris

Hi again, I just wanted to let you know that we have released a fix for the issue you reported in Supervisor v14.0.20, please update to that version and let us know if the fix works for you.

It would also be very beneficial for understanding the root cause of the issue if you could reply to the questions my colleague asked. Thank you again