LogBackend: server responded with status code: 504

Hi,
I am currently trying to setup an internal openBalena instance for testing purposes.
The server setup went pretty smoothly but I am now stuck at the deploy process.
I followed the Getting started guide, only change is that I deployed balenaDash instead using balena deploy myApp --logs --source . --emulated. Only nothing happens on my development Rpi3.

The device appears when using the local command balena devices

$ balena devices
ID UUID    DEVICE NAME  DEVICE TYPE  APPLICATION NAME STATUS IS ONLINE SUPERVISOR VERSION OS VERSION           DASHBOARD URL
1  04e43a0 hidden-dream raspberrypi3 myApp            Idle   true      9.15.7             balenaOS 2.38.0+rev1 https://dashboard.openbalena.<redacted>/devices/04e43a0f842d3a0914f6d39f5fef042b/summary

The device shows as Idle from the server POV:

$ balena device 04e43a0
== HIDDEN DREAM
ID:                 1
DEVICE TYPE:        raspberrypi3
STATUS:             idle
IS ONLINE:          true
IP ADDRESS:         <redacted>
APPLICATION NAME:   myApp
UUID:               04e43a0f842d3a0914f6d39f5fef042b
COMMIT:             1a1a33c8eed1c99b56e5ae801b163b68
SUPERVISOR VERSION: 9.15.7
OS VERSION:         balenaOS 2.38.0+rev1
DASHBOARD URL:      https://dashboard.openbalena.<redacted>/devices/04e43a0f842d3a0914f6d39f5fef042b/summary

I successfully ssh’d into my development Rpi3 using ssh root@<ip> -p 22222.
Here are the logs from the supervisor, using journalctl -u resin-supervisor -fn100:

root@04e43a0:~# journalctl -u resin-supervisor -fn100-- Logs begin at Wed 2019-06-12 18:01:15 UTC. --
Jul 25 16:17:44 04e43a0 systemd[1]: Starting Resin supervisor...
Jul 25 16:17:45 04e43a0 resin-supervisor[1615]: resin_supervisorJul 25 16:17:45 04e43a0 resin-supervisor[1655]: active
Jul 25 16:17:45 04e43a0 systemd[1]: Started Resin supervisor.Jul 25 16:17:46 04e43a0 resin-supervisor[1666]: Container config has not changed
Jul 25 16:17:50 04e43a0 resin-supervisor[1666]: Starting system message bus: dbus.
Jul 25 16:17:50 04e43a0 resin-supervisor[1666]:  * Starting Avahi mDNS/DNS-SD Daemon: avahi-daemon
Jul 25 16:17:50 04e43a0 resin-supervisor[1666]:    ...done.
Jul 25 16:18:00 04e43a0 resin-supervisor[1666]: (node:1) [DEP0005] DeprecationWarning: Buffer() is deprecated due to security and usability issues. Please use the Buffer.alloc(), Buffer.allocUnsafe(), or Buffer.from() methods instead.
Jul 25 16:18:00 04e43a0 resin-supervisor[1666]: Starting event tracker
Jul 25 16:18:00 04e43a0 resin-supervisor[1666]: Starting up api binder
Jul 25 16:18:00 04e43a0 resin-supervisor[1666]: Starting logging infrastructure
Jul 25 16:18:00 04e43a0 resin-supervisor[1666]: Event: Supervisor start {}
Jul 25 16:18:00 04e43a0 resin-supervisor[1666]: Performing database cleanup for container log timestamps
Jul 25 16:18:01 04e43a0 resin-supervisor[1666]: Connectivity check enabled: true
Jul 25 16:18:01 04e43a0 resin-supervisor[1666]: Starting periodic check for IP addresses
Jul 25 16:18:01 04e43a0 resin-supervisor[1666]: Reporting initial state, supervisor version and API info
Jul 25 16:18:01 04e43a0 resin-supervisor[1666]: VPN status path exists.
Jul 25 16:18:01 04e43a0 resin-supervisor[1666]: Waiting for connectivity...
Jul 25 16:18:02 04e43a0 resin-supervisor[1666]: Skipping preloading
Jul 25 16:18:02 04e43a0 resin-supervisor[1666]: Starting API server
Jul 25 16:18:02 04e43a0 resin-supervisor[1666]: Applying target state
Jul 25 16:18:02 04e43a0 resin-supervisor[1666]: Ensuring device is provisioned
Jul 25 16:18:02 04e43a0 resin-supervisor[1666]: Starting current state report
Jul 25 16:18:02 04e43a0 resin-supervisor[1666]: Starting target state poll
Jul 25 16:18:03 04e43a0 resin-supervisor[1666]: Supervisor API listening on allowed interfaces only
Jul 25 16:18:03 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:08 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:08 04e43a0 resin-supervisor[1666]: Finished applying target state
Jul 25 16:18:08 04e43a0 resin-supervisor[1666]: Apply success!
Jul 25 16:18:08 04e43a0 resin-supervisor[1666]: Applying target state
Jul 25 16:18:08 04e43a0 resin-supervisor[1666]: Finished applying target state
Jul 25 16:18:08 04e43a0 resin-supervisor[1666]: Apply success!
Jul 25 16:18:10 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:11 04e43a0 resin-supervisor[1666]: Internet Connectivity: OK
Jul 25 16:18:14 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:17 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:21 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:24 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:30 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:33 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:42 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:44 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:18:50 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:19:00 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:19:02 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:19:31 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:19:33 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:19:50 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:20:27 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:20:29 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:21:17 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:21:33 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:21:35 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:22:22 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:22:38 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:22:40 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:22:50 04e43a0 resin-supervisor[1666]: Supervisor API: GET /v1/healthy 200 - 31.215 ms
Jul 25 16:23:27 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:23:43 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:23:45 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:24:32 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:24:48 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:24:50 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:25:32 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:25:54 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:25:56 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:26:32 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:26:59 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:27:02 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:27:33 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:27:51 04e43a0 resin-supervisor[1666]: Supervisor API: GET /v1/healthy 200 - 9.975 ms
Jul 25 16:27:59 04e43a0 resin-supervisor[1666]: Attempting container log timestamp flush...
Jul 25 16:28:00 04e43a0 resin-supervisor[1666]: Container log timestamp flush complete
Jul 25 16:28:04 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:28:07 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:28:33 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:29:09 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:29:11 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:29:33 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:30:14 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:30:17 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:30:33 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:31:19 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:31:21 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:31:33 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:32:24 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:32:27 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:32:33 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504
Jul 25 16:32:52 04e43a0 resin-supervisor[1666]: Supervisor API: GET /v1/healthy 200 - 9.555 ms
Jul 25 16:33:30 04e43a0 resin-supervisor[1666]: Event: Service restart {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:33:32 04e43a0 resin-supervisor[1666]: Event: Service exit {"service":{"appId":1,"serviceId":1,"serviceName":"wpe","releaseId":1}}
Jul 25 16:33:34 04e43a0 resin-supervisor[1666]: LogBackend: server responded with status code: 504

I tried looking for this 504 error in the openBalena server logs but can’t seem to find anything.
Any help is greatly appreciated.
Thanks,
Erwan

Hey there! The logs show that the supervisor is restarting the service over and over. Can you try something simpler to start with, like our simple Node.js server (https://github.com/balena-io-projects/simple-server-node) and see if that works fine?

1 Like

It works! The container runs smoothly, so the 504 must be unrelated.
I will now try deploying balenaDash again and check the docker logs.

Glad to hear that. Let us know how it goes!

Ok, so the wpe container seems to be crashing 2 seconds after starting.

Should I open another thread / post on the balenaDash issue tracker instead?

root@04e43a0:~# balena ps
CONTAINER ID        IMAGE                               COMMAND                  CREATED             STATUS                            PORTS                    NAMES
4c37d0c2ec82        7f3c31c8e200                        "/usr/bin/entry.sh n…"   14 minutes ago      Up 5 minutes                      0.0.0.0:8888->8888/tcp   photos_13_7
be199c322107        f49e9bc5de33                        "/wpe-init"              14 minutes ago      Restarting (0) 7 seconds ago                               wpe_11_7
007858779e03        05335daf5b45                        "/usr/bin/entry.sh /…"   14 minutes ago      Up 5 minutes                                               scheduler_10_7
0ce95d8ee287        balena/armv7hf-supervisor:v9.15.7   "./entry.sh"             4 weeks ago         Up 4 minutes (health: starting)                            resin_supervisor
root@04e43a0:~# balena logs wpe_11_7
2019/07/25 18:58:29 Command is 'cog'
2019/07/25 18:58:29 Starting listening on port 8080
2019/07/25 18:58:36 Command is 'cog'
2019/07/25 18:58:36 Starting listening on port 8080
2019/07/25 18:58:43 Command is 'cog'
2019/07/25 18:58:43 Starting listening on port 8080
2019/07/25 18:58:49 Command is 'cog'
2019/07/25 18:58:49 Starting listening on port 8080
2019/07/25 18:58:58 Command is 'cog'
2019/07/25 18:58:58 Starting listening on port 8080
2019/07/25 18:59:09 Command is 'cog'
2019/07/25 18:59:09 Starting listening on port 8080
2019/07/25 18:59:27 Command is 'cog'
2019/07/25 18:59:27 Starting listening on port 8080
2019/07/25 18:59:58 Command is 'cog'
2019/07/25 18:59:58 Starting listening on port 8080
2019/07/25 19:00:54 Command is 'cog'
2019/07/25 19:00:54 Starting listening on port 8080

Hi @edorgeville , in most cases where the WPE container is continually crashing its because the gpu_mem split is not correct. How much RAM have you give the GPU on the device?

According to the CPU info :
a02082 → 1GB

I used the following commands to set the GPU RAM according to the configuration section:

balena env add RESIN_HOST_CONFIG_gpu_mem_256 128 --application myApp
balena env add RESIN_HOST_CONFIG_gpu_mem_512 196 --application myApp
balena env add RESIN_HOST_CONFIG_gpu_mem_1024 396 --application myApp

Strangely enough, no env seems to be set :

$ balena envs --application myApp    
No environment variables found

If you need help, don't hesitate in contacting our support forums at
https://forums.balena.io

For bug reports or feature requests, have a look at the GitHub issues or
create a new one at: https://github.com/balena-io/balena-cli/issues/

interesting. Do you also define just the standard RESIN_HOST_CONFIG_gpu_mem ? i think it is by default set to 16 and might override the others (although i don’t think so.) The other thing to do is to check the config.txt on the device. You should be able to find it at /mnt/boot/config.txt , it should be able to confirm if the envs are actually set.

They indeed seem to be set.

root@04e43a0:~# cat /mnt/boot/config.txt
gpu_mem=16
gpu_mem_1024=396
gpu_mem_256=128
gpu_mem_512=196
disable_splash=1
dtparam=i2c_arm=on
dtparam=spi=on
dtparam=audio=on
enable_uart=1

The following didn’t change anything after 2 minutes:

$ balena env add RESIN_HOST_CONFIG_gpu_mem 396 --application myApp

This one did change it instantly:

$ balena env add RESIN_HOST_CONFIG_gpu_mem 396 --device 04e43a0
root@04e43a0:~# cat /mnt/boot/config.txt
gpu_mem=396
gpu_mem_1024=396
gpu_mem_256=128
gpu_mem_512=196
disable_splash=1
dtparam=i2c_arm=on
dtparam=spi=on
dtparam=audio=on
enable_uart=1

However, the container keeps crashing, even after a full power cycle.

root@04e43a0:~# balena logs -f wpe_11_7
2019/07/25 19:30:03 Command is 'cog'
2019/07/25 19:30:03 Starting listening on port 8080
2019/07/25 19:30:10 Command is 'cog'
2019/07/25 19:30:10 Starting listening on port 8080
2019/07/25 19:30:16 Command is 'cog'
2019/07/25 19:30:16 Starting listening on port 8080
2019/07/25 19:30:23 Command is 'cog'
2019/07/25 19:30:23 Starting listening on port 8080
2019/07/25 19:30:29 Command is 'cog'
2019/07/25 19:30:29 Starting listening on port 8080
2019/07/25 19:30:38 Command is 'cog'
2019/07/25 19:30:38 Starting listening on port 8080

ah, I would guess that balena env add RESIN_HOST_CONFIG_gpu_mem 396 --application myApp is setting the application wide variable, but since the device already has RESIN_HOST_CONFIG_gpu_mem set at the per-device level, it doesn’t get updated, because per-device env vars over ride fleet-level env vars.

I think it might be worth adding an infinite loop to the wpe-init script in the WPE service and then trying to launch the process manually from within the container. Launching just wpe with out tohora to see if we can get logs from the wpe process

I added the following to line 25 of the script :

while :
do
	echo "Infinite loop"
	sleep 1
done

Then attached to the running container:

root@04e43a0:~# balena exec -i -t wpe_14_8 /bin/bash
bash-4.4# ./tohora 8080 cog
2019/07/25 19:52:31 Command is 'cog'
2019/07/25 19:52:31 Starting listening on port 8080
2019/07/25 19:52:34 Responded with home page

I can then log to http://04e43a0.local:8080/ and set the URL, everything works!

interesting, that is weird. I wonder if some how the is no process PID one process staying running long enough and since the tohora process is started as a background process. Could you try putting the inifinite loop right at the end of the script?

This worked.

I also realized that I hadn’t set the WPE_URL env variable… if [[ ! -z ${WPE_URL+x} ]] returns false, skipping the last block, and the script would simply end.

I reverted all my changes, and added the variable. Everything works once again! I guess this is an oversight from when tahora was added?

$ balena env add WPE_URL "https://www.youtube.com/embed/hHW1oY26kxQ?autoplay=1" --device 04e43a0

Anyway, case pretty much closed. Thanks a lot for your help ! :pray:

Ah!! awesome, glad we got to the bottom of it. Yeah I think this must have been an oversight. Would you mind opening an issue on the repo so the guys can get that fixed :slight_smile:

1 Like

Done!

Thanks so much and thanks for working through it with us.

1 Like

Thank you @edorgeville for letting us know the issue.

The issue was fixed and closed here.

1 Like

One last thing!

I needed to add the --config flag as described in the cli docs :

$ balena envs --application myApp --config

I noticed this later while enabling local mode :slight_smile: