Webserver basics - RPI3

Hello,

I plan to set up a few raspberry pis with pikrellcam GitHub - billw2/pikrellcam: Raspberry Pi motion vector detection program with OSD web interface.. I have managed to set up a pi with Debian and install/run pikrellcam with the BalenaOS and hub. I can navigate the pi with a terminal so know it is working.

Pikrellcam usually creates a nginx webserver on the localhost on port 80. However whilst using the balenaOS/ container I cannot access this locally hosted webpage.

My dockerfile is setup like this
FROM balenalib/%%BALENA_MACHINE_NAME%%-debian-node:10-buster

COPY /pikrellcam /home/pi/pikrellcam

WORKDIR /home/pi/pikrellcam

RUN bash install-pikrellcam.sh

CMD ["./pikrellcam"]

Is there anything obvious I need to do to access the webpage. I have a feeling it is something to do with port forwarding but I am not an expert,

Thanks for any support, it’s appreciated!
Sam

Hey there @tennisparty and welcome to the forums!

Just a few clarification questions first as you’ve said “I have managed to set up a pi with Debian”; did you do a bare metal install of Debian and build on top of that? Alternatively did you install balenaOS as a download from balenaCloud (and now have the device showing online in the balenaCloud dashboard)?

Next, are you running this container as part of a multicontainer application (using a docker-compose.yml file), or as one single standalone container? If you’re just running one container on it’s own you shouldn’t have to do anything to expose port 80 externally to the device. You can check here for a basic example of a webserver running on port 80, and as you can see no special consideration is given to get it to work.

However if the container is part of a multicontainer application, you’ll need to add port 80 to the list of ports for that service within the compose file. See here for an example of that.

Aside from all this, the project looks great and if you can get it up and running we’d love to see it listed on balenaHub :slight_smile:

Another further idea; you could also run something like netstat -tlp to check that the app is running and actually listening on port 80 :slight_smile:

Hello

Just wondering if the suggestions my colleague made were helpful? Were you able to get this working?

Let us know. thanks :slight_smile:

Hi Rahul,

I never got it working with a single container application even when exposing port 80. I managed to run a multicontainer application which seemed to expose the port (although I haven’t tested extensively so can’t be certain). I then got an error regarding the raspberry pi camera so need to work on that error, I found some forums online with similar errors so should be able to sort that. I’ve been moving house but will get back to you when I get as chance to look at it. I appreciate your help,

Thanks
Sam

Thanks for the update

I then got an error regarding the raspberry pi camera so need to work on that error, I found some forums online with similar errors so should be able to sort that

When working with the camera make sure to set the proper settings. You can find out more here.

Keep us posted

Thanks

Thanks, I just checked my notes, it was the camera permissions which I had problems with. I am going to try the instructions on here How to Access the Raspberry Pi Camera in Docker

@tennisparty I just wanted to reach out and see if the link helped you resolve the issue? #pendinguserresponse

Hi all,

Thanks for your help. I have managed to get a version of Pikrellcam working in a container with the variables you listed. I have managed to run the Pikrellcam install script in the dockerfile.template and then start Pikrellcam in the container with CMD. However… the Pikrellcam install script installs Nginx and Php7.3 to serve the web application. To get Pikrellcam to start running a webserver container to work I have to enter the container terminal and start both of these services.

I understand it is best practice to split out such elements into different containers. As such I have tried separating the PHP installation into another container but I am getting a bit stuck on how to go about this. I have setup another container with the following php image “php:7.3-fpm”. But… I don’t know how to connect this to the Pikrellcam container I have setup. I have tried googling for how to set this up but am very lost.

Any help would be much appreciated!

Thanks,
Sam

Hello,
you’re right, that this is best practice, but unfortunately with software that hasn’t been developed with containers in mind it’s often not worth the effort.
As you’re main problem you seem to have now is that you need to start multiple applications inside your container I would suggest you’re using something like this:

You set this as your ENTRYPOINT in your Dockerfile and have a small config that defines what programs to keep running.
Hope this helps!

Thanks so much, really helpful direction. I’ve got everything working now using a bash script which runs on CMD (I thought that would be more lightweight than Supervisord) - I used the advice which is on the Docker website here: multiservicecontainer-docker

I have one last question. Due to permissions restrictions I need to run the ./pikrellcam script as user pi. When I run as user pi I have issues with /dev/vchiq permissions. After reading online I have tried adding pi and www-data to the video user group but this doesn’t work for me. I have successfully got the ./pikrellcam script running by granting everyone permission to the vchiq file by running “sudo chmod 777 /dev/vchiq” in my bash script. This works but I have read granting all users this permission is bad practice (though I don’t understand this fully).

I have seen on another thread that I might be able to give the pi user /dev/vchiq permissions by creating a udev rule in a config.json - Balena udev rule. Could you give me a bit more info on how this works, the Balena page on this doesn’t give too much detail. Is there a easier way to elevate permissions in the docker-compose.yml or dockerfile.template

Thanks again for your help,
The cloud service is brilliant,
Sam

Thanks for getting back. And great to see that you were able to get a bash script to get the multi service piece working.

As for your question on udev its more of a subsystem to trigger scripts as when you have ext. devices connected - so more dynamic in a way (ref: An introduction to Udev: The Linux subsystem for managing device events | Opensource.com). You can either use the instructions from the thread you pointed (/dev/vchiq is too restricted on Raspberry Pi 4 production image - #9) to enable a udev rule here to change permissions to more palatable 660 (i.e. user/owner + group can read and write - which you want in this case). Or just use the chmod with 660 here.

More on the permission numbers here: https://linuxize.com/post/what-does-chmod-777-mean/#permission-number

Let us know how it goes.

Thanks that works if I add my user to the video group then in my bash script I change the group ownership of file /dev/vchiq to the video group.

I can’t change this group ownership within the Dockerfile itself as I get an error saying the /dev/vchiq file doesn’t exist

Hey Sam,

We do something very similar in our browser block:

If you do want to do it at runtime, though - you’ll need to put your commands in a bash script and run that as part of the entrypoint to the container. The dockerfile builds the container (hence why /dev does not exist for it), a bash script run from the container once it’s built would be able to access /dev

Hope that helps.
Phil