Pikrellcam usually creates a nginx webserver on the localhost on port 80. However whilst using the balenaOS/ container I cannot access this locally hosted webpage.
My dockerfile is setup like this
FROM balenalib/%%BALENA_MACHINE_NAME%%-debian-node:10-buster
COPY /pikrellcam /home/pi/pikrellcam
WORKDIR /home/pi/pikrellcam
RUN bash install-pikrellcam.sh
CMD ["./pikrellcam"]
Is there anything obvious I need to do to access the webpage. I have a feeling it is something to do with port forwarding but I am not an expert,
Just a few clarification questions first as you’ve said “I have managed to set up a pi with Debian”; did you do a bare metal install of Debian and build on top of that? Alternatively did you install balenaOS as a download from balenaCloud (and now have the device showing online in the balenaCloud dashboard)?
Next, are you running this container as part of a multicontainer application (using a docker-compose.yml file), or as one single standalone container? If you’re just running one container on it’s own you shouldn’t have to do anything to expose port 80 externally to the device. You can check here for a basic example of a webserver running on port 80, and as you can see no special consideration is given to get it to work.
However if the container is part of a multicontainer application, you’ll need to add port 80 to the list of ports for that service within the compose file. See here for an example of that.
Aside from all this, the project looks great and if you can get it up and running we’d love to see it listed on balenaHub
I never got it working with a single container application even when exposing port 80. I managed to run a multicontainer application which seemed to expose the port (although I haven’t tested extensively so can’t be certain). I then got an error regarding the raspberry pi camera so need to work on that error, I found some forums online with similar errors so should be able to sort that. I’ve been moving house but will get back to you when I get as chance to look at it. I appreciate your help,
I then got an error regarding the raspberry pi camera so need to work on that error, I found some forums online with similar errors so should be able to sort that
When working with the camera make sure to set the proper settings. You can find out more here.
Thanks for your help. I have managed to get a version of Pikrellcam working in a container with the variables you listed. I have managed to run the Pikrellcam install script in the dockerfile.template and then start Pikrellcam in the container with CMD. However… the Pikrellcam install script installs Nginx and Php7.3 to serve the web application. To get Pikrellcam to start running a webserver container to work I have to enter the container terminal and start both of these services.
I understand it is best practice to split out such elements into different containers. As such I have tried separating the PHP installation into another container but I am getting a bit stuck on how to go about this. I have setup another container with the following php image “php:7.3-fpm”. But… I don’t know how to connect this to the Pikrellcam container I have setup. I have tried googling for how to set this up but am very lost.
Hello,
you’re right, that this is best practice, but unfortunately with software that hasn’t been developed with containers in mind it’s often not worth the effort.
As you’re main problem you seem to have now is that you need to start multiple applications inside your container I would suggest you’re using something like this:
You set this as your ENTRYPOINT in your Dockerfile and have a small config that defines what programs to keep running.
Hope this helps!
Thanks so much, really helpful direction. I’ve got everything working now using a bash script which runs on CMD (I thought that would be more lightweight than Supervisord) - I used the advice which is on the Docker website here: multiservicecontainer-docker
I have one last question. Due to permissions restrictions I need to run the ./pikrellcam script as user pi. When I run as user pi I have issues with /dev/vchiq permissions. After reading online I have tried adding pi and www-data to the video user group but this doesn’t work for me. I have successfully got the ./pikrellcam script running by granting everyone permission to the vchiq file by running “sudo chmod 777 /dev/vchiq” in my bash script. This works but I have read granting all users this permission is bad practice (though I don’t understand this fully).
I have seen on another thread that I might be able to give the pi user /dev/vchiq permissions by creating a udev rule in a config.json - Balena udev rule. Could you give me a bit more info on how this works, the Balena page on this doesn’t give too much detail. Is there a easier way to elevate permissions in the docker-compose.yml or dockerfile.template
Thanks again for your help,
The cloud service is brilliant,
Sam
We do something very similar in our browser block:
If you do want to do it at runtime, though - you’ll need to put your commands in a bash script and run that as part of the entrypoint to the container. The dockerfile builds the container (hence why /dev does not exist for it), a bash script run from the container once it’s built would be able to access /dev