Access nvarguscamerasrc CSI camera with OpenCV

Hello,

We are currently migrating from the Raspberry Pi3 CM3+ to the Jetson Nano 4GB. I’m the poor guy that is responsible for the integration. I’ve gotten a bit familiar with the Jetson Nano, L4T and GStreamer over the past few days but I’m still WAY in over my head.

I’m trying to build a development docker container that can access 1 CSI Camera (IMX219) and stream the images to OpenCV with gstreamer. To make things harder, we are using Node, and have been using the OpenCV4nodejs library which is a bit of a pain to install.

I am using the following Dockerfile (a bit of a Frankenstein file, but primarily based off this example)

My host L4T version is v32.6.1 so I have selected the BSP to match that version.

FROM balenalib/jetson-nano-ubuntu-node:12-bionic as buildstep

WORKDIR /usr/src/app

# Don't prompt with any configuration questions
ENV DEBIAN_FRONTEND noninteractive

# Install CUDA, CUDA compiler and some utilities
RUN \
    apt-get update && apt-get install -y cuda-toolkit-10-2 cuda-compiler-10-2 \
    lbzip2 xorg-dev \
    cmake wget unzip \
    libgtk2.0-dev \
    libavcodec-dev \
    libgstreamer1.0-dev \
    libgstreamer-plugins-base1.0-dev \
    libjpeg-dev \
    libpng-dev \
    libtiff-dev \
    libdc1394-22-dev -y --no-install-recommends && \
    echo "/usr/lib/aarch64-linux-gnu/tegra" > /etc/ld.so.conf.d/nvidia-tegra.conf && \
    ldconfig && \
    wget https://github.com/opencv/opencv/archive/4.0.1.zip && \
    unzip 4.0.1.zip && rm 4.0.1.zip

RUN \
    wget https://github.com/opencv/opencv_contrib/archive/4.0.1.zip -O opencv_modules.4.0.1.zip && \
    unzip opencv_modules.4.0.1.zip && rm opencv_modules.4.0.1.zip && \
    export CUDA_HOME=/usr/local/cuda-10.2/ && \
    export LD_LIBRARY_PATH=${CUDA_HOME}/lib64 && \
    PATH=${CUDA_HOME}/bin:${PATH} && export PATH && \
    mkdir -p opencv-4.0.1/build && cd opencv-4.0.1/build && \
    cmake -D WITH_CUDA=ON -D CUDA_ARCH_BIN="5.3"  -D BUILD_LIST=cudev,highgui,videoio,cudaimgproc,ximgproc -D OPENCV_EXTRA_MODULES_PATH=../../opencv_contrib-4.0.1/modules -D CUDA_ARCH_PTX="" -D WITH_GSTREAMER=ON -D WITH_LIBV4L=ON -D BUILD_TESTS=ON -D BUILD_PERF_TESTS=ON -D BUILD_SAMPLES=ON -D BUILD_EXAMPLES=ON -D CMAKE_BUILD_TYPE=RELEASE -D WITH_GTK=on -D BUILD_DOCS=OFF -D CMAKE_INSTALL_PREFIX=/usr/local .. && make -j32 && make install && \
    cp /usr/src/app/opencv-4.0.1/build/bin/opencv_version /usr/src/app/ && \
    cp /usr/src/app/opencv-4.0.1/build/bin/example_ximgproc_paillou_demo /usr/src/app/ && \
    cp /usr/src/app/opencv-4.0.1/build/bin/example_ximgproc_fourier_descriptors_demo /usr/src/app/ && \
    cd /usr/src/app/ && rm -rf /usr/src/app/opencv-4.0.1 && \
    mv opencv_contrib-4.0.1/samples/data/corridor.jpg /usr/src/app/ && \
    rm -rf /usr/src/app/opencv_contrib-4.0.1


# Download and install BSP binaries for L4T 32.6.1
RUN apt-get update && apt-get install -y wget tar lbzip2 python3 libegl1 && \
    wget https://developer.nvidia.com/embedded/l4t/r32_release_v6.1/t210/jetson-210_linux_r32.6.1_aarch64.tbz2 && \       
    tar xf jetson-210_linux_r32.6.1_aarch64.tbz2 && \
    cd Linux_for_Tegra && \
    sed -i 's/config.tbz2\"/config.tbz2\" --exclude=etc\/hosts --exclude=etc\/hostname/g' apply_binaries.sh && \
    sed -i 's/install --owner=root --group=root \"${QEMU_BIN}\" \"${L4T_ROOTFS_DIR}\/usr\/bin\/\"/#install --owner=root --group=root \"${QEMU_BIN}\" \"${L4T_ROOTFS_DIR}\/usr\/bin\/\"/g' nv_tegra/nv-apply-debs.sh && \
    sed -i 's/LC_ALL=C chroot . mount -t proc none \/proc/ /g' nv_tegra/nv-apply-debs.sh && \
    sed -i 's/umount ${L4T_ROOTFS_DIR}\/proc/ /g' nv_tegra/nv-apply-debs.sh && \
    sed -i 's/chroot . \//  /g' nv_tegra/nv-apply-debs.sh && \
    ./apply_binaries.sh -r / --target-overlay && cd .. \
    rm -rf jetson-210_linux_r32.6.1_aarch64.tbz2 && \
    rm -rf Linux_for_Tegra && \
    echo "/usr/lib/aarch64-linux-gnu/tegra" > /etc/ld.so.conf.d/nvidia-tegra.conf && ldconfig

ENV UDEV=1
ENV LD_LIBRARY_PATH=/usr/local/lib

# Copy over and install my dependencies (include openCV4Nodejs - not important in this example)
COPY ./package.json ./package.json
RUN npm install

RUN apt-get install -y lbzip2 xorg  network-manager 

RUN apt-get install -y gstreamer1.0-tools gstreamer1.0-alsa gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav \
    libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libgstreamer-plugins-good1.0-dev libgstreamer-plugins-bad1.0-dev

RUN apt-get install -y --no-install-recommends \
    xserver-xorg-input-evdev \
    xinit \
    xfce4 \
    xfce4-terminal \
    x11-xserver-utils \
    dbus-x11 \
    xterm


CMD ["npm", "run", "start"]

I have been starting the container with this command (SSH connection, so $DISPLAY is empty)

docker run --rm -it --privileged --net=host --runtime nvidia --ipc=host -v /tmp/.X11-unix/:/tmp/.X11-unix/ -v /tmp/argus_socket:/tmp/argus_socket  --cap-add SYS_PTRACE -e DISPLAY=$DISPLAY myContainer bash

Inside the container, I’m trying to test out the nvarguscamerasrc setup but I’m getting the following errors. This is when I run through SSH:

root@mercury:/# gst-launch-1.0 nvarguscamerasrc ! fakesink
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
GST_ARGUS: Creating output stream
(Argus) Error NotSupported: Failed to initialize EGLDisplay (in src/eglutils/EGLUtils.cpp, function getDefaultDisplay(), line 77)
(Argus) Error BadParameter:  (propagating from src/eglstream/FrameConsumerImpl.cpp, function initialize(), line 89)
(Argus) Error BadParameter:  (propagating from src/eglstream/FrameConsumerImpl.cpp, function create(), line 44)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadInitialize:320 Failed to create FrameConsumer
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:241 (propagating)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, waitRunning:203 Invalid thread state 3
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:783 (propagating)
Got EOS from element "pipeline0".
Execution ended after 0:00:00.448047741
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
root@mercury:/#

If I run the command directly on a terminal on the machine I get a different error:

root@mercury:/# gst-launch-1.0 nvarguscamerasrc ! fakesink
No protocol specified
No protocol specified
No protocol specified
No protocol specified
nvbuf_utils: Could not get EGL display connection
No protocol specified
No protocol specified
No protocol specified
No protocol specified
nvbuf_utils: Could not get EGL display connection
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
(Argus) Error EndOfFile: Unexpected error in reading socket (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 266)
(Argus) Error EndOfFile: Receive worker failure, notifying 1 waiting threads (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 340)
(Argus) Error InvalidState: Argus client is exiting with 1 outstanding client threads (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 357)
(Argus) Error EndOfFile: Receiving thread terminated with error (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadWrapper(), line 368)
(Argus) Error EndOfFile: Client thread received an error from socket (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 145)
(Argus) Error EndOfFile:  (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 87)
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:557 No cameras available
Got EOS from element "pipeline0".
Execution ended after 0:00:00.930946733
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
(Argus) Error InvalidState: Receive thread is not running cannot send. (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 96)
(Argus) Error InvalidState:  (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 87)

I have no idea how to move forward, does anybody have any insights ? I feel like I’m missing something in my libraries.

Regards,

Hello @chuyzoz welcome back to the balena forums.

Did you gave a look at this project we have of opendatacam with NVIDIA Jetson? Maybe this can help you.

If it doesn’t work i will try to reproduce!

Let’s stay connected