Install scipy, numpy, and sklearn

Hey! I’m working to build a thought recognition headset that I believe will radically change the way we all use computers.

Problem

Error when adding numpy and scipy to simple-server-python https://github.com/resin-io-projects/simple-server-python

You can duplicate by adding to requirements.txt by hitting new line after the first line, and typing numpy and hit enter/new line again, and type scipy and hit enter, then scikit-learn. Then commit and push or build with resin.io

numpy will install

But then scipy will crash with a lot of no lapack/blas resources found gist messages:

[Build]        Running from scipy source directory.
[Build]        Traceback (most recent call last):
[Build]          File "<string>", line 1, in <module>
[Build]          File "/tmp/pip-build-HBMnOj/scipy/setup.py", line 418, in <module>
[Build]            setup_package()
[Build]          File "/tmp/pip-build-HBMnOj/scipy/setup.py", line 414, in setup_package
[Build]            setup(**metadata)
[Build]          File "/usr/local/lib/python2.7/site-packages/numpy/distutils/core.py", line 135, in setup
[Build]            config = configuration()
[Build]          File "/tmp/pip-build-HBMnOj/scipy/setup.py", line 336, in configuration
[Build]            config.add_subpackage('scipy')
[Build]          File "/usr/local/lib/python2.7/site-packages/numpy/distutils/misc_util.py", line 1029, in add_subpackage
[Build]            caller_level = 2)
[Build]          File "/usr/local/lib/python2.7/site-packages/numpy/distutils/misc_util.py", line 998, in get_subpackage
[Build]            caller_level = caller_level + 1)
[Build]          File "/usr/local/lib/python2.7/site-packages/numpy/distutils/misc_util.py", line 935, in _get_configuration_from_setup_py
[Build]            config = setup_module.configuration(*args)
[Build]          File "scipy/setup.py", line 15, in configuration
[Build]            config.add_subpackage('linalg')
[Build]          File "/usr/local/lib/python2.7/site-packages/numpy/distutils/misc_util.py", line 1029, in add_subpackage
[Build]            caller_level = 2)
[Build]          File "/usr/local/lib/python2.7/site-packages/numpy/distutils/misc_util.py", line 998, in get_subpackage
[Build]            caller_level = caller_level + 1)
[Build]          File "/usr/local/lib/python2.7/site-packages/numpy/distutils/misc_util.py", line 935, in _get_configuration_from_setup_py
[Build]            config = setup_module.configuration(*args)
[Build]          File "scipy/linalg/setup.py", line 19, in configuration
[Build]            raise NotFoundError('no lapack/blas resources found')
[Build]        numpy.distutils.system_info.NotFoundError: no lapack/blas resources found

And then i do some more googleing and stumble upon this stackover flow post

And uncomment some lines in Dockerfile.template for our python simple server to make them look like:

RUN sudo apt-get update && apt-get install -yq \
    gfortran libopenblas-dev liblapack-dev && \
    apt-get clean && rm -rf /var/lib/apt/lists/*

But still fails with:

[Build]    Building wheels for collected packages: Flask, numpy, scipy, scikit-learn, itsdangerous, MarkupSafe
[Build]      Running setup.py bdist_wheel for Flask: started
[Build]      Running setup.py bdist_wheel for Flask: finished with status 'done'
[Build]      Stored in directory: /root/.cache/pip/wheels/b6/09/65/5fcf16f74f334a215447c26769e291c41883862fe0dc7c1430
[Build]      Running setup.py bdist_wheel for numpy: started
[Info]     Still working...
[Info]     Still working...
[Build]      Running setup.py bdist_wheel for numpy: still running...
[Info]     Still working...
[Info]     Still working...
[Build]      Running setup.py bdist_wheel for numpy: still running...
[Info]     Still working...

[Info]     Still working...
[Build]      Running setup.py bdist_wheel for numpy: still running...
[Build]      Running setup.py bdist_wheel for numpy: finished with status 'done'
[Build]      Stored in directory: /root/.cache/pip/wheels/37/68/92/25b4aa6b2dbeb1da0829a26db2d64d883df18dbc3456903975
[Build]      Running setup.py bdist_wheel for scipy: started
[Build]      Running setup.py bdist_wheel for scipy: finished with status 'error'
[Build]      Complete output from command /usr/local/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-9RzxNp/scipy/set
up.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdis
t_wheel -d /tmp/tmpTQvJ39pip-wheel- --python-tag cp27:
[Build]      Traceback (most recent call last):
[Build]        File "<string>", line 1, in <module>
[Build]        File "/tmp/pip-build-9RzxNp/scipy/setup.py", line 418, in <module>
[Build]          setup_package()
[Build]        File "/tmp/pip-build-9RzxNp/scipy/setup.py", line 398, in setup_package
[Build]          from numpy.distutils.core import setup
[Build]      ImportError: No module named numpy.distutils.core

Any ideas here? Miniconda is proving really hard to install on raspberry pi too

Have some ideas, maybe it can help. Our Python base images install python besides the system’s own python (as we provide more versions of Python than the base Debian does), and there can be some weird interaction, not being sure if the system Python or the resin one was used, if not careful.

It would help to share the whole Dockerfile, so we can try to reproduce.

In the meantime, there’s one possible workaround: could just use the Debian base image and the Python provided by them. It comes with a bit older versions of the packages, but should still work. For example this managed to install scikit-learn for me:

# Use a Debian base image, and install system python later
FROM resin/%%RESIN_MACHINE_NAME%%-debian:stretch

# Set working directory
WORKDIR /usr/src/app

# Install dependencies
RUN    apt-get update \
    && apt-get install -yq \
        g++ \
        python3-dev \
        python3-pip \
        python3-setuptools \
        python3-wheel \
        python3-numpy \
        python3-scipy

# Copy requirements.txt first for better cache on later pushes
COPY ./requirements.txt /requirements.txt

# pip install python deps from requirements.txt on the resin.io build server
RUN pip3 install -r /requirements.txt

# This will copy all files in our root to the working  directory in the container
COPY . ./

# main.py will run when container starts up on the device
CMD ["python","src/main.py"]

and requirements.txt is just:

scikit-learn==0.19.1

Not sure what else you would need for your project afterwards, but this should hopefully make some progress.
Otherwise can share your own Dockerfile or full project, and we can try to replicate what you get, and tweak.

@imrehg
Hey thanks so much for the reply!

Here is my docker file

# base-image for node on any machine using a template variable,
# see more about dockerfile templates here: http://docs.resin.io/deployment/docker-templates/
# and about resin base images here: http://docs.resin.io/runtime/resin-base-images/
FROM resin/raspberrypi3-buildpack-deps:jessie

# Basic requirements
RUN apt-get update --fix-missing && apt-get install -y --no-install-recommends \
	wget curl \
	bzip2 tar unzip \
	ca-certificates \
    libglib2.0-0 libxext6 libsm6 libxrender1

# Compilation
RUN apt-get install -y --no-install-recommends build-essential \
	    make patch cmake \
	    gcc \
	    g++ \
    && rm -rf /var/lib/apt/lists/*

ENV NODE_VERSION 6.11.5

RUN curl -SLO "http://resin-packages.s3.amazonaws.com/node/v$NODE_VERSION/node-v$NODE_VERSION-linux-armv7hf.tar.gz" \
	&& echo "4115b02a8b78a816ed4d4b74b33b834909aae77e18dd2b3212b5ae57aa57ff68  node-v6.11.5-linux-armv7hf.tar.gz" | sha256sum -c - \
	&& tar -xzf "node-v$NODE_VERSION-linux-armv7hf.tar.gz" -C /usr/local --strip-components=1 \
	&& rm "node-v$NODE_VERSION-linux-armv7hf.tar.gz" \
	&& npm config set unsafe-perm true -g --unsafe-perm \
	&& rm -rf /tmp/*

CMD ["echo","'No CMD command was set in Dockerfile! Details about CMD command could be found in Dockerfile Guide section in our Docs. Here's the link: http://docs.resin.io/deployment/dockerfile"]
ENV PYTHONPATH /usr/lib/python2.7/dist-packages:/usr/lib/python2.7/site-packages:$PYTHONPATH

# use apt-get if you need to install dependencies,
# for instance if you need ALSA sound utils, just uncomment the lines below.
#RUN apt-get update && apt-get install -yq \
#    libblas-dev libblas3 && \
#    libopenblas-dev gfortran && \
#    python-numpy python-scipy python-scikits-learn && \
#    apt-get clean && rm -rf /var/lib/apt/lists/*

# Defines our working directory in container
WORKDIR /usr/src/app

# Copy requirements.txt first for better cache on later pushes
COPY ./requirements.txt /requirements.txt

# Copies the package.json first for better cache on later pushes
COPY package.json package.json

# pip install python deps from requirements.txt on the resin.io build server
RUN  install -r /requirements.txt

# This install npm dependencies on the resin.io build server,
# making sure to clean up the artifacts it creates in order to reduce the image size.
RUN JOBS=MAX npm install --production --unsafe-perm && npm cache clean && rm -rf /tmp/*

# This will copy all files in our root to the working  directory in the container
COPY . ./

# Enable systemd init system in container
ENV INITSYSTEM on

# server.js will run when container starts up on the device
CMD ["npm", "start"]

All my code is in python 2, so i’m interested to see if your method could work for what i need.

Since I need node too, I was thinking of using the node base image and then doing what you did with installing python later, I will try this and report back!

I think the method should work there as well, I guess, just would need to install python and python- packages in general, instead of python3, and use pip instead of pip3.

It works!!! I got the docker file working with node js building at least. I also got the scipy running and all the other programs i needed. My mind is so blown right now

1 Like