├── Dockerfile ├── README.md └── datascience.sh /Dockerfile: -------------------------------------------------------------------------------- 1 | # This file creates a container that runs a jupyter notebook server on Raspberry Pi 2 | # 3 | # Author: Max Jiang 4 | # Date 13/02/2017 5 | 6 | FROM resin/rpi-raspbian:jessie 7 | MAINTAINER Max Jiang 8 | 9 | # Set the variables 10 | ENV DEBIAN_FRONTEND noninteractive 11 | ENV PYTHON_VERSION 3.6.0 12 | 13 | WORKDIR /root 14 | 15 | # Install packages necessary for compiling python 16 | RUN apt-get update && apt-get upgrade && apt-get install -y \ 17 | build-essential \ 18 | libncursesw5-dev \ 19 | libgdbm-dev \ 20 | libc6-dev \ 21 | zlib1g-dev \ 22 | libsqlite3-dev \ 23 | tk-dev \ 24 | libssl-dev \ 25 | openssl \ 26 | libbz2-dev 27 | 28 | # Download and compile python 29 | RUN apt-get install -y ca-certificates 30 | ADD "https://www.python.org/ftp/python/${PYTHON_VERSION}/Python-${PYTHON_VERSION}.tgz" /root/Python-${PYTHON_VERSION}.tgz 31 | RUN tar zxvf "Python-${PYTHON_VERSION}.tgz" \ 32 | && cd Python-${PYTHON_VERSION} \ 33 | && ./configure \ 34 | && make \ 35 | && make install \ 36 | && cd .. \ 37 | && rm -rf "./Python-${PYTHON_VERSION}" \ 38 | && rm "./Python-${PYTHON_VERSION}.tgz" 39 | 40 | # Update pip and install jupyter 41 | RUN apt-get install -y libncurses5-dev 42 | RUN pip3 install --upgrade pip 43 | RUN pip3 install readline jupyter 44 | 45 | # Configure jupyter 46 | RUN jupyter notebook --generate-config 47 | RUN mkdir notebooks 48 | RUN sed -i "/c.NotebookApp.open_browser/c c.NotebookApp.open_browser = False" /root/.jupyter/jupyter_notebook_config.py \ 49 | && sed -i "/c.NotebookApp.ip/c c.NotebookApp.ip = '*'" /root/.jupyter/jupyter_notebook_config.py \ 50 | && sed -i "/c.NotebookApp.notebook_dir/c c.NotebookApp.notebook_dir = '/root/notebooks'" /root/.jupyter/jupyter_notebook_config.py 51 | 52 | VOLUME /root/notebooks 53 | 54 | # Add Tini. Tini operates as a process subreaper for jupyter. This prevents kernel crashes. 55 | ENV TINI_VERSION 0.14.0 56 | ENV CFLAGS="-DPR_SET_CHILD_SUBREAPER=36 -DPR_GET_CHILD_SUBREAPER=37" 57 | 58 | ADD https://github.com/krallin/tini/archive/v${TINI_VERSION}.tar.gz /root/v${TINI_VERSION}.tar.gz 59 | RUN apt-get install -y cmake 60 | RUN tar zxvf v${TINI_VERSION}.tar.gz \ 61 | && cd tini-${TINI_VERSION} \ 62 | && cmake . \ 63 | && make \ 64 | && cp tini /usr/bin/. \ 65 | && cd .. \ 66 | && rm -rf "./tini-${TINI_VERSION}" \ 67 | && rm "./v${TINI_VERSION}.tar.gz" 68 | 69 | ENTRYPOINT ["/usr/bin/tini", "--"] 70 | 71 | EXPOSE 8888 72 | 73 | CMD ["jupyter", "notebook"] 74 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # rpi-jupyter 2 | Jupyter Notebook Server on Raspberry Pi 3 | 4 | To have your own Jupyter Notebook Server running 24/7 on Raspberry Pi. If connected to your router, with port forwarding and DDNS set up, you can carry out your Data Science tasks on the move. 5 | 6 | Despite the fact that we adore Raspberry Pi and it is becoming more and more powerful, it is not intended to run large cpu intensive tasks. It will be slow and the best model only offers 1G of RAM. For larger datasets, you either need to use incremental machine learning algorithms or build a cluster and run Spark on it. I am currently working on the latter which would be interesting. 7 | 8 | ---------- 9 | This is a dockfile for building rpi-jupyter. The image is built on a Raspberry Pi 3 running [Hypriot OS](http://blog.hypriot.com/). It is a minimal notebook server with [resin/rpi-raspbian:jessie](https://hub.docker.com/r/resin/rpi-raspbian/) as base image without additional packages. 10 | 11 | Due to a popular python library called scikit-learn requires bzip2 library which needs to be installed before Python is compiled, I updated the image to 1.1. 12 | 13 | I have also built maxjiang/rpi-jupyter:datascience (based on 1.1) so that you have most of the data science packages installed without installing/compiling them yourself. 14 | 15 | ### Installing 16 | Go to [Hypriot OS](http://blog.hypriot.com/) and follow the steps to get the Raspberry Pi docker ready. Then, run the following: 17 | 18 | Tags | Description 19 | --- | --- 20 | datascience | numpy scipy scikit-learn pandas seaborn matplotlib 21 | 1.1/latest | Python 3.6, Tini 0.14.0, jessie-20170315 22 | 1.0 | Python 3.5.1, Tini 0.9.0, jessie-20160525 23 | 24 | docker pull maxjiang/rpi-jupyter<:tag> 25 | 26 | 27 | ### Running in detached mode 28 | docker run -dp 8888:8888 maxjiang/rpi-jupyter 29 | 30 | Now you can access your notebook at `http://:8888` 31 | 32 | ### Configuration 33 | If you would like to change some config, create your own jupyter_notebook_config.py on the docker host and run the following: 34 | 35 | docker run -itp : -v :/root/.jupyter/jupyter_notebook_config.py maxjiang/rpi-jupyter 36 | 37 | This maps a local config file to the container. 38 | 39 | The following command gives you a bash session in the running container so you could do more: 40 | 41 | docker exec -it /bin/bash 42 | 43 | ### For Data Scientists 44 | Use the above command to open a new bash session in your container and run the following: 45 | 46 | sh datascience.sh 47 | 48 | This will install almost all the Python modules you need for most Data Science tasks. 49 | -------------------------------------------------------------------------------- /datascience.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | # After running this script, you should be able to carry out most of your data science tasks. 4 | # I deliberately seperated this from the dockerfile to keep the docker image as clean as possible 5 | 6 | # Some useful tools 7 | apt-get update && apt-get install -y vim wget git 8 | 9 | # Required for pandas hdf5, sci-pi 10 | apt-get install -y libhdf5-dev liblapack-dev gfortran 11 | 12 | # Python packages for data science 13 | pip3 install numpy scipy pandas scikit-learn nltk seaborn tables matplotlib 14 | --------------------------------------------------------------------------------