├── images ├── .gitignore ├── Screenshot01.png ├── Screenshot02.png ├── Screenshot03.png ├── Screenshot05.png ├── Screenshot06.png ├── Screenshot07.png ├── Screenshot08.png ├── raspberrypi.jpeg ├── raspberrypi.jpg ├── screenshot1.png ├── screenshot2.png └── screenshot3.png ├── docker-compose-ollama-webui.yml ├── docker-compose.yaml └── README.md /images/.gitignore: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /images/Screenshot01.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/Screenshot01.png -------------------------------------------------------------------------------- /images/Screenshot02.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/Screenshot02.png -------------------------------------------------------------------------------- /images/Screenshot03.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/Screenshot03.png -------------------------------------------------------------------------------- /images/Screenshot05.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/Screenshot05.png -------------------------------------------------------------------------------- /images/Screenshot06.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/Screenshot06.png -------------------------------------------------------------------------------- /images/Screenshot07.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/Screenshot07.png -------------------------------------------------------------------------------- /images/Screenshot08.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/Screenshot08.png -------------------------------------------------------------------------------- /images/raspberrypi.jpeg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/raspberrypi.jpeg -------------------------------------------------------------------------------- /images/raspberrypi.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/raspberrypi.jpg -------------------------------------------------------------------------------- /images/screenshot1.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/screenshot1.png -------------------------------------------------------------------------------- /images/screenshot2.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/screenshot2.png -------------------------------------------------------------------------------- /images/screenshot3.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/adijayainc/LLM-ollama-webui-Raspberry-Pi5/HEAD/images/screenshot3.png -------------------------------------------------------------------------------- /docker-compose-ollama-webui.yml: -------------------------------------------------------------------------------- 1 | version: "3.9" 2 | 3 | services: 4 | ollama: 5 | container_name: ollama 6 | image: ollama/ollama:latest 7 | restart: always 8 | volumes: 9 | - /home/adijaya/ollama:/root/.ollama 10 | 11 | ollama-webui: 12 | build: 13 | context: ./webui/ 14 | args: 15 | OLLAMA_API_BASE_URL: '/ollama/api' 16 | dockerfile: Dockerfile 17 | image: ghcr.io/ollama/ollama-webui:main 18 | container_name: ollama-webui 19 | volumes: 20 | - ollama-webui:/app/backend/data 21 | depends_on: 22 | - ollama 23 | ports: 24 | - ${OLLAMA_WEBUI_PORT-3000}:8080 25 | environment: 26 | - 'OLLAMA_API_BASE_URL=http://ollama:11434/api' 27 | extra_hosts: 28 | - host.docker.internal:host-gateway 29 | restart: unless-stopped 30 | 31 | volumes: 32 | ollama-webui: {} 33 | ollama: {} 34 | 35 | -------------------------------------------------------------------------------- /docker-compose.yaml: -------------------------------------------------------------------------------- 1 | services: 2 | ollama: 3 | volumes: 4 | - /home/adijaya/ollama:/root/.ollama 5 | - /home/adijaya/project:/root/project 6 | container_name: ollama 7 | pull_policy: always 8 | tty: true 9 | restart: unless-stopped 10 | image: ollama/ollama:${OLLAMA_DOCKER_TAG-latest} 11 | 12 | open-webui: 13 | build: 14 | context: . 15 | args: 16 | OLLAMA_BASE_URL: '/ollama' 17 | dockerfile: Dockerfile 18 | image: ghcr.io/open-webui/open-webui:${WEBUI_DOCKER_TAG-main} 19 | container_name: open-webui 20 | volumes: 21 | - open-webui:/app/backend/data 22 | depends_on: 23 | - ollama 24 | ports: 25 | - ${OPEN_WEBUI_PORT-3000}:8080 26 | environment: 27 | - 'OLLAMA_BASE_URL=http://ollama:11434' 28 | - 'WEBUI_SECRET_KEY=' 29 | extra_hosts: 30 | - host.docker.internal:host-gateway 31 | restart: unless-stopped 32 | 33 | volumes: 34 | ollama: {} 35 | open-webui: {} 36 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 |
2 | 3 | # How to run TinyLlama Using Ollama-Webui on Raspberry Pi 5 4 | 5 | How to run LLM (mistral 7b) on Raspberry Pi 5 6 | 7 | Step-by-Step Guide on how to run TinyLama LLM on a Raspberry Pi 5 using Docker + Ollama + WebUI 8 | 9 |
10 | 11 | ### Table of Content: 12 | 13 | 1. [Prerequisite](#prerequisite) 14 | 2. [Setup Raspberry Pi](#setup-raspberry-pi-headless-setup) 15 | 3. [setup Docker Composer](#setup-docker-composer) 16 | 4. [Setup Ollama WebUI:step-by-step Guide](#setup-ollama-webui-step-by-step-guide) 17 | 5. [Download Model](#Download-Model-on-Ollama-Web-Ui) 18 | 6. [Extra Resoucres](#extra-resoucres) 19 | 7. [access API ollama](#access-api) 20 | --- 21 | 22 | ### Prerequisite 23 | 24 | - [Raspberry Pi 5, 8GB RAM](https://www.raspberrypi.com/products/raspberry-pi-5/) 25 | - 32GB SD Card 26 | 27 | ### Setup Raspberry Pi (Headless-setup) 28 | 29 | You can also follow along this [YouTube video](https://www.youtube.com/watch?v=9fEnvDgxwbI) instead. 30 | 31 | 1. Connect the SD card to your laptop 32 | 2. Download Raspberry Pi OS (bootloader): [https://www.raspberrypi.com/software/](https://www.raspberrypi.com/software/) 33 | 3. Run it, and you should see: 34 | ![screenshot1.png](./images/Screenshot01.png) 35 | - "Choose Device" - choose Raspberry Pi 5 36 | - OS, choose the latest (64bit is the recommended) 37 | - "Choose Storage" - choose the inserted SD card 38 | 4. Now click next, and it will ask you if you want to edit the settings, click "Edit settings" 39 | ![screenshot2.png](./images/Screenshot02.png) 40 | 5. Configure 41 | ![screenshot3.png](./images/Screenshot03.png) 42 | - enable hostname and set it to `raspberrypi`.local 43 | - Set username and password you will remember, we will use them shortly 44 | - Enable "Configure Wireless LAN" and add your wifi name and password 45 | - Click save, and contiue. it will take a few minutes to write everything to the SD 46 | 6. Insert the SD card to your raspberry pi, and connect it to the electricity 47 | 7. SSH into the Raspberry PI: 48 | 49 | ```bash 50 | ssh ssh @raspberrypi.local 51 | ``` 52 | ### Setup Docker Composer 53 | 54 | 1. install Docker : 55 | 56 | ```bash 57 | curl -fsSL https://get.docker.com -o get-docker.sh 58 | sudo sh get-docker.sh 59 | ``` 60 | 2. add user for running docker 61 | 62 | ```bash 63 | sudo usermod -aG docker ${USER} 64 | ``` 65 | 66 | 3. Check docker installation 67 | 68 | ```bash 69 | sudo su - ${USER} 70 | docker version 71 | docker run hello-world 72 | ``` 73 | 4. install docker composer 74 | 75 | ```bash 76 | sudo apt-get install libffi-dev libssl-dev 77 | sudo apt install python3-dev 78 | sudo apt-get install -y python3 python3-pip 79 | sudo pip3 install docker-compose 80 | ``` 81 | 82 | ### Setup Ollama-WebUI Step by Step Guide: ( Replace with Open Web UI) 83 | 84 | 1. Download the latest snapshot of ollama-webui : 85 | 86 | ```bash 87 | git clone https://github.com/ollama-webui/ollama-webui webui 88 | ``` 89 | 90 | 2. create a docker compose file (you can change user pi ): 91 | 92 | ```bash 93 | version: "3.9" 94 | services: 95 | ollama: 96 | container_name: ollama 97 | image: ollama/ollama:latest 98 | restart: always 99 | volumes: 100 | - /home/pi/ollama:/root/.ollama 101 | ollama-webui: 102 | build: 103 | context: ./webui/ 104 | args: 105 | OLLAMA_API_BASE_URL: '/ollama/api' 106 | dockerfile: Dockerfile 107 | image: ghcr.io/ollama/ollama-webui:main 108 | container_name: ollama-webui 109 | volumes: 110 | - ollama-webui:/app/backend/data 111 | depends_on: 112 | - ollama 113 | ports: 114 | - ${OLLAMA_WEBUI_PORT-3000}:8080 115 | environment: 116 | - 'OLLAMA_API_BASE_URL=http://ollama:11434/api' 117 | extra_hosts: 118 | - host.docker.internal:host-gateway 119 | restart: unless-stopped 120 | volumes: 121 | ollama-webui: {} 122 | ollama: {} 123 | 124 | ``` 125 | 3. bring the container up : 126 | 127 | ```bash 128 | docker-compose up -d 129 | ``` 130 | 5. Access the webui at http://localhost:3000 131 | 6. create free account for first login 132 | 7. Download the model you want to use (see below), by clicking on the litte cog icon selecting model 133 | ![Screenshot05.png](./images/Screenshot05.png) 134 | 8. For list of model [model library](https://github.com/ollama/ollama#Model-Library). 135 | 136 | That is it! 137 | 138 | ### Running Tinyllama Model on Ollama Web UI 139 | 140 | 1. Access the web ui login using username already created 141 | 142 | ![Screenshot06.png](./images/Screenshot06.png) 143 | 144 | 2. Pull a model form Ollama.com , select tinyllama / mistral:7b 145 | 146 | ![Screenshot07.png](./images/Screenshot07.png) 147 | 148 | 3. Testing asking to webui , "who are you ?" 149 | 150 | ![Screenshot08.png](./images/Screenshot08.png) 151 | 152 | That is it! 153 | 154 | ### Extra Resoucres: 155 | 156 | - [Youtube video on how to setup Raspberry PI headlessly](https://www.youtube.com/watch?v=9fEnvDgxwbI) 157 | - [Adijayainc Twitter](https://twitter.com/adijayainc) 158 | 159 | ### Update Docker Images 160 | 161 | ```bash 162 | docker-compose stop 163 | docker container rm [container-name] 164 | docker rmi [images] 165 | docker-compose pull 166 | docker-compose up --force-recreate --build -d 167 | docker image prune -f 168 | ``` 169 | 170 | 171 | ### Access API 172 | 173 | --------------------------------------------------------------------------------