├── README.md ├── esp32cam ├── .gitignore ├── Makefile ├── include │ └── README ├── lib │ └── README ├── platformio.ini ├── src │ ├── .gitignore │ ├── cam.ino │ └── wifi_pass.h └── test │ └── README ├── server ├── Dockerfile ├── Dockerfile.dlib ├── Dockerfile.rpi4 ├── build_push.sh ├── cam-server.yaml ├── faces.py ├── faces │ ├── Miguel Angel Ajo │ │ └── ajo1.jpg │ └── Ricardo Noriega │ │ └── ricky1.jpg ├── mjpeg_streamer.py ├── run.sh └── server.py └── wifi-ap ├── Dockerfile ├── Dockerfile.rpi4 ├── build_push.sh ├── build_run.sh ├── cam-ap.yaml ├── hostapd.conf ├── hostapd.rpi4.conf ├── idea-hostap-container.txt ├── microshift-wlan0-dnsmasq-conf └── run.sh /README.md: -------------------------------------------------------------------------------- 1 | # AI at the Edge with MicroShift 2 | This repository contains the code developed for the talk "AI at the Edge with MicroShift" developed by Miguel Angel Ajo and Ricardo Noriega. 3 | 4 | You can find the video of our presentation [here](https://www.youtube.com/watch?v=kR9eSxM9qgg). 5 | 6 | The end goal of this demo is to run a face detection and face recognition AI model in a cloud-native fashion using MicroShift in an edge computing scenario. In order to do this, we used the [NVIDIA Jetson](https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/) family boards (tested on Jetson TX2 and Jetson Xavier NX). 7 | 8 | This demo repository is structured into three different folders: 9 | 10 | * esp32cam: software used to program the ESP32 cameras. 11 | * wifi-ap: script running inside a Kubernetes pod that creates a WIFI SSID. 12 | * server: Flask server that receives video streams from the cameras and performs face detection and recognition. 13 | 14 | 15 | ## ESP32 Cameras Software 16 | 17 | The code has been built using [PlatformIO](https://platformio.org/) with the Arduino framework. VSCode IDE has a very convenient plugin that allows you to choose the platform to use, compile the code and push it to the device from one place. 18 | 19 | There are tons of tutorials on how to use an ESP32 Camera with Platform IO, but one of the most relevant files to look at is the platformio.ini located in the esp32cam directory: 20 | 21 | ``` 22 | [env:esp32cam] 23 | #upload_speed = 460800 24 | upload_speed = 921600 25 | platform = espressif32 26 | board = esp32cam 27 | framework = arduino 28 | lib_deps = yoursunny/esp32cam@0.0.20210226 29 | monitor_port=/dev/ttyUSB0 30 | monitor_speed=115200 31 | monitor_rst=1 32 | monitor_dtr=0 33 | ``` 34 | 35 | Once you connect the camera to your laptop, a new `/dev/ttyUSB0` will appear, and this file determines what platform the code must be compiled for. 36 | 37 | If you want the cameras to connect to a different SSID, please modify the following file `esp32cam/src/wifi_pass.h`: 38 | 39 | ``` 40 | const char* WIFI_SSID = "camwifi"; 41 | const char* WIFI_PASS = "thisisacamwifi"; 42 | ``` 43 | 44 | ## Running MicroShift (jetson L4T) 45 | 46 | At this point, we have programmed our ESP32 cameras. We assume that you have installed the standard L4T operating system specific to your Jetson board, and it is ready to install some packages (as root). 47 | 48 | ``` 49 | apt install -y curl jq runc iptables conntrack nvidia-container-runtime nvidia-container-toolkit 50 | ``` 51 | 52 | Disable firewalld: 53 | 54 | ``` 55 | systemctl disable --now firewalld 56 | ``` 57 | 58 | Install CRI-O as our container runtime: 59 | 60 | ``` 61 | curl https://raw.githubusercontent.com/cri-o/cri-o/main/scripts/get | bash 62 | 63 | ``` 64 | 65 | Configure CRI-O in order to use the NVIDIA Container Runtime 66 | 67 | 68 | ``` 69 | rm /etc/crio/crio.conf.d/* 70 | 71 | cat << EOF > /etc/crio/crio.conf.d/10-nvidia-runtime.conf 72 | [crio.runtime] 73 | default_runtime = "nvidia" 74 | 75 | [crio.runtime.runtimes.nvidia] 76 | runtime_path = "/usr/bin/nvidia-container-runtime" 77 | EOF 78 | 79 | cat << EOF > /etc/crio/crio.conf.d/01-crio-runc.conf 80 | [crio.runtime.runtimes.runc] 81 | runtime_path = "/usr/sbin/runc" 82 | runtime_type = "oci" 83 | runtime_root = "/run/runc" 84 | EOF 85 | 86 | rm -rf /etc/cni/net.d/10-crio-bridge.conf 87 | ``` 88 | 89 | Download MicroShift binary: 90 | 91 | ``` 92 | export ARCH=arm64 93 | export VERSION=$(curl -s https://api.github.com/repos/redhat-et/microshift/releases | grep tag_name | head -n 1 | cut -d '"' -f 4) 94 | curl -LO https://github.com/redhat-et/microshift/releases/download/$VERSION/microshift-linux-${ARCH} 95 | mv microshift-linux-${ARCH} /usr/bin/microshift; chmod 755 /usr/bin/microshift 96 | ``` 97 | Create the MicroShift's systemd service: 98 | 99 | ``` 100 | cat << EOF > /usr/lib/systemd/system/microshift.service 101 | [Unit] 102 | Description=MicroShift 103 | After=crio.service 104 | 105 | [Service] 106 | WorkingDirectory=/usr/bin/ 107 | ExecStart=/usr/bin/microshift run 108 | Restart=always 109 | User=root 110 | 111 | [Install] 112 | WantedBy=multi-user.target 113 | EOF 114 | ``` 115 | 116 | Enable and run CRI-O and MicroShift services: 117 | ``` 118 | systemctl enable crio --now 119 | systemctl enable microshift.service --now 120 | ``` 121 | Download and install the oc client: 122 | 123 | ``` 124 | curl -LO https://mirror.openshift.com/pub/openshift-v4/arm64/clients/ocp/stable/openshift-client-linux.tar.gz 125 | tar xvf openshift-client-linux.tar.gz 126 | chmod +x oc 127 | mv oc /usr/local/bin 128 | ``` 129 | 130 | Set Kubeconfig environment variable: 131 | 132 | ``` 133 | export KUBECONFIG=/var/lib/microshift/resources/kubeadmin/kubeconfig 134 | ``` 135 | 136 | If MicroShift is up and running, after a couple of minutes you should see the following pods: 137 | 138 | ``` 139 | root@jetson-nx:~# oc get pod -A 140 | NAMESPACE NAME READY STATUS RESTARTS AGE 141 | kube-system kube-flannel-ds-7rz4d 1/1 Running 0 17h 142 | kubevirt-hostpath-provisioner kubevirt-hostpath-provisioner-9m9mc 1/1 Running 0 17h 143 | openshift-dns dns-default-6pbkt 2/2 Running 0 17h 144 | openshift-dns node-resolver-g4d8g 1/1 Running 0 17h 145 | openshift-ingress router-default-85bcfdd948-tsk29 1/1 Running 0 17h 146 | openshift-service-ca service-ca-7764c85869-dvdtm 1/1 Running 0 17h 147 | 148 | ``` 149 | 150 | Now, we have our cloud-native platform ready to run workloads. Think about this: we have an edge computing optimized Kubernetes distribution ready to run an AI workload, and make use of the integrated GPU from the NVIDIA Jetson board. It's awesome! 151 | 152 | ## Wi-FI Access Point 153 | 154 | We don't want to depend on any available wireless network that we don't control and might not be secured. Furthermore, in order to allow users to test this demo, it has to be self-contained. This is why we have created a pod that creates a Wi-Fi Access Point with the credentials mentioned in the above section. 155 | 156 | ``` 157 | SSID: camwifi 158 | Password: thisisacamwifi 159 | ``` 160 | 161 | If you want to expose a different SSID and use a different password, please change the file `wifi-ap/hostapd.conf`. 162 | 163 | Now, let's deploy this pod on MicroShift. 164 | 165 | ``` 166 | oc apply -f wifi-ap/cam-ap.yaml 167 | ``` 168 | 169 | After a few seconds, we should be able to see it running: 170 | 171 | ``` 172 | oc get pods 173 | 174 | NAME READY STATUS RESTARTS AGE 175 | cameras-ap-b6b6c9c96-krm45 1/1 Running 0 3s 176 | ``` 177 | 178 | Looking at the logs of the pod, you will see how the process inside will provide IP addresses to the cameras or any device connected to the wireless network. 179 | 180 | 181 | ## AI models 182 | 183 | The final step is to deploy the AI models that will perform face detection and face recognition. This pod is basically a Flask server that will get the streams of the cameras once they are connected, and start working on a discrete number of frames. 184 | 185 | Let's deploy the AI models on MicroShift: 186 | 187 | ``` 188 | oc apply -f server/cam-server.yaml 189 | ``` 190 | 191 | After few seconds: 192 | 193 | ``` 194 | oc get pods 195 | 196 | NAME READY STATUS RESTARTS AGE 197 | cameras-ap-b6b6c9c96-krm45 1/1 Running 0 4m37s 198 | camserver-cc996fd86-pkm45 1/1 Running 0 42s 199 | ``` 200 | 201 | We also need to create a service and expose a route for this pod: 202 | 203 | ``` 204 | oc expose deployment camserver 205 | oc expose service camserver --hostname microshift-cam-reg.local 206 | ``` 207 | 208 | MicroShift has mDNS built-in capabilities, and this route will be automatically announced, so the cameras can register to this service, and start streaming video. 209 | 210 | Looking at the camserver logs, we can see this registration process: 211 | 212 | ``` 213 | oc logs camserver-cc996fd86-pkm45 214 | 215 | * Serving Flask app 'server' (lazy loading) 216 | * Environment: production 217 | WARNING: This is a development server. Do not use it in a production deployment. 218 | Use a production WSGI server instead. 219 | * Debug mode: off 220 | * Running on all addresses. 221 | WARNING: This is a development server. Do not use it in a production deployment. 222 | * Running on http://10.85.0.36:5000/ (Press CTRL+C to quit) 223 | [2022-01-21 11:18:46,203] INFO in server: camera @192.168.66.89 registered with token a53ca190 224 | [2022-01-21 11:18:46,208] INFO in server: starting streamer thread 225 | [2022-01-21 11:19:34,674] INFO in server: starting streamer thread 226 | 10.85.0.1 - - [21/Jan/2022 11:19:34] "GET /register?ip=192.168.66.89&token=a53ca190 HTTP/1.1" 200 - 227 | ``` 228 | 229 | Finally, open a browser with the following URL: 230 | 231 | ``` 232 | http://microshift-cam-reg.local/video_feed 233 | ``` 234 | 235 | This web will show you the feeds of all the cameras that have been registered and you will be able to see how faces are detected. 236 | 237 | 238 | ## Conclusion 239 | 240 | This demo is just a simple use case of what an edge computing scenario would look like. Running AI/ML models on top of an embedded system like the NVIDIA Jetson family, and leveraging cloud-native capabilities with MicroShift. 241 | 242 | We hope you enjoy it! 243 | 244 | 245 | # Installing on Fedora 35 / aarch64 rpi4 246 | ```bash 247 | sudo dnf module enable -y cri-o:1.21 248 | sudo dnf copr enable -y @redhat-et/microshift 249 | sudo dnf install -y cri-o cri-tools nss-mdns avahi microshift 250 | 251 | hostnamectl set-hostname microshift-rpi64.local 252 | systemctl enable --now avahi-daemon.service 253 | 254 | 255 | firewall-cmd --zone=trusted --add-source=10.42.0.0/16 --permanent 256 | firewall-cmd --zone=public --add-port=80/tcp --permanent 257 | firewall-cmd --zone=public --add-port=443/tcp --permanent 258 | # enable mdns 259 | firewall-cmd --zone=public --add-port=5353/udp --permanent 260 | # this one is used by the dhcp server on the acccess point 261 | firewall-cmd --zone=public --add-service=dhcp --permanent 262 | # this rule allows connections from the cameras into this host 263 | firewall-cmd --zone=trusted --add-source=192.168.66.0/24 --permanent 264 | firewall-cmd --reload 265 | 266 | cat >/etc/NetworkManager/conf.d/99-unmanaged-devices.conf < kustomization.yaml < access-point.yaml < cam-server.yaml < mdns-route.yaml < service.yaml < THIS FILE 25 | | 26 | |- platformio.ini 27 | |--src 28 | |- main.c 29 | 30 | and a contents of `src/main.c`: 31 | ``` 32 | #include 33 | #include 34 | 35 | int main (void) 36 | { 37 | ... 38 | } 39 | 40 | ``` 41 | 42 | PlatformIO Library Dependency Finder will find automatically dependent 43 | libraries scanning project source files. 44 | 45 | More information about PlatformIO Library Dependency Finder 46 | - https://docs.platformio.org/page/librarymanager/ldf.html 47 | -------------------------------------------------------------------------------- /esp32cam/platformio.ini: -------------------------------------------------------------------------------- 1 | ; PlatformIO Project Configuration File 2 | ; 3 | ; Build options: build flags, source filter 4 | ; Upload options: custom upload port, speed and extra flags 5 | ; Library options: dependencies, extra library storages 6 | ; Advanced options: extra scripting 7 | ; 8 | ; Please visit documentation for the other options and examples 9 | ; https://docs.platformio.org/page/projectconf.html 10 | 11 | [env:esp32cam] 12 | #upload_speed = 460800 13 | upload_speed = 921600 14 | platform = espressif32 15 | board = esp32cam 16 | framework = arduino 17 | lib_deps = yoursunny/esp32cam@0.0.20210226 18 | monitor_port=/dev/ttyUSB0 19 | monitor_speed=115200 20 | monitor_rst=1 21 | monitor_dtr=0 22 | -------------------------------------------------------------------------------- /esp32cam/src/.gitignore: -------------------------------------------------------------------------------- 1 | wifi_pass.h 2 | -------------------------------------------------------------------------------- /esp32cam/src/cam.ino: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include 5 | #include 6 | #include 7 | #include 8 | 9 | WebServer server(80); 10 | IPAddress microShiftAddress; 11 | 12 | static auto hiRes = esp32cam::Resolution::find(640, 480); 13 | unsigned long lastReconnectMillis; 14 | bool microShiftResolved = false; 15 | 16 | void 17 | handleJpgHi() 18 | { 19 | pinMode(4,OUTPUT); 20 | digitalWrite(4,HIGH); 21 | 22 | delay(150); 23 | 24 | if (!esp32cam::Camera.changeResolution(hiRes)) { 25 | Serial.println("SET-HI-RES FAIL"); 26 | } 27 | 28 | auto frame = esp32cam::capture(); 29 | pinMode(4,OUTPUT); 30 | digitalWrite(4,LOW); 31 | 32 | if (frame == nullptr) { 33 | Serial.println("CAPTURE FAIL"); 34 | server.send(503, "", ""); 35 | return; 36 | } 37 | Serial.printf("CAPTURE OK %dx%d %db\n", frame->getWidth(), frame->getHeight(), 38 | static_cast(frame->size())); 39 | 40 | server.setContentLength(frame->size()); 41 | server.send(200, "image/jpeg"); 42 | WiFiClient client = server.client(); 43 | frame->writeTo(client); 44 | 45 | } 46 | 47 | 48 | bool anyoneStreaming; 49 | 50 | void 51 | handleMjpeg() 52 | { 53 | anyoneStreaming = true; 54 | 55 | if (!esp32cam::Camera.changeResolution(hiRes)) { 56 | Serial.println("SET-HI-RES FAIL"); 57 | } 58 | 59 | Serial.println("STREAM BEGIN"); 60 | WiFiClient client = server.client(); 61 | auto startTime = millis(); 62 | int res = esp32cam::Camera.streamMjpeg(client); 63 | 64 | anyoneStreaming = false; 65 | //Trigger reconnection to microshift camserver 66 | microShiftAddress = IPAddress((uint32_t)0); 67 | 68 | if (res <= 0) { 69 | Serial.printf("STREAM ERROR %d\n", res); 70 | return; 71 | } 72 | auto duration = millis() - startTime; 73 | Serial.printf("STREAM END %dfrm %0.2ffps\n", res, 1000.0 * res / duration); 74 | } 75 | 76 | char token[32]; 77 | 78 | char *getCamID() { 79 | 80 | static char cam_id[17]; 81 | uint32_t chipId = 0; 82 | for(int i=0; i<17; i=i+8) { 83 | chipId |= ((ESP.getEfuseMac() >> (40 - i)) & 0xff) << i; 84 | } 85 | sprintf(cam_id,"CAM_%08x", chipId); 86 | return cam_id; 87 | } 88 | 89 | void 90 | setup() 91 | { 92 | 93 | Serial.begin(115200); 94 | Serial.println(); 95 | 96 | sprintf(token, "%08x", esp_random()); 97 | 98 | char *cam_id = getCamID(); 99 | 100 | Serial.print("Started mDNS server as "); 101 | Serial.println(cam_id); 102 | 103 | 104 | { 105 | using namespace esp32cam; 106 | Config cfg; 107 | cfg.setPins(pins::AiThinker); 108 | cfg.setResolution(hiRes); 109 | cfg.setBufferCount(2); 110 | cfg.setJpeg(90); 111 | 112 | bool ok = Camera.begin(cfg); 113 | Serial.println(ok ? "CAMERA OK" : "CAMERA FAIL"); 114 | 115 | sensor_t * s = esp_camera_sensor_get(); 116 | s->set_brightness(s, 0); // -2 to 2 117 | s->set_contrast(s, 0); // -2 to 2 118 | s->set_saturation(s, 0); // -2 to 2 119 | s->set_special_effect(s, 0); // 0 to 6 (0 - No Effect, 1 - Negative, 2 - Grayscale, 3 - Red Tint, 4 - Green Tint, 5 - Blue Tint, 6 - Sepia) 120 | s->set_whitebal(s, 1); // 0 = disable , 1 = enable 121 | s->set_awb_gain(s, 1); // 0 = disable , 1 = enable 122 | s->set_wb_mode(s, 0); // 0 to 4 - if awb_gain enabled (0 - Auto, 1 - Sunny, 2 - Cloudy, 3 - Office, 4 - Home) 123 | s->set_exposure_ctrl(s, 1); // 0 = disable , 1 = enable 124 | s->set_aec2(s, 0); // 0 = disable , 1 = enable 125 | s->set_ae_level(s, 0); // -2 to 2 126 | s->set_aec_value(s, 300); // 0 to 1200 127 | s->set_gain_ctrl(s, 1); // 0 = disable , 1 = enable 128 | s->set_agc_gain(s, 0); // 0 to 30 129 | s->set_gainceiling(s, (gainceiling_t)0); // 0 to 6 130 | s->set_bpc(s, 0); // 0 = disable , 1 = enable 131 | s->set_wpc(s, 1); // 0 = disable , 1 = enable 132 | s->set_raw_gma(s, 1); // 0 = disable , 1 = enable 133 | s->set_lenc(s, 1); // 0 = disable , 1 = enable 134 | s->set_hmirror(s, 0); // 0 = disable , 1 = enable 135 | s->set_vflip(s, 0); // 0 = disable , 1 = enable 136 | s->set_dcw(s, 1); // 0 = disable , 1 = enable 137 | s->set_colorbar(s, 0); // 0 = disable , 1 = enable 138 | 139 | } 140 | 141 | WiFi.persistent(false); 142 | WiFi.mode(WIFI_STA); 143 | WiFi.begin(WIFI_SSID, WIFI_PASS); 144 | while (WiFi.status() != WL_CONNECTED) { 145 | delay(500); 146 | } 147 | 148 | if(!MDNS.begin(cam_id)) { 149 | Serial.println("Error starting mDNS"); 150 | return; 151 | } 152 | 153 | lastReconnectMillis = millis(); 154 | 155 | Serial.print("http://"); Serial.println(WiFi.localIP()); Serial.println(" /cam-hi.jpg"); 156 | Serial.print("http://"); Serial.println(WiFi.localIP()); Serial.println(" /stream"); 157 | 158 | server.on("/cam-hi.jpg", handleJpgHi); 159 | server.on("/stream", handleMjpeg); 160 | 161 | server.begin(); 162 | } 163 | 164 | 165 | #define HOSTNAME "microshift-cam-reg" 166 | #define HOSTNAME_FULL HOSTNAME ".local" 167 | 168 | void reconnectWiFiOnDisconnect() { 169 | unsigned long currentMillis = millis(); 170 | // if WiFi is down, try reconnecting 171 | if ((WiFi.status() != WL_CONNECTED) && (currentMillis - lastReconnectMillis >= 20000)) { 172 | Serial.println("Reconnecting to WiFi..."); 173 | WiFi.disconnect(); 174 | WiFi.reconnect(); 175 | lastReconnectMillis = currentMillis; 176 | } 177 | } 178 | 179 | 180 | unsigned long lastRegistrationMillis = 0; 181 | 182 | #define RE_REGISTRATION_MS 15000 183 | 184 | void 185 | loop() 186 | { 187 | reconnectWiFiOnDisconnect(); 188 | 189 | if (WiFi.status() != WL_CONNECTED) { 190 | delay(100); 191 | return; 192 | } 193 | 194 | server.handleClient(); 195 | 196 | if ((!microShiftAddress) || (!anyoneStreaming && (millis()-lastRegistrationMillis)>RE_REGISTRATION_MS)) { 197 | microShiftAddress = MDNS.queryHost(HOSTNAME); 198 | if (microShiftAddress) { 199 | Serial.print("microshift " HOSTNAME_FULL " resolved to: "); 200 | Serial.println(microShiftAddress.toString()); 201 | Serial.println("Registering to cam server"); 202 | if (!registerInCamServer()) { 203 | microShiftAddress = IPAddress((uint32_t)0); 204 | } else { 205 | lastRegistrationMillis = millis(); 206 | } 207 | 208 | } else { 209 | Serial.println("Could not resolve microshift address"); 210 | delay(1000); 211 | } 212 | } 213 | } 214 | 215 | bool registerInCamServer() { 216 | 217 | // If you wonder why using a bare TCP client instead of the HTTPClient: 218 | // HTTPClient does not let you provide a hostname when contacting an 219 | // IP Address, and without that openshift router (or any other type 220 | // of http load balancer, won't know where to send the request) 221 | 222 | String requestPath = "/register?ip="; 223 | requestPath += WiFi.localIP().toString(); 224 | requestPath += "&token="; 225 | requestPath += token; 226 | 227 | WiFiClient client; 228 | 229 | if (!client.connect(microShiftAddress, 80)) { 230 | Serial.println("Connection failed"); 231 | } 232 | 233 | client.println("GET " + requestPath + " HTTP/1.1"); 234 | client.println("Host: " HOSTNAME_FULL); 235 | client.println("Connection: close"); 236 | client.println(); 237 | uint8_t buf[1024]; 238 | delay(1000); 239 | int l = client.read(buf, 1023); 240 | buf[l] = '\0'; 241 | Serial.println((char*)buf); 242 | client.stop(); 243 | return true; 244 | } 245 | -------------------------------------------------------------------------------- /esp32cam/src/wifi_pass.h: -------------------------------------------------------------------------------- 1 | const char* WIFI_SSID = "camwifi"; 2 | const char* WIFI_PASS = "thisisacamwifi"; 3 | 4 | -------------------------------------------------------------------------------- /esp32cam/test/README: -------------------------------------------------------------------------------- 1 | 2 | This directory is intended for PlatformIO Unit Testing and project tests. 3 | 4 | Unit Testing is a software testing method by which individual units of 5 | source code, sets of one or more MCU program modules together with associated 6 | control data, usage procedures, and operating procedures, are tested to 7 | determine whether they are fit for use. Unit testing finds problems early 8 | in the development cycle. 9 | 10 | More information about PlatformIO Unit Testing: 11 | - https://docs.platformio.org/page/plus/unit-testing.html 12 | -------------------------------------------------------------------------------- /server/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM docker.io/mangelajo/jetson-tx2-dlib:latest 2 | RUN DEBIAN_FRONTEND=noninteractive TZ=Etc/UTC \ 3 | apt-get install -y python3-markupsafe nvidia-opencv python3-opencv\ 4 | python3-requests 5 | RUN pip3 install flask 6 | WORKDIR /app 7 | 8 | COPY *.py /app 9 | RUN mkdir /app/faces 10 | COPY faces /app/faces 11 | ENV FLASK_APP=server 12 | ENV LC_ALL=C.UTF-8 13 | ENV LANG=C.UTF-8 14 | CMD python3 -m flask run --host 0.0.0.0 15 | -------------------------------------------------------------------------------- /server/Dockerfile.dlib: -------------------------------------------------------------------------------- 1 | # This Dockerfile should not be run directly. Instead, run ./build-aio-dev.sh 2 | FROM nvcr.io/nvidia/l4t-base:r32.6.1 3 | 4 | ARG ARCH=arm64 5 | 6 | #xavier NX 7 | #ARG SOC=t194 8 | 9 | #tx2 10 | ARG SOC=t186 11 | ARG DLIB_VERSION=19.22 12 | USER root 13 | 14 | # add l4t repo 15 | ADD https://repo.download.nvidia.com/jetson/jetson-ota-public.asc /etc/apt/trusted.gpg.d/jetson-ota-public.asc 16 | RUN chmod 644 /etc/apt/trusted.gpg.d/jetson-ota-public.asc && \ 17 | apt-get update && apt-get install -y ca-certificates && \ 18 | echo "deb https://repo.download.nvidia.com/jetson/common r32.6 main" > /etc/apt/sources.list.d/nvidia-container-runtime.list && \ 19 | echo "deb https://repo.download.nvidia.com/jetson/${SOC} r32.6 main" >> /etc/apt/sources.list.d/nvidia-container-runtime.list && \ 20 | apt-get update 21 | 22 | RUN apt-get install -y libcublas-dev libcusolver-dev-10-2 cuda-nvcc-10-2 cuda-cudart-dev-10-2 cuda-compiler-10-2 cmake git libcudnn8-dev \ 23 | libopenblas-dev liblapack-dev libcurand-dev-10-2 24 | 25 | WORKDIR /root 26 | RUN wget http://dlib.net/files/dlib-${DLIB_VERSION}.tar.bz2 && tar xvf dlib* && rm *.tar.bz2 27 | RUN cd dlib-${DLIB_VERSION} && \ 28 | sed -i 's,forward_algo = forward_best_algo;,//forward_algo = forward_best_algo;,g' dlib/cuda/cudnn_dlibapi.cpp && \ 29 | mkdir build && \ 30 | cd build && \ 31 | cmake .. -DDLIB_USE_CUDA=1 && \ 32 | cmake --build . -- -j 5 33 | 34 | RUN apt-get install -y python3-all-dev python3-setuptools && \ 35 | cd dlib-${DLIB_VERSION} && \ 36 | python3 setup.py install 37 | 38 | RUN apt-get install -y python3-pil python3-pip python3-numpy && pip3 install face_recognition 39 | 40 | -------------------------------------------------------------------------------- /server/Dockerfile.rpi4: -------------------------------------------------------------------------------- 1 | FROM registry.fedoraproject.org/fedora-minimal:36-aarch64 2 | RUN microdnf install -y --setopt=install_weak_deps=0 python3 python3-flask dlib python3-dlib python3-opencv \ 3 | python3-pillow python3-pip python3-numpy python3-setuptools python3-requests 4 | RUN pip3 install face_recognition 5 | WORKDIR /app 6 | 7 | COPY *.py /app 8 | RUN mkdir /app/faces 9 | COPY faces /app/faces 10 | ENV FLASK_APP=server 11 | ENV LC_ALL=C.UTF-8 12 | ENV LANG=C.UTF-8 13 | CMD python3 -m flask run --host 0.0.0.0 14 | -------------------------------------------------------------------------------- /server/build_push.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | podman build . -f Dockerfile.rpi4 -t docker.io/mangelajo/cam-server:latest-nogpu 4 | podman push docker.io/mangelajo/cam-server:latest-nogpu 5 | oc delete pod -l app=camserver --grace-period=1 --wait=false 6 | while true; do 7 | sleep 5 8 | oc logs -f -l app=camserver 9 | done 10 | -------------------------------------------------------------------------------- /server/cam-server.yaml: -------------------------------------------------------------------------------- 1 | apiVersion: apps/v1 2 | kind: Deployment 3 | metadata: 4 | name: camserver 5 | labels: 6 | app: camserver 7 | spec: 8 | replicas: 1 9 | selector: 10 | matchLabels: 11 | app: camserver 12 | template: 13 | metadata: 14 | labels: 15 | app: camserver 16 | spec: 17 | containers: 18 | - name: camserver 19 | image: docker.io/mangelajo/cam-server:latest #quay.io/mangelajo/cam-server:latest 20 | imagePullPolicy: Always 21 | ports: 22 | - containerPort: 5000 23 | 24 | -------------------------------------------------------------------------------- /server/faces.py: -------------------------------------------------------------------------------- 1 | import cv2 2 | import face_recognition 3 | import numpy as np 4 | 5 | 6 | # TODO, traverse directories and generate automatically 7 | ricky1_face_encoding = face_recognition.face_encodings( 8 | face_recognition.load_image_file("faces/Ricardo Noriega/ricky1.jpg"))[0] 9 | 10 | ajo1_face_encoding = face_recognition.face_encodings( 11 | face_recognition.load_image_file("faces/Miguel Angel Ajo/ajo1.jpg"))[0] 12 | 13 | known_face_encodings = [ 14 | ricky1_face_encoding, 15 | ajo1_face_encoding, 16 | ] 17 | 18 | known_face_names = [ 19 | "Ricardo Noriega", 20 | "Miguel Angel Ajo" 21 | ] 22 | 23 | RATIO = 0.25 24 | 25 | 26 | def find_and_mark_faces(frame, logger, cam_ip): 27 | small_frame = cv2.resize(frame, (0, 0), fx=RATIO, fy=RATIO) 28 | face_locations = face_recognition.face_locations(small_frame, 1, "hog") 29 | names = [] 30 | face_encodings = face_recognition.face_encodings(small_frame, face_locations) 31 | for face_encoding in face_encodings: 32 | matches = face_recognition.compare_faces(known_face_encodings, face_encoding) 33 | name = "Unknown" 34 | face_distances = face_recognition.face_distance(known_face_encodings, face_encoding) 35 | best_match_index = np.argmin(face_distances) 36 | if matches[best_match_index]: 37 | name = known_face_names[best_match_index] 38 | names.append(name) 39 | 40 | logger.info("[cam %s] face_locations = %s, names = %s", cam_ip, face_locations, names) 41 | 42 | for (top, right, bottom, left), name in zip(face_locations, names): 43 | add_name_box(frame, left, top, bottom, right, name) 44 | 45 | return frame 46 | 47 | 48 | def add_name_box(frame, left, top, bottom, right, name): 49 | inv_ratio = 1.0 / RATIO 50 | top = int(top * inv_ratio) 51 | right = int(right * inv_ratio) 52 | bottom = int(bottom * inv_ratio) 53 | left = int(left * inv_ratio) 54 | cv2.rectangle(frame, (left, top), (right, bottom), (0, 0, 255), 2) 55 | cv2.rectangle(frame, (left, bottom - 35), (right, bottom), (0, 0, 255), cv2.FILLED) 56 | font = cv2.FONT_HERSHEY_DUPLEX 57 | cv2.putText(frame, name, (left + 6, bottom - 6), font, 1.0, (255, 255, 255), 1) 58 | -------------------------------------------------------------------------------- /server/faces/Miguel Angel Ajo/ajo1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/redhat-et/AI-for-edge-microshift-demo/39decaa08f4628f95b00c0a8f4d94649bbbe5f23/server/faces/Miguel Angel Ajo/ajo1.jpg -------------------------------------------------------------------------------- /server/faces/Ricardo Noriega/ricky1.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/redhat-et/AI-for-edge-microshift-demo/39decaa08f4628f95b00c0a8f4d94649bbbe5f23/server/faces/Ricardo Noriega/ricky1.jpg -------------------------------------------------------------------------------- /server/mjpeg_streamer.py: -------------------------------------------------------------------------------- 1 | import requests 2 | import io 3 | 4 | # from @codeskyblue : https://stackoverflow.com/a/62424471/3278496 5 | 6 | class MjpegReader: 7 | def __init__(self, url: str): 8 | self._url = url 9 | 10 | def iter_content(self): 11 | """ 12 | Raises: 13 | RuntimeError 14 | """ 15 | r = requests.get(self._url, stream=True) 16 | 17 | # parse boundary 18 | content_type = r.headers['content-type'] 19 | index = content_type.rfind("boundary=") 20 | assert index != 1 21 | boundary = content_type[index+len("boundary="):] + "\r\n" 22 | boundary = boundary.encode('utf-8') 23 | 24 | rd = io.BufferedReader(r.raw) 25 | while True: 26 | length = self._parse_length(rd) 27 | yield rd.read(length) 28 | self._skip_to_boundary(rd, boundary) 29 | 30 | @staticmethod 31 | def _parse_length(rd) -> int: 32 | length = 0 33 | while True: 34 | line = rd.readline() 35 | if line == b'\r\n': 36 | return length 37 | if line.startswith(b"Content-Length"): 38 | length = int(line.decode('utf-8').split(": ")[1]) 39 | assert length > 0 40 | 41 | @staticmethod 42 | def _skip_to_boundary(rd, boundary: bytes): 43 | for _ in range(1000): 44 | line = rd.readline() 45 | if boundary in line: 46 | break 47 | else: 48 | raise RuntimeError("Boundary not detected:", boundary) -------------------------------------------------------------------------------- /server/run.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | 3 | FLASK_APP=server python3 -m flask run --host 0.0.0.0 4 | 5 | -------------------------------------------------------------------------------- /server/server.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python3 2 | from flask import Flask 3 | from flask import request 4 | from flask import Response 5 | import logging 6 | import threading 7 | import time 8 | 9 | from faces import * 10 | from mjpeg_streamer import MjpegReader 11 | 12 | # frame to be shared via mjpeg server out 13 | outputFrames = {} 14 | lock = threading.Lock() 15 | 16 | app = Flask(__name__) 17 | app.logger.setLevel(logging.INFO) 18 | 19 | 20 | @app.route("/") 21 | def hello_world(): 22 | return "

Hello, World!

" 23 | 24 | 25 | @app.route("/register") 26 | def register(): 27 | if "token" not in request.args or "ip" not in request.args: 28 | return "Invalid registration, you need both token and ip", 400 29 | 30 | cam_ip = request.args["ip"] 31 | cam_token = request.args["token"] 32 | 33 | app.logger.info("camera @%s registered with token %s", cam_ip, cam_token) 34 | threading.Thread(target=streamer_thread, name=None, args=[cam_ip, cam_token]).start() 35 | 36 | return "ACK", 200 37 | 38 | 39 | def streamer_thread(cam_ip, cam_token): 40 | 41 | app.logger.info("starting streamer thread for cam %s", cam_ip) 42 | 43 | mr = MjpegReader("http://" + cam_ip + "/stream?token=" + cam_token) 44 | for content in mr.iter_content(): 45 | frame = cv2.imdecode(np.frombuffer(content, dtype=np.uint8), cv2.IMREAD_COLOR) 46 | process_streamer_frame(cam_ip, frame) 47 | 48 | 49 | def process_streamer_frame(cam_ip, frame): 50 | global outputFrames, lock 51 | frame_out = find_and_mark_faces(frame, app.logger, cam_ip) 52 | with lock: 53 | outputFrames[cam_ip] = frame_out.copy() 54 | 55 | 56 | @app.route("/video_feed") 57 | def video_feed(): 58 | # return the response generated along with the specific media 59 | # type (mime type) 60 | return Response(generate_video_feed(), 61 | mimetype="multipart/x-mixed-replace; boundary=frame") 62 | 63 | 64 | def generate_video_feed(): 65 | # grab global references to the output frame and lock variables 66 | global outputFrames, lock 67 | # loop over frames from the output stream 68 | while True: 69 | # wait until the lock is acquired 70 | with lock: 71 | # check if the output frame is available, otherwise skip 72 | # the iteration of the loop 73 | if not outputFrames: 74 | time.sleep(0.01) 75 | continue 76 | 77 | # encode the frame in JPEG format 78 | 79 | frames = [] 80 | for camIP, frame in sorted(outputFrames.items()): 81 | frames.append(frame) 82 | 83 | all_cams = cv2.hconcat(frames) 84 | 85 | (flag, encodedImage) = cv2.imencode(".jpg", all_cams) 86 | 87 | # ensure the frame was successfully encoded 88 | if not flag: 89 | continue 90 | # yield the output frame in the byte format 91 | yield b'--frame\r\n' b'Content-Type: image/jpeg\r\n\r\n' + bytearray(encodedImage) + b'\r\n' 92 | 93 | time.sleep(0.1) 94 | -------------------------------------------------------------------------------- /wifi-ap/Dockerfile: -------------------------------------------------------------------------------- 1 | FROM registry.fedoraproject.org/fedora-minimal:35-aarch64 2 | RUN microdnf install -y --setopt=install_weak_deps=0 hostapd dnsmasq iproute 3 | COPY hostapd.conf /etc/hostapd/hostapd.conf 4 | COPY microshift-wlan0-dnsmasq-conf /etc/dnsmasq.d/ 5 | 6 | COPY run.sh /run.sh 7 | 8 | CMD ["/run.sh"] 9 | -------------------------------------------------------------------------------- /wifi-ap/Dockerfile.rpi4: -------------------------------------------------------------------------------- 1 | FROM registry.fedoraproject.org/fedora-minimal:35-aarch64 2 | RUN microdnf install -y --setopt=install_weak_deps=0 hostapd dnsmasq iproute 3 | COPY hostapd.rpi4.conf /etc/hostapd/hostapd.conf 4 | COPY microshift-wlan0-dnsmasq-conf /etc/dnsmasq.d/ 5 | 6 | COPY run.sh /run.sh 7 | 8 | CMD ["/run.sh"] 9 | -------------------------------------------------------------------------------- /wifi-ap/build_push.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | podman build . -f Dockerfile.rpi4 -t quay.io/mangelajo/microshift-ap-cams:latest-rpi4 3 | podman push quay.io/mangelajo/microshift-ap-cams:latest-rpi4 4 | 5 | -------------------------------------------------------------------------------- /wifi-ap/build_run.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | docker build . -t quay.io/mangelajo/microshift-ap-cams:latest 3 | docker run -ti --rm --privileged --net=host --name wifiap quay.io/mangelajo/microshift-ap-cams:latest 4 | 5 | -------------------------------------------------------------------------------- /wifi-ap/cam-ap.yaml: -------------------------------------------------------------------------------- 1 | apiVersion: apps/v1 2 | kind: Deployment 3 | metadata: 4 | name: cameras-ap 5 | labels: 6 | infra: ap 7 | spec: 8 | replicas: 1 9 | selector: 10 | matchLabels: 11 | infra: ap 12 | template: 13 | metadata: 14 | labels: 15 | infra: ap 16 | spec: 17 | hostNetwork: true 18 | containers: 19 | - name: camwifi-ap 20 | image: quay.io/mangelajo/microshift-ap-cams:latest-rpi4 21 | imagePullPolicy: Always 22 | securityContext: 23 | privileged: true # otherwise hostapd can't change channels for some reason 24 | capabilities: 25 | add: 26 | - NET_ADMIN 27 | - NET_RAW 28 | 29 | -------------------------------------------------------------------------------- /wifi-ap/hostapd.conf: -------------------------------------------------------------------------------- 1 | interface=wlan0 2 | driver=nl80211 3 | #driver=hostap 4 | ssid=camwifi 5 | wmm_enabled=1 6 | hw_mode=g 7 | ieee80211n=1 8 | ieee80211d=1 9 | channel=6 10 | beacon_int=100 11 | dtim_period=2 12 | auth_algs=1 13 | ignore_broadcast_ssid=0 14 | wpa=2 15 | country_code=ES 16 | logger_stdout=2 17 | logger_stdout_level=0 18 | ctrl_interface=/var/run/hostapd 19 | wpa_key_mgmt=WPA-PSK 20 | rsn_pairwise=CCMP 21 | wpa_passphrase=thisisacamwifi 22 | -------------------------------------------------------------------------------- /wifi-ap/hostapd.rpi4.conf: -------------------------------------------------------------------------------- 1 | interface=wlan0 2 | ssid=camwifi 3 | wpa_passphrase=thisisacamwifi 4 | hw_mode=g 5 | channel=6 6 | ieee80211n=1 7 | wmm_enabled=1 8 | ignore_broadcast_ssid=0 9 | wpa=2 10 | wpa_key_mgmt=WPA-PSK 11 | rsn_pairwise=CCMP 12 | 13 | #ieee80211d=1 14 | 15 | #beacon_int=100 16 | #dtim_period=2 17 | #auth_algs=1 18 | 19 | #country_code=ES 20 | logger_stdout=2 21 | logger_stdout_level=0 22 | ctrl_interface=/var/run/hostapd 23 | 24 | -------------------------------------------------------------------------------- /wifi-ap/idea-hostap-container.txt: -------------------------------------------------------------------------------- 1 | https://fwhibbit.es/punto-de-acceso-automatizado-con-docker-y-raspberry-pi-zero-w 2 | 3 | 4 | -------------------------------------------------------------------------------- /wifi-ap/microshift-wlan0-dnsmasq-conf: -------------------------------------------------------------------------------- 1 | interface=wlan0 2 | bind-interfaces 3 | dhcp-option=3,192.168.66.1 4 | dhcp-option=6,192.168.66.1 5 | expand-hosts 6 | domain=cams 7 | local=/cams/ 8 | dhcp-authoritative 9 | dhcp-range=192.168.66.50,192.168.66.150,12h 10 | dhcp-leasefile=/var/lib/misc/dnsmasq.leases 11 | log-queries 12 | log-dhcp 13 | 14 | 15 | -------------------------------------------------------------------------------- /wifi-ap/run.sh: -------------------------------------------------------------------------------- 1 | #!/bin/sh 2 | set -e 3 | set -x 4 | 5 | WIFI=wlan0 6 | 7 | ip a add 192.168.66.1/24 dev "${WIFI}" || true 8 | ip l set dev "${WIFI}" up 9 | 10 | hostapd /etc/hostapd/hostapd.conf & 11 | dnsmasq -d & 12 | wait -n 13 | kill 0 14 | --------------------------------------------------------------------------------