├── .gitignore ├── 3d_files ├── AQI Box - cover.stl ├── AQI Box - main housing.stl └── AQI Box - vented cover.stl ├── README.md ├── What's that Smell - Codemash 2024.pdf ├── api ├── main.py └── requirements.txt ├── consumer ├── alert.py ├── consumer_json.py ├── consumer_ts.py ├── consumer_ts_madd.py └── requirements.txt ├── pico_w ├── .picowgo ├── PMS5003.py ├── README.md ├── example.secrets.py ├── main.py ├── picoredis.py └── utility.py └── requirements.txt /.gitignore: -------------------------------------------------------------------------------- 1 | secrets.py 2 | __pycache__ 3 | .DS_Store 4 | .vscode 5 | venv -------------------------------------------------------------------------------- /3d_files/AQI Box - cover.stl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/redis-developer/redis-aqi-monitor/5cab3b74a75191aa103a13baa0ad128f568ab522/3d_files/AQI Box - cover.stl -------------------------------------------------------------------------------- /3d_files/AQI Box - main housing.stl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/redis-developer/redis-aqi-monitor/5cab3b74a75191aa103a13baa0ad128f568ab522/3d_files/AQI Box - main housing.stl -------------------------------------------------------------------------------- /3d_files/AQI Box - vented cover.stl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/redis-developer/redis-aqi-monitor/5cab3b74a75191aa103a13baa0ad128f568ab522/3d_files/AQI Box - vented cover.stl -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Redis AQI Monitor 2 | 3 | This repository is a suite of scripts intended to demonstrate the capabilities of Redis Stack with IoT. Sensor readings are stored in a Redis Stream data structure and fan out to TimeSeries and JSON documents. 4 | 5 | ## Filesystem 6 | 7 | There are three top-level folders used for different aspects of the process of gathering and interpreting sensor data: 8 | - :open_file_folder: `pico_w`: this houses the code that is installed in the Raspberry Pico W unit 9 | - :open_file_folder: `consumer`: this contains services to consume data from the main stream data structure of sensor readings and create other data structures, such as a TimeSeries or JSON 10 | - :open_file_folder: `api`: this is a fastAPI server that serves relevant sensor data to Grafana. It also has sample query endpoints to demonstrate the JSON query capabilities 11 | 12 | 13 | ## Hardware Overview 14 | A Raspberry Pi Pico W compute unit, essentially a microcontroller capable of running a truncated version of Python, is wired to a readily-available air particulate sensor. The sensor specifically returns an array of particulate matter density per cubic meter. For this application, particulate matter of 2.5 microns or smaller was chosen as a metric to track. This size range is the largest threat to human health during times of wildfires and heavy air pollution. 15 | 16 | The Pico W unit has onboard wireless capabilities and requires approximately 4.5 volts to operate; this means it can operate on batteries anywhere within range of a wireless access point. 17 | 18 | ## Software Overview 19 | 20 | ### The portable sensor unit 21 | The Pico W unit, with the installed software within this project, records particulate value readings every five seconds. The value is also converted to the standard AQI value most commonly seen in in online air quality maps, such as PurpleAir.com. The Pico W also has an onboard temperature sensor, so a temperature reading is recorded as well for additional data tracking. 22 | 23 | These three values are sent to a stream data structure housed in a cloud instance of Redis. A cloud infrastructure was chosen to ensure high-availability to any sensor connected to the internet; there is no need for a local machine serving Redis. 24 | 25 | ### Consumers of the stream data structure 26 | Sources that send data to a stream are called *producers*. A producer can be a sensor, another Redis data structure, even another Stream. Services that process data from the stream are called *consumers*. 27 | 28 | In this project, two consumers run continuously. 29 | 30 | The `consumer_ts.py` script reads each entry from the stream (`sensor:raw`) and adds the values to a respective TimeSeries data structure. `temp`, `pm2_5`, and `aqi` are each added to their respective timeseries instances. 31 | 32 | ```python 33 | result = redis.xread( 34 | streams={STREAM_KEY: last_stream_entry_id}, 35 | count=1, 36 | block=50000) 37 | 38 | payload = result[0][1][0] # payload for stream entry 1678071037305-0 39 | # extract values form payload 40 | timestamp = payload[0][:13] # stream id without the segment: 1678071037305 41 | target = payload[1]['target'] 42 | pm2_5 = payload[1]['PM2.5'] 43 | aqi = payload[1]['AQI'] 44 | temp = payload[1]['temp'] 45 | 46 | # establish timeseries key prefix for target location 47 | ts_key_prefix = f'ts:{target}' 48 | 49 | try: 50 | # create three separate timeseries entries from each stream entry 51 | ts_entry_temp = redis.ts().add(f'{ts_key_prefix}:aqi', timestamp[:10], aqi, duplicate_policy='first') 52 | ts_entry_pm25 = redis.ts().add(f'{ts_key_prefix}:pm', timestamp[:10], pm2_5, duplicate_policy='first') 53 | ts_entry_temp = redis.ts().add(f'{ts_key_prefix}:pm', timestamp[:10], temp, duplicate_policy='first') 54 | ``` 55 | 56 | The `consumer_json.py` file also reads each entry from the stream and updates a JSON document dedicated to each sensor location. A text message alert system is also included within this file to deploy third party notifications when the AQI value has crossed a defined threshold after one minute. 57 | 58 | ```python 59 | result = r.json().set(json_key, '.', 60 | { 'timestamp': timestamp, 61 | 'current_pm2_5': pm2_5, 62 | 'current_temp': temp, 63 | 'current_aqi': aqi, 64 | 'last_12': last_12}) 65 | ``` 66 | 67 | There is a rolling list property within each JSON document called `last_12`. This is an array of the last 12 AQI readings, comprising a snapshot of 1 minute of time. If the sum of all AQI readings within this array is above a defined threshold, then the text message alert is sent. This assumes that there is a consistent trend of higher AQI than normal and that the user should be alerted. Once a text message is sent, a temporary variable `user_notified` is created that acts as a boolean block for text messages being sent numerous times. A sensible default of one hour exists between text notifications. 68 | 69 | ```python 70 | location_json = r.json().get(json_key) 71 | 72 | # if we need to create this JSON document, populate the array 73 | if location_json is None: 74 | last_12 = [0,0,0,0,0,0,0,0,0,0,0,0] 75 | else: 76 | last_12 = location_json['last_12'] 77 | 78 | last_12.append(aqi) 79 | last_12.pop(0) 80 | sum_last_12 = sum(last_12) 81 | 82 | # alert if threshold is crossed: 83 | if sum_last_12 >= AQI_THRESHOLD: 84 | aqi_average = floor(sum_last_12/12) 85 | has_been_notified = r.get('user_notified') 86 | if not has_been_notified: 87 | alert(aqi_average, target) 88 | r.set('user_notified', 1, 3600) 89 | ``` 90 | 91 | 92 | ### Serving the sensor data 93 | 94 | This project contains a fastAPI webserver under the folder `api`. There are several routes used by Grafana for displaying all available timeseries endpoints and the actual payloads of coordinates for plotting. Here are the endpoints: 95 | 96 | | method | endpoint | purpose | 97 | |--------|-----------|---------| 98 | | `GET` | `/` | returns a JSON object of all sensors and a sample of the last ten entries in the stream of sensor readings | 99 | | `GET` | `/json/query/{sensor}/aqi?min=int&max=int` | example of the query capabilities Redis implements upon JSON documents. This endpoint returns a specified sensor's readings where the aqi **value** is between `min` and `max` 100 | | `POST` | `/query` | part of an autocomplete feature within Grafana that returns an array of available sensors | 101 | | `POST` | `/search` | receives a JSON request from Grafana's SimpleJSON plugin and returns an array of one or more readings of sensors based on a given time frame 102 | 103 | The Simple JSON plugin for Grafana allows for a quick way to integrate the timeseries data. After receiving a request from Grafana for a specified time window, one or more locations, and the interval of data points, the `RANGE` command is executed to retrieve the values and timestamps: 104 | 105 | ```python 106 | for target_request in targets: 107 | target = target_request['target'] 108 | from_time = body['range']['from'] 109 | to_time = body['range']['to'] 110 | interval = body['intervalMs']/100 111 | 112 | ts_key = f'ts:{target}:aqi' 113 | from_time = (parse(from_time) - timedelta(hours=8)).strftime('%s') 114 | to_time = (parse(to_time) - timedelta(hours=8)).strftime('%s') 115 | 116 | # request a specified range on timeseries 117 | results = redis.ts().range(ts_key, 118 | from_time, 119 | to_time, 120 | aggregation_type='avg', 121 | bucket_size_msec=int(interval)) 122 | ``` 123 | 124 | -------------------------------------------------------------------------------- /What's that Smell - Codemash 2024.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/redis-developer/redis-aqi-monitor/5cab3b74a75191aa103a13baa0ad128f568ab522/What's that Smell - Codemash 2024.pdf -------------------------------------------------------------------------------- /api/main.py: -------------------------------------------------------------------------------- 1 | # cSpell:disable 2 | from fastapi import FastAPI, Request 3 | import os 4 | from dateutil.parser import * 5 | from datetime import * 6 | import redis 7 | 8 | TIMEFORMAT='%Y-%m-%d %H:%M:%S' 9 | TIMEZONE_DIFF=5 10 | redis = redis.Redis( 11 | host=os.getenv("AQI_HOST"), 12 | port=os.getenv("AQI_PORT"), 13 | password=os.getenv("AQI_PASS"), 14 | decode_responses=True 15 | ) 16 | 17 | app = FastAPI() 18 | 19 | system_profile = { 20 | 'active_sensors': {}, 21 | 'sensor_names': [], 22 | 'active_since': datetime.now() 23 | } 24 | 25 | # returns information on the sensors connected to Redis 26 | @app.get("/") 27 | async def root(): 28 | # fetch all active sensor keys 29 | active_sensors = redis.keys('ttl:*') 30 | # remove the tll prefix 31 | system_profile['sensor_names'] = [] 32 | for sensor in active_sensors: 33 | system_profile['sensor_names'].append(sensor[4:]) 34 | 35 | # get total count of active sensors 36 | system_profile['active_sensors'] = len(active_sensors) 37 | # get a sample of the last ten sensor readings 38 | system_profile['latest_10'] = redis.xrevrange('sensor:raw', '+','-' , 10) 39 | return {"response": system_profile} 40 | 41 | # search endpoint for Grafana 42 | @app.post("/search") 43 | async def search(request: Request): 44 | body = await request.body() 45 | active_sensors = redis.keys('ttl:*') 46 | formatted_sensors = [] 47 | 48 | for sensor in active_sensors: 49 | formatted_sensors.append(sensor[4:]) 50 | # return a list of all active sensors to choose from 51 | return formatted_sensors 52 | 53 | # returns an array of timestamps and values based on json request from Grafana 54 | @app.post("/query") 55 | async def query(request: Request): 56 | body = await request.json() 57 | targets = body['targets'] 58 | response=[] 59 | # set up iterator to query for one or multiple TS and return in results_array 60 | for target_request in targets: 61 | target = target_request['target'] 62 | from_time = body['range']['from'] 63 | to_time = body['range']['to'] 64 | interval = body['intervalMs']/100 65 | print(target) 66 | ts_key = f'ts:{target}:aqi' 67 | from_time = (parse(from_time) - timedelta(hours=TIMEZONE_DIFF)).strftime('%s') 68 | print(f'from_time: {from_time}') 69 | 70 | to_time = (parse(to_time) - timedelta(hours=TIMEZONE_DIFF)).strftime('%s') 71 | print(f'to_time: {to_time}') 72 | 73 | # request a specified range on timeseries 74 | results = redis.ts().range(ts_key, from_time, to_time, 75 | aggregation_type='avg', 76 | bucket_size_msec=int(interval)) 77 | print('HIT RANGE') 78 | print(ts_key) 79 | print(results) 80 | # iterate through results, and prepare response payload 81 | results_list = [] 82 | for index, tuple in enumerate(results): 83 | graf_data = tuple[1] 84 | graf_stamp = int(tuple[0])*1000 # datetime.fromtimestamp(tuple[0]).strftime(TIMEFORMAT) 85 | results_list.append([graf_data, graf_stamp]) 86 | response.append({'target' : target, 'datapoints' : results_list}) 87 | 88 | print(response) 89 | return response 90 | -------------------------------------------------------------------------------- /api/requirements.txt: -------------------------------------------------------------------------------- 1 | anyio==3.6.2 2 | async-timeout==4.0.2 3 | fastapi==0.92.0 4 | idna==3.4 5 | importlib-metadata==6.0.0 6 | pydantic==1.10.5 7 | redis==4.5.1 8 | sniffio==1.3.0 9 | starlette==0.25.0 10 | typing-extensions==4.5.0 11 | uvicorn==0.21.0 12 | zipp==3.13.0 13 | -------------------------------------------------------------------------------- /consumer/alert.py: -------------------------------------------------------------------------------- 1 | from twilio.rest import Client 2 | import os 3 | 4 | account_sid = os.getenv('TWILIO_SID') 5 | auth_token = os.getenv('TWILIO_AUTH_TOKEN') 6 | phone_number = os.getenv('PTN') 7 | client = Client(account_sid, auth_token) 8 | 9 | def alert(value, location): 10 | message = client.messages.create( 11 | from_='+12766001085', 12 | body=f'Hello, the current AQI is {value} at the {location} location.', 13 | to=phone_number) 14 | return message 15 | 16 | -------------------------------------------------------------------------------- /consumer/consumer_json.py: -------------------------------------------------------------------------------- 1 | import os 2 | from alert import alert 3 | from redis import Redis 4 | from math import floor 5 | 6 | STREAM_KEY = 'sensor:raw' 7 | AQI_THRESHOLD = 100*12 8 | 9 | r = Redis(host=os.getenv('REDIS_HOST'), 10 | port=os.getenv('REDIS_PORT'), 11 | password=os.getenv('REDIS_PASS'), 12 | decode_responses=True) 13 | 14 | # placeholder if there is a pause in consumer service 15 | json_stream_entry_id = r.get('json_stream_entry_id') 16 | 17 | while(True): 18 | # read first result from stream that receives raw sensor data 19 | result = r.xread(streams={STREAM_KEY: json_stream_entry_id}, 20 | count=1, 21 | block=50000) 22 | 23 | # extract values from result 24 | entry_stream_id = result[0][1][0][0] 25 | timestamp = int(result[0][1][0][0][:13]) 26 | sensor_readings = result[0][1][0][1] 27 | 28 | target = sensor_readings["target"] 29 | json_key = f'json:{target}' 30 | 31 | pm2_5 = int(sensor_readings["PM2.5"]) 32 | temp = float(sensor_readings["temp"]) 33 | aqi = int(sensor_readings["AQI"]) 34 | 35 | # check for 12 * 5 second threshold readings in a row (1 minute) 36 | location_json = r.json().get(json_key) 37 | # if a json object has not yet been made, create an array 38 | if location_json is None: 39 | last_12 = [0,0,0,0,0,0,0,0,0,0,0,0] 40 | else: 41 | last_12 = location_json['last_12'] 42 | 43 | # "slide" out the oldest reading, add the newest 44 | last_12.append(aqi) 45 | last_12.pop(0) 46 | 47 | # alert if threshold is crossed: 48 | if sum(last_12) >= AQI_THRESHOLD: 49 | aqi_average = floor(sum(last_12)/12) 50 | has_been_notified = r.get('user_notified') 51 | if not has_been_notified: 52 | alert(aqi_average, target) 53 | r.set('user_notified', 1, 3600) 54 | 55 | # create a new JSON document or update an existing one 56 | try: 57 | result = r.json().set(json_key, '.', 58 | { 'timestamp': timestamp, 59 | 'current_pm2_5': pm2_5, 60 | 'current_temp': temp, 61 | 'current_aqi': aqi, 62 | 'last_12': last_12 63 | }) 64 | 65 | except: 66 | print(f'Error:\nkey: {json_key}') 67 | finally: 68 | # update entry_stream_id so we know where to pull the next entry 69 | last_entry = int(entry_stream_id[14:])+1 70 | new_stream_id = f'{entry_stream_id[:14]}{last_entry}' 71 | r.set('json_stream_entry_id', new_stream_id) 72 | json_stream_entry_id = new_stream_id 73 | -------------------------------------------------------------------------------- /consumer/consumer_ts.py: -------------------------------------------------------------------------------- 1 | # cSpell:disable 2 | import os 3 | from redis import Redis 4 | 5 | STREAM_KEY = "sensor:raw" 6 | RETENTION = 1000*60*60*24*3 # 3 days of milliseconds 7 | 8 | redis = Redis(host=os.getenv("AQI_HOST"), 9 | port=os.getenv("AQI_PORT"), 10 | password=os.getenv("AQI_PASS"), 11 | decode_responses=True) 12 | 13 | stream_entry_id = '$' 14 | 15 | while(True): 16 | 17 | # read newest result from stream 18 | result = redis.xread(block=50000, streams={STREAM_KEY: stream_entry_id}) 19 | 20 | payload = result[0][1][0] # payload for stream entry 1678071037305-0 21 | # extract values form payload 22 | timestamp = payload[0][:10] # stream id without the segment: 1678071037305 23 | print(timestamp) 24 | ts_key_prefix = f'ts:{payload[1]["target"]}' 25 | sensor_values = payload[1] 26 | 27 | try: 28 | # create three separate timeseries entries from each stream entry 29 | ts_entry_aqi = redis.ts().add(f'{ts_key_prefix}:aqi', 30 | timestamp, 31 | sensor_values["AQI"], 32 | retention_msecs=RETENTION, 33 | duplicate_policy='first') 34 | 35 | ts_entry_pm25 = redis.ts().add(f'{ts_key_prefix}:pm', 36 | timestamp, 37 | sensor_values["PM2.5"], 38 | retention_msecs=RETENTION, 39 | duplicate_policy='first') 40 | 41 | ts_entry_temp = redis.ts().add(f'{ts_key_prefix}:temp', 42 | timestamp, 43 | sensor_values["temp"], 44 | retention_msecs=RETENTION, 45 | duplicate_policy='first') 46 | 47 | except Exception as err: 48 | # report any errors in adding to timeseries 49 | print(f'Unexpected {err}, {type(err)}') 50 | 51 | finally: 52 | # update last stream entry id for next iteration 53 | stream_entry_id = result[0][1][0][0] -------------------------------------------------------------------------------- /consumer/consumer_ts_madd.py: -------------------------------------------------------------------------------- 1 | # cSpell:disable 2 | import os 3 | from redis import Redis 4 | 5 | STREAM_KEY = "sensor:raw" 6 | 7 | redis = Redis(host=os.getenv("REDIS_HOST"), 8 | port=os.getenv("REDIS_PORT"), 9 | password=os.getenv("REDIS_PASS"), 10 | decode_responses=True) 11 | 12 | stream_entry_id = redis.get("ts_stream_entry_id") or 0 13 | 14 | # read first result from stream 15 | result = redis.xread(streams={STREAM_KEY: stream_entry_id}, block=50000) 16 | 17 | stream_results = result[0][1] 18 | 19 | ts_madd_array = [] 20 | 21 | redis.ts().create('ts:unit_1:temp', duplicate_policy= 'first') 22 | redis.ts().create('ts:unit_1:aqi', duplicate_policy= 'first') 23 | redis.ts().create('ts:unit_1:pm', duplicate_policy= 'first') 24 | 25 | # redis.ts().create('ts:livingroom:temp', DUPLICATE_POLICY= 'first') 26 | # redis.ts().create('ts:livingroom:aqi', DUPLICATE_POLICY= 'first') 27 | # redis.ts().create('ts:livingroom:pm', DUPLICATE_POLICY= 'first') 28 | 29 | for entry in stream_results: 30 | target = f'ts:{entry[1]["target"]}' 31 | print(target) 32 | my_tuple = (f'{target}:aqi', entry[0][:10], entry[1]['AQI']) 33 | ts_madd_array.append(my_tuple) 34 | 35 | my_tuple = (f'{target}:temp', entry[0][:10], entry[1]['temp']) 36 | ts_madd_array.append(my_tuple) 37 | 38 | my_tuple = (f'{target}:pm', entry[0][:10], entry[1]['PM2.5']) 39 | ts_madd_array.append(my_tuple) 40 | 41 | 42 | print(ts_madd_array) 43 | results = redis.ts().madd(ts_madd_array) 44 | print(results) 45 | -------------------------------------------------------------------------------- /consumer/requirements.txt: -------------------------------------------------------------------------------- 1 | anyio==3.6.2 2 | async-timeout==4.0.2 3 | certifi==2022.12.7 4 | charset-normalizer==3.1.0 5 | fastapi==0.91.0 6 | idna==3.4 7 | importlib-metadata==6.0.0 8 | PyJWT==2.6.0 9 | pytz==2022.7.1 10 | redis==4.5.1 11 | requests==2.28.2 12 | six==1.16.0 13 | sniffio==1.3.0 14 | starlette==0.24.0 15 | twilio==7.16.5 16 | typing_extensions==4.5.0 17 | urllib3==1.26.15 18 | zipp==3.15.0 19 | -------------------------------------------------------------------------------- /pico_w/.picowgo: -------------------------------------------------------------------------------- 1 | {'info': 'This file is just used to identify a project folder.'} -------------------------------------------------------------------------------- /pico_w/PMS5003.py: -------------------------------------------------------------------------------- 1 | import ustruct as struct 2 | import time 3 | 4 | import machine 5 | 6 | 7 | __version__ = '0.0.7' 8 | 9 | 10 | PMS5003_SOF = bytearray(b'\x42\x4d') 11 | PMS5003_CMD_MODE_PASSIVE = b'\xe1\x00\x00' 12 | PMS5003_CMD_MODE_ACTIVE = b'\xe1\x00\x01' 13 | PMS5003_CMD_READ = b'\xe2\x00\x00' 14 | PMS5003_CMD_SLEEP = b'\xe4\x00\x00' 15 | PMS5003_CMD_WAKEUP = b'\xe4\x00\x01' 16 | 17 | PMSA003I_I2C_ADDR = 0x12 18 | 19 | 20 | class ChecksumMismatchError(RuntimeError): 21 | pass 22 | 23 | 24 | class FrameLengthError(RuntimeError): 25 | pass 26 | 27 | 28 | class ReadTimeoutError(RuntimeError): 29 | pass 30 | 31 | 32 | class SerialTimeoutError(RuntimeError): 33 | pass 34 | 35 | 36 | class PMS5003Response: 37 | FRAME_LEN = None 38 | DATA_LEN = None 39 | DATA_FMT = None 40 | CHECKSUM_IDX = None 41 | 42 | @classmethod 43 | def check_data_len(cls, raw_data_len, desc="Data"): 44 | if raw_data_len != cls.DATA_LEN: 45 | raise FrameLengthError("{} too {} {:d} bytes".format( 46 | desc, 47 | "short" if raw_data_len < cls.DATA_LEN else "long", 48 | raw_data_len 49 | )) 50 | 51 | def __init__(self, raw_data, *, frame_length_bytes): 52 | raw_data_len = len(raw_data) 53 | self.check_data_len(raw_data_len) 54 | self.raw_data = raw_data 55 | self.data = struct.unpack(self.DATA_FMT, raw_data) 56 | self.checksum = self.data[self.CHECKSUM_IDX] 57 | 58 | # Don't include the checksum bytes in the checksum calculation 59 | checksum = sum(PMS5003_SOF) + sum(raw_data[:-2]) 60 | if frame_length_bytes is None: 61 | checksum += (raw_data_len >> 256) + (raw_data_len & 0xff) 62 | else: 63 | checksum += sum(frame_length_bytes) 64 | if checksum != self.checksum: 65 | raise ChecksumMismatchError("PMS5003 Checksum Mismatch {} != {}".format(checksum, 66 | self.checksum)) 67 | 68 | 69 | class PMS5003CmdResponse(PMS5003Response): 70 | FRAME_LEN = 8 71 | DATA_LEN = FRAME_LEN - 4 # includes checksum 72 | DATA_FMT = ">BBH" 73 | CHECKSUM_IDX = 2 74 | 75 | def __init__(self, raw_data, *, frame_length_bytes=None): 76 | super().__init__(raw_data, frame_length_bytes=frame_length_bytes) 77 | 78 | 79 | class PMS5003Data(PMS5003Response): 80 | FRAME_LEN = 32 81 | DATA_LEN = FRAME_LEN - 4 # includes checksum 82 | DATA_FMT = ">HHHHHHHHHHHHHH" 83 | CHECKSUM_IDX = 13 84 | 85 | def __init__(self, raw_data, *, frame_length_bytes=None): 86 | super().__init__(raw_data, frame_length_bytes=frame_length_bytes) 87 | 88 | def pm_ug_per_m3(self, size, atmospheric_environment=False): 89 | if atmospheric_environment: 90 | if size == 1.0: 91 | return self.data[3] 92 | if size == 2.5: 93 | return self.data[4] 94 | if size is None: 95 | return self.data[5] 96 | 97 | else: 98 | if size == 1.0: 99 | return self.data[0] 100 | if size == 2.5: 101 | return self.data[1] 102 | if size == 10: 103 | return self.data[2] 104 | 105 | raise ValueError("Particle size {} measurement not available.".format(size)) 106 | 107 | def pm_per_1l_air(self, size): 108 | if size == 0.3: 109 | return self.data[6] 110 | if size == 0.5: 111 | return self.data[7] 112 | if size == 1.0: 113 | return self.data[8] 114 | if size == 2.5: 115 | return self.data[9] 116 | if size == 5: 117 | return self.data[10] 118 | if size == 10: 119 | return self.data[11] 120 | 121 | raise ValueError("Particle size {} measurement not available.".format(size)) 122 | 123 | def __repr__(self): 124 | return """{}""".format(*self.data[1:2], checksum=self.checksum) 125 | 126 | def __str__(self): 127 | return self.__repr__() 128 | 129 | 130 | class PMS5003(): 131 | MAX_RESET_TIME = 20_000 # 9.2 seconds seen in testing 132 | MAX_RESP_TIME = 5_000 133 | MIN_CMD_INTERVAL = 0.1 # mode changes with interval < 50ms break a PMS5003 134 | 135 | @staticmethod 136 | def _build_cmd_frame(cmd_bytes): 137 | """ 138 | Builds a valid command frame byte array with checksum for given command bytes 139 | """ 140 | if len(cmd_bytes) != 3: 141 | raise RuntimeError("Malformed command frame") 142 | cmd_frame = bytearray() 143 | cmd_frame.extend(PMS5003_SOF) 144 | cmd_frame.extend(cmd_bytes) 145 | cmd_frame.extend(sum(cmd_frame).to_bytes(2, "big")) 146 | return cmd_frame 147 | 148 | def __init__(self, 149 | uart, 150 | pin_reset, 151 | pin_enable, 152 | mode='active', 153 | retries=5 154 | ): 155 | self._port = uart 156 | self._serial = type(uart) is machine.UART 157 | self._mode = 'active' # device starts up in active mode 158 | 159 | self._pin_enable = pin_enable 160 | self._pin_reset = pin_reset 161 | self._attempts = retries + 1 if retries else 1 162 | 163 | if mode not in ('active', 'passive'): 164 | raise ValueError("Invalid mode") 165 | 166 | if self._pin_enable: 167 | self._pin_enable.init(machine.Pin.OPEN_DRAIN) 168 | self._pin_enable.value(1) 169 | 170 | if self._pin_reset: 171 | self._pin_reset.init(machine.Pin.OUT) 172 | self._pin_reset.value(1) 173 | 174 | self.reset() 175 | 176 | if mode == 'passive': 177 | self.cmd_mode_passive() 178 | 179 | def cmd_mode_passive(self): 180 | """ 181 | Sends command to device to enable 'passive' mode. 182 | In passive mode data frames are only sent in response to 183 | a read command. 184 | """ 185 | if not self._serial: 186 | return 187 | 188 | self._mode = 'passive' 189 | 190 | time.sleep(self.MIN_CMD_INTERVAL) 191 | self._reset_input_buffer() 192 | self._port.write(self._build_cmd_frame(PMS5003_CMD_MODE_PASSIVE)) 193 | # In rare cases a single data frame sneaks in giving FrameLengthError 194 | try: 195 | resp = self._read_data(PMS5003CmdResponse) 196 | except FrameLengthError: 197 | resp = self._read_data(PMS5003CmdResponse) 198 | time.sleep(self.MIN_CMD_INTERVAL) 199 | return resp 200 | 201 | def cmd_mode_active(self): 202 | """ 203 | Sends command to device to enable 'active' mode. 204 | In active mode data frames are streamed continuously at intervals 205 | ranging from 200ms to 2.3 seconds. 206 | """ 207 | if not self._serial: 208 | return 209 | 210 | self._mode = 'active' 211 | # mode changes with interval < 50ms break on a PMS5003 212 | time.sleep(self.MIN_CMD_INTERVAL) 213 | self._reset_input_buffer() 214 | self._port.write(self._build_cmd_frame(PMS5003_CMD_MODE_ACTIVE)) 215 | # In rare cases a single data frame sneaks in giving FrameLengthError 216 | try: 217 | resp = self._read_data(PMS5003CmdResponse) 218 | except FrameLengthError: 219 | resp = self._read_data(PMS5003CmdResponse) 220 | time.sleep(self.MIN_CMD_INTERVAL) 221 | return resp 222 | 223 | def _reset_input_buffer(self): 224 | if not self._serial: 225 | return 226 | 227 | while self._port.read() is not None: 228 | pass 229 | 230 | def reset(self): 231 | """This resets the device via a pin if one is defined. 232 | It restores passive mode as necessary.""" 233 | if self._pin_reset is None: 234 | return False 235 | 236 | time.sleep(0.1) 237 | self._pin_reset.value(0) 238 | self._reset_input_buffer() 239 | time.sleep(0.1) 240 | self._pin_reset.value(1) 241 | 242 | # Wait for first data frame from the device 243 | # CircuitPython 6.0.0 on nRF52840 sometimes picks up 2 bogus bytes here 244 | start = time.ticks_ms() 245 | while True: 246 | if self.data_available(): 247 | break 248 | elapsed = time.ticks_ms() - start 249 | if elapsed > self.MAX_RESET_TIME: 250 | raise ReadTimeoutError("PMS5003 Read Timeout: No response after reset") 251 | 252 | # After a reset device will be in active mode, restore passive mode 253 | if self._mode == "passive": 254 | self._reset_input_buffer() 255 | self.cmd_mode_passive() 256 | 257 | return True 258 | 259 | def data_available(self): 260 | """Returns boolean indicating if one or more data frames are waiting. 261 | Only for use in active mode.""" 262 | if not self._serial: 263 | try: 264 | data = self._port.readfrom_mem(PMSA003I_I2C_ADDR, 0x00, 2) 265 | return data == PMS5003_SOF 266 | except OSError: 267 | return False 268 | 269 | return self._port.any() >= PMS5003Data.FRAME_LEN 270 | 271 | def read(self): 272 | """Read a data frame. In passive mode this will transmit a request for one. 273 | This will make additional attempts based on retries value in constructor 274 | if there are exceptions and only raise the first exception if all fail.""" 275 | read_ex = None 276 | for _ in range(self._attempts): 277 | if self._mode == 'passive': 278 | self._cmd_passive_read() 279 | try: 280 | return self._read_data() 281 | except RuntimeError as ex: 282 | if read_ex is None: 283 | read_ex = ex 284 | raise read_ex if read_ex else RuntimeError("read failed - internal error") 285 | 286 | def _wait_for_bytes(self, num_bytes, timeout=MAX_RESP_TIME): 287 | start = time.ticks_ms() 288 | while self._port.any() < num_bytes: 289 | elapsed = time.ticks_ms() - start 290 | if elapsed > timeout: 291 | raise ReadTimeoutError("PMS5003 Read Timeout: Waiting for {} bytes!".format(num_bytes)) 292 | 293 | def _read_data(self, response_class=PMS5003Data): 294 | if self._serial: 295 | sof_index = 0 296 | 297 | while True: 298 | self._wait_for_bytes(1) 299 | 300 | one_byte = self._port.read(1) 301 | if one_byte is None or len(one_byte) == 0: 302 | continue 303 | 304 | if ord(one_byte) == PMS5003_SOF[sof_index]: 305 | if sof_index == 0: 306 | sof_index = 1 307 | elif sof_index == 1: 308 | break 309 | else: 310 | sof_index = 0 311 | 312 | self._wait_for_bytes(2) 313 | 314 | len_data = self._port.read(2) # Get frame length packet 315 | frame_length = struct.unpack(">H", len_data)[0] 316 | response_class.check_data_len(frame_length, desc="Length field") 317 | 318 | self._wait_for_bytes(frame_length) 319 | 320 | raw_data = self._port.read(frame_length) 321 | return response_class(raw_data, frame_length_bytes=len_data) 322 | else: 323 | try: 324 | raw_data = self._port.readfrom_mem(PMSA003I_I2C_ADDR, 0x00, 32) 325 | except OSError: 326 | raise RuntimeError("Error reading from I2C") 327 | return response_class(raw_data[4:], frame_length_bytes=raw_data[2:4]) 328 | 329 | def _cmd_passive_read(self): 330 | """ 331 | Sends command to request a data frame while in 'passive' 332 | mode and immediately reads in frame. 333 | """ 334 | if not self._serial: 335 | return 336 | self._reset_input_buffer() 337 | self._port.write(self._build_cmd_frame(PMS5003_CMD_READ)) 338 | -------------------------------------------------------------------------------- /pico_w/README.md: -------------------------------------------------------------------------------- 1 | # Raspberry Pi Pico W instructions -------------------------------------------------------------------------------- /pico_w/example.secrets.py: -------------------------------------------------------------------------------- 1 | REDIS_HOST="" 2 | REDIS_PORT="" 3 | REDIS_PASS="" 4 | WIFI_SSD="" 5 | WIFI_PASS="" -------------------------------------------------------------------------------- /pico_w/main.py: -------------------------------------------------------------------------------- 1 | # cSpell:disable 2 | import machine 3 | import network 4 | import time 5 | import secrets 6 | import picoredis as client 7 | import PMS5003 8 | import utility 9 | 10 | SENSOR_INTERVAL = 5 # seconds between each air sensor reading 11 | TTL_TIMER = 60 * 5 # 5 minutes between each ping for liveness 12 | SENSOR_LOCATION = 'unit:3' 13 | STREAM_KEY = 'sensor:raw' 14 | 15 | # connect to WIFI 16 | wlan = network.WLAN(network.STA_IF) 17 | wlan.active(True) 18 | wlan.connect(secrets.WIFI_SSD, secrets.WIFI_PASS) 19 | 20 | max_wait = 10 21 | while max_wait > 0: 22 | if wlan.status() < 0 or wlan.status() >= 3: 23 | break 24 | max_wait -= 1 25 | print('Connecting to WIFI...') 26 | time.sleep(1) 27 | 28 | if wlan.status() != 3: 29 | raise RuntimeError('Network connection failed') 30 | else: 31 | connection_info = wlan.ifconfig() 32 | print(f'Connected with IP: {connection_info[0]}') 33 | 34 | # Connect to RedisCloud database 35 | redis = client.Redis( 36 | host = secrets.REDIS_HOST, 37 | port = secrets.REDIS_PORT) 38 | redis.auth(secrets.REDIS_PASS) 39 | 40 | # Initial announcement that we exist 41 | redis.set(f'ttl:{SENSOR_LOCATION}', 'active', 'EX', TTL_TIMER) 42 | 43 | # Select pins for connecting to sensor 44 | UART_connection = machine.UART( 45 | 1, 46 | tx=machine.Pin(8), 47 | rx=machine.Pin(9), 48 | baudrate=9600) 49 | 50 | # Connect to sensor 51 | sensor = PMS5003.PMS5003( 52 | uart=UART_connection, 53 | pin_enable=machine.Pin(3), 54 | pin_reset=machine.Pin(2), 55 | mode="active") 56 | 57 | count_down_timer = TTL_TIMER 58 | 59 | # loop that will run while the Pico W has power 60 | while True: 61 | # announce our existence one interval before timer reaches zero 62 | if count_down_timer <= SENSOR_INTERVAL: 63 | redis.set(f'ttl:{SENSOR_LOCATION}', 'active', 'EX', TTL_TIMER) 64 | count_down_timer = TTL_TIMER 65 | try: 66 | # read values from sensor 67 | raw_reading = sensor.read() 68 | aqi_int = raw_reading.pm_ug_per_m3(2.5, False) 69 | aqi = utility.convert(aqi_int) 70 | temperature_reading = utility.read_onboard_temp() 71 | 72 | # send readings to Redis via a stream add command 73 | results = redis.XADD(STREAM_KEY, 74 | '*', 75 | 'target', SENSOR_LOCATION, 76 | 'PM2.5', raw_reading, 77 | 'AQI', aqi, 78 | 'temp', temperature_reading) 79 | 80 | print(f'Stream Entry ID: {SENSOR_LOCATION} - {results} AQI: {aqi}') 81 | 82 | except Exception as err: 83 | # report any errors in adding to stream 84 | print(f'Unexpected {err}: {type(err)}') 85 | 86 | finally: 87 | print(f'{count_down_timer} seconds TTL') 88 | # reduce the countdown timer 89 | count_down_timer = count_down_timer - SENSOR_INTERVAL 90 | # sleep until time to read again 91 | time.sleep(SENSOR_INTERVAL) -------------------------------------------------------------------------------- /pico_w/picoredis.py: -------------------------------------------------------------------------------- 1 | """A very minimal Redis client library (not only) for MicroPython.""" 2 | 3 | try: 4 | import usocket as socket 5 | except ImportError: 6 | import socket 7 | 8 | try: 9 | import uselect as select 10 | except ImportError: 11 | import select 12 | 13 | 14 | CRLF = "\r\n" 15 | 16 | 17 | class RedisError(Exception): 18 | """RESP error returned by the Redis server.""" 19 | pass 20 | 21 | 22 | class RedisTimeout(Exception): 23 | """Reply from the Redis server cannot be read within timeout.""" 24 | 25 | 26 | class ParseError(Exception): 27 | """Invalid input while parsing RESP data.""" 28 | pass 29 | 30 | 31 | def encode_request(*args): 32 | """Pack a series of arguments into a RESP array of bulk strings.""" 33 | result = ["*" + str(len(args)) + CRLF] 34 | 35 | for arg in args: 36 | if arg is None: 37 | result.append('$-1' + CRLF) 38 | else: 39 | s = str(arg) 40 | result.append('$' + str(len(s)) + CRLF + s + CRLF) 41 | 42 | return "".join(result) 43 | 44 | 45 | class Redis: 46 | """A very minimal Redis client.""" 47 | 48 | def __init__(self, host='127.0.0.1', port=6379, timeout=3000, debug=False): 49 | self.debug = debug 50 | self._sock = None 51 | self._timeout = timeout 52 | self.connect(host, port) 53 | 54 | def connect(self, host=None, port=None): 55 | if host is not None: 56 | self._host = host 57 | 58 | if port is not None: 59 | self._port = port 60 | 61 | if not self._sock: 62 | self._sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) 63 | self._sock.connect(socket.getaddrinfo(self._host, self._port)[0][-1]) 64 | self._sock_fd = self._sock.makefile('rb') 65 | try: 66 | self._sock_fd = self._sock_fd.fileno() 67 | except AttributeError: 68 | pass 69 | self._poller = select.poll() 70 | self._poller.register(self._sock_fd, select.POLLIN) 71 | 72 | def close(self): 73 | if self._sock: 74 | self._poller.unregister(self._sock_fd) 75 | self._sock.close() 76 | self._sock = None 77 | 78 | def do_cmd(self, cmd, *args): 79 | if not self._sock: 80 | raise RedisError("Not connected: use 'connect()' to connect to Redis server.") 81 | 82 | request = encode_request(cmd, *args) 83 | 84 | if self.debug: 85 | print("SEND: {!r}".format(request)) 86 | 87 | self._sock.send(request.encode('utf-8')) 88 | return self._read_response() 89 | 90 | __call__ = do_cmd 91 | 92 | def __getattr__(self, name): 93 | if name.isalpha(): 94 | return lambda *args: self.do_cmd(name, *args) 95 | 96 | raise AttributeError 97 | 98 | def _read_response(self): 99 | line = self._readuntil(lambda l, pos: l[-2:] == b'\r\n') 100 | rtype = line[:1].decode('utf-8') 101 | 102 | if rtype == '+': 103 | return line[1:-2] 104 | elif rtype == '-': 105 | raise RedisError(*line[1:-2].decode('utf-8').split(None, 1)) 106 | elif rtype == ':': 107 | return int(line[1:-2]) 108 | elif rtype == '$': 109 | length = int(line[1:-2]) 110 | 111 | if length == -1: 112 | return None 113 | 114 | return self._readuntil(lambda l, pos: pos == length + 2)[:-2] 115 | elif rtype == '*': 116 | length = int(line[1:-2]) 117 | 118 | if length == -1: 119 | return None 120 | 121 | return [self._read_response() for item in range(length)] 122 | else: 123 | raise ParseError("Invalid response header byte.") 124 | 125 | def _readuntil(self, predicate): 126 | buf = b'' 127 | pos = 0 128 | while not predicate(buf, pos): 129 | readylist = self._poller.poll(self._timeout) 130 | if not readylist: 131 | raise RedisTimeout("Error reading response from Redis server within timeout.") 132 | 133 | for entry in readylist: 134 | if (entry[0] is self._sock_fd and entry[1] & select.POLLIN and not 135 | entry[1] & (select.POLLHUP | select.POLLERR)): 136 | buf += self._sock.recv(1) 137 | pos += 1 138 | break 139 | else: 140 | self.close() 141 | raise OSError("Error reading from socket.") 142 | 143 | if self.debug: 144 | print("RECV: {!r}".format(buf)) 145 | 146 | return buf -------------------------------------------------------------------------------- /pico_w/utility.py: -------------------------------------------------------------------------------- 1 | import machine 2 | import PMS5003 3 | 4 | AQI_MATRIX = [ 5 | {'C_LO': 0, 'C_HI': 12.0, 'I_LO': 0, 'I_HI': 50}, 6 | {'C_LO': 12.1, 'C_HI': 35.4, 'I_LO': 51, 'I_HI': 100}, 7 | {'C_LO': 35.5, 'C_HI': 55.4, 'I_LO': 101, 'I_HI': 150}, 8 | {'C_LO': 55.5, 'C_HI': 150.4, 'I_LO': 151, 'I_HI': 200}, 9 | {'C_LO': 150.5, 'C_HI': 250.4, 'I_LO': 201, 'I_HI': 300}, 10 | {'C_LO': 250.5, 'C_HI': 350.4, 'I_LO': 301, 'I_HI': 400}, 11 | {'C_LO': 350.5, 'C_HI': 500.4, 'I_LO': 401, 'I_HI': 500}, 12 | ] 13 | 14 | def convert_with_data(pm2_5, row): 15 | thresh = AQI_MATRIX[row] 16 | pt_1 = (thresh['I_HI'] - thresh['I_LO']) / (thresh['C_HI'] - thresh['C_LO']) 17 | pt_2 = pm2_5 - thresh['C_LO'] 18 | aqi = int(pt_1 * pt_2 + thresh['I_LO']) 19 | return aqi 20 | 21 | def convert(pm2_5): 22 | # determine which threshold to use for this concentration 23 | if pm2_5 < 12: 24 | aqi = convert_with_data(pm2_5, 0) 25 | elif pm2_5 > 12.1 and pm2_5 < 35.4: 26 | aqi = convert_with_data(pm2_5, 1) 27 | elif pm2_5 > 35.5 and pm2_5 < 55.4: 28 | aqi = convert_with_data(pm2_5, 2) 29 | elif pm2_5 > 55.5 and pm2_5 < 150.4: 30 | aqi = convert_with_data(pm2_5, 3) 31 | elif pm2_5 > 150.5 and pm2_5 < 250.4: 32 | aqi = convert_with_data(pm2_5, 4) 33 | elif pm2_5 > 250.5 and pm2_5 < 350.4: 34 | aqi = convert_with_data(pm2_5, 5) 35 | elif pm2_5 > 350.5 and pm2_5 < 500.4: 36 | aqi = convert_with_data(pm2_5, 6) 37 | 38 | return aqi 39 | 40 | def read_onboard_temp(): 41 | sensor_temp = machine.ADC(4) 42 | conversion_factor = 3.3 / (65535) 43 | reading = sensor_temp.read_u16() * conversion_factor 44 | temperature = (reading - 0.706)/0.001721 45 | temperature = (temperature * 1.8) + 53.5 46 | temperature = round(temperature, 2) 47 | return temperature -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | anyio==3.6.2 2 | async-timeout==4.0.2 3 | certifi==2022.12.7 4 | charset-normalizer==3.1.0 5 | fastapi==0.91.0 6 | idna==3.4 7 | importlib-metadata==6.0.0 8 | PyJWT==2.6.0 9 | pytz==2022.7.1 10 | redis==4.5.1 11 | requests==2.28.2 12 | six==1.16.0 13 | sniffio==1.3.0 14 | starlette==0.24.0 15 | twilio==7.16.5 16 | typing_extensions==4.5.0 17 | urllib3==1.26.15 18 | zipp==3.15.0 19 | --------------------------------------------------------------------------------