├── .gitignore ├── LICENSE ├── README.md ├── c-version ├── Makefile ├── README.md ├── mjson.c ├── mjson.h ├── snr-cli.c ├── snr.c ├── stats.c ├── stats.h ├── tree.c ├── tree.h └── xaa-output.prn ├── rtl_433_stats ├── tools ├── rtl_json_csv └── rtl_xtract_json ├── xaa-output.prn └── xaa.json /.gitignore: -------------------------------------------------------------------------------- 1 | # Prerequisites 2 | *.d 3 | 4 | # Object files 5 | *.o 6 | *.ko 7 | *.obj 8 | *.elf 9 | 10 | # Linker output 11 | *.ilk 12 | *.map 13 | *.exp 14 | 15 | # Precompiled Headers 16 | *.gch 17 | *.pch 18 | 19 | # Libraries 20 | *.lib 21 | *.a 22 | *.la 23 | *.lo 24 | 25 | # Shared objects (inc. Windows DLLs) 26 | *.dll 27 | *.so 28 | *.so.* 29 | *.dylib 30 | 31 | # Executables 32 | *.exe 33 | *.out 34 | *.app 35 | *.i*86 36 | *.x86_64 37 | *.hex 38 | 39 | # Debug files 40 | *.dSYM/ 41 | *.su 42 | *.idb 43 | *.pdb 44 | 45 | # Kernel Module Compile Results 46 | *.mod* 47 | *.cmd 48 | .tmp_versions/ 49 | modules.order 50 | Module.symvers 51 | Mkfile.old 52 | dkms.conf 53 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | BSD 2-Clause License 2 | 3 | Copyright (c) 2022, David Todd 4 | All rights reserved. 5 | 6 | Redistribution and use in source and binary forms, with or without 7 | modification, are permitted provided that the following conditions are met: 8 | 9 | 1. Redistributions of source code must retain the above copyright notice, this 10 | list of conditions and the following disclaimer. 11 | 12 | 2. Redistributions in binary form must reproduce the above copyright notice, 13 | this list of conditions and the following disclaimer in the documentation 14 | and/or other materials provided with the distribution. 15 | 16 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 17 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 18 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE 19 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE 20 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 21 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR 22 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER 23 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, 24 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE 25 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. 26 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # rtl\_433\_stats v2.2.0 2 | Catalog and analyze transmissions from devices recorded in rtl_433 JSON logs 3 | 4 | ## Function 5 | `rtl_433_stats` catalogs and characterizes ISM-band devices in your neighborhood using data from the JSON log file generated by `rtl_433`. It processes rtl\_433 JSON log files to: 6 | 7 | * read the recorded packet information from the log file(s), 8 | * catalog all devices recorded in the log(s), 9 | * count the packets and consolidate redundant packets into individual transmissions, 10 | * summarize the statistics about packet signal-to-noise ratios (SNR) and radio frequencies (Freq), the gap times between transmissions (ITGT), and the packets per transmission (PPT). 11 | 12 | Sample output looks like this: 13 | 14 | ``` 15 | Processing ISM 433MHz messages recorded by rtl_433 16 | Including SNR Stats 17 | Including ITGT Stats 18 | Including Freq Stats 19 | Excluding TPMS devices 20 | Processing file xaa.json 21 | 22 | Processed 20000 Packets as 6952 De-Duplicated Transmissions in 0.22sec 23 | Packets dated from Thu 2022-06-09 07:08:27 to Thu 2022-06-09 19:46:16 24 | 25 | Signal-to-Noise Inter-Transmission Gap Time Frequency (MHz) Packets per Transmit 26 | Device model/channel/id _________________________________ _______________________________________ ____________________________________________ ____________________________________ 27 | #Pkts Mean ± 𝜎 Min Max #Gaps Mean ± 𝜎 Min Max #Pkts Mean ± 𝜎 Min Max #Xmits Mean ± 𝜎 Min Max 28 | Acurite-01185M/0/0 4 9.6 ± 4.9 6.4 16.9 3 3678.7s ± 2812.7 434.0 5425.0 4 433.911 ± 0.017 433.902 433.936 3 1.0 ± 0.0 1 1 29 | Acurite-606TX//134 858 8.4 ± 2.1 5.5 20.0 857 53.0s ± 139.2 30.0 2573.0 858 433.901 ± 0.009 433.863 433.962 857 1.0 ± 0.0 1 1 30 | Acurite-609TXC//194 8006 19.3 ± 0.5 12.3 21.2 1356 33.5s ± 0.7 33.0 49.0 8006 433.931 ± 0.002 433.922 433.950 1356 5.9 ± 0.4 2 6 31 | Acurite-Tower/A/11524 8203 19.2 ± 0.5 13.2 20.8 2752 16.5s ± 2.5 15.0 33.0 8203 433.950 ± 0.002 433.926 433.955 2752 3.0 ± 0.2 1 3 32 | LaCrosse-TX141Bv3/1/253 597 8.4 ± 1.3 5.7 19.2 347 109.9s ± 338.9 31.0 4216.0 597 433.904 ± 0.003 433.863 433.945 347 1.7 ± 0.5 1 2 33 | LaCrosse-TX141THBv2/0/168 1536 9.6 ± 1.1 6.0 19.2 837 54.2s ± 14.7 49.0 150.0 1536 433.961 ± 0.004 433.862 433.966 837 1.8 ± 0.4 1 2 34 | Markisol/0/0 39 19.1 ± 1.2 12.3 20.2 38 1053.5s ± 1532.1 33.0 6633.0 39 433.932 ± 0.002 433.928 433.936 38 1.0 ± 0.0 1 1 35 | Markisol/0/256 20 19.3 ± 0.4 18.5 20.2 19 2108.7s ± 3625.7 33.0 14070.0 20 433.931 ± 0.002 433.927 433.936 19 1.0 ± 0.0 1 1 36 | Markisol/1/0 36 19.2 ± 0.5 17.6 20.0 35 1009.8s ± 1550.0 67.0 6801.0 36 433.931 ± 0.002 433.927 433.934 35 1.0 ± 0.0 1 1 37 | Prologue-TH/2/203 699 11.6 ± 1.3 7.2 19.5 698 64.9s ± 32.7 52.0 477.0 699 433.864 ± 0.008 433.859 433.943 698 1.0 ± 0.0 1 1 38 | ``` 39 | 40 | ## Use 41 | 42 | Issue the command `rtl_433_stats -i ` to generate the report. Note that the input file specification may include wildcards and/or compressed files (.gz or .bz2). Use `-i` with no file specification for stdin. 43 | 44 | Issue the command `rtl_433_stats -h` to see command-line options: 45 | 46 | ``` 47 | usage: rtl_433_stats [-h] [-i [FILE ...]] [-o {SNR,ITGT,Freq,PPT} [{SNR,ITGT,Freq,PPT} ...]] [-x NOISE] [-w WINDOW] [-T] [-v] 48 | 49 | Analyze rtl_433 JSON logs to catalog the devices seen and to characterize 50 | statistically their signal-to-noise ratio (SNR), times between 51 | transmissions (ITGT),tradio frequency (Freq), and packets per transmission (PPT). 52 | 53 | options: 54 | -h, --help show this help message and exit 55 | -i [FILE ...], --input [FILE ...] 56 | Path to JSON log files to read in; can be .gz; can be wildcard; blank if 57 | -o {SNR,ITGT,Freq,PPT} [{SNR,ITGT,Freq,PPT} ...], --omit {SNR,ITGT,Freq,PPT} [{SNR,ITGT,Freq,PPT} ...] 58 | -x NOISE, --exclude_noise NOISE 59 | Exclude device records with fewer than 'NOISE' packets seen 60 | -w WINDOW, --xmt_window WINDOW 61 | Max time in sec for a packet group to be considered as one transmission (default: None) 62 | -T, --include_TPMS include tire pressure monitors in catalog (default: False) 63 | -v, --version show program's version number and exit 64 | 65 | ``` 66 | In practice and for log files recorded over long periods, the log file may contain records for devices seen only sporadically: tire pressure monitor systems, security systems, automobile remotes, etc. These may make the report long and difficult to read. Some options help customize the reports: 67 | 68 | * By default, tire pressure monitoring systems (TPMS) are excluded from reports: use `-T` to _include_ them. 69 | * All other devices recorded in the log file(s) are included in the report by default. Use the `-x n` option to exclude from the report any device with less than `n` packets in the logs (typically n=10 to 100 seem to be most useful). 70 | * The default report with all four characteristics is fairly wide. It can be narrowed by omitting one or more of the characteristics with the `-o` option followed by `SNR`, `ITGT`, `Freq`, and/or `PPT` to specify which reports to omit. 71 | * By default, packets broadcast by a single device within a 2-second window are considered to be one transmission. The `-w n` option, n in seconds, can be used to change that window, affecting the ITGT and PPT reports. 72 | 73 | If you've configured `logrotate` to compress and archive your `rtl_433` log files, you can process the .gz log files directly with the file specification to `rtl_433_stats`, and you can select just a set of files with wildcards. For example, specify `-i /var/log/rtl_433/rtl_433.json-202303*.gz` to generate a report for just March, 2023. 74 | 75 | ## Operational Details 76 | 77 | `rtl_433_stats` reads the JSON log file created by rtl\_433 (recommend to stop rtl_433 so that the JSON log file is closed for processing). The observed devices, as recorded in the JSON file in temporal order, are cataloged in alphabetical order in a summary table along with a count of the number of packets and de-duplicated transmissions seen for that device and with basic statistics for that device: 78 | 79 | * count of samples, 80 | * mean, 81 | * std deviation, 82 | * min value seen, and 83 | * max value seen 84 | 85 | for these device characteristics: 86 | 87 | * signal-to-noise ratio over all packets (SNR), 88 | * inter-transmission gap times (ITGT): the time between successive transmissions by that device, 89 | * radio frequency of transmissions (Freq) over all packets from that device, 90 | * the number of packets per transmission (PPT). 91 | 92 | A command-line option allows the de-selection of any or all of these statistics reports (default is to report all four). 93 | 94 | A device "transmission" represents one observation but may contain 1 to 6 or more "packets", and transmissions are frequently initiated by remote sensor devices at approximately 15-second, 30-second, or 60-second intervals. These are simplex communication devices -- the remote device sends data and receives no acknowledgement from the receiver that it has received the data. In high-traffic neighborhoods, the signals from the various devices may interfere with each other. Sending redundant packets increases the probability that a receiving device will successfully receive at least one packet in the transmission. 95 | 96 | `rtl_433_stats` assumes that packets from the same device within a default setting of 2 seconds of each other were repeated for reliability and represent one transmission. Override that 2-second window with the `-w n` option. 97 | 98 | SNR and frequency data are averaged over all packets; transmission gap times and packets per transmission are averaged over transmissions (grouped packets). 99 | 100 | The key string for cataloging a device and labeling device in the report lines is the concatenation of the JSON 'model'/'channel'/'id' fields from the received data record. 101 | 102 | `rtl_433_stats` tracks packet times _per device_ so that data for transmissions from different devices are separated and tranmission statistics for individual devices are more reliable in high-traffic areas. 103 | 104 | While processing the JSON log files, `rtl_433_stats` monitors the "battery_ok" and "status" flags for each device (if present in the packets) and prints an alert that there has been a change in condition in the remote device. Generally a 1-->0 change in "battery_ok" indicates a low voltage on the battery and suggests that the battery should be changed. The values of "status" vary by device and are generally not well documented, but the program notes them for you in case further investigation is justified. 105 | 106 | ## Installation 107 | 1. Use git to clone the distribution from Github, [https://github.com/hdtodd/rtl\_433\_stats](https://github.com/hdtodd/rtl_433_stats) 108 | 2. Connect to the download directory 109 | 3. Type `python3 rtl_433_stats -i xaa.json` and compare its output with the file `xaa-output.rpt` to ensure that it is functioning correctly. 110 | 111 | After verifying that `rtl_433_stats`is functioning correctly with test data, you may want to configure your `rtl_433` config file to record data from your own RTL_SDR dongle. For example, add something like the following to your `rtl_433.conf` file: 112 | ``` 113 | output json:/var/log/rtl_433/rtl_433.json 114 | ``` 115 | (and create and assign ownership to /var/log/rtl\_433/ if necessary) and then restart rtl\_433. 116 | 117 | ## Known Issues 118 | 119 | The first packet from a device during a transmission interval (individually or as the first in a transmission packet) may have a distorted SNR because of a high auto-gain on the receiving RTL\_SDR dongle. Some devices issue just one packet per transmission and others issue 3-6. No fix anticipated. 120 | 121 | ## Other Tools 122 | 123 | The `tools` directory contains two Python scripts that may be useful for extracting records from the JSON log files for more detailed analysis: 124 | 125 | * `rtl_xtract_json` extracts from a JSON log file all records for one or more specific devices into a separate file. Devices are identified by the "model/channel/id" keyword identifier in the `rtl_433_stats` report. 126 | * `rtl_json_csv` extracts from a JSON log file and into CSV format the values of fields specified on the command line. The output is labeled with the "model/channel/id" identifier. A header line identifying the extracted fields prefaces the data to allow easy importing into spreadsheet programs. 127 | 128 | A reduced-functionality version of `rtl_433_stats` is available as a c-language version in the directory `c-version` as the program `snr`. That program analyzes only signal-to-noise ratios and does not have the options for selecting records to be processed, but it may be useful in some circumstances (and is much faster in execution). 129 | 130 | ## Author 131 | David Todd, hdtodd@gmail.com, 2022.05; v2.1 2023.04; v2.2 2024.08 adds support for both ISO and Unix Epoch time stamps in the JSON log file. 132 | 133 | -------------------------------------------------------------------------------- /c-version/Makefile: -------------------------------------------------------------------------------- 1 | # Makefile for snr, a program to report statistics on 2 | # signal-to-noise ratio on packets received by rtl_433 3 | # and logged in JSON format. 4 | # 5 | #2022.05.16 Inital version 6 | #2022.03.21 Updated to omit Python code 7 | #Author: HDTodd@gmail.com, Williston VT 8 | # 9 | 10 | CC = gcc 11 | PROJ = snr 12 | 13 | BIN = ~/bin/ 14 | #CFLAGS += -D DEBUG_ENABLE 15 | LDFLAGS = -lm 16 | OBJS = snr.o snr-cli.o stats.o tree.o mjson.o 17 | 18 | 19 | all: ${PROJ} 20 | 21 | .SUFFIXES: .c 22 | 23 | .c.o: 24 | $(CC) $(CFLAGS) -c $< 25 | 26 | ${PROJ}: ${OBJS} 27 | $(CC) -o $@ ${OBJS} $(LDFLAGS) 28 | 29 | clean: 30 | /bin/rm -f *~ *.o ${PROJ} 31 | 32 | install: 33 | mkdir -p ${BIN} 34 | mv snr ${BIN} 35 | 36 | uninstall: 37 | rm ${BIN}snr 38 | 39 | 40 | 41 | -------------------------------------------------------------------------------- /c-version/README.md: -------------------------------------------------------------------------------- 1 | # snr 2 | Catalog and analyze transmissions from devices recorded in rtl_433 JSON logs 3 | 4 | 5 | 6 | ## Function 7 | `snr` catalogs and characterizes ISM-band devices in your neighborhood using data from the JSON log file generated by `rtl_433`. It processes rtl\_433 JSON log files to: 8 | 9 | * read the packet information as recorded by rtl\_433 in a JSON log file, 10 | * catalog all devices recorded in that log, 11 | * count the packets and consolidate redundant packets into an individual transmission, 12 | * summarize the statistics about packet signal-to-noise ratios (SNR) in the packets observed. 13 | 14 | Sample output looks like this: 15 | 16 | ``` 17 | snr: Analyze rtl_433 json log files 18 | Processing ISM 433MHz messages from file ../xaa.json 19 | 20 | Processed 7045 de-duplicated records 21 | Dated from Thu 2022-06-09 07:08:27 to Thu 2022-06-09 19:46:16 22 | 23 | Device #Recs Mean SNR ± 𝜎 Min Max 24 | Acurite-01185M 0 4 9.6 ± 4.9 6.4 16.9 25 | Acurite-606TX 134 858 8.4 ± 2.1 5.5 20.0 26 | Acurite-609TXC 194 1446 19.2 ± 0.5 12.4 21.2 27 | Acurite-Tower 11524 2753 19.2 ± 0.5 13.2 20.8 28 | Hyundai-VDO 60b87768 1 11.0 ± 0.0 11.0 11.0 29 | Hyundai-VDO aeba4a98 1 7.2 ± 0.0 7.2 7.2 30 | LaCrosse-TX141Bv3 253 348 8.2 ± 1.1 5.7 11.5 31 | LaCrosse-TX141THBv2 168 840 9.6 ± 1.1 6.0 19.2 32 | Markisol 0 75 19.2 ± 0.9 12.3 20.2 33 | Markisol 256 20 19.3 ± 0.4 18.5 20.2 34 | Prologue-TH 203 699 11.6 ± 1.3 7.2 19.5 35 | ``` 36 | ## Use 37 | 38 | Issue the command `snr -f ` to generate the report; `snr -h` shows the command-line options. 39 | 40 | ## Details 41 | 42 | `rtl_433_stats` reads the JSON log file created by rtl\_433 (recommend to stop rtl_433 so that the JSON log file is closed for processing). The observed devices, as recorded in the JSON file in temporal order, are cataloged in alphabetical order in a summary table. The summary includes a count of the number of packets and de-duplicated transmissions seen for that device and basic statistics for the signal-to-noise ratios: 43 | 44 | * count of samples, 45 | * mean, 46 | * std deviation, 47 | * min value seen, and 48 | * max value seen. 49 | 50 | JSON log times are expected to be in the format "HH:MM:SS", to the nearest second with no fractional part. 51 | 52 | `snr` summarizes information only for the first packet in each transmission and ignores "duplicated" packets. A packet is considered a duplicate of its predecessor if the concatenated device identifier string is repeated within 2 seconds of that predecessor. Other recorded data (snr, etc.) are *not* compared, and for some devices that may not be desirable. The algorithm only considers the immediate predecessor record in the JSON file, not the immediate predecessor record *for that device*, so interleaved data packets from differing devices would result in imperfect de-duplication in high-traffic regions. 53 | 54 | ## Installation 55 | 56 | 1. Connect to the `c-version` directory and `make` and then `make install`. Note that this installs the *snr* executable into `~/bin`; edit `Makefile`'s definition of `BIN` if you want the code installed elsewhere, or simply execute the programs from the download directory rather than install. 57 | 2. Assuming that `~/bin/` is in your path or that you execute from the download directory, you may then process JSON log files. For example, to process the `xaa.json` file that is distributed with the package, `snr -f ../xaa.json` and compare with the sample `xaa-output.prn` file distributed with the package to verify correct operation. 58 | 59 | ## Dependencies 60 | This code uses Eric Raymond's mjson.c library to parse the rtl_433 JSON file and would not have been possible without it: that code is included in this distribution. One slight modification to Raymond's distributed code was needed to accommodate model values that were sometimes numeric and sometimes quoted strings; that modification is noted in the mjson.c file included in this distribution. 61 | 62 | ## Author 63 | David Todd, hdtodd@gmail.com, 2022.05. Updated 2023.04. 64 | 65 | -------------------------------------------------------------------------------- /c-version/mjson.c: -------------------------------------------------------------------------------- 1 | /**************************************************************************** 2 | 3 | NAME 4 | mjson.c - parse JSON into fixed-extent data structures 5 | 6 | DESCRIPTION 7 | This module parses a large subset of JSON (JavaScript Object 8 | Notation). Unlike more general JSON parsers, it doesn't use malloc(3) 9 | and doesn't support polymorphism; you need to give it a set of 10 | template structures describing the expected shape of the incoming 11 | JSON, and it will error out if that shape is not matched. When the 12 | parse succeeds, attribute values will be extracted into static 13 | locations specified in the template structures. 14 | 15 | The "shape" of a JSON object in the type signature of its 16 | attributes (and attribute values, and so on recursively down through 17 | all nestings of objects and arrays). This parser is indifferent to 18 | the order of attributes at any level, but you have to tell it in 19 | advance what the type of each attribute value will be and where the 20 | parsed value will be stored. The template structures may supply 21 | default values to be used when an expected attribute is omitted. 22 | 23 | The preceding paragraph told one fib. A single attribute may 24 | actually have a span of multiple specifications with different 25 | syntactically distinguishable types (e.g. string vs. real vs. integer 26 | vs. boolean, but not signed integer vs. unsigned integer). The parser 27 | will match the right spec against the actual data. 28 | 29 | The dialect this parses has some limitations. First, it cannot 30 | recognize the JSON "null" value. Second, all elements of an array must 31 | be of the same type. Third, characters may not be array elements (this 32 | restriction could be lifted) 33 | 34 | There are separate entry points for beginning a parse of either 35 | JSON object or a JSON array. JSON "float" quantities are actually 36 | stored as doubles. 37 | 38 | This parser processes object arrays in one of two different ways, 39 | defending on whether the array subtype is declared as object or 40 | structobject. 41 | 42 | Object arrays take one base address per object subfield, and are 43 | mapped into parallel C arrays (one per subfield). Strings are not 44 | supported in this kind of array, as they don't have a "natural" size 45 | to use as an offset multiplier. 46 | 47 | Structobjects arrays are a way to parse a list of objects to a set 48 | of modifications to a corresponding array of C structs. The trick is 49 | that the array object initialization has to specify both the C struct 50 | array's base address and the stride length (the size of the C struct). 51 | If you initialize the offset fields with the correct offsetof calls, 52 | everything will work. Strings are supported but all string storage 53 | has to be inline in the struct. 54 | 55 | PERMISSIONS 56 | This file is Copyright (c) 2014 by Eric S. Raymond 57 | SPDX-License-Identifier: BSD-2-Clause 58 | 59 | MODIFICATION 60 | Modified 2022 by hdtodd@gmail.com to accommodate quoted numeric 61 | values for numeric fields; find "[hdt]" to locate the mod. 62 | ***************************************************************************/ 63 | /* The strptime prototype is not provided unless explicitly requested. 64 | * We also need to set the value high enough to signal inclusion of 65 | * newer features (like clock_gettime). See the POSIX spec for more info: 66 | * http://pubs.opengroup.org/onlinepubs/9699919799/functions/V2_chap02.html#tag_15_02_01_02 */ 67 | #define _XOPEN_SOURCE 600 68 | 69 | #include 70 | #include 71 | #include 72 | #include 73 | #include 74 | #include 75 | #include 76 | #include 77 | #include /* for HUGE_VAL */ 78 | 79 | #include "mjson.h" 80 | 81 | #define str_starts_with(s, p) (strncmp(s, p, strlen(p)) == 0) 82 | 83 | #ifdef DEBUG 84 | static int debuglevel = 0; 85 | static FILE *debugfp; 86 | 87 | void json_enable_debug(int level, FILE * fp) 88 | /* control the level and destination of debug trace messages */ 89 | { 90 | debuglevel = level; 91 | debugfp = fp; 92 | } 93 | 94 | static void json_trace(int errlevel, const char *fmt, ...) 95 | /* assemble command in printf(3) style */ 96 | { 97 | if (errlevel <= debuglevel) { 98 | char buf[BUFSIZ]; 99 | va_list ap; 100 | 101 | (void)strncpy(buf, "json: ", BUFSIZ-1); 102 | buf[BUFSIZ-1] = '\0'; 103 | va_start(ap, fmt); 104 | (void)vsnprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), fmt, 105 | ap); 106 | va_end(ap); 107 | 108 | (void)fputs(buf, debugfp); 109 | } 110 | } 111 | 112 | # define json_debug_trace(args) (void) json_trace args 113 | #else 114 | # define json_debug_trace(args) do { } while (0) 115 | #endif /* DEBUG */ 116 | 117 | static char *json_target_address(const struct json_attr_t *cursor, 118 | const struct json_array_t 119 | *parent, int offset) 120 | { 121 | char *targetaddr = NULL; 122 | if (parent == NULL || parent->element_type != t_structobject) { 123 | /* ordinary case - use the address in the cursor structure */ 124 | switch (cursor->type) { 125 | case t_ignore: 126 | targetaddr = NULL; 127 | break; 128 | case t_integer: 129 | targetaddr = (char *)&cursor->addr.integer[offset]; 130 | break; 131 | case t_uinteger: 132 | targetaddr = (char *)&cursor->addr.uinteger[offset]; 133 | break; 134 | case t_short: 135 | targetaddr = (char *)&cursor->addr.shortint[offset]; 136 | break; 137 | case t_ushort: 138 | targetaddr = (char *)&cursor->addr.ushortint[offset]; 139 | break; 140 | case t_time: 141 | case t_real: 142 | targetaddr = (char *)&cursor->addr.real[offset]; 143 | break; 144 | case t_string: 145 | targetaddr = cursor->addr.string; 146 | break; 147 | case t_boolean: 148 | targetaddr = (char *)&cursor->addr.boolean[offset]; 149 | break; 150 | case t_character: 151 | targetaddr = (char *)&cursor->addr.character[offset]; 152 | break; 153 | default: 154 | targetaddr = NULL; 155 | break; 156 | } 157 | } else 158 | /* tricky case - hacking a member in an array of structures */ 159 | targetaddr = 160 | parent->arr.objects.base + (offset * parent->arr.objects.stride) + 161 | cursor->addr.offset; 162 | json_debug_trace((1, "Target address for %s (offset %d) is %p\n", 163 | cursor->attribute, offset, targetaddr)); 164 | return targetaddr; 165 | } 166 | 167 | #ifdef TIME_ENABLE 168 | static double iso8601_to_unix(char *isotime) 169 | /* ISO8601 UTC to Unix UTC */ 170 | { 171 | double usec; 172 | struct tm tm; 173 | 174 | char *dp = strptime(isotime, "%Y-%m-%dT%H:%M:%S", &tm); 175 | if (dp == NULL) 176 | return (double)HUGE_VAL; 177 | if (*dp == '.') 178 | usec = strtod(dp, NULL); 179 | else 180 | usec = 0; 181 | return (double)timegm(&tm) + usec; 182 | } 183 | #endif /* TIME_ENABLE */ 184 | 185 | static int json_internal_read_object(const char *cp, 186 | const struct json_attr_t *attrs, 187 | const struct json_array_t *parent, 188 | int offset, 189 | const char **end) 190 | { 191 | enum 192 | { init, await_attr, in_attr, await_value, in_val_string, 193 | in_escape, in_val_token, post_val, post_element 194 | } state = 0; 195 | #ifdef DEBUG 196 | char *statenames[] = { 197 | "init", "await_attr", "in_attr", "await_value", "in_val_string", 198 | "in_escape", "in_val_token", "post_val", "post_element", 199 | }; 200 | #endif /* DEBUG */ 201 | char attrbuf[JSON_ATTR_MAX + 1], *pattr = NULL; 202 | char valbuf[JSON_VAL_MAX + 1], *pval = NULL; 203 | bool value_quoted = false; 204 | char uescape[5]; /* enough space for 4 hex digits and a NUL */ 205 | const struct json_attr_t *cursor; 206 | int substatus, n, maxlen = 0; 207 | unsigned int u; 208 | const struct json_enum_t *mp; 209 | char *lptr; 210 | 211 | if (end != NULL) 212 | *end = NULL; /* give it a well-defined value on parse failure */ 213 | 214 | /* stuff fields with defaults in case they're omitted in the JSON input */ 215 | for (cursor = attrs; cursor->attribute != NULL; cursor++) 216 | if (!cursor->nodefault) { 217 | lptr = json_target_address(cursor, parent, offset); 218 | if (lptr != NULL) 219 | switch (cursor->type) { 220 | case t_integer: 221 | memcpy(lptr, &cursor->dflt.integer, sizeof(int)); 222 | break; 223 | case t_uinteger: 224 | memcpy(lptr, &cursor->dflt.uinteger, sizeof(unsigned int)); 225 | break; 226 | case t_short: 227 | memcpy(lptr, &cursor->dflt.shortint, sizeof(short)); 228 | break; 229 | case t_ushort: 230 | memcpy(lptr, &cursor->dflt.ushortint, 231 | sizeof(unsigned short)); 232 | break; 233 | case t_time: 234 | case t_real: 235 | memcpy(lptr, &cursor->dflt.real, sizeof(double)); 236 | break; 237 | case t_string: 238 | if (parent != NULL 239 | && parent->element_type != t_structobject 240 | && offset > 0) 241 | return JSON_ERR_NOPARSTR; 242 | lptr[0] = '\0'; 243 | break; 244 | case t_boolean: 245 | memcpy(lptr, &cursor->dflt.boolean, sizeof(bool)); 246 | break; 247 | case t_character: 248 | lptr[0] = cursor->dflt.character; 249 | break; 250 | case t_object: /* silences a compiler warning */ 251 | case t_structobject: 252 | case t_array: 253 | case t_check: 254 | case t_ignore: 255 | break; 256 | } 257 | } 258 | 259 | json_debug_trace((1, "JSON parse of '%s' begins.\n", cp)); 260 | 261 | /* parse input JSON */ 262 | for (; *cp != '\0'; cp++) { 263 | json_debug_trace((2, "State %-14s, looking at '%c' (%p)\n", 264 | statenames[state], *cp, cp)); 265 | switch (state) { 266 | case init: 267 | if (isspace((unsigned char) *cp)) 268 | continue; 269 | else if (*cp == '{') 270 | state = await_attr; 271 | else { 272 | json_debug_trace((1, 273 | "Non-WS when expecting object start.\n")); 274 | if (end != NULL) 275 | *end = cp; 276 | return JSON_ERR_OBSTART; 277 | } 278 | break; 279 | case await_attr: 280 | if (isspace((unsigned char) *cp)) 281 | continue; 282 | else if (*cp == '"') { 283 | state = in_attr; 284 | pattr = attrbuf; 285 | if (end != NULL) 286 | *end = cp; 287 | } else if (*cp == '}') 288 | break; 289 | else { 290 | json_debug_trace((1, "Non-WS when expecting attribute.\n")); 291 | if (end != NULL) 292 | *end = cp; 293 | return JSON_ERR_ATTRSTART; 294 | } 295 | break; 296 | case in_attr: 297 | if (pattr == NULL) 298 | /* don't update end here, leave at attribute start */ 299 | return JSON_ERR_NULLPTR; 300 | if (*cp == '"') { 301 | *pattr++ = '\0'; 302 | json_debug_trace((1, "Collected attribute name %s\n", 303 | attrbuf)); 304 | for (cursor = attrs; cursor->attribute != NULL; cursor++) { 305 | json_debug_trace((2, "Checking against %s\n", 306 | cursor->attribute)); 307 | if (strcmp(cursor->attribute, attrbuf) == 0) 308 | break; 309 | if (strcmp(cursor->attribute, "") == 0 && 310 | cursor->type == t_ignore) { 311 | break; 312 | } 313 | } 314 | if (cursor->attribute == NULL) { 315 | json_debug_trace((1, 316 | "Unknown attribute name '%s'" 317 | " (attributes begin with '%s').\n", 318 | attrbuf, attrs->attribute)); 319 | // don't update end here, leave at attribute start 320 | return JSON_ERR_BADATTR; 321 | } 322 | state = await_value; 323 | if (cursor->type == t_string) 324 | maxlen = (int)cursor->len - 1; 325 | else if (cursor->type == t_check) 326 | maxlen = (int)strlen(cursor->dflt.check); 327 | else if (cursor->type == t_time || cursor->type == t_ignore) 328 | maxlen = JSON_VAL_MAX; 329 | else if (cursor->map != NULL) 330 | maxlen = (int)sizeof(valbuf) - 1; 331 | pval = valbuf; 332 | } else if (pattr >= attrbuf + JSON_ATTR_MAX - 1) { 333 | json_debug_trace((1, "Attribute name too long.\n")); 334 | /* don't update end here, leave at attribute start */ 335 | return JSON_ERR_ATTRLEN; 336 | } else 337 | *pattr++ = *cp; 338 | break; 339 | case await_value: 340 | if (isspace((unsigned char) *cp) || *cp == ':') 341 | continue; 342 | else if (*cp == '[') { 343 | if (cursor->type != t_array) { 344 | json_debug_trace((1, 345 | "Saw [ when not expecting array.\n")); 346 | if (end != NULL) 347 | *end = cp; 348 | return JSON_ERR_NOARRAY; 349 | } 350 | substatus = json_read_array(cp, &cursor->addr.array, &cp); 351 | if (substatus != 0) 352 | return substatus; 353 | state = post_element; 354 | } else if (cursor->type == t_array) { 355 | json_debug_trace((1, 356 | "Array element was specified, but no [.\n")); 357 | if (end != NULL) 358 | *end = cp; 359 | return JSON_ERR_NOBRAK; 360 | } else if (*cp == '{') { 361 | if (cursor->type != t_object) { 362 | json_debug_trace((1, 363 | "Saw { when not expecting object.\n")); 364 | if (end != NULL) 365 | *end = cp; 366 | return JSON_ERR_NOARRAY; 367 | } 368 | substatus = json_read_object(cp, cursor->addr.attrs, &cp); 369 | if (substatus != 0) 370 | return substatus; 371 | --cp; // last } will be re-consumed by cp++ at end of loop 372 | state = post_element; 373 | } else if (cursor->type == t_object) { 374 | json_debug_trace((1, 375 | "Object element was specified, but no {.\n")); 376 | if (end != NULL) 377 | *end = cp; 378 | return JSON_ERR_NOCURLY; 379 | } else if (*cp == '"') { 380 | value_quoted = true; 381 | state = in_val_string; 382 | pval = valbuf; 383 | } else { 384 | value_quoted = false; 385 | state = in_val_token; 386 | pval = valbuf; 387 | *pval++ = *cp; 388 | } 389 | break; 390 | case in_val_string: 391 | if (pval == NULL) 392 | /* don't update end here, leave at value start */ 393 | return JSON_ERR_NULLPTR; 394 | if (*cp == '\\') 395 | state = in_escape; 396 | else if (*cp == '"') { 397 | *pval++ = '\0'; 398 | json_debug_trace((1, "Collected string value %s\n", valbuf)); 399 | state = post_val; 400 | } else if (pval > valbuf + JSON_VAL_MAX - 1 401 | || pval > valbuf + maxlen) { 402 | json_debug_trace((1, "String value too long.\n")); 403 | /* don't update end here, leave at value start */ 404 | return JSON_ERR_STRLONG; /* */ 405 | } else 406 | *pval++ = *cp; 407 | break; 408 | case in_escape: 409 | if (pval == NULL) 410 | /* don't update end here, leave at value start */ 411 | return JSON_ERR_NULLPTR; 412 | else if (pval > valbuf + JSON_VAL_MAX - 1 413 | || pval > valbuf + maxlen) { 414 | json_debug_trace((1, "String value too long.\n")); 415 | /* don't update end here, leave at value start */ 416 | return JSON_ERR_STRLONG; /* */ 417 | } 418 | switch (*cp) { 419 | case 'b': 420 | *pval++ = '\b'; 421 | break; 422 | case 'f': 423 | *pval++ = '\f'; 424 | break; 425 | case 'n': 426 | *pval++ = '\n'; 427 | break; 428 | case 'r': 429 | *pval++ = '\r'; 430 | break; 431 | case 't': 432 | *pval++ = '\t'; 433 | break; 434 | case 'u': 435 | cp++; /* skip the 'u' */ 436 | for (n = 0; n < 4 && isxdigit(*cp); n++) 437 | uescape[n] = *cp++; 438 | uescape[n] = '\0'; /* terminate */ 439 | --cp; 440 | /* ECMA-404 says JSON \u must have 4 hex digits */ 441 | if ((4 != n) || (1 != sscanf(uescape, "%4x", &u))) { 442 | return JSON_ERR_BADSTRING; 443 | } 444 | *pval++ = (unsigned char)u; /* truncate values above 0xff */ 445 | break; 446 | default: /* handles double quote and solidus */ 447 | *pval++ = *cp; 448 | break; 449 | } 450 | state = in_val_string; 451 | break; 452 | case in_val_token: 453 | if (pval == NULL) 454 | /* don't update end here, leave at value start */ 455 | return JSON_ERR_NULLPTR; 456 | if (isspace((unsigned char) *cp) || *cp == ',' || *cp == '}') { 457 | *pval = '\0'; 458 | json_debug_trace((1, "Collected token value %s.\n", valbuf)); 459 | state = post_val; 460 | if (*cp == '}' || *cp == ',') 461 | --cp; 462 | } else if (pval > valbuf + JSON_VAL_MAX - 1) { 463 | json_debug_trace((1, "Token value too long.\n")); 464 | /* don't update end here, leave at value start */ 465 | return JSON_ERR_TOKLONG; 466 | } else 467 | *pval++ = *cp; 468 | break; 469 | case post_val: 470 | // Ignore whitespace after either string or token values. 471 | if (isspace(*cp)) { 472 | while (*cp != '\0' && isspace((unsigned char) *cp)) { 473 | ++cp; 474 | } 475 | json_debug_trace((1, "Skipped trailing whitespace: value \"%s\"\n", valbuf)); 476 | } 477 | /* 478 | * We know that cursor points at the first spec matching 479 | * the current attribute. We don't know that it's *the* 480 | * correct spec; our dialect allows there to be any number 481 | * of adjacent ones with the same attrname but different 482 | * types. Here's where we try to seek forward for a 483 | * matching type/attr pair if we're not looking at one. 484 | */ 485 | for (;;) { 486 | int seeking = cursor->type; 487 | if (value_quoted && (cursor->type == t_string 488 | || cursor->type == t_time)) 489 | break; 490 | if ((strcmp(valbuf, "true")==0 || strcmp(valbuf, "false")==0 491 | || isdigit((unsigned char) valbuf[0])) 492 | && seeking == t_boolean) 493 | break; 494 | if (isdigit((unsigned char) valbuf[0])) { 495 | bool decimal = strchr(valbuf, '.') != NULL; 496 | //[hdt] 497 | if (seeking == t_string) { 498 | value_quoted = true; 499 | break;}; 500 | //[\hdt] 501 | if (decimal && seeking == t_real) 502 | break; 503 | if (!decimal && (seeking == t_integer 504 | || seeking == t_uinteger)) 505 | break; 506 | } 507 | if (cursor[1].attribute==NULL) /* out of possiblities */ 508 | break; 509 | if (strcmp(cursor[1].attribute, attrbuf)!=0) 510 | break; 511 | ++cursor; 512 | } 513 | if (value_quoted 514 | && (cursor->type != t_string && cursor->type != t_character 515 | && cursor->type != t_check && cursor->type != t_time 516 | && cursor->type != t_ignore && cursor->map == 0)) { 517 | json_debug_trace((1, "Saw quoted value when expecting" 518 | " non-string.\n")); 519 | return JSON_ERR_QNONSTRING; 520 | } 521 | if (!value_quoted 522 | && (cursor->type == t_string || cursor->type == t_check 523 | || cursor->type == t_time || cursor->map != 0)) { 524 | json_debug_trace((1, "Didn't see quoted value when expecting" 525 | " string.\n")); 526 | return JSON_ERR_NONQSTRING; 527 | } 528 | if (cursor->map != 0) { 529 | for (mp = cursor->map; mp->name != NULL; mp++) 530 | if (strcmp(mp->name, valbuf) == 0) { 531 | goto foundit; 532 | } 533 | json_debug_trace((1, "Invalid enumerated value string \"%s\".\n", 534 | valbuf)); 535 | return JSON_ERR_BADENUM; 536 | foundit: 537 | (void)snprintf(valbuf, sizeof(valbuf), "%d", mp->value); 538 | } 539 | if (cursor->type == t_check) { 540 | lptr = cursor->dflt.check; 541 | } else { 542 | lptr = json_target_address(cursor, parent, offset); 543 | } 544 | if (lptr != NULL) 545 | switch (cursor->type) { 546 | case t_integer: 547 | { 548 | int tmp = atoi(valbuf); 549 | memcpy(lptr, &tmp, sizeof(int)); 550 | } 551 | break; 552 | case t_uinteger: 553 | { 554 | unsigned int tmp = (unsigned int)atoi(valbuf); 555 | memcpy(lptr, &tmp, sizeof(unsigned int)); 556 | } 557 | break; 558 | case t_short: 559 | { 560 | short tmp = atoi(valbuf); 561 | memcpy(lptr, &tmp, sizeof(short)); 562 | } 563 | break; 564 | case t_ushort: 565 | { 566 | unsigned short tmp = (unsigned int)atoi(valbuf); 567 | memcpy(lptr, &tmp, sizeof(unsigned short)); 568 | } 569 | break; 570 | case t_time: 571 | #ifdef TIME_ENABLE 572 | { 573 | double tmp = iso8601_to_unix(valbuf); 574 | memcpy(lptr, &tmp, sizeof(double)); 575 | } 576 | #endif /* TIME_ENABLE */ 577 | break; 578 | case t_real: 579 | { 580 | double tmp = atof(valbuf); 581 | memcpy(lptr, &tmp, sizeof(double)); 582 | } 583 | break; 584 | case t_string: 585 | if (parent != NULL 586 | && parent->element_type != t_structobject 587 | && offset > 0) 588 | return JSON_ERR_NOPARSTR; 589 | else { 590 | size_t vl = strlen(valbuf), cl = cursor->len-1; 591 | memset(lptr, '\0', cl); 592 | memcpy(lptr, valbuf, vl < cl ? vl : cl); 593 | } 594 | break; 595 | case t_boolean: 596 | { 597 | bool tmp = (strcmp(valbuf, "true") == 0 || strtol(valbuf, NULL, 0)); 598 | memcpy(lptr, &tmp, sizeof(bool)); 599 | } 600 | break; 601 | case t_character: 602 | if (strlen(valbuf) > 1) 603 | /* don't update end here, leave at value start */ 604 | return JSON_ERR_STRLONG; 605 | else 606 | lptr[0] = valbuf[0]; 607 | break; 608 | case t_ignore: /* silences a compiler warning */ 609 | case t_object: /* silences a compiler warning */ 610 | case t_structobject: 611 | case t_array: 612 | break; 613 | case t_check: 614 | if (strcmp(cursor->dflt.check, valbuf) != 0) { 615 | json_debug_trace((1, "Required attribute value %s" 616 | " not present.\n", 617 | cursor->dflt.check)); 618 | /* don't update end here, leave at start of attribute */ 619 | return JSON_ERR_CHECKFAIL; 620 | } 621 | break; 622 | } 623 | __attribute__ ((fallthrough)); 624 | case post_element: 625 | if (isspace((unsigned char) *cp)) 626 | continue; 627 | else if (*cp == ',') 628 | state = await_attr; 629 | else if (*cp == '}') { 630 | ++cp; 631 | goto good_parse; 632 | } else { 633 | json_debug_trace((1, "Garbage while expecting comma or }\n")); 634 | if (end != NULL) 635 | *end = cp; 636 | return JSON_ERR_BADTRAIL; 637 | } 638 | break; 639 | } 640 | } 641 | if (state == init){ 642 | json_debug_trace((1, "Input was empty or white-space only\n")); 643 | return JSON_ERR_EMPTY; 644 | } 645 | 646 | good_parse: 647 | /* in case there's another object following, consume trailing WS */ 648 | while (*cp != '\0' && isspace((unsigned char) *cp)) 649 | ++cp; 650 | if (end != NULL) 651 | *end = cp; 652 | json_debug_trace((1, "JSON parse ends.\n")); 653 | return 0; 654 | } 655 | 656 | int json_read_array(const char *cp, const struct json_array_t *arr, 657 | const char **end) 658 | { 659 | int substatus, offset, arrcount; 660 | char *tp; 661 | 662 | if (end != NULL) 663 | *end = NULL; /* give it a well-defined value on parse failure */ 664 | 665 | json_debug_trace((1, "Entered json_read_array()\n")); 666 | 667 | while (*cp != '\0' && isspace((unsigned char) *cp)) 668 | cp++; 669 | if (*cp != '[') { 670 | json_debug_trace((1, "Didn't find expected array start\n")); 671 | return JSON_ERR_ARRAYSTART; 672 | } else 673 | cp++; 674 | 675 | tp = arr->arr.strings.store; 676 | arrcount = 0; 677 | 678 | /* Check for empty array */ 679 | while (*cp != '\0' && isspace((unsigned char) *cp)) 680 | cp++; 681 | if (*cp == ']') 682 | goto breakout; 683 | 684 | for (offset = 0; offset < arr->maxlen; offset++) { 685 | char *ep = NULL; 686 | json_debug_trace((1, "Looking at %s\n", cp)); 687 | switch (arr->element_type) { 688 | case t_string: 689 | while (*cp != '\0' && isspace((unsigned char) *cp)) 690 | cp++; 691 | if (*cp != '"') 692 | return JSON_ERR_BADSTRING; 693 | else 694 | ++cp; 695 | arr->arr.strings.ptrs[offset] = tp; 696 | for (; tp - arr->arr.strings.store < arr->arr.strings.storelen; 697 | tp++) 698 | if (*cp == '"') { 699 | ++cp; 700 | *tp++ = '\0'; 701 | goto stringend; 702 | } else if (*cp == '\0') { 703 | json_debug_trace((1, 704 | "Bad string syntax in string list.\n")); 705 | return JSON_ERR_BADSTRING; 706 | } else { 707 | *tp = *cp++; 708 | } 709 | json_debug_trace((1, "Bad string syntax in string list.\n")); 710 | return JSON_ERR_BADSTRING; 711 | stringend: 712 | break; 713 | case t_object: 714 | case t_structobject: 715 | substatus = 716 | json_internal_read_object(cp, arr->arr.objects.subtype, arr, 717 | offset, &cp); 718 | if (substatus != 0) { 719 | if (end != NULL) 720 | end = &cp; 721 | return substatus; 722 | } 723 | break; 724 | case t_integer: 725 | arr->arr.integers.store[offset] = (int)strtol(cp, &ep, 0); 726 | if (ep == cp) 727 | return JSON_ERR_BADNUM; 728 | else 729 | cp = ep; 730 | break; 731 | case t_uinteger: 732 | arr->arr.uintegers.store[offset] = (unsigned int)strtoul(cp, 733 | &ep, 0); 734 | if (ep == cp) 735 | return JSON_ERR_BADNUM; 736 | else 737 | cp = ep; 738 | break; 739 | case t_short: 740 | arr->arr.shorts.store[offset] = (short)strtol(cp, &ep, 0); 741 | if (ep == cp) 742 | return JSON_ERR_BADNUM; 743 | else 744 | cp = ep; 745 | break; 746 | case t_ushort: 747 | arr->arr.ushorts.store[offset] = (unsigned short)strtol(cp, &ep, 0); 748 | if (ep == cp) 749 | return JSON_ERR_BADNUM; 750 | else 751 | cp = ep; 752 | break; 753 | #ifdef TIME_ENABLE 754 | case t_time: 755 | if (*cp != '"') 756 | return JSON_ERR_BADSTRING; 757 | else 758 | ++cp; 759 | arr->arr.reals.store[offset] = iso8601_to_unix((char *)cp); 760 | if (arr->arr.reals.store[offset] >= HUGE_VAL) 761 | return JSON_ERR_BADNUM; 762 | while (*cp && *cp != '"') 763 | cp++; 764 | if (*cp != '"') 765 | return JSON_ERR_BADSTRING; 766 | else 767 | ++cp; 768 | break; 769 | #else 770 | case t_time: 771 | break; 772 | #endif /* TIME_ENABLE */ 773 | case t_real: 774 | arr->arr.reals.store[offset] = strtod(cp, &ep); 775 | if (ep == cp) 776 | return JSON_ERR_BADNUM; 777 | else 778 | cp = ep; 779 | break; 780 | case t_boolean: 781 | if (str_starts_with(cp, "true")) { 782 | arr->arr.booleans.store[offset] = true; 783 | cp += 4; 784 | } 785 | else if (str_starts_with(cp, "false")) { 786 | arr->arr.booleans.store[offset] = false; 787 | cp += 5; 788 | } else { 789 | int val = strtol(cp, &ep, 0); 790 | if (ep == cp) 791 | return JSON_ERR_BADNUM; 792 | else { 793 | arr->arr.booleans.store[offset] = (bool) val; 794 | cp = ep; 795 | } 796 | } 797 | break; 798 | case t_character: 799 | case t_array: 800 | case t_check: 801 | case t_ignore: 802 | json_debug_trace((1, "Invalid array subtype.\n")); 803 | return JSON_ERR_SUBTYPE; 804 | } 805 | arrcount++; 806 | while (*cp != '\0' && isspace((unsigned char) *cp)) 807 | cp++; 808 | if (*cp == ']') { 809 | json_debug_trace((1, "End of array found.\n")); 810 | goto breakout; 811 | } else if (*cp == ',') 812 | cp++; 813 | else { 814 | json_debug_trace((1, "Bad trailing syntax on array.\n")); 815 | return JSON_ERR_BADSUBTRAIL; 816 | } 817 | } 818 | json_debug_trace((1, "Too many elements in array.\n")); 819 | if (end != NULL) 820 | *end = cp; 821 | return JSON_ERR_SUBTOOLONG; 822 | breakout: 823 | if (arr->count != NULL) 824 | *(arr->count) = arrcount; 825 | if (end != NULL) 826 | *end = cp; 827 | json_debug_trace((1, "leaving json_read_array() with %d elements\n", 828 | arrcount)); 829 | return 0; 830 | } 831 | 832 | int json_read_object(const char *cp, const struct json_attr_t *attrs, 833 | const char **end) 834 | { 835 | int st; 836 | 837 | json_debug_trace((1, "json_read_object() sees '%s'\n", cp)); 838 | st = json_internal_read_object(cp, attrs, NULL, 0, end); 839 | return st; 840 | } 841 | 842 | const char *json_error_string(int err) 843 | { 844 | const char *errors[] = { 845 | "unknown error while parsing JSON", 846 | "non-whitespace when expecting object start", 847 | "non-whitespace when expecting attribute start", 848 | "unknown attribute name", 849 | "attribute name too long", 850 | "saw [ when not expecting array", 851 | "array element specified, but no [", 852 | "string value too long", 853 | "token value too long", 854 | "garbage while expecting comma or } or ]", 855 | "didn't find expected array start", 856 | "error while parsing object array", 857 | "too many array elements", 858 | "garbage while expecting array comma", 859 | "unsupported array element type", 860 | "error while string parsing", 861 | "check attribute not matched", 862 | "can't support strings in parallel arrays", 863 | "invalid enumerated value", 864 | "saw quoted value when expecting nonstring", 865 | "didn't see quoted value when expecting string", 866 | "other data conversion error", 867 | "error while parsing a numerical argument", 868 | "unexpected null value or attribute pointer", 869 | "object element specified, but no {", 870 | "input was empty or white-space only", 871 | }; 872 | 873 | if (err <= 0 || err >= (int)(sizeof(errors) / sizeof(errors[0]))) 874 | return errors[0]; 875 | else 876 | return errors[err]; 877 | } 878 | 879 | /* end */ 880 | 881 | -------------------------------------------------------------------------------- /c-version/mjson.h: -------------------------------------------------------------------------------- 1 | /* Structures for JSON parsing using only fixed-extent memory 2 | * 3 | * This file is Copyright (c) 2010 by the GPSD project 4 | * SPDX-License-Identifier: BSD-2-clause 5 | */ 6 | 7 | #include 8 | #include 9 | #include 10 | #ifdef TIME_ENABLE 11 | #include 12 | #endif /* TIME_ENABLE */ 13 | 14 | typedef enum {t_integer, t_uinteger, t_real, 15 | t_string, t_boolean, t_character, 16 | t_time, 17 | t_object, t_structobject, t_array, 18 | t_check, t_ignore, 19 | t_short, t_ushort} 20 | json_type; 21 | 22 | struct json_enum_t { 23 | char *name; 24 | int value; 25 | }; 26 | 27 | struct json_array_t { 28 | json_type element_type; 29 | union { 30 | struct { 31 | const struct json_attr_t *subtype; 32 | char *base; 33 | size_t stride; 34 | } objects; 35 | struct { 36 | char **ptrs; 37 | char *store; 38 | int storelen; 39 | } strings; 40 | struct { 41 | int *store; 42 | } integers; 43 | struct { 44 | unsigned int *store; 45 | } uintegers; 46 | struct { 47 | short *store; 48 | } shorts; 49 | struct { 50 | unsigned short *store; 51 | } ushorts; 52 | struct { 53 | double *store; 54 | } reals; 55 | struct { 56 | bool *store; 57 | } booleans; 58 | } arr; 59 | int *count, maxlen; 60 | }; 61 | 62 | struct json_attr_t { 63 | char *attribute; 64 | json_type type; 65 | union { 66 | int *integer; 67 | unsigned int *uinteger; 68 | short *shortint; 69 | unsigned short *ushortint; 70 | double *real; 71 | char *string; 72 | bool *boolean; 73 | char *character; 74 | const struct json_attr_t *attrs; 75 | const struct json_array_t array; 76 | size_t offset; 77 | } addr; 78 | union { 79 | int integer; 80 | unsigned int uinteger; 81 | short shortint; 82 | unsigned short ushortint; 83 | double real; 84 | bool boolean; 85 | char character; 86 | char *check; 87 | } dflt; 88 | size_t len; 89 | const struct json_enum_t *map; 90 | bool nodefault; 91 | }; 92 | 93 | #define JSON_ATTR_MAX 31 /* max chars in JSON attribute name */ 94 | #define JSON_VAL_MAX 512 /* max chars in JSON value part */ 95 | 96 | #ifdef __cplusplus 97 | extern "C" { 98 | #endif 99 | int json_read_object(const char *, const struct json_attr_t *, 100 | const char **); 101 | int json_read_array(const char *, const struct json_array_t *, 102 | const char **); 103 | const char *json_error_string(int); 104 | 105 | #ifdef TIME_ENABLE 106 | extern time_t timegm(struct tm *tm); 107 | #endif /* TIME_ENABLE */ 108 | 109 | void json_enable_debug(int, FILE *); 110 | #ifdef __cplusplus 111 | } 112 | #endif 113 | 114 | #define JSON_ERR_OBSTART 1 /* non-WS when expecting object start */ 115 | #define JSON_ERR_ATTRSTART 2 /* non-WS when expecting attrib start */ 116 | #define JSON_ERR_BADATTR 3 /* unknown attribute name */ 117 | #define JSON_ERR_ATTRLEN 4 /* attribute name too long */ 118 | #define JSON_ERR_NOARRAY 5 /* saw [ when not expecting array */ 119 | #define JSON_ERR_NOBRAK 6 /* array element specified, but no [ */ 120 | #define JSON_ERR_STRLONG 7 /* string value too long */ 121 | #define JSON_ERR_TOKLONG 8 /* token value too long */ 122 | #define JSON_ERR_BADTRAIL 9 /* garbage while expecting comma or } or ] */ 123 | #define JSON_ERR_ARRAYSTART 10 /* didn't find expected array start */ 124 | #define JSON_ERR_OBJARR 11 /* error while parsing object array */ 125 | #define JSON_ERR_SUBTOOLONG 12 /* too many array elements */ 126 | #define JSON_ERR_BADSUBTRAIL 13 /* garbage while expecting array comma */ 127 | #define JSON_ERR_SUBTYPE 14 /* unsupported array element type */ 128 | #define JSON_ERR_BADSTRING 15 /* error while string parsing */ 129 | #define JSON_ERR_CHECKFAIL 16 /* check attribute not matched */ 130 | #define JSON_ERR_NOPARSTR 17 /* can't support strings in parallel arrays */ 131 | #define JSON_ERR_BADENUM 18 /* invalid enumerated value */ 132 | #define JSON_ERR_QNONSTRING 19 /* saw quoted value when expecting nonstring */ 133 | #define JSON_ERR_NONQSTRING 20 /* didn't see quoted value when expecting string */ 134 | #define JSON_ERR_MISC 21 /* other data conversion error */ 135 | #define JSON_ERR_BADNUM 22 /* error while parsing a numerical argument */ 136 | #define JSON_ERR_NULLPTR 23 /* unexpected null value or attribute pointer */ 137 | #define JSON_ERR_NOCURLY 24 /* object element specified, but no { */ 138 | #define JSON_ERR_EMPTY 25 /* input was empty or white-space only */ 139 | 140 | /* 141 | * Use the following macros to declare template initializers for structobject 142 | * arrays. Writing the equivalents out by hand is error-prone. 143 | * 144 | * STRUCTOBJECT takes a structure name s, and a fieldname f in s. 145 | * 146 | * STRUCTARRAY takes the name of a structure array, a pointer to a an 147 | * initializer defining the subobject type, and the address of an integer to 148 | * store the length in. 149 | */ 150 | #define STRUCTOBJECT(s, f) .addr.offset = offsetof(s, f) 151 | #define STRUCTARRAY(a, e, n) \ 152 | .addr.array.element_type = t_structobject, \ 153 | .addr.array.arr.objects.subtype = e, \ 154 | .addr.array.arr.objects.base = (char*)a, \ 155 | .addr.array.arr.objects.stride = sizeof(a[0]), \ 156 | .addr.array.count = n, \ 157 | .addr.array.maxlen = (int)(sizeof(a)/sizeof(a[0])) 158 | 159 | /* json.h ends here */ 160 | -------------------------------------------------------------------------------- /c-version/snr-cli.c: -------------------------------------------------------------------------------- 1 | /* snr-cli.c 2 | Command-line processor for SNR -- signal-to-noise ratio 3 | analysis program for RTL_433 packets. 4 | 5 | hdtodd@gmail.com, 2022.05.17 6 | */ 7 | 8 | #define _XOPEN_SOURCE 9 | #include 10 | #include 11 | #include 12 | #include 13 | #include 14 | #include 15 | #include 16 | #include 17 | 18 | extern time_t dFirst, dLast; 19 | extern char inFileName[60]; 20 | 21 | int processCmdLine(int argc, char* argv[]) { 22 | int c, i, j; 23 | const char* short_opt = "h?:f:s:e:"; 24 | struct tm tm; 25 | 26 | struct myOption { 27 | char* name; 28 | int has_arg; 29 | int* flag; 30 | int val; 31 | char* desc; 32 | }; 33 | 34 | struct myOption long_opt[] = 35 | { 36 | {"help", optional_argument, NULL, 'h', "This help message"}, 37 | {"file", required_argument, NULL, 'f', "-f : source data file to process"}, 38 | {"start", required_argument, NULL, 's', "-s : date-time of first record to process, in form YYYY-MM-DD HH:MM:SS"}, 39 | {"end", required_argument, NULL, 'e', "-e : date-time of last record to process, in form YYYY-MM-DD HH:MM:SS"}, 40 | {NULL, 0, NULL, 0, NULL } 41 | }; 42 | 43 | // validate arguments and/or provide help 44 | 45 | // Set default values for options, then check for command-line values 46 | dFirst = 0; // Process all dates 47 | dLast = 0x7FFFFFFF; // Process all dates; watch for neg 64-bit time 48 | 49 | // If no options provided, simulate "-h" 50 | c = getopt_long(argc, argv, short_opt, (const struct option *)long_opt, NULL); 51 | if (c == -1) c = 'h'; 52 | do { 53 | switch(c) { 54 | case -1: /* no more arguments */ 55 | case 0: /* long options toggles */ 56 | break; 57 | 58 | case 'f': 59 | if (strlen(optarg)<(sizeof(inFileName)-1)) { 60 | strcpy(inFileName, optarg); 61 | if (access(inFileName,R_OK) !=0) { 62 | printf("Specified input file '%s' not found or not readable\n", inFileName); 63 | return(-1); 64 | } 65 | } else { 66 | printf("Input data file name '%s' exceeds allocated storage\n", inFileName); 67 | return(-1); 68 | }; 69 | break; 70 | 71 | case 's': 72 | memset(&tm, 0, sizeof(struct tm)); 73 | strptime(optarg, "%Y-%m-%d %H:%M:%S", &tm); 74 | tm.tm_isdst = 1; 75 | dFirst = mktime(&tm); 76 | break; 77 | 78 | case 'e': 79 | memset(&tm, 0, sizeof(struct tm)); 80 | strptime(optarg, "%Y-%m-%d %H:%M:%S", &tm); 81 | tm.tm_isdst = 1; 82 | dLast = mktime(&tm); 83 | break; 84 | 85 | case '?': 86 | case 'h': 87 | printf("SNR: Program to analyze rtl_433 packet logs for SNR performance\n"); 88 | printf(" Usage: %s [OPTIONS]\n", argv[0]); 89 | printf(" [OPTIONS] are any combination of\n\tLong form Short\tOption invoked\n"); 90 | for (i=0; long_opt[i].name!=NULL; i++) { 91 | printf("\t--%-14s-%c\t%s\n", long_opt[i].name, long_opt[i].val, long_opt[i].desc); 92 | }; 93 | return(-1); 94 | 95 | default: 96 | fprintf(stderr, "%s: invalid option -- %c\n", argv[0], c); 97 | fprintf(stderr, "Try `%s --help' for more information.\n", argv[0]); 98 | return(-2); 99 | }; 100 | } 101 | while ((c = getopt_long(argc, argv, short_opt, (const struct option *)long_opt, NULL)) != -1);; 102 | 103 | return(0); 104 | }; 105 | 106 | -------------------------------------------------------------------------------- /c-version/snr.c: -------------------------------------------------------------------------------- 1 | /* snr.c -- collect basic statistics on signal-to-noise ratio 2 | data from rtl_433 JSON logs. 3 | 4 | Written by HDTodd@gmail.com, 2022.05.16 5 | */ 6 | 7 | #define _XOPEN_SOURCE 8 | #include 9 | #include 10 | #include 11 | #include 12 | #include 13 | #include 14 | #include 15 | #include 16 | #include 17 | #include 18 | #include 19 | 20 | #include "mjson.h" 21 | #include "stats.h" 22 | #include "tree.h" 23 | 24 | char model[201]; 25 | char timestring[40]; 26 | char id[20]; 27 | double snr; 28 | 29 | const struct json_attr_t json_rtl[] = { 30 | {"time", t_string, .addr.string = timestring, .len = sizeof(timestring)}, 31 | {"model", t_string, .addr.string = model, .len = sizeof(model)}, 32 | {"id", t_string , .addr.string = id, .len = sizeof(id)}, 33 | {"snr", t_real, .addr.real = &snr}, 34 | {"", t_ignore}, 35 | {NULL}, 36 | };; 37 | 38 | // Filled in by cli 39 | time_t dFirst, dLast; 40 | char inFileName[60]; 41 | int fnLen = 39; 42 | 43 | void node_print(NPTR p) { 44 | stats_get( (bstats *)p->attr); 45 | printf("%-27s", p->key); 46 | stats_print( (bstats *)p->attr); 47 | }; 48 | 49 | 50 | // Local internal vars 51 | extern int processCmdLine(int argc, char* argv[]); 52 | FILE *fp; 53 | char *path = "test.json"; 54 | char lbuf[501]; 55 | time_t timestamp, lasttime = 0; 56 | char lastmodel[201] = ""; 57 | time_t earliestDTS = (time_t) 0x7FFFFFFF; // watch out for 64-bit time_t, negative times 58 | time_t latestDTS = (time_t) 0x00000000; 59 | char ft[80], lt[80]; 60 | struct tm ts; 61 | 62 | int main(int argc, char *argv[]) 63 | { 64 | int option; 65 | unsigned int lc = 0, rc=0; 66 | int status = 0; 67 | int errCode = 0; 68 | struct tm tm; 69 | NPTR root=NULL, base, node; 70 | APTR attr; 71 | bstats *snrstats; 72 | 73 | printf("\nsnr: Analyze rtl_433 json log files\n"); 74 | // process command line to retrieve options selected, leave in global vars 75 | if ( (errCode=processCmdLine(argc, argv)) != 0 ) exit(errCode); 76 | fp = fopen(inFileName, "r"); 77 | if (!fp) { 78 | perror(inFileName); 79 | exit(EXIT_FAILURE); 80 | }; 81 | 82 | // Ready to process input file 83 | printf("Processing ISM 433MHz messages from file %s\n", inFileName); 84 | while (fgets(lbuf, sizeof(lbuf), fp)) { 85 | lc++; 86 | status = json_read_object(lbuf, json_rtl, NULL); 87 | strptime(timestring, "%Y-%m-%d %H:%M:%S", &tm); 88 | tm.tm_isdst = 1; 89 | timestamp = mktime(&tm); 90 | if (timestampdLast) continue; // ignore recs not in date-time range 91 | // Statement below makes 'model'+'id' the key for cataloging and summarizing 92 | // Change the following statement to experiment with other keys 93 | strcat(model, " "); strcat(model, id); // 'model'+'id' is the key for lookups 94 | if ( (strcmp(model, lastmodel) != 0) || 95 | (timestamp > lasttime+2) ) { 96 | #ifdef DEBUG 97 | printf("Line %d: timestamp=%s=%lu, model=%s, snr=%lf\n", lc, timestring, timestamp, model, snr); 98 | #endif 99 | rc++; 100 | node = node_find(root,model); 101 | if (root == NULL) root = node; 102 | if (node != NULL) 103 | stats_append(snr, (bstats *)node->attr); 104 | else { 105 | fprintf(stderr, "NULL node for %s at line %d\n", model, lc); 106 | exit(EXIT_FAILURE); 107 | }; 108 | 109 | if (timestamplatestDTS) latestDTS = timestamp; 111 | strcpy(lastmodel, model); 112 | lasttime = timestamp; 113 | }; 114 | if (status != 0) puts(json_error_string(status)); 115 | }; 116 | if (!feof(fp)) { 117 | fprintf(stderr, "?Error reading at line %d; not at EOF?", lc); 118 | exit(EXIT_FAILURE); 119 | }; 120 | 121 | // Finished input file; print result summary 122 | ts = *localtime(&earliestDTS); 123 | strftime(ft,sizeof(ft),"%a %Y-%m-%d %H:%M:%S", &ts); 124 | ts = *localtime(&latestDTS); 125 | strftime(lt,sizeof(lt),"%a %Y-%m-%d %H:%M:%S", &ts); 126 | printf("\nProcessed %d de-duplicated records\nDated from %s to %s\n\n", rc, ft, lt); 127 | printf("%-25s %6s %14s %6s %6s\n", 128 | "Device","#Recs", "Mean SNR ± 𝜎", "Min", "Max"); 129 | tree_process(root, &node_print); 130 | 131 | if (fclose(fp)) { 132 | perror(path); 133 | exit(EXIT_FAILURE); 134 | }; 135 | 136 | exit(EXIT_SUCCESS); 137 | } 138 | -------------------------------------------------------------------------------- /c-version/stats.c: -------------------------------------------------------------------------------- 1 | /* stats.c -- functions to instantiate a basic statistics data node 2 | (count of records, mean, standard deviation, min valule, max value) 3 | and to accumulate basic statistics from a stream of data through 4 | a series of calls to 'append'. Uses recursive algorithm to 5 | compute mean and std deviation. 6 | 7 | WARNING: std2, stored internally, is the SQUARE of the std dev. 8 | Need to perform the 'get' to convert to std dev when all 9 | stream data have been processed. 10 | 11 | hdtodd@gmail.com, 2022.05.22 12 | */ 13 | 14 | #include 15 | #include 16 | #include 17 | 18 | #include "stats.h" 19 | 20 | // creates a new data node to hold the basic statistics data 21 | bstats *stats_new(void) { 22 | bstats *data; 23 | 24 | data = (bstats *) malloc(sizeof(bstats)); 25 | if (data == NULL) { 26 | fprintf(stderr, "Unable to allocate space for a stats block!"); 27 | exit(EXIT_FAILURE); 28 | }; 29 | 30 | data->count = 0; 31 | data->mean = (double)0.0e0; 32 | data->std2 = (double)0.0e0; 33 | data->min = (double)+INFINITY; 34 | data->max = (double)-INFINITY; 35 | return(data); 36 | }; 37 | 38 | void stats_print(bstats *data) { 39 | // printf("Count = %d, avg = %lf, std2 dev = %lf, min = %lf, max = %lf\n", 40 | printf("%6d %6.1lf ± %4.1lf %6.1lf %6.1lf\n", 41 | data->count, data->mean, data->std2, data->min, data->max); 42 | return; 43 | }; 44 | 45 | bstats *stats_get(bstats* data) { 46 | data->std2 = sqrt(data->std2); 47 | return(data); 48 | }; 49 | 50 | void stats_append(double x, bstats *self) { 51 | self->count += 1; 52 | self->mean = ( (double)(self->count - 1) * self->mean + x)/(double)self->count; 53 | self->std2 = self->count<2 ? 0.0 : 54 | ( (self->count-2)*self->std2 + 55 | (double)(self->count)*(self->mean-x)*(self->mean-x)/(double)(self->count - 1) )/(double)(self->count-1); 56 | self->min = xmin ? x : self->min; 57 | self->max = x>self->max ? x : self->max; 58 | 59 | return; 60 | }; 61 | -------------------------------------------------------------------------------- /c-version/stats.h: -------------------------------------------------------------------------------- 1 | //stats.h 2 | // Data structure to contain basic statistics for 3 | // a stream of data values: 4 | // count of # of values, mean, std dev, min, max 5 | // hdtodd@gmail.com, 2022.05.22 6 | 7 | typedef struct { 8 | int count; 9 | double mean; 10 | double std2; 11 | double min; 12 | double max; 13 | } bstats; 14 | 15 | bstats *stats_new(void); 16 | void stats_print(bstats *data); 17 | bstats *stats_get(bstats *data); 18 | void stats_append(double x, bstats *data); 19 | 20 | -------------------------------------------------------------------------------- /c-version/tree.c: -------------------------------------------------------------------------------- 1 | // tree.c -- simple binary tree search/insert/update code 2 | // modeled after Kernighan & Ritchie, "C Programming Language", pg 130 3 | // Accumulates basic statistics (mean, std dev, min, max) of a stream of 4 | // values, associated with a labeled node in a binary tree 5 | // Data values are stored in a data structure associated with each 6 | // keyed node. 7 | // hdtodd@gmail.com, 2022.05.22 8 | 9 | #include 10 | #include 11 | #include 12 | #include 13 | 14 | #include "stats.h" 15 | #include "tree.h" 16 | 17 | char MODULE[] = "tree -- simple binary tree model"; 18 | int node_number = 0; 19 | 20 | // create a new attribute data node 21 | APTR attr_new(void) { 22 | APTR p; 23 | #ifdef DEBUG 24 | printf("entering attr_new\n"); 25 | #endif 26 | p = (APTR) malloc(sizeof(ATTR)); 27 | if (p == NULL) { 28 | fprintf(stderr, "Out of memory while allocating space for a new attribute node in 'tree'\n"); 29 | exit(EXIT_FAILURE); 30 | }; 31 | p->count = 0; 32 | p->mean = p->std2 = (double)0.0e0; 33 | p->min = (double)+INFINITY; 34 | p->max = (double)-INFINITY; 35 | return p; 36 | }; 37 | 38 | // create a new binary tree node and instantiate its associated attribute node 39 | NPTR node_new(char *key) { 40 | NPTR p; 41 | node_number++; 42 | #ifdef DEBUG 43 | printf("entering node_new with key '%s' to create node # %d\n", key, node_number); 44 | #endif 45 | p = (NPTR) malloc(sizeof(NODE)); 46 | if (p == NULL) { 47 | fprintf(stderr, "Oout of memory allocating space for a new binary tree node in 'tree'\n"); 48 | exit(EXIT_FAILURE); 49 | }; 50 | p->key = (char *) malloc(strlen(key)+1); 51 | strcpy(p->key,key); 52 | p->attr = (APTR) attr_new(); 53 | p->num = node_number; 54 | p->bh = EH; 55 | p->lptr = p->rptr = NULL; 56 | #ifdef DEBUG 57 | printf("Node[%2d]: key=%s, balance=%d, lptr=0x%x, rptr=0x%x\n", p->num, p->key, p->bh, p->lptr, p->rptr); 58 | printf("\tattr address=%x, attr_count = %d\n", p->attr, (p->attr)->count); 59 | #endif 60 | return p; 61 | }; 62 | 63 | /* 64 | void node_print(NPTR p) { 65 | if (p == NULL) printf("NULL pointer!\n"); 66 | else 67 | printf("\tNode [%3d] @ 0x%x:%20s, cnt = %d, lptr = 0x%x, rptr = 0x%x\n", 68 | p->num, p, p->key, (p->attr)->count, p->lptr, p->rptr); 69 | }; 70 | */ 71 | 72 | // find the tree node assocated with 'key' or create a new node 73 | // at the appropriate place in the binary tree 74 | NPTR node_find(NPTR root, char *key) { 75 | int cond; 76 | NPTR p = root;; 77 | if (p == NULL) return(node_new(key)); 78 | while (p != NULL) { 79 | if ( (cond=strcmp(key, p->key)) == 0) 80 | return(p); 81 | else if (cond<0) 82 | if (p->lptr == NULL) return (p->lptr = node_new(key)); 83 | else p = p->lptr; 84 | else 85 | if (p->rptr == NULL) return (p->rptr = node_new(key)); 86 | else p = p->rptr; 87 | }; 88 | return(NULL); 89 | }; 90 | 91 | // Not implemented as node_find does this 92 | NPTR node_insert(char *key) { 93 | printf("Spurious call to 'node_insert: should not have occurred\n"); 94 | return NULL; 95 | }; 96 | 97 | // In-order printing of the tree with the 'stats_print' function 98 | void tree_print(NPTR p) { 99 | if (p != NULL) { 100 | tree_print(p->lptr); 101 | #ifdef DEBUG 102 | printf("In tree_print, Node[%3d]: %-12s %2d\n", p->num, p->key, (p->attr)->count); 103 | #endif 104 | stats_get( (bstats *)p->attr); 105 | printf("%-30s ", p->key); 106 | stats_print( (bstats *)p->attr); 107 | tree_print(p->rptr); 108 | }; 109 | }; 110 | 111 | // In-order processing of the tree nodes using an external 112 | // (callback) routine for the actual node processing 113 | void tree_process(NPTR p, void print_node(NPTR p)) { 114 | if (p != NULL) { 115 | tree_process(p->lptr, *print_node); 116 | print_node(p); 117 | tree_process(p->rptr, *print_node); 118 | }; 119 | }; 120 | -------------------------------------------------------------------------------- /c-version/tree.h: -------------------------------------------------------------------------------- 1 | // tree.h 2 | // Include file with definitions used by binary tree code 3 | 4 | // Define balance factor for subtree: 5 | // LH==>left high; EH==>equal height; RH==>right high 6 | typedef enum {LH=-1, EH=0, RH =+1} BALANCEFACTOR; 7 | 8 | typedef struct attr { 9 | int count; 10 | double mean; 11 | double std2; 12 | double min; 13 | double max; 14 | } ATTR, *APTR; 15 | 16 | typedef struct node { 17 | char *key; 18 | APTR attr; 19 | int num; 20 | BALANCEFACTOR bh; 21 | struct node *lptr; 22 | struct node *rptr; 23 | } NODE, *NPTR; 24 | 25 | NPTR node_find(NPTR p, char *key); 26 | void node_dump(void); 27 | void tree_print(NPTR p); 28 | //void node_print(NPTR p); 29 | void tree_process(NPTR p, void node_print()); 30 | -------------------------------------------------------------------------------- /c-version/xaa-output.prn: -------------------------------------------------------------------------------- 1 | 2 | snr: Analyze rtl_433 json log files 3 | Processing ISM 433MHz messages from file ../xaa.json 4 | 5 | Processed 7045 de-duplicated records 6 | Dated from Thu 2022-06-09 07:08:27 to Thu 2022-06-09 19:46:16 7 | 8 | Device #Recs Mean SNR ± 𝜎 Min Max 9 | Acurite-01185M 0 4 9.6 ± 4.9 6.4 16.9 10 | Acurite-606TX 134 858 8.4 ± 2.1 5.5 20.0 11 | Acurite-609TXC 194 1446 19.2 ± 0.5 12.4 21.2 12 | Acurite-Tower 11524 2753 19.2 ± 0.5 13.2 20.8 13 | Hyundai-VDO 60b87768 1 11.0 ± 0.0 11.0 11.0 14 | Hyundai-VDO aeba4a98 1 7.2 ± 0.0 7.2 7.2 15 | LaCrosse-TX141Bv3 253 348 8.2 ± 1.1 5.7 11.5 16 | LaCrosse-TX141THBv2 168 840 9.6 ± 1.1 6.0 19.2 17 | Markisol 0 75 19.2 ± 0.9 12.3 20.2 18 | Markisol 256 20 19.3 ± 0.4 18.5 20.2 19 | Prologue-TH 203 699 11.6 ± 1.3 7.2 19.5 20 | -------------------------------------------------------------------------------- /rtl_433_stats: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # rtl_433_stats 3 | VERSION="2.2.0" 4 | 5 | # Program to catalog devices seen by rtl_433 and to analyze and summarize 6 | # device performance data. 7 | # This version analyzes signal-to-noise ratios (SNR), 8 | # inter-transmission gap times (ITGT), frequency variations (Freq), 9 | # and packets per transmission (PPT). 10 | # A "transmission" is a group of one or more packets broadcast to provide information. 11 | # A "packet" is a group of bits sent to communicate that information. 12 | # More than one packet may be sent per transmission in order to increase reliability. 13 | # SNR and Freq statistics are summarized over all packets from each device 14 | # ITGT and PPT summarize statistics on transmissions 15 | # (starting with the first packet of each transmission) 16 | 17 | import sys 18 | import argparse 19 | import fileinput 20 | import json 21 | from json.decoder import JSONDecodeError 22 | import time 23 | import datetime 24 | import math 25 | 26 | # Set gap time for packets to be considered to be duplicatd 27 | dup_thresh = 2.0 28 | 29 | AP_DESCRIPTION=""" 30 | \tAnalyze rtl_433 JSON logs to catalog the devices seen and to characterize 31 | \tstatistically their signal-to-noise ratio (SNR), times between 32 | \ttransmissions (ITGT), radio frequency (Freq), and packets per transmission (PPT). 33 | """ 34 | 35 | AP_EPILOG=""" 36 | Running: 37 | python3 rtl_433_stats [options] [-i or defaults to stdin] 38 | 39 | """ 40 | 41 | ########################################################################################## 42 | # Define a class to accumulate basic statistics over a stream of data 43 | class stats(): 44 | 45 | def __init__(self, x): 46 | self.count = 1 47 | self.mean = x 48 | self.std2 = 0.0 49 | self.min = x 50 | self.max = x 51 | 52 | def append(self,x): 53 | self.count += 1 54 | self.mean = ((self.count-1)*self.mean + x)/self.count 55 | self.std2 = 0 if self.count<2 else ( (self.count-2)*self.std2 + 56 | (self.count*(self.mean-x)**2)/(self.count-1) )/(self.count-1) 57 | if self.count > 1: 58 | self.min = x if xself.max else self.max 60 | return 61 | 62 | def get(self): 63 | return (self.count, self.mean, math.sqrt(self.std2), self.min, self.max) 64 | 65 | 66 | ########################################################################################## 67 | # Define a class to hold the data for a device and procedures to create, update, access 68 | class Data: 69 | def __init__(self, snr, eTime, freq, battery, status): 70 | self.pktcount = 1 71 | self.xmtcount = 0 72 | self.pkt_xmt = 1 73 | self.last_pkt_time = eTime 74 | self.last_xmt_time = eTime 75 | self.battery = battery 76 | self.status = status 77 | self.SNR = None if omitSNR else stats(snr) 78 | self.ITGT = None 79 | self.Freq = None if omitFreq else stats(freq) 80 | self.PPT = None 81 | return 82 | 83 | # Update information about this device and return a 84 | # flag to indicate if it was a duplicate record for this xmit 85 | def update(self, snr, eTime, freq, battery, status): 86 | self.pktcount += 1 87 | self.last_pkt_time = eTime 88 | dup = eTime < self.last_xmt_time + dup_thresh 89 | if not omitSNR: 90 | self.SNR.append(snr) 91 | if not omitITGT and not dup: 92 | if self.ITGT is None: 93 | self.ITGT = stats(eTime-self.last_xmt_time) 94 | else: 95 | self.ITGT.append(eTime-self.last_xmt_time) 96 | self.last_xmt_time = eTime 97 | self.xmtcount += 1 98 | self.last_xmt_time = eTime 99 | if not omitFreq: 100 | self.Freq.append(freq) 101 | if not omitPPT: 102 | if not dup: 103 | if self.PPT is None: 104 | self.PPT = stats(self.pkt_xmt) 105 | else: 106 | self.PPT.append(self.pkt_xmt) 107 | self.pkt_xmt = 0 108 | self.pkt_xmt += 1 109 | if battery != self.battery: 110 | batt_change = True 111 | self.battery = battery 112 | else: 113 | batt_change = False 114 | if status != self.status: 115 | stat_change = True 116 | self.status = status 117 | else: 118 | stat_change = False 119 | 120 | return (dup, batt_change, stat_change) 121 | 122 | def get(self): 123 | return (self.pktcount, 124 | self.xmtcount, 125 | self.SNR, 126 | self.ITGT, 127 | self.Freq, 128 | self.PPT) 129 | 130 | ########################################################################################## 131 | # Routine to reate the command parser, parse cmd line, and set defaults 132 | def make_parser(): 133 | 134 | parser = argparse.ArgumentParser(formatter_class=argparse.RawDescriptionHelpFormatter, 135 | description=AP_DESCRIPTION, 136 | epilog=AP_EPILOG) 137 | 138 | parser.add_argument("-i", "--input", metavar="FILE", nargs="*", 139 | help="Path to JSON log files to read in; can be .gz; can be wildcard; blank if ") 140 | parser.add_argument("-o", "--omit", choices=["SNR", "ITGT", "Freq", "PPT"], 141 | nargs="+") 142 | parser.add_argument("-x", "--exclude_noise", type=int, dest="noise", 143 | help="Exclude device records with fewer than 'NOISE' packets seen") 144 | parser.add_argument("-w", "--xmt_window", dest="window", type=float, 145 | help="Max time in sec for a packet group to be considered as one transmission (default: %(default)s)") 146 | parser.add_argument("-T", "--include_TPMS", action='store_true', 147 | dest="include_TPMS", default=False, 148 | help="include tire pressure monitors in catalog (default: %(default)s)") 149 | parser.add_argument("-v", "--version", action="version", version=VERSION) 150 | 151 | args = parser.parse_args() 152 | return args 153 | 154 | ############################################################################### 155 | # Convert time string (ts) from ISO format to epoch time 156 | # Or, if ts is in epoch time, convert to timestamp format. 157 | # Return both formats for use in processing and displaying 158 | def CnvTime(ts): 159 | if ts.find("-") > 0: 160 | try: 161 | eTime = datetime.datetime.fromisoformat(ts).timestamp() 162 | timestamp = ts 163 | except ValueError as e: 164 | err={} 165 | print("datetime error in input line converting time string: ", ts) 166 | print("datetime msg:", err.get("error", str(e))) 167 | sys.exit(1) 168 | else: 169 | try: 170 | eTime = float(ts) 171 | timestamp = datetime.datetime.fromtimestamp(eTime) 172 | except ValueError as e: 173 | err = {} 174 | print("Datetime conversion failed on line with datetime string", ts) 175 | print("float() error msg:", err.get("error", str(e))) 176 | sys.exit(1) 177 | 178 | return(timestamp, eTime) 179 | # End CnvTime() 180 | 181 | ########################################################################################## 182 | # main program 183 | start_time = time.process_time() 184 | 185 | print("rtl_433_stats:", AP_DESCRIPTION) 186 | 187 | args = make_parser() 188 | omitSNR = args.omit is not None and "SNR" in args.omit 189 | omitITGT = args.omit is not None and "ITGT" in args.omit 190 | omitFreq = args.omit is not None and "Freq" in args.omit 191 | omitPPT = args.omit is not None and "PPT" in args.omit 192 | 193 | firstTime = float('inf') 194 | lastTime = 0.0 195 | DDTC = 0 196 | devices = {} 197 | 198 | print("Processing ISM 433MHz messages recorded by rtl_433") 199 | print("Including" if not omitSNR else "Excluding", "SNR Stats") 200 | print("Including" if not omitITGT else "Excluding", "ITGT Stats") 201 | print("Including" if not omitFreq else "Excluding", "Freq Stats") 202 | print("Including" if args.include_TPMS else "Excluding", "TPMS devices") 203 | if args.noise is not None: 204 | print("Excluding devices with fewer than", args.noise, "packets seen") 205 | noise = args.noise 206 | else: 207 | noise = 0 208 | if args.window is not None: 209 | print("Using transmission window of", args.window, "sec rather than deafult", dup_thresh, "sec") 210 | dup_thresh = args.window 211 | 212 | lc = 0 # line count, for error reporting 213 | with fileinput.FileInput(files=args.input, openhook=fileinput.hook_compressed) as log: 214 | #open(args.fn,"rt") as log: 215 | for line in log: 216 | if log.isfirstline(): 217 | print("Processing file", log.filename()) 218 | lc += 1 219 | # strip any NUL chars from the line 220 | line = str.replace(line, "\x00", "", -1) 221 | # and unpack the rtl_433 JSON log record 222 | try: 223 | y = json.loads(line) 224 | except JSONDecodeError as e: 225 | print("JSON decode error at file line ", fileinput.filelineno()) 226 | print("Line contents:\n", line) 227 | err = {} 228 | print("JSON error msg:", err.get("error", str(e))) 229 | print("\nOr are there null characters in your input file?") 230 | print("On Linux, try sed \"s/\\x0//g' oldfile.json > newfile.json\" to remove them") 231 | print("On OSX/Posix, try \"tr -d '\\000' < oldfile.json > newfile.json\"") 232 | sys.exit(1) 233 | except TypeError as e: 234 | print("JSON type error in file line", fileinput.filelineno()) 235 | print("Line contents:\n", line) 236 | err = {} 237 | print("JSON error msg:", err.get("error", str(e))) 238 | sys.exit(1) 239 | 240 | # Ignore packets with no "model" field 241 | if "model" not in y: 242 | continue 243 | 244 | # Ignore TPMS packets if told to do so 245 | if not args.include_TPMS: 246 | if "type" in y and y["type"]=="TPMS": 247 | continue 248 | 249 | # Statements below make 'model'/'channel'/'id' the key for cataloging and summarizing 250 | dev = y["model"] + "/" 251 | if "channel" in y: 252 | dev += str(y["channel"]) 253 | dev += "/" 254 | if "id" in y: 255 | dev += str(y["id"]) 256 | 257 | # Convert data values to standard form 258 | (ts,eTime) = CnvTime(y["time"]) 259 | snr = 0.0 if "snr" not in y else float(y["snr"]) 260 | freq = 0.0 if "freq" not in y else float(y["freq"]) 261 | battery = None if "battery_ok" not in y else y["battery_ok"] 262 | status = None if "status" not in y else y["status"] 263 | 264 | # Mark earliest and latest records for reporting 265 | firstTime = min(eTime, firstTime) 266 | lastTime = max(eTime, lastTime) 267 | 268 | # We keep the following counters as the file is processed: 269 | # lc = the total number of packets seen, of all types 270 | # DDTC = the number of de-duplicated transmissions seen 271 | # Where a transmission may include 1 or up to 6 or more replicated packets 272 | # Replicated packets may NOT have the same SNR or frequency and so are 273 | # included in the SNR and Freq statistics. Data fields not compared. 274 | # ITGT measures the time between the first packet of one transmission 275 | # and the first packet of the prior transmission. 276 | 277 | if dev in devices: 278 | (dup, batt_change, stat_change) = devices[dev].update(snr, eTime, freq, battery, status) 279 | else: 280 | devices[dev] = Data(snr, eTime, freq, battery, status) 281 | (dup, batt_change, stat_change) = (False, False, False) 282 | if not dup: 283 | DDTC += 1 284 | if batt_change: 285 | print("{:<17} to {:>3} for {:<30} at {:<25}".format("battery_ok change", battery, dev, y["time"])) 286 | if stat_change: 287 | print("{:<17} to {:>3} for {:<30} at {:<25}".format("status change", status, dev, y["time"])) 288 | 289 | # Finished processing log files; write summary report 290 | print("\nProcessed", lc, "Packets as", DDTC, "De-Duplicated Transmissions", 291 | "in {:>5.2f}sec".format(time.process_time()-start_time), 292 | "\nPackets dated from",time.strftime("%a %Y-%m-%d %H:%M:%S",time.localtime(firstTime)), 293 | "to", time.strftime("%a %Y-%m-%d %H:%M:%S",time.localtime(lastTime))) 294 | print() 295 | 296 | # First, the header, dependent upon cmdline options 297 | print("{:<30} ".format(""), end="") 298 | if not omitSNR: 299 | print("{:^36} ".format("Signal-to-Noise"), end="") 300 | if not omitITGT: 301 | print("{:^36}".format("Inter-Transmission Gap Time"), end="") 302 | if not omitFreq: 303 | print(" {:^40}".format("Frequency (MHz)"), end="") 304 | if not omitPPT: 305 | print(" {:^36} ".format("Packets per Transmit"), end="") 306 | print() 307 | print("{:<30} ".format("Device model/channel/id"), end="") 308 | 309 | if not omitSNR: 310 | print("{:>33} ".format("_"*33), end="") 311 | if not omitITGT: 312 | print(" {:>40} ".format("_"*39), end="") 313 | if not omitFreq: 314 | print(" {:>44}".format("_"*44), end="") 315 | if not omitPPT: 316 | print(" {:>36}".format("_"*36), end="") 317 | print() 318 | 319 | print("{:<30} ".format(""), end="") 320 | if not omitSNR: 321 | print("{:>6} {:>8} {:>5} {:>5}".format("#Pkts", " Mean ± 𝜎", "Min", "Max"), end="") 322 | if not omitITGT: 323 | print(" {:>7} {:>13} {:>5} {:>5}".format(" #Gaps", "Mean ± 𝜎", "Min", "Max"), end="") 324 | if not omitFreq: 325 | print(" {:>7} {:>5} {:>5} {:>5}".format(" #Pkts", "Mean ± 𝜎", "Min", "Max"), end="") 326 | if not omitPPT: 327 | print(" {:>7} {:>5} {:>5} {:>5}".format("#Xmits", "Mean ± 𝜎", "Min", "Max"), end="") 328 | print() 329 | 330 | # And now the data values from the table 331 | for d in sorted(devices): 332 | (pkt, xmt, SNR, ITGT, Freq, PPT) = devices[d].get() 333 | if pkt < noise: 334 | continue 335 | print("{:<30}".format(d), end="") 336 | if not omitSNR: 337 | (n,avg,std,min,max) = SNR.get() 338 | print("{:>7} {:>6.1f} ±{:>5.1f} {:>5.1f} {:>5.1f}".format(n,avg,std,min,max), end="") 339 | if not omitITGT: 340 | if ITGT is not None: 341 | (n,avg,std,min,max) = ITGT.get() 342 | else: 343 | (n,avg,std,min,max) = (0,0.,0.,0.,0.) 344 | print(" {:>7} {:>7.1f}s ± {:>6.1f} {:>5.1f} {:>7.1f}".format(n,avg,std,min,max), end="") 345 | if not omitFreq: 346 | (n,avg,std,min,max) = Freq.get() 347 | print(" {:>7} {:>8.3f} ± {:>6.3f} {:>8.3f} {:>8.3f}".format(n,avg,std,min,max), end="") 348 | if not omitPPT: 349 | if PPT is not None: 350 | (n,avg,std,min,max) = PPT.get() 351 | else: 352 | (n,avg,std,min,max) = (0,0.,0.,0.,0.) 353 | print(" {:>7} {:>6.1f} ± {:>4.1f} {:>5.0f} {:>5.0f}".format(n,avg,std,min,max), end="") 354 | print() 355 | 356 | sys.exit(0) 357 | -------------------------------------------------------------------------------- /tools/rtl_json_csv: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # rtl_json_csvn 3 | # Program to extract fields from an rtl_433 JSON log file 4 | # into a CSV file, with model+channel+id as device name keyword 5 | # Written by David Todd, hdtodd@gmail.com, 2023.03 6 | 7 | PROGNAME="rtl_json_csv" 8 | VERSION="2.0.0" 9 | 10 | import sys 11 | import argparse 12 | import fileinput 13 | import json 14 | from json.decoder import JSONDecodeError 15 | 16 | AP_DESCRIPTION=""" 17 | \tExtract specified fields and their associated values from an rtl_433 JSON log 18 | \tfile into a CSV-format file. 19 | \n\tThe first field of the output file is a device identifier constructed in the 20 | \tformat "model/channel/id", where those fields are the standard rtl_433 21 | \tdevice identifiers. The other fields follow, in CSV format. 22 | \n\tThe first line is a header line that identifies the fields by name. 23 | \n\tIf no output file is specified, output is directed to the controlling terminal. 24 | """ 25 | 26 | AP_EPILOG=""" 27 | Input file specifications may be wildcard or stdin (-). Files may be text, .gz, or .bz2. 28 | 29 | """ 30 | 31 | ########################################################################################## 32 | # Routine to reate the command parser, parse cmd line, and set defaults 33 | def make_parser(): 34 | 35 | parser = argparse.ArgumentParser(formatter_class=argparse.RawDescriptionHelpFormatter, 36 | description=AP_DESCRIPTION, 37 | epilog=AP_EPILOG) 38 | 39 | parser.add_argument("-i", "--input", metavar="FILE", nargs="*", 40 | help="Path to JSON log files to read in; can be .gz; can be wildcard; blank if ") 41 | parser.add_argument("-o", "--output", dest="out_file", 42 | help="Path to and name of CSV file to be created") 43 | parser.add_argument("-f", "--field", nargs="+", required=True, 44 | help="List of JSON field names and associated values to be extracted") 45 | parser.add_argument("-T", "--include_TPMS", action='store_true', 46 | dest="include_TPMS", default=False, 47 | help="include tire pressure monitors in output (default: %(default)s)") 48 | parser.add_argument("-d", "--debug", dest="debug", action="store_true") 49 | 50 | args = parser.parse_args() 51 | return args 52 | 53 | 54 | ########################################################################################## 55 | # main program 56 | 57 | args = make_parser() 58 | 59 | print(PROGNAME, VERSION, ":", AP_DESCRIPTION) 60 | print("Extracting records from JSON into CSV format into file ", end="") 61 | if args.out_file: 62 | csv = open(args.out_file, "w") 63 | print(args.out_file) 64 | else: 65 | csv = sys.stdout 66 | print("stdout") 67 | print("include_TPMS =", args.include_TPMS) 68 | 69 | lc = 0 # line count, for error reporting 70 | print("{:<40}".format("0Device_key,"), end="", file=csv) 71 | for f in args.field: 72 | print("{:>10}".format(f), end=",", file=csv) 73 | print(file=csv) 74 | with fileinput.FileInput(files=args.input, openhook=fileinput.hook_compressed, encoding="utf-8") as log: 75 | for line in log: 76 | lc += 1 77 | line = str.replace(line, "\x00", "", -1) 78 | if log.isfirstline(): 79 | print("Processing file", log.filename()) 80 | # unpack the record in JSON 81 | try: 82 | y = json.loads(line) 83 | except JSONDecodeError as e: 84 | print("JSON decode error at file line ", lc) 85 | print("Line contents:\n", line) 86 | err = {} 87 | print("JSON error msg:", err.get("error", str(e))) 88 | print("Or are there null characters in your input file?") 89 | print("Try sed 's/\\x0//g' oldfile > newfile to remove them") 90 | quit() 91 | except TypeError as e: 92 | print("JSON type error in file line", lc) 93 | print("Line contents:\n", line) 94 | err = {} 95 | print("JSON error msg:", err.get("error", str(e))) 96 | quit() 97 | 98 | # Ignore TPMS packets if told to do so 99 | if not args.include_TPMS and "type" in y and y["type"]=="TPMS": 100 | continue 101 | 102 | # If a status record, no device "model"; ignore 103 | if not "model" in y: 104 | continue 105 | else: 106 | # Does this record have a value for a field we want? 107 | have_a_value = False 108 | for f in args.field: 109 | have_a_value = have_a_value or f in y 110 | if not have_a_value: # no 111 | continue 112 | else: 113 | key = y["model"] + "/" 114 | if "channel" in y: 115 | key += str(y["channel"]) 116 | key += "/" 117 | if "id" in y: 118 | key += str(y["id"]) 119 | key += "," 120 | print("{:<40}".format(key), end="", file=csv) 121 | for f in args.field: 122 | print("{:>10}".format(" " if not f in y else y[f]), end=",", file=csv) 123 | print(file=csv) 124 | 125 | print("\n\n", lc, "lines processed", file=sys.stdout) 126 | quit() 127 | -------------------------------------------------------------------------------- /tools/rtl_xtract_json: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # rtl_xtract_json 3 | # Program to extract records for a specific device from an 4 | # rtl_433 JSON log file into a separate file. 5 | # Records are identified by "model/channel/id" as device 6 | # name keyword 7 | # Written by David Todd, hdtodd@gmail.com, 2023.03 8 | 9 | PROGNAME="rtl_xtract_json" 10 | VERSION="1.0.0" 11 | 12 | import sys 13 | import argparse 14 | import fileinput 15 | import json 16 | from json.decoder import JSONDecodeError 17 | 18 | AP_DESCRIPTION=""" 19 | \tCopy records identified by a device identifier from an rtl_433 JSON log 20 | \tfile into a separate JSON file. 21 | \n\tThe device identifier is constructed as "model/channel/id", where those 22 | \tfields are the standard rtl_433 device identifiers. 23 | \n\tThe complete JSON record of lines that match that identifier are copied 24 | \tto the specified output file. 25 | 26 | """ 27 | 28 | AP_EPILOG=""" 29 | 30 | Input file specifications may be wildcard or stdin (-). Files may be text, .gz, or .bz2. 31 | """ 32 | 33 | ########################################################################################## 34 | # Routine to reate the command parser, parse cmd line, and set defaults 35 | def make_parser(): 36 | 37 | parser = argparse.ArgumentParser(formatter_class=argparse.RawDescriptionHelpFormatter, 38 | description=AP_DESCRIPTION, 39 | epilog=AP_EPILOG) 40 | 41 | parser.add_argument("-d", "--device", dest="device", type=str, required=True, nargs="*", 42 | help="Identifier for the device to be copied in the format model/channel/id") 43 | parser.add_argument("-i", "--input", metavar="FILE", nargs="*", 44 | help="Path to JSON log files to read in; can be .gz; can be wildcard; blank if ") 45 | parser.add_argument("-o", "--output", dest="out_file", 46 | help="Path to and name of CSV file to be created") 47 | 48 | args = parser.parse_args() 49 | return args 50 | 51 | 52 | ########################################################################################## 53 | # main program 54 | 55 | args = make_parser() 56 | 57 | print(PROGNAME, VERSION, ":", AP_DESCRIPTION) 58 | print("Copying records for", args.device, "to file ", end="") 59 | 60 | if args.out_file: 61 | out = open(args.out_file, "w") 62 | print(args.out_file) 63 | else: 64 | out = sys.stdout 65 | print("stdout") 66 | 67 | lc = 0 # line count, for error reporting 68 | with fileinput.FileInput(files=args.input, openhook=fileinput.hook_compressed, encoding="utf-8") as log: 69 | for line in log: 70 | if log.isfirstline(): 71 | print("Processing file", log.filename()) 72 | lc += 1 73 | # strip any NUL chars from the line 74 | line = str.replace(line, "\x00", "", -1) 75 | # and unpack the rtl_433 JSON log record 76 | try: 77 | y = json.loads(line) 78 | except JSONDecodeError as e: 79 | print("JSON decode error at file line ", lc) 80 | print("Line contents:\n", line) 81 | err = {} 82 | print("JSON error msg:", err.get("error", str(e))) 83 | print("\nOr are there null characters in your input file?") 84 | print("On Linux, try sed \"s/\\x0//g' oldfile.json > newfile.json\" to remove them") 85 | print("On OSX/Posix, try \"tr -d '\\000' < oldfile.json > newfile.json\"") 86 | quit() 87 | except TypeError as e: 88 | print("JSON type error in file line", lc) 89 | print("Line contents:\n", line) 90 | err = {} 91 | print("JSON error msg:", err.get("error", str(e))) 92 | quit() 93 | 94 | # Ignore packets with no "model" field 95 | if not "model" in y: 96 | continue 97 | 98 | # Statements below make 'model'/'channel'/'id' the key for cataloging and summarizing 99 | # Change the following statements to experiment with other keys 100 | dev = y["model"] + "/" 101 | if "channel" in y: 102 | dev += str(y["channel"]) 103 | dev += "/" 104 | if "id" in y: 105 | dev += str(y["id"]) #use 'model'+'id' as unique key 106 | 107 | if dev in args.device: 108 | print(line, end="", file=out) 109 | 110 | print("\n\n", lc, "lines processed", file=sys.stdout) 111 | quit() 112 | -------------------------------------------------------------------------------- /xaa-output.prn: -------------------------------------------------------------------------------- 1 | rtl_433_stats: 2 | Analyze rtl_433 JSON logs to catalog the devices seen and to characterize 3 | statistically their signal-to-noise ratio (SNR), times between 4 | transmissions (ITGT),tradio frequency (Freq), and packets per transmission (PPT). 5 | 6 | Processing ISM 433MHz messages recorded by rtl_433 7 | Including SNR Stats 8 | Including ITGT Stats 9 | Including Freq Stats 10 | Excluding TPMS devices 11 | Processing file xaa.json 12 | 13 | Processed 20000 Packets as 6952 De-Duplicated Transmissions in 0.22sec 14 | Packets dated from Thu 2022-06-09 07:08:27 to Thu 2022-06-09 19:46:16 15 | 16 | Signal-to-Noise Inter-Transmission Gap Time Frequency (MHz) Packets per Transmit 17 | Device model/channel/id _________________________________ _______________________________________ ____________________________________________ ____________________________________ 18 | #Pkts Mean ± 𝜎 Min Max #Gaps Mean ± 𝜎 Min Max #Pkts Mean ± 𝜎 Min Max #Xmits Mean ± 𝜎 Min Max 19 | Acurite-01185M/0/0 4 9.6 ± 4.9 6.4 16.9 3 3678.7s ± 2812.7 434.0 5425.0 4 433.911 ± 0.017 433.902 433.936 3 1.0 ± 0.0 1 1 20 | Acurite-606TX//134 858 8.4 ± 2.1 5.5 20.0 857 53.0s ± 139.2 30.0 2573.0 858 433.901 ± 0.009 433.863 433.962 857 1.0 ± 0.0 1 1 21 | Acurite-609TXC//194 8006 19.3 ± 0.5 12.3 21.2 1356 33.5s ± 0.7 33.0 49.0 8006 433.931 ± 0.002 433.922 433.950 1356 5.9 ± 0.4 2 6 22 | Acurite-Tower/A/11524 8203 19.2 ± 0.5 13.2 20.8 2752 16.5s ± 2.5 15.0 33.0 8203 433.950 ± 0.002 433.926 433.955 2752 3.0 ± 0.2 1 3 23 | LaCrosse-TX141Bv3/1/253 597 8.4 ± 1.3 5.7 19.2 347 109.9s ± 338.9 31.0 4216.0 597 433.904 ± 0.003 433.863 433.945 347 1.7 ± 0.5 1 2 24 | LaCrosse-TX141THBv2/0/168 1536 9.6 ± 1.1 6.0 19.2 837 54.2s ± 14.7 49.0 150.0 1536 433.961 ± 0.004 433.862 433.966 837 1.8 ± 0.4 1 2 25 | Markisol/0/0 39 19.1 ± 1.2 12.3 20.2 38 1053.5s ± 1532.1 33.0 6633.0 39 433.932 ± 0.002 433.928 433.936 38 1.0 ± 0.0 1 1 26 | Markisol/0/256 20 19.3 ± 0.4 18.5 20.2 19 2108.7s ± 3625.7 33.0 14070.0 20 433.931 ± 0.002 433.927 433.936 19 1.0 ± 0.0 1 1 27 | Markisol/1/0 36 19.2 ± 0.5 17.6 20.0 35 1009.8s ± 1550.0 67.0 6801.0 36 433.931 ± 0.002 433.927 433.934 35 1.0 ± 0.0 1 1 28 | Prologue-TH/2/203 699 11.6 ± 1.3 7.2 19.5 698 64.9s ± 32.7 52.0 477.0 699 433.864 ± 0.008 433.859 433.943 698 1.0 ± 0.0 1 1 29 | --------------------------------------------------------------------------------