├── .gitignore ├── EXAMPLE_APPS.md ├── LICENSE ├── README.md ├── RaspberryPi.md ├── apps ├── IF201812 │ ├── IF201812-Preprocess.ipynb │ ├── IF201812-Train-Eval-All.ipynb │ └── config.py ├── cnn-laser-machine-listener │ ├── CNN-LML-Another-Attempt-AlexNetBased.ipynb │ ├── CNN-LML-Preprocess-Data.ipynb │ ├── CNN-LML-TF-Model-Conversion.ipynb │ ├── CNN-LML-Train.ipynb │ ├── CNN-LML-Visualize.ipynb │ ├── README.md │ ├── cnn-alexbased-laser-machine-listener.pb │ ├── cnn-model-laser-machine-listener.pb │ ├── config.py │ ├── download.sh │ └── model_based_on_fsd.h5 ├── fsdkaggle2018 │ ├── FSDKaggle2018-TF-Model-Coversion.ipynb │ ├── FSDKaggle2018-Visualize.ipynb │ ├── config.py │ ├── train_alexbased.py │ └── train_mobilenetv2.py ├── fsdkaggle2018small │ ├── config.py │ ├── train_alexbased.py │ └── train_mobilenetv2.py └── urban-sound │ ├── UrbanSound8K-Overview&Preprocess.ipynb │ ├── UrbanSound8K-Train-AlexNet.ipynb │ ├── UrbanSound8K-Train-MobileNetV2.ipynb │ ├── config.py │ └── train.py ├── common.py ├── config.py ├── convert_keras_to_tf.py ├── deskwork_detector.py ├── ext └── download.sh ├── lib_train.py ├── model ├── alexbased_small_fsd2018_41cls.h5 ├── alexbased_small_fsd2018_41cls.pb ├── mobilenetv2_fsd2018_41cls.h5 ├── mobilenetv2_fsd2018_41cls.pb ├── mobilenetv2_fsd2018_41cls.txt └── mobilenetv2_small_fsd2018_41cls.h5 ├── premitive_file_predictor.py ├── realtime_predictor.py ├── rpi └── config.py ├── sample ├── desktop_sounds.wav └── fireworks.wav ├── sound_models.py ├── template_realtime_mels_viewer.py ├── test └── test_lib_train.py ├── title.jpg └── visualize.py /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | [Xy]_*.npy 6 | 7 | # C extensions 8 | *.so 9 | 10 | # Distribution / packaging 11 | .Python 12 | build/ 13 | develop-eggs/ 14 | dist/ 15 | downloads/ 16 | eggs/ 17 | .eggs/ 18 | lib/ 19 | lib64/ 20 | parts/ 21 | sdist/ 22 | var/ 23 | wheels/ 24 | *.egg-info/ 25 | .installed.cfg 26 | *.egg 27 | MANIFEST 28 | 29 | # PyInstaller 30 | # Usually these files are written by a python script from a template 31 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 32 | *.manifest 33 | *.spec 34 | 35 | # Installer logs 36 | pip-log.txt 37 | pip-delete-this-directory.txt 38 | 39 | # Unit test / coverage reports 40 | htmlcov/ 41 | .tox/ 42 | .coverage 43 | .coverage.* 44 | .cache 45 | nosetests.xml 46 | coverage.xml 47 | *.cover 48 | .hypothesis/ 49 | .pytest_cache/ 50 | 51 | # Translations 52 | *.mo 53 | *.pot 54 | 55 | # Django stuff: 56 | *.log 57 | local_settings.py 58 | db.sqlite3 59 | 60 | # Flask stuff: 61 | instance/ 62 | .webassets-cache 63 | 64 | # Scrapy stuff: 65 | .scrapy 66 | 67 | # Sphinx documentation 68 | docs/_build/ 69 | 70 | # PyBuilder 71 | target/ 72 | 73 | # Jupyter Notebook 74 | .ipynb_checkpoints 75 | 76 | # pyenv 77 | .python-version 78 | 79 | # celery beat schedule file 80 | celerybeat-schedule 81 | 82 | # SageMath parsed files 83 | *.sage.py 84 | 85 | # Environments 86 | .env 87 | .venv 88 | env/ 89 | venv/ 90 | ENV/ 91 | env.bak/ 92 | venv.bak/ 93 | 94 | # Spyder project settings 95 | .spyderproject 96 | .spyproject 97 | 98 | # Rope project settings 99 | .ropeproject 100 | 101 | # mkdocs documentation 102 | /site 103 | 104 | # mypy 105 | .mypy_cache/ 106 | -------------------------------------------------------------------------------- /EXAMPLE_APPS.md: -------------------------------------------------------------------------------- 1 | # Example Applications and Training 2 | 3 | `apps` folder has some example applications which explains entire process to work with your dataset. 4 | 5 | ## Install Before Start Training 6 | 7 | Run followings to download python modules for using training. 8 | 9 | ```sh 10 | cd ext 11 | ./download.sh 12 | ``` 13 | 14 | ## A. FSDKaggle2018 15 | 16 | This trains a model for Freesound Dataset Kaggle 2018 dataset. 17 | 18 | - `mobilenetv2_fsd2018_41cls.h5` is a trained model for 500 epochs, almost competitive even in the Kaggle competition. 19 | - It has useful representation for using as pretrained model for other tasks. 20 | - 44.1 kHz, 1 second duration, 128 n_mels and 128 time hops. 21 | 22 | Followings will train MobileNetV2 model: 23 | 24 | ```Python 25 | cd apps/fsdkaggle2018 26 | python train.py 27 | ``` 28 | 29 | Followings will train AlexNet based model: 30 | 31 | ```Python 32 | cd apps/fsdkaggle2018/alexnet 33 | python train.py 34 | ``` 35 | 36 | Running this will convert trained model to Tensorflow .pb file. 37 | 38 | - `FSDKaggle2018-TF-Model-Coversion.ipynb` 39 | 40 | ## B. FSDKaggle2018small 41 | 42 | This trains a model for handing smaller audio data. Model is not small but audio, FS is 16 kHz for example. This sample is less computationally expensive for audio processing. 43 | 44 | - `mobilenetv2_small_fsd2018_41cls.h5` is the model created by this. 45 | - 16 kHz, 1 second duration, 64 n_mels and 64 time hops. 46 | 47 | Followings will train model: 48 | 49 | ```Python 50 | cd apps/fsdkaggle2018small 51 | python train_this.py 52 | ``` 53 | 54 | Followings will train AlexNet based model: 55 | 56 | ```Python 57 | cd apps/fsdkaggle2018/alexnet 58 | python train.py 59 | ``` 60 | 61 | ## C. CNN Laser Machine Listener 62 | 63 | This is a experimental application example to [github/Laser Machine Listener](https://github.com/kotobuki/laser-machine-listener). 64 | Classification problem of sounds in hardware laboratory. 65 | 66 | Originally simple NN was applied successfully for this classification problem, then what if we apply CNN? 67 | 68 | Simple answer is too much for the provided dataset as is. 69 | It might be effective if: 70 | 71 | - We need a single model that needs to work fine in many different FabLabs. Then the model needs to generalize well. 72 | - And we _have enough data_ from variety of FabLabs and machines. 73 | 74 | This example has three notebooks to make it work. 75 | 76 | 1. Run followings. 77 | ```sh 78 | cd apps/cnn-laser-machine-listener 79 | ./download.sh 80 | ``` 81 | 2. Preprocess data by `CNN-LML-Preprocess-Data.ipynb`. 82 | 3. Train model by `CNN-LML-Train.ipynb`. 83 | 4. Convert model to .pb by `CNN-LML-TF-Model-Conversion.ipynb`. 84 | 5. Then you can predict in realtime as follows. 85 | ```sh 86 | python ../../realtime_predictor.py 87 | ``` 88 | 89 | `cnn-model-laser-machine-listener.pb` is ready in this repository for your quick try. 90 | 91 | ### C.1 Another attempt for CNN Laser Machine Listener 92 | 93 | AlexNet based model is also applied to this problem. It not only shows the better result, but also runs much faster. 94 | 95 | - `CNN-LML-Another-Attempt-AlexNetBased.ipynb` is the notebook that shows how to train, and visualization of results. 96 | - `cnn-alexbased-laser-machine-listener.pb` is also ready for your quick try. 97 | 98 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2018 Daisuke Niizumi 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Machine Learning Sound Classifier for Live Audio 2 | 3 | ![title](title.jpg) 4 | (Picture on the left: [Rembrandt - Portrait of an Evangelist Writing, cited from Wikipedia](https://commons.wikimedia.org/wiki/File:Rembrandt_-_Portrait_of_an_Evangelist_Writing.jpg)) 5 | 6 | This is a simple, fast, for live audio in realtime, customizable machine learning sound classifier. 7 | 8 | - MobileNetV2; light weight CNN model for mobile is used. 9 | - (new) AlexNet based lighter/faster CNN model is also provided. 10 | - Tensorflow graph for runtime faster prediction, and for portability. 11 | - Freesound Dataset Kaggle 2018 pre-built models are ready. 12 | - (new) Raspberry Pi realtime inference supported. 13 | - Ensembling sequence of predictions (moving average) for stable results. 14 | - Normalizing samplewise for robustness to level drift. 15 | - Easy to customize example. 16 | 17 | This project is a spin-off from my solution for a competition Kaggle ["Freesound General-Purpose Audio Tagging Challenge"](https://www.kaggle.com/c/freesound-audio-tagging) for trying in real environment. 18 | 19 | The competition itself have provided dataset as usual, then all the work have done with dataset's static recorded samples. 20 | Unlike competition, this project is developed as working code/exsample to record live audio and classify in realtime. 21 | 22 | (new) AlexNet based lighter model is also added and that makes it possible to run on Raspberry Pi. 23 | 24 | ### Application Pull Requests/Reports are Welcome! 25 | 26 | If you implemented on your mobile app, or any embedded app like detecting sound of door open/close and make notification with Raspberry Pi, I really appreciate if you could let me know. Or you can pull-request your application, post as an issue here. You can also reach me via [twitter](https://twitter.com/nizumical). 27 | 28 | ## 1. Running Classifier 29 | 30 | ### Requirements 31 | 32 | Install following modules. 33 | 34 | - Python 3.6. 35 | - Tensorflow, tested with 1.9. 36 | - Keras, tested with 2.2.0. 37 | - LibROSA, tested with 0.6.0. 38 | - PyAudio, tested with 0.2.11. 39 | - EasyDict, tested with 1.4. 40 | - Pandas, tested with 0.23.0. 41 | - Matplotlib, tested with 2.2.2. 42 | - imbalance.learn (For training only), tested with 0.3.3. 43 | - (some others, to be updated) 44 | 45 | ### Quick try 46 | 47 | Simply running `realtime_predictor.py` will start listening to the default audio input to predict. Following result is an example when typing computer keyboard on my laptop which was runnning this. 48 | You can see 'Bus' after many 'Computer_keyboard'. It appears many times with usual home environmental noise like sound of car running on the road neay by, so it is expected. 49 | 50 | ``` 51 | $ python realtime_predictor.py 52 | Using TensorFlow backend. 53 | Fireworks 31 0.112626694 54 | Computer_keyboard 8 0.13604443 55 | Computer_keyboard 8 0.1880264 56 | Computer_keyboard 8 0.20889345 57 | Computer_keyboard 8 0.23447378 58 | Computer_keyboard 8 0.27483797 59 | Computer_keyboard 8 0.30053857 60 | Computer_keyboard 8 0.31214702 61 | Computer_keyboard 8 0.31657898 62 | Computer_keyboard 8 0.30582297 63 | Computer_keyboard 8 0.31986332 64 | Computer_keyboard 8 0.289629 65 | Computer_keyboard 8 0.22962372 66 | Computer_keyboard 8 0.20274992 67 | Computer_keyboard 8 0.2013374 68 | Computer_keyboard 8 0.1868646 69 | Computer_keyboard 8 0.17860062 70 | Computer_keyboard 8 0.17743647 71 | Computer_keyboard 8 0.18610674 72 | Bus 22 0.13272804 73 | Bus 22 0.14847495 74 | Bus 22 0.15111914 75 | Bus 22 0.16028993 76 | Bus 22 0.18327181 77 | Bus 22 0.1998493 78 | Bus 22 0.2240002 79 | Bus 22 0.2510223 80 | Bus 22 0.28142408 81 | Bus 22 0.32763514 82 | Bus 22 0.35531467 83 | : 84 | ``` 85 | 86 | ### Three example scripts for running classifier 87 | 88 | - realtime_predictor.py - This is basic example, predicts 41 classes provided by Freesound dataset. 89 | - deskwork_detector.py - This is customization example to predict 3 deskwork sounds only, uses Emoji to express prediction probabilities. 90 | - premitive_file_predictor.py - This is very simple example. No prediction ensemble is used, predict from file only. 91 | 92 | ## 2. Applications as training examples 93 | 94 | [Example Applications](EXAMPLE_APPS.md) are provided as examples for some datasets. These explains how to train models. 95 | 96 | And `cnn-laser-machine-listener` is an end to end example to showcase training, converting model to .pb and use it for prediction in realtime. 97 | 98 | Check [Example Applications](EXAMPLE_APPS.md) when you try to train model for your datasets. 99 | 100 | ## 3. Pretrained models 101 | 102 | Following pretrained models are provided. 44.1kHz model is for high quality sound classification, and 16kHz model is for general purpose sounds. 103 | 104 | | File name | Dataset | Base Model | Audio input | 105 | |------------------------------------|---------------|-------------|-----| 106 | | mobilenetv2_fsd2018_41cls.h5 | FSDKaggle2018 | MobileNetV2 | 44.1kHz, 128 n_mels, 128 time hop | 107 | | mobilenetv2_small_fsd2018_41cls.h5 | FSDKaggle2018 | MobileNetV2 | 16kHz, 64 n_mels, 64 time hop | 108 | | alexbased_small_fsd2018_41cls.h5 | FSDKaggle2018 | AlexNet | 16kHz, 64 n_mels, 64 time hop | 109 | 110 | About 9,000 samples from FSDKaggle2018 are used for training. This number of samples 9,000 would not be enough to compare with ImageNet. So we cannot expect that these models have enough capability of representation, but it's still better than starting from scratch in small problems. 111 | 112 | ## 4. Raspberry Pi support 113 | 114 | See [RaspberryPi.md](RaspberryPi.md) for the detail. 115 | 116 | ## 5. Tuning example behaviors 117 | 118 | Followings are important 3 parameters that you can fine-tune program behavior in your environment. 119 | 120 | ``` 121 | conf.rt_process_count = 1 122 | conf.rt_oversamples = 10 123 | conf.pred_ensembles = 10 124 | ``` 125 | 126 | ### `conf.rt_oversamples` 127 | 128 | This sets how many times raw audio capture will happen. 129 | If this is 10, it happens 10 times per one second. 130 | Responce will be quicker with larger value as long as your environment is powerful enough. 131 | 132 | ### `conf.rt_process_count` 133 | 134 | This sets duration of audio conversion and following prediction. This set count(s) of audio capture, 1 means audio conversion and prediction happens with each audio capture. If it is 2, once of prediction per twice of audio captures. 135 | Responce will also be quicker if you set smaller like 1, but usually prediction takes time and you might need to set this bigger if it's not so powerful. 136 | 137 | On my MacBookPro with 4 core CPU, 1 is fine. 138 | 139 | ### `conf.pred_ensembles` 140 | 141 | This sets number of prediction ensembles. 142 | Past to present predictions are held in fifo, and this number is used as the size of this fifo. 143 | Ensemble is done by calclating geometric mean of all the predictions in the fifo. 144 | 145 | For example, if `conf.pred_ensembles = 5`, past 4 predictions and present prediction will be used for calculating present ensemble prediction result. 146 | 147 | ## 6. Possible future applications 148 | 149 | Following examples introduce possible impacts when you apply this kind of classifier in your applications. 150 | 151 | ### App #1 iPhone recording sound example: Fireworks detection 152 | 153 | A sample `fireworks.wav` is recorded on iPhone during my trip to local festival as a video. This audio is cut from the video by Quicktime. 154 | 155 | So basically this is iPhone recorded audio, should be available in your iPhone app. So if you embed the prediction model, your app will understand 'Fireworks is going on'. 156 | 157 | ``` 158 | $ CUDA_VISIBLE_DEVICES= python premitive_file_predictor.py sample/fireworks.wav 159 | Using TensorFlow backend. 160 | Fireworks 0.7465853 161 | Fireworks 0.7307739 162 | Fireworks 0.9102228 163 | Fireworks 0.83495927 164 | Fireworks 0.8383171 165 | Fireworks 0.87180334 166 | Fireworks 0.67869043 167 | Fireworks 0.8273291 168 | Fireworks 0.9263128 169 | Fireworks 0.6631776 170 | Fireworks 0.2644468 171 | Laughter 0.2599133 172 | ``` 173 | 174 | ### App #2 PC recording sound example: Deskwork deteection 175 | 176 | Simple dedicated example is ready, `deskwork_detector.py` will detect 3 deskwork sounds; Writing, Using scissors and Computer keyboard typing. 177 | 178 | And a sample `desktop_sounds.wav` is recorded on my PC that captured sound of 3 deskwork actions in sequence. 179 | 180 | So this is the similar impact as iPhone example, we are capable of detecting specific action. 181 | 182 | ``` 183 | $ python deskwork_detector.py -f sample/desktop_sounds.wav 184 | Using TensorFlow backend. 185 | 📝 📝 📝 📝 📝 Writing 0.40303284 186 | 📝 📝 📝 📝 📝 📝 Writing 0.53238016 187 | 📝 📝 📝 📝 📝 📝 Writing 0.592431 188 | 📝 📝 📝 📝 📝 📝 Writing 0.5818318 189 | 📝 📝 📝 📝 📝 📝 📝 Writing 0.60172915 190 | 📝 📝 📝 📝 📝 📝 📝 Writing 0.6218055 191 | 📝 📝 📝 📝 📝 📝 📝 Writing 0.6176261 192 | 📝 📝 📝 📝 📝 📝 📝 Writing 0.6429986 193 | 📝 📝 📝 📝 📝 📝 📝 Writing 0.64290494 194 | 📝 📝 📝 📝 📝 📝 📝 Writing 0.6534221 195 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.70015717 196 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.7130502 197 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.72495073 198 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.75500214 199 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.7705893 200 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.7760113 201 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.79651505 202 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.7876685 203 | 📝 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.8026823 204 | 📝 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.80895096 205 | 📝 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.8053692 206 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.7975255 207 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.77262956 208 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.76053137 209 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.7428124 210 | 📝 📝 📝 📝 📝 📝 📝 📝 Writing 0.70416236 211 | 📝 📝 📝 📝 📝 📝 📝 Writing 0.6924045 212 | 📝 📝 📝 📝 📝 📝 📝 Writing 0.66129446 213 | 📝 📝 📝 📝 📝 📝 📝 Writing 0.6085751 214 | 📝 📝 📝 📝 📝 📝 Writing 0.5530443 215 | 📝 📝 📝 📝 📝 📝 Writing 0.50439316 216 | 📝 📝 📝 📝 Writing 0.36155683 217 | 📝 📝 📝 Writing 0.25736108 218 | 📝 📝 Writing 0.19337422 219 | ⌨ ⌨ Computer_keyboard 0.17075704 220 | ⌨ ⌨ ⌨ Computer_keyboard 0.2984399 221 | ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.42496625 222 | ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.5532899 223 | ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.64340276 224 | ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.6802248 225 | ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.66601527 226 | ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.62935054 227 | ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.58397347 228 | ⌨ ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.526138 229 | ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.4632419 230 | ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.42490804 231 | ⌨ ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.40419492 232 | ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.38645235 233 | ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.36492997 234 | ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.33910704 235 | ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.3251212 236 | ⌨ ⌨ ⌨ ⌨ Computer_keyboard 0.30858216 237 | ⌨ ⌨ ⌨ Computer_keyboard 0.27181208 238 | ⌨ ⌨ ⌨ Computer_keyboard 0.263743 239 | ⌨ ⌨ ⌨ Computer_keyboard 0.23821306 240 | ⌨ ⌨ ⌨ Computer_keyboard 0.20494641 241 | ⌨ ⌨ Computer_keyboard 0.12053269 242 | ⌨ ⌨ Computer_keyboard 0.15198663 243 | ✁ ✁ Scissors 0.16703679 244 | ✁ ✁ ✁ Scissors 0.21830729 245 | ✁ ✁ ✁ Scissors 0.28535327 246 | ✁ ✁ ✁ ✁ Scissors 0.3840115 247 | ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.52598876 248 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.6164208 249 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.6412893 250 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.7195315 251 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.7480882 252 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.76995486 253 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.7952656 254 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.79389054 255 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.8043309 256 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.79848546 257 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.801064 258 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.79365206 259 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.7864271 260 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.71144533 261 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.70292735 262 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.6741176 263 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.6504746 264 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.6561931 265 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.6303668 266 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.6297095 267 | ✁ ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.6140897 268 | ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.57456887 269 | ✁ ✁ ✁ ✁ ✁ ✁ Scissors 0.5549676 270 | ``` 271 | 272 | ## Future works 273 | 274 | - Documenting some more detail of training your dataset 275 | -------------------------------------------------------------------------------- /RaspberryPi.md: -------------------------------------------------------------------------------- 1 | # Running on Raspberry Pi 2 | 3 | This implementation uses full of CPU, not specialized to use GPU on it. You will need Raspberry Pi 3. 4 | 5 | ## Install modeles 6 | 7 | Followings are my installation log, you might be able to install on fresh Raspbian image. 8 | 9 | ```sh 10 | sudo apt-get install python3-numpy 11 | sudo apt-get install python3-scipy 12 | sudo apt-get install python3-pandas 13 | sudo apt-get install python3-h5py 14 | sudo apt install libatlas-base-dev 15 | sudo apt-get install python3-pyaudio 16 | sudo pip3 install resampy==0.1.5 librosa==0.5.1 17 | pip3 install tensorflow 18 | pip3 install keras 19 | pip3 install easydict 20 | git clone https://github.com/daisukelab/ml-sound-classifier.git 21 | cd ml-sound-classifier/ext 22 | ./download.sh 23 | ``` 24 | 25 | Tensorflow 1.9 will be installed as of now. 26 | 27 | LibROSA has to be the version above, thanks to [the web article (in Japanese) "Raspbian Stretch with Desktopにlibrosaをインストールする@Qiita"](https://qiita.com/mayfair/items/92874e69ba63378f6280). 28 | 29 | You can confirm by running off-line inference as follows: 30 | 31 | ```sh 32 | cd ml-sound-classifier 33 | python3 premitive_file_predictor.py sample/fireworks.wav 34 | ``` 35 | 36 | ** This uses MobileNetV2 model, and it's not fast enough. 37 | 38 | ## Run realtime inference 39 | 40 | ```sh 41 | cd ml-sound-classifier/rpi 42 | python3 ../realtime_predictor.py -i 2 43 | ``` 44 | 45 | The `rpi` folder has appropriate `config.py` for using `alexbased_small_fsd2018_41cls.h5` pre-trained model. 46 | 47 | - 0.5s interval to output results; `conf.rt_process_count / conf.rt_oversamples = 5 / 10 = 0.5` 48 | - It still calculates ensemble of 10 predictions. 49 | 50 | -------------------------------------------------------------------------------- /apps/IF201812/IF201812-Preprocess.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# Interface 2018/12 Snack sound classification\n", 8 | "\n", 9 | "Classification task of 6 snack bag shaking sounds.\n", 10 | "\n", 11 | "https://shop.cqpub.co.jp/hanbai/books/MIF/MIF201812.html\n", 12 | "\n", 13 | "This notebook download, convert and preprocess dataset." 14 | ] 15 | }, 16 | { 17 | "cell_type": "code", 18 | "execution_count": 1, 19 | "metadata": {}, 20 | "outputs": [ 21 | { 22 | "name": "stdout", 23 | "output_type": "stream", 24 | "text": [ 25 | "{'sampling_rate': 44100, 'duration': 2, 'hop_length': 694, 'fmin': 20, 'fmax': 22050, 'n_mels': 128, 'n_fft': 2560, 'model': 'mobilenetv2', 'labels': ['babystar', 'bbq', 'corn', 'kappaebi', 'potechi', 'vegetable'], 'folder': PosixPath('.'), 'n_fold': 1, 'valid_limit': None, 'random_state': 42, 'test_size': 0.2, 'samples_per_file': 1, 'batch_size': 32, 'learning_rate': 0.0001, 'metric_save_ckpt': 'val_acc', 'epochs': 100, 'verbose': 2, 'best_weight_file': 'best_model_weight.h5', 'rt_process_count': 1, 'rt_oversamples': 10, 'pred_ensembles': 10, 'runtime_model_file': None, 'label2int': {'babystar': 0, 'bbq': 1, 'corn': 2, 'kappaebi': 3, 'potechi': 4, 'vegetable': 5}, 'num_classes': 6, 'samples': 88200, 'rt_chunk_samples': 4410, 'mels_onestep_samples': 4410, 'mels_convert_samples': 92610, 'dims': [128, 128, 1], 'metric_save_mode': 'auto', 'logdir': 'logs', 'data_balancing': 'over_sampling', 'X_train': 'X_train.npy', 'y_train': 'y_train.npy', 'X_test': 'X_test.npy', 'y_test': 'y_test.npy', 'steps_per_epoch_limit': None, 'aug_mixup_alpha': 1.0, 'eval_ensemble': True, 'what_is_sample': 'log mel-spectrogram', 'use_audio_training_model': True}\n" 26 | ] 27 | }, 28 | { 29 | "name": "stderr", 30 | "output_type": "stream", 31 | "text": [ 32 | "Using TensorFlow backend.\n" 33 | ] 34 | } 35 | ], 36 | "source": [ 37 | "import sys\n", 38 | "sys.path.append('../..')\n", 39 | "from lib_train import *\n", 40 | "%matplotlib inline\n", 41 | "\n", 42 | "DATAROOT = Path('Snack/data')" 43 | ] 44 | }, 45 | { 46 | "cell_type": "markdown", 47 | "metadata": {}, 48 | "source": [ 49 | "## Prepare dataset" 50 | ] 51 | }, 52 | { 53 | "cell_type": "code", 54 | "execution_count": 7, 55 | "metadata": {}, 56 | "outputs": [ 57 | { 58 | "name": "stdout", 59 | "output_type": "stream", 60 | "text": [ 61 | "--2018-10-30 14:44:06-- http://www.cqpub.co.jp/interface/download/2018/12/Snack-data.zip\n", 62 | "Resolving www.cqpub.co.jp (www.cqpub.co.jp)... 219.101.148.16\n", 63 | "Connecting to www.cqpub.co.jp (www.cqpub.co.jp)|219.101.148.16|:80... connected.\n", 64 | "HTTP request sent, awaiting response... 200 OK\n", 65 | "Length: 1008569782 (962M) [application/x-zip-compressed]\n", 66 | "Saving to: ‘Snack-data.zip’\n", 67 | "\n", 68 | "Snack-data.zip 100%[===================>] 961.85M 9.36MB/s in 1m 46s \n", 69 | "\n", 70 | "2018-10-30 14:45:52 (9.08 MB/s) - ‘Snack-data.zip’ saved [1008569782/1008569782]\n", 71 | "\n" 72 | ] 73 | } 74 | ], 75 | "source": [ 76 | "! wget http://www.cqpub.co.jp/interface/download/2018/12/Snack-data.zip" 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": 8, 82 | "metadata": {}, 83 | "outputs": [ 84 | { 85 | "name": "stdout", 86 | "output_type": "stream", 87 | "text": [ 88 | "Listing sub folders:\n", 89 | "jack laptop smartphone usb voicerecorder\n", 90 | "Listing some of files:\n", 91 | "babystar_B_1_0000.wav\n", 92 | "babystar_B_1_0001.wav\n", 93 | "babystar_B_1_0002.wav\n", 94 | "babystar_B_1_0003.wav\n", 95 | "babystar_B_1_0004.wav\n", 96 | "babystar_B_1_0005.wav\n", 97 | "babystar_B_1_0006.wav\n", 98 | "babystar_B_1_0007.wav\n", 99 | "babystar_B_1_0008.wav\n", 100 | "babystar_B_1_0009.wav\n", 101 | "ls: write error: Broken pipe\n" 102 | ] 103 | } 104 | ], 105 | "source": [ 106 | "! unzip -q Snack-data.zip\n", 107 | "! rm Snack-data.zip\n", 108 | "! echo Listing sub folders:\n", 109 | "! ls Snack/data\n", 110 | "! echo Listing some of files:\n", 111 | "! ls Snack/data/jack | head" 112 | ] 113 | }, 114 | { 115 | "cell_type": "code", 116 | "execution_count": 2, 117 | "metadata": {}, 118 | "outputs": [ 119 | { 120 | "name": "stdout", 121 | "output_type": "stream", 122 | "text": [ 123 | "Classes ['babystar', 'bbq', 'corn', 'kappaebi', 'potechi', 'vegetable']\n" 124 | ] 125 | } 126 | ], 127 | "source": [ 128 | "files = list(DATAROOT.glob('*/*[0-9].wav'))\n", 129 | "X_files = [str(f) for f in files]\n", 130 | "y_labels = [str(f.name).split('_')[0] for f in files]\n", 131 | "print('Classes', sorted(list(set(y_labels))))\n", 132 | "y = [conf.label2int[y] for y in y_labels]" 133 | ] 134 | }, 135 | { 136 | "cell_type": "code", 137 | "execution_count": 3, 138 | "metadata": {}, 139 | "outputs": [ 140 | { 141 | "data": { 142 | "text/plain": [ 143 | "((4799, 128, 128, 1), (4799,))" 144 | ] 145 | }, 146 | "execution_count": 3, 147 | "metadata": {}, 148 | "output_type": "execute_result" 149 | } 150 | ], 151 | "source": [ 152 | "# Train\n", 153 | "train_files = sorted([f for f in files if f.stem[-6] in '1'])\n", 154 | "y_labels = [str(f.name).split('_')[0] for f in train_files]\n", 155 | "y_train = [conf.label2int[y] for y in y_labels]\n", 156 | "XX = mels_build_multiplexed_X(conf, train_files)\n", 157 | "X_train, y_train = mels_demux_XX_y(XX, y_train)\n", 158 | "np.save('X_train.npy', X_train)\n", 159 | "np.save('y_train.npy', y_train)\n", 160 | "X_train.shape, y_train.shape" 161 | ] 162 | }, 163 | { 164 | "cell_type": "code", 165 | "execution_count": 4, 166 | "metadata": {}, 167 | "outputs": [ 168 | { 169 | "data": { 170 | "text/plain": [ 171 | "((1206, 128, 128, 1), (1206,))" 172 | ] 173 | }, 174 | "execution_count": 4, 175 | "metadata": {}, 176 | "output_type": "execute_result" 177 | } 178 | ], 179 | "source": [ 180 | "# Valid\n", 181 | "valid_files = sorted([f for f in files if f.stem[-6] in '2'])\n", 182 | "y_labels = [str(f.name).split('_')[0] for f in valid_files]\n", 183 | "y_valid = [conf.label2int[y] for y in y_labels]\n", 184 | "\n", 185 | "XX = mels_build_multiplexed_X(conf, valid_files)\n", 186 | "X_valid, y_valid = mels_demux_XX_y(XX, y_valid)\n", 187 | "np.save('X_valid.npy', X_valid)\n", 188 | "np.save('y_valid.npy', y_valid)\n", 189 | "X_valid.shape, y_valid.shape" 190 | ] 191 | }, 192 | { 193 | "cell_type": "code", 194 | "execution_count": 5, 195 | "metadata": {}, 196 | "outputs": [ 197 | { 198 | "data": { 199 | "text/plain": [ 200 | "['jack', 'jack', 'jack', 'jack', 'jack']" 201 | ] 202 | }, 203 | "execution_count": 5, 204 | "metadata": {}, 205 | "output_type": "execute_result" 206 | } 207 | ], 208 | "source": [ 209 | "# Valid - list of mic for each samples\n", 210 | "mic_valid = [str(f.parent.name) for f in valid_files]\n", 211 | "np.save('mic_valid.npy', mic_valid)\n", 212 | "mic_valid[:5]" 213 | ] 214 | }, 215 | { 216 | "cell_type": "code", 217 | "execution_count": null, 218 | "metadata": { 219 | "collapsed": true 220 | }, 221 | "outputs": [], 222 | "source": [] 223 | } 224 | ], 225 | "metadata": { 226 | "kernelspec": { 227 | "display_name": "Python 3", 228 | "language": "python", 229 | "name": "python3" 230 | }, 231 | "language_info": { 232 | "codemirror_mode": { 233 | "name": "ipython", 234 | "version": 3 235 | }, 236 | "file_extension": ".py", 237 | "mimetype": "text/x-python", 238 | "name": "python", 239 | "nbconvert_exporter": "python", 240 | "pygments_lexer": "ipython3", 241 | "version": "3.6.2" 242 | } 243 | }, 244 | "nbformat": 4, 245 | "nbformat_minor": 2 246 | } 247 | -------------------------------------------------------------------------------- /apps/IF201812/config.py: -------------------------------------------------------------------------------- 1 | # Interface 2018/12 sound classification example for shaking snack 2 | # Application configurations 3 | 4 | from easydict import EasyDict 5 | 6 | conf = EasyDict() 7 | 8 | # Basic configurations 9 | conf.sampling_rate = 44100 10 | conf.duration = 2 11 | conf.hop_length = 347*2 # to make time steps 128 12 | conf.fmin = 20 13 | conf.fmax = conf.sampling_rate // 2 14 | conf.n_mels = 128 15 | conf.n_fft = conf.n_mels * 20 16 | conf.model = 'mobilenetv2' # 'alexnet' 17 | 18 | # Labels 19 | conf.labels = ['babystar', 'bbq', 'corn', 'kappaebi', 'potechi', 'vegetable'] 20 | 21 | # Training configurations 22 | conf.folder = '.' 23 | conf.n_fold = 1 24 | conf.valid_limit = None 25 | conf.random_state = 42 26 | conf.test_size = 0.2 27 | conf.samples_per_file = 1 28 | conf.batch_size = 32 29 | conf.learning_rate = 0.0001 30 | conf.metric_save_ckpt = 'val_acc' 31 | conf.epochs = 100 32 | conf.verbose = 2 33 | conf.best_weight_file = 'best_model_weight.h5' 34 | 35 | # Runtime conficurations 36 | conf.rt_process_count = 1 37 | conf.rt_oversamples = 10 38 | conf.pred_ensembles = 10 39 | conf.runtime_model_file = None # 'model/mobilenetv2_fsd2018_41cls.pb' 40 | -------------------------------------------------------------------------------- /apps/cnn-laser-machine-listener/CNN-LML-Preprocess-Data.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# CNN version Laser Machine Listener: Preprocess\n", 8 | "\n", 9 | "Run `download.sh` before starting this, to download dataset." 10 | ] 11 | }, 12 | { 13 | "cell_type": "code", 14 | "execution_count": 1, 15 | "metadata": {}, 16 | "outputs": [ 17 | { 18 | "name": "stdout", 19 | "output_type": "stream", 20 | "text": [ 21 | "{'sampling_rate': 16000, 'duration': 1, 'hop_length': 253, 'fmin': 20, 'fmax': 8000, 'n_mels': 64, 'n_fft': 1280, 'labels': ['background', 'cutting_in_focus', 'cutting_not_in_focus', 'marking', 'sleeping', 'waiting'], 'folder': PosixPath('.'), 'n_fold': 1, 'normalize': 'samplewise', 'valid_limit': None, 'random_state': 42, 'samples_per_file': 30, 'test_size': 0.2, 'batch_size': 32, 'learning_rate': 0.0001, 'epochs': 50, 'verbose': 2, 'best_weight_file': 'best_model_weight.h5', 'eval_ensemble': False, 'rt_process_count': 1, 'rt_oversamples': 10, 'pred_ensembles': 10, 'runtime_model_file': 'cnn-model-laser-machine-listener.pb', 'label2int': {'background': 0, 'cutting_in_focus': 1, 'cutting_not_in_focus': 2, 'marking': 3, 'sleeping': 4, 'waiting': 5}, 'num_classes': 6, 'samples': 16000, 'rt_chunk_samples': 1600, 'mels_onestep_samples': 1600, 'mels_convert_samples': 17600, 'dims': [64, 64, 1], 'model': 'mobilenetv2', 'metric_save_ckpt': 'val_loss', 'metric_save_mode': 'auto', 'logdir': 'logs', 'data_balancing': 'over_sampling', 'X_train': 'X_train.npy', 'y_train': 'y_train.npy', 'X_test': 'X_test.npy', 'y_test': 'y_test.npy', 'steps_per_epoch_limit': None, 'aug_mixup_alpha': 1.0, 'what_is_sample': 'log mel-spectrogram', 'use_audio_training_model': True}\n" 22 | ] 23 | }, 24 | { 25 | "name": "stderr", 26 | "output_type": "stream", 27 | "text": [ 28 | "Using TensorFlow backend.\n" 29 | ] 30 | } 31 | ], 32 | "source": [ 33 | "import sys\n", 34 | "sys.path.append('../..')\n", 35 | "from lib_train import *\n", 36 | "%matplotlib inline" 37 | ] 38 | }, 39 | { 40 | "cell_type": "markdown", 41 | "metadata": {}, 42 | "source": [ 43 | "## Make a list of training samples\n", 44 | "\n", 45 | "As listed below, this dataset is quite small:\n", 46 | "\n", 47 | "- 6 classes\n", 48 | "- 12 audio samples\n", 49 | "\n", 50 | "And this dataset is following basic structure of Keras classification project; `yourdataset/classname/filename.wav`." 51 | ] 52 | }, 53 | { 54 | "cell_type": "code", 55 | "execution_count": 3, 56 | "metadata": {}, 57 | "outputs": [ 58 | { 59 | "data": { 60 | "text/plain": [ 61 | "[PosixPath('laser-machine-listener/data/sleeping/sleeping.wav'),\n", 62 | " PosixPath('laser-machine-listener/data/background/background.wav'),\n", 63 | " PosixPath('laser-machine-listener/data/marking/acrylic_marking.wav'),\n", 64 | " PosixPath('laser-machine-listener/data/marking/paper_marking.wav'),\n", 65 | " PosixPath('laser-machine-listener/data/marking/mdf_marking.wav'),\n", 66 | " PosixPath('laser-machine-listener/data/waiting/waiting.wav'),\n", 67 | " PosixPath('laser-machine-listener/data/cutting_not_in_focus/paper_cutting_not_in_focus.wav'),\n", 68 | " PosixPath('laser-machine-listener/data/cutting_not_in_focus/mdf_cutting_not_in_focus.wav'),\n", 69 | " PosixPath('laser-machine-listener/data/cutting_not_in_focus/acrylic_cutting_not_in_focus.wav'),\n", 70 | " PosixPath('laser-machine-listener/data/cutting_in_focus/acrylic_cutting_in_focus.wav'),\n", 71 | " PosixPath('laser-machine-listener/data/cutting_in_focus/mdf_cutting_in_focus.wav'),\n", 72 | " PosixPath('laser-machine-listener/data/cutting_in_focus/paper_cutting_in_focus.wav')]" 73 | ] 74 | }, 75 | "execution_count": 3, 76 | "metadata": {}, 77 | "output_type": "execute_result" 78 | } 79 | ], 80 | "source": [ 81 | "DATAROOT = Path('laser-machine-listener/data')\n", 82 | "data_files = list(DATAROOT.glob('*/*.wav'))\n", 83 | "data_files" 84 | ] 85 | }, 86 | { 87 | "cell_type": "markdown", 88 | "metadata": {}, 89 | "source": [ 90 | "## Preprocess\n", 91 | "\n", 92 | "Data preprocessing follows these steps:\n", 93 | "\n", 94 | "1. `mels_build_multiplexed_X()` converts `classname/filename.wav` style raw audio as log mel-spectrogram, generate training data samples. Output is multiplexed training set denoted by XX.\n", 95 | "2. The multiplexed set is once flattened to normal training set (usually denoted by X), then divided into train/test split.\n", 96 | "3. Save as files.\n", 97 | "\n", 98 | "In this case, there are only 12 original audio samples. Then 288 training & 72 validation samples are created.\n", 99 | "\n", 100 | "- One sample duration is 1 second which is defined by config.py as conf.duration.\n", 101 | "- One sample is a three dimentional, [64 n_mels, 64 time hops, 1 channel] vector.\n", 102 | "\n", 103 | "### This preprocessing intentionally leaks data\n", 104 | "\n", 105 | "Usually train/test split shall be very clean, test set is supposed be 'unseen'.\n", 106 | "\n", 107 | "But this data set is a special case, all classes has only one sample from sound variations.\n", 108 | "Even though some class have 3 samples but these are ones from all different variations.\n", 109 | "\n", 110 | "Then there's no way collecting train/test samples from different audio files. The step 3 above shuffles data." 111 | ] 112 | }, 113 | { 114 | "cell_type": "code", 115 | "execution_count": 4, 116 | "metadata": {}, 117 | "outputs": [ 118 | { 119 | "name": "stdout", 120 | "output_type": "stream", 121 | "text": [ 122 | "Set labels to config.py; conf.labels = ['background', 'cutting_in_focus', 'cutting_not_in_focus', 'marking', 'sleeping', 'waiting']\n", 123 | "Training set has 288 samples. X shape is (288, 64, 64, 1)\n", 124 | "Test set has 72 samples. X shape is (72, 64, 64, 1)\n" 125 | ] 126 | } 127 | ], 128 | "source": [ 129 | "# 1. generate XX and y\n", 130 | "XX = mels_build_multiplexed_X(conf, data_files)\n", 131 | "y, _, _ = generate_y_from_pathname(data_files)\n", 132 | "\n", 133 | "# 2. flatten (= unseen data is not available) & shuffle by train_test_split\n", 134 | "X, y = mels_demux_XX_y(XX, y) # demultiplex = flatten\n", 135 | "X_train, X_test, y_train, y_test = \\\n", 136 | " train_test_split(X, y, test_size=conf.test_size, random_state=conf.random_state)\n", 137 | "\n", 138 | "# 3. write all sets as files\n", 139 | "np.save('X_train.npy', X_train)\n", 140 | "np.save('y_train.npy', y_train)\n", 141 | "np.save('X_test.npy', X_test)\n", 142 | "np.save('y_test.npy', y_test)\n", 143 | "\n", 144 | "print('Training set has {} samples. X shape is {}'.format(len(X_train), X_train.shape))\n", 145 | "print('Test set has {} samples. X shape is {}'.format(len(X_test), X_test.shape))" 146 | ] 147 | }, 148 | { 149 | "cell_type": "markdown", 150 | "metadata": {}, 151 | "source": [ 152 | "### Look inside" 153 | ] 154 | }, 155 | { 156 | "cell_type": "code", 157 | "execution_count": 5, 158 | "metadata": {}, 159 | "outputs": [ 160 | { 161 | "data": { 162 | "text/plain": [ 163 | "[(12, 30, 64, 64, 1), (360,), (288, 64, 64, 1), (72, 64, 64, 1), (288,), (72,)]" 164 | ] 165 | }, 166 | "execution_count": 5, 167 | "metadata": {}, 168 | "output_type": "execute_result" 169 | } 170 | ], 171 | "source": [ 172 | "[a.shape for a in [XX, y, X_train, X_test, y_train, y_test]]" 173 | ] 174 | }, 175 | { 176 | "cell_type": "code", 177 | "execution_count": 6, 178 | "metadata": {}, 179 | "outputs": [ 180 | { 181 | "data": { 182 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZMAAAEaCAYAAADUo7pxAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzsnXmUVOWZ/7+39q27uqsbaDZBFGUT\nSWz3jIKShNEkQ9RxDOqcGDxq3MnozyZxjVFJIqKJe0IkxyUT44moJ27jgkrQgBF0ZHEHN6Ch9+7a\nq+7vD4amn+97qeqmuhvsfj7ncA636tZ73/vet/qt9/tslm3bNhRFURSlBFx7uwOKoijKVx9dTBRF\nUZSS0cVEURRFKRldTBRFUZSS0cVEURRFKRldTBRFUZSS0cVE2eexLAsPPfRQv19348aNsCwLy5cv\n7/drK8pXDc/e7oCi7KuMHj0amzdvRlVV1d7uiqLs8+hioii7we12o6amZm93Q1G+EqjMpewTLF++\nHMceeyzKyspQVlaGQw89FM8995zjue3t7bjsssswcuRIhEIhfO1rX8Nf//pXcc7WrVvxwx/+EEOG\nDEFZWRmOPfZYvPrqq53vL1u2DJZl4amnnsIRRxyBQCCAKVOm4KWXXuo8h2WuncePPvoovvOd7yAU\nCmHcuHFYsmSJuPYnn3yCb33rWwgEAhg9ejTuuusuTJ8+Heeee24vjZai7HvoYqLsdbLZLL73ve/h\nyCOPxFtvvYW33noL119/PUKhkHGubdv47ne/i7fffht//vOf8e677+LHP/4xzjjjDLz44osAgEQi\ngRkzZqCtrQ3PPPMMVq9ejZNOOgnf/OY3sX79etHeT37yE1x77bVYvXo1jjzySHz3u9/F5s2bC/a3\nrq4O//mf/4l33nkHZ5xxBs4991y8//77nf37/ve/j5aWFrz66qt46qmn8Le//Q2rV6/updFSlH0U\nW1H2Mo2NjTYA++WXX3Z8H4D94IMP2rZt2y+//LLt9/vt5uZmcc4555xj/9u//Ztt27b9wAMP2CNH\njrQzmYw4Z8aMGfZll13W2Q4A+/e//33n+5lMxt5vv/3sq6++2rZt2/7kk09sAPZrr70mjhcuXNj5\nmWw2a0ciEfvee++1bdu2n3/+eRuA/cEHH3Se09DQYAeDQXvu3Lk9HhtF+aqgNhNlr1NZWYlzzz0X\n3/72t3HCCSfg+OOPx/e//30cfPDBxrmrVq1COp3GyJEjxevpdBrjx4/vPGfLli2oqKgQ56RSKQSD\nQfHa0Ucf3fl/j8eDI444AmvXri3Y32nTpnX+3+12Y+jQodi6dSsAYN26daiursaBBx7YeU4sFnO8\nF0UZSOhiouwT/O53v8Nll12G559/Hv/zP/+Da665BnfeeSfOP/98cV4+n0c0GsWqVauMNnw+X+c5\nEydOxOOPP26c4ySd9ZSd19mJZVnI5/PiWFEGG2ozUfYZpkyZgp/85Cd45plnMHfuXNx///3GObW1\ntWhubkYymcSBBx4o/u23336d53z88ccoLy83zhkxYoRo74033uj8fzabxcqVKzFp0qQ9vodJkyZh\n27Zt+Oijjzpfa2pq6rSpKMpARRcTZa/z4Ycf4qqrrsLy5cuxadMmvP7663jttdcc/6ifcMIJmDlz\nJk455RQsXboUH3/8Mf75z3/it7/9LX73u98BAM4880zsv//+OPnkk/H8889j48aN+Mc//oFbbrkF\nS5cuFe0tWLAATz/9NNavX48f//jH2LZtGy688MI9vpeZM2fi0EMPxdlnn41Vq1bh7bffxtlnnw2P\nx6M7FmVAo4uJstcJh8P44IMPcMYZZ+Cggw7CqaeeimOOOQZ33nmnca5lWXjyySdxyimnYN68eZgw\nYQJOPvlk/O1vf8MBBxwAAAgEAnjllVdQW1uLc845BwcddBBOOeUUrFy5EmPGjBHt3Xrrrbjmmmsw\nbdo0/P3vf8cTTzxh7F56gmVZePzxxxEOh/Ev//Iv+M53voN//dd/xcEHH4xAILDH7SrKvo5l21pp\nURl8LFu2DDNmzMBnn32GUaNG9em12traMGrUKPziF7/AJZdc0qfXUpS9hRrgFaWXefLJJ+HxeDBx\n4kTU19fjhhtugGVZOP300/d21xSlz9DFRFF6mXg8jp///OfYuHEjwuEwDjvsMCxfvhzDhg3b211T\nlD5DZS5FURSlZNQAryiKopSMLiaKoii9SHzj53u7C3uFQSFznfWzLzv/b+fl7abiKeP8REdcHOez\neXlsy+NcOiuObXrf45cR0wDg8Upzlcvtlm1mZJu+gJ+uYT62RLvst9fvpWvK40wqLY7TCTkWLo/s\nk1M/mHRStuGm+/IFzc9z/EUmLfsVb+0o2Maw/UxbRCgi3XCz2Zw4bmuSbSY7kgX75DTefI7lksdZ\neoYuV/HfbvwMeZ646TibytA15TE/c6d+cpv5nByrFI2N22eaWsPRiGyDvjNtjc2yn/Sd4Wfq1G8P\nzWf+jvDY5Wgs8jQHnFj+1PFFz+kOf/N2L33OyZn3euV6+wJqgFcURellLO/gC1DVxURRFKWXcQfN\nXf1AZ1AsJpHyXZliWe7weM2HHo6GCp6Tzcg2cjm5pfeRDMDvO7XB1wiG5bbf7Za/dHI5U3bh62RI\nSmDyDv0q1AfA7DcfsxzkpbEIlZltut1S/mlrTojjVFLKFYGQlA0j5WZkeSgkJREXjV8wLNtoaZCy\nF/c7EDRllyxJOR1tUg5ykZzE0hvLTU6wLMvzl+UkbpM/DwB5ekY89/jei/UJALx+2Qb3Kzm8Uhzz\nXOU2uyMr8jkZkvwskhWd7qvYd2RPcXl0Z6IoiqKUiMpciqIoSsn0187k9ddfx1/+8hd88cUXuPnm\nmzvz0wHA448/jpdeegkulwvnnHOOqMPjxLJly/DRRx9h7ty5ePTRR/Hiiy+ivLwcmUwGkydPxty5\ncws6kQyKxSTRscvDiGUAJ6mB5R/eknv9ctgsknrStHV2kpPsPF2D+pEmacfoo4MMwHIFy0d8Hx7y\n1uL3k3HpVeV0TjGpjKWJeJvpPcf99gVMSakrLK0lOrrRT5ZdEuwFRdIlnc/vO/WT5Tf+TDKRLvi+\nE9wmP1OW6zwe+T5LcQCQSRWWafmZ+gLF/0xkM/IzLMu6qN987zyWTvIzS2H8HTElWPk9dJK5is21\nPaW3dyZr167FsmXLcNFFF4nXR48ejSuuuMIo1/D5559jxYoVuO2229DU1IQbb7wRd9xxR7c8Cndy\n8skn43vf+x7y+Tyuu+46rFu3DlOmTNnt+YNiMVEURelP+mtnsrskpatWrcIxxxwDr9eLoUOHoqam\nBh9++CEOOuggcd7LL7+MpUuXIhQKYcyYMfA6uGRns1lkMhlEIhHjva7oYqIoitLLuH3d3wHU1dV1\n/n/mzJmYOXNmyddvbGzsLGMN7Cgd3djYKM5pamrCo48+il/+8pcIhUK44YYbMHbs2M73//a3v+G1\n117D9u3bMW3aNPGeE4NiMamuKev8P2/H02lTakiRvMMyAEsNjJ+2zk4yAW/Zi8kujJMMUMwLh++L\nKXZfgOlBk2yXbRYL3AuETG8uHguW8Pi+WJpw8i5iKYfhNvm4p3IeYMqG3KbHK8fX5VAsi6/r5LXX\nFb739lYpI6YS5jM3+kmybSZFwZZu7rfZD5acuF/mfRUZXyeZq0h8dahMessV60Nf0h1PvZ0sWLBg\nt+/99Kc/RSaTQTKZRHt7O6688koAOwrAFbOBFOODDz7A5MmTUV5eDgA4+uijsXnz5s73d8pc2WwW\nt912G/7+97/j2GOP3W17g2IxURRF6U+sbvww6w4333wzgN3bTHZHLBZDQ0ND53FjYyNisdge9cHj\n8WDatGlYv359wcVEc3MpiqL0Mi631a1/fUVtbS1WrFiBTCaD+vp6bN68GQceeKA4Z/z48Vi3bh3a\n2tqQzWbxxhtvOLZl2zY2bNhQtITCoNiZdJWyfD65fWbpwek1lkw40IkD2liCSiXNwCjeBocjUv5h\nmcvIp9SNXz58XUMeYskqXliycmqDyecLB8Q5wTILyyosV4QjhT2YACBFUg17NQXpPrykcSdp7OId\npnddsYA3nkcssTrNPS/NT8vwKit8zIGS4S4BuzsJhQs/w2S88NxyeqZBQ94sLGvlMoW/U07w9ywU\nkcHFfO8sETrJXOzl11v0ROYqhZUrV+IPf/gDWltbsWDBAowdOxY/+9nPMHr0aBx99NH4yU9+ApfL\n5ejWW1lZiX//93/H1VdfjVAoZNhEdtpMcrkc9ttvP3z7298u2JdBkejxkttbO//Pi4kTbEfpj8WE\n+9UXiwmzJ4sJ3xtr5XuymLCNqdhiEimnpID7yGJi2iL4mRZfTIq59nIbxSLHeSyB7iwmhd3SnZ4p\n/6Hu6WLCkf1O9j/+nrEtsjcWk7uuqDDO2RPePP7obp1X+8rrvXK9fYFBsTNRFEXpTziVy2BgUCwm\nXX/F867DyRPIT794eNeQThcODmS6s+UtJns5BSkyptcZ7X7ovrhJ/lXs5IXGv/6ymcLynIt/aWcc\ngujoFz4HOvJ98K91p1+c3E/2nGpr41T5hX/VOu0i/IHC3kNGn2gaOHmc8S/lAOUY45xjxi6ajnku\nA4Cb/tCxdu+vpDT39MwyDqncKWu9MZ9dNo0vfZ7zwLHcB5j3wveRo0BgH++gHHZUiSLBwXtKf8lc\n+xKDYjFRFEXpT/rSuL6voouJoihKL6M7kwFKVwmDt+NO22nePsfJMM1yBhtznSQRpoNkFsOQGigs\nszgFs5n5juTj9XNwGgdG0vtO8h0beIOc6t0lPa1YhslnzSC6aKX0OAqTgZglKKapycz3xUZk9hhj\nyYTHjj3GuE+AORYdZKTnPphlBsy5Z0iN5BiQThcOoM2A82455HCjbz3PA5YZ/eRIEHCo1cGfYQeG\n1haSFWksuuNUwfdSTPrl952cb8Juswpqb6A2E0VRFKVk3N34QTnQ0MVEURSll1GZa4DSVYphqcfJ\nuyhN2/4QpQEPhtjbq5gUYU4slnb4nO7EkTAej+wnOxelkyQ5sbTWjWp3LD+wBMJj4fUWPt/pNUPO\noEfEEooTnJqdvZz4y86yF+dKc0pzz9IZe5WxJxtLqk7jy/ExPp+8j2hUym1+v7xGIiHlorY201uJ\nr8uybjhCVSYDhZ8PAMQTnG5f3gfP7yFDy8RxkKSztjbzGXPZBpZ+eT6bcSfmd91pPvYGKnMpiqIo\nJaM7E0VRFKVkdDEZoFRW7JIG2tvl9tlp6zt8uMz54/dRFcSM3E5nsyQb0Ja+otJMu56lNli6Yfmt\nO0GL+XxhSSlSJiWSXI6DGAvfB2DKWMW6xec77f75Ol6qUpflAMJsz6Uzw7OniHdckiVBh/t0s4xi\nlBUonNLFaXzLyauJZSwei5Z66SXF0o/T2HA/Wf7M0Hz25YvPi67fMQAYNoTnmjw/lZZtxuOUTsXB\ngM1yG8PPsJ3KIziVm3Dy5uwNXJ6+aXdfZlAsJoqiKP2JBi0qiqIoJaMy1wAlmdwlL/D2ORo1g5Yi\nYfJuoSzeqbScKImk3F63thWu8gcAgQAHp5HHWFC+H/TLa7JMAJjyG89no5gdyUk+isvzOPy66kjI\na6RSslGH4oEC9toBgHCosFdOc6scT/Yo83jNi/IzdNNl00VSMnEWYdshUzk/Qw/V/W6nuZYm6TIc\ncqiWWaR2eJ5yXHnIu4tlRSenQEOmpWfCsq0hMzr0y2fMJZKGIdtg6ThCQaFOf4t5zvP85c90ROR9\nsXwHmOPVW6g3l6IoilIyujNRFEVRSkYXkwFK1y11IEAygYOUw9v6ZKqwHJSgz7O3C0tWTm2kKCYu\nQ1KEl54Ub/EBIEj3xtIOU8wTK54wT2CPmWg5Va4sck0nqSGZ4iJGHFRH3nFVFCjpMBYs5bA3EcuG\nhsSX7tl9AUBzi/Te4mDLULBwgCcABKhf7GHHObCKve8kc/HcYVkxQ/GCIf7OOPzV6IiThxiNP3/P\nWAUqkw6UjnIpjw3fG3/GT+PL37Edr/VNbUD15lIURVFKRm0miqIoSukU80QZgOhioiiK0suozWSA\nEqvY/ZaT9WHASZeW78fJFZijo53sGQy7RsbKSXOmKO9EsmiThg7NLsuc+M5PXtGsKTtUZzXcL9mN\nlG0RPL5OLrmcYJKzFPA1bFtO2+5E1fMzbO+g8rjk5stj0x1iVO6WbSas+TtRzNZg9ovsH3SfGYfx\nNu0o1Ab5kDc2O/hFE2y34qC9gJkEQl6jpbDrMAB4ya7Fz5Q3A+wK72TrKeYivqeozKUoiqKUjO5M\nFEVRlJJRb64Byhdbd8km7C7olLSOpQR2J+YIeHbPjEUpMtrt5H7Icga7AheWO9rjZr/ZlZd32txP\nls7YfZbdZXecUzgSn6/JbXbETe2M762ionA0NF+TXVt3XLewy2e0XHaU3V9ZImH3ZSf4PijvptGG\nk4zI/XZZlF2hXZ7P0hnLTZzU0QmWN1lqKyZhOcHPjO+dJUDuJkvJANC+XcqfnH0hQnVYYhXk3u3w\n166v1CjdmSiKoiilozaTviWfz6Ourg6xWAx1dXWor6/H7bffjra2NowbNw6XXHIJPB4Ptm3bhnvu\nuQetra2IRCK45JJLUFVVBQDYvn077r33XjQ0NAAA5s+fj6FDh/bnbSiKohSEnXgGA/26mDz99NMY\nOXIkEokdMeMPPfQQTj75ZBx77LG4//778dJLL+Fb3/oWHnzwQRx33HGYPn063n33XTzyyCO45JJL\nAAB33nknTjnlFEydOhXJZLJbD60svGu7y55BThieP/HCW3RO8Gd4RXVDFkiSdJYlLyi+TY7S3/Ga\nPM4Vie7l+wyH2APH7DfLFW3kFdXaRqVru/GdytP4GJHhRdpwiiQv5o3FUfXsdcbZAzi7AGA+I0PK\nofFl2TAUNNssJiOm4zTe7fKY67RwwkvAjOZnac0mWcvPSUYdJL8ESU4s+RVLYMlZIpwkVsviMsnS\nFYu/hy00F52++07j0xsMRm+ufrvjhoYGvPXWWzjxxBMB7NDv165di6OOOgoAMH36dKxatQoA8Pnn\nn2PKlCkAgMmTJ+PNN9/sfD2Xy2Hq1KkAgEAgAL+/iM+hoihKP2N53N36N5Dot8VkyZIlOOusszp3\nEm1tbQiFQnD/30/AWCyGxsZGAMCYMWOwcuVKAMDKlSuRSCTQ1taGL7/8EuFwGLfeeiv+3//7f3jw\nwQeR522CoijKXsZyWd36N5DoF5nrn//8J6LRKMaNG4e1a9cWPf/ss8/GH/7wByxbtgwTJ05ELBaD\ny+VCPp/H+vXr8atf/QrV1dVYtGgRli1bhhNOOEF8/oUXXsATTzyBeDyOxYsXC8kiQGVQnTw82AOJ\n5YuOeOEFrLyM1+ieTxqWtXjb75TckJNB5vOFA95Mby957FDR2AjyMoLqygpPKafkkiw/cD/5GvFE\n8R8Q/Jy5zWIBbz66DaexsOmHJQfaFVNgnX6Y8jNk6ZITa0Yj5HnlYanNSZLiRI88TwonDHW6r3BI\ndpy/V/ybj+dzUlYf7lagJHv9hUk25Pnt1G+n5I+9gWUNPpmrXxaT9957D2+++SZWr16NdDqNRCKB\nJUuWIB6PI5fLwe12o7GxEbFYDMCOXcoVV1wBAEgmk/jHP/6BcDiMWCyGsWPHYtiwYQCAI444Au+/\n/76xmMycORMzZ87sj1tTFEUxGWC7ju7QL4vJnDlzMGfOHADA2rVr8dRTT+HSSy/FbbfdhjfeeAPH\nHnssli1bhtraWgDo9OJyuVx4/PHHMWPGDADAgQceiHg8jtbWVpSXl+Pdd9/FuHHj+uMWFEVRus1g\nNMDv1TiTM888E7fffjv++7//G/vvv3/nDmPdunV45JFHYFkWJk6ciLlz5wIAXC4Xzj77bPz85z+H\nbdsYN25ct3YgXYP5QlSC10kucnHtEPISCfjlvp9zYFWUyc/7veaWvT1ReLLxlpxlApZDACDg4zog\nVGI3wyVd5fv8+XS2uLeR11P4PlgydIK9yMrDhQMI+Zk5ac8snXGb4YBsNE7STyZbXLLi1zj/lN/L\n41/Y6w8w5Tiee+GgbDMSKOw915EyO865uYIBecySU1OLvAbX2gFMaZelssoyyoVG34lkhj9vzqsE\n9avYj3/2tnOStEMB87XeoL/sIe3t7Vi0aBG2bduGIUOGYN68eYhEIgU/c/311+Pss8/GAQccgIsu\nugiBQKDTjHDGGWfg8MMP36O+9PtiMnnyZEyePBkAMGzYMNxyyy3GOUcddVSnlxczdepU3HrrrX3a\nR0VRlFKwilWm6yFr167FsmXLcNFFF4nXly5dikMOOQSzZ8/G0qVLsXTpUpx11lk9avu6665DeXk5\nvvzyS/ziF7/Y48Vk8O3FFEVR+hqXq3v/SmTVqlU4/vjjAQDHH398Z3hFV9LpNG6//XbMmzcPv/71\nr5FOO3sdxONxhMPhPe7LoEinMiS266FxcJqTR42P5AkzhxJ5jZBM09Yhz25z8OZiOaOYrMUSFPcR\nKF6Gl/N/GbJLN2QYzpnEkgkHjnHeLCevKJZyONiPPcg4wM2pNC1LTgEfySoUJMoeTi2t8nwO3AOA\nMpLOePyb2uT73cnv1dpeLNBU3mwmKycwS2t7EojNc7OMAvuaWkztcus2+ZDKyasvR4nKWB5lL7RQ\nwBwH9upzym3WFZ43TpJrn6Wg78HA19XVdf6/p85DLS0tqKysBABUVFSgpaXFOOf555+Hz+fDokWL\nsGnTJlx11VXi/RtuuAEAsHXrVsybN6/b12YGxWKiKIrSr/Rg17FgwYLdvvfTn/4UmUwGyWQS7e3t\nuPLKKwHssDdPmzZNnGtZluMitm7dOpx00kkAdsTwjRkzRry/U+basmULbrzxRkyePBmBQM+NSbqY\nKIqi9DK9ZYC/+eabAezeZhKNRtHU1ITKyko0NTWhvLx8j69VU1ODaDSKzz//HAceeGCPPz8oFpOu\nsgn/YGiLm+dztcBIsHAAFUsk7DVSFnLICRSQe243pRpPUaXFeEoet8XNXz7soWTkYKLbMFK7F8lP\nBZjSDhP0F5dyGJbfzKA66lc3gtFYcuLPeP38TOUJAfImcioj4PVQXiw6p5KcathjiaU2AAjR+Hmp\nTRvsvVV4vJ3+pqVILmrpkP1qaWNJVX6+qtKcGGagroTHn2UtfobsTQeY8ifPZ85jxt5cEYe522d/\nAPspaLG2thavvPIKZs+ejVdeecXReD5p0iQsX74cU6ZMwaeffopNmzY5ttXS0oL6+npUV1fvUV8G\nxWKiKIrSn/S2N9fumD17NhYtWoSXXnqp0zWY+da3voW7774b8+bNw8iRI43YvBtuuAEulwu5XA5z\n5sxBRUXFHvVFFxNFUZTeppfjTLqGVHSlrKwM1157bcHP+nw+XH755Y7v3XXXXb3SP2CQLCYVkV37\n4XSGH3Lxh563KaiOZK/qMilZpcnDJuA13UjcLtlGR0pqCRkKDqwMyz27z2O2yR4zPrc8J+CR7lnp\nnHz8zQlpdGt1CKxkuYKDz1gOClA/nYP/SMqJUGAk3YdFVSpbkmbmaL8hQck24ml573kKsnMaX4al\nGP6Mh4JCXZRCvYqeKQBEfCnjta60p+W9ZnLygWRpDjg9Q5bSxlTLa1pD5PnNCZqbZrcxpEzOLZ+b\nqiJmybvLZk84edzUYf5p2tYkj1mOrorK9zkotzxoum4FvXwzIeOcPUEj4BVFUZTS0eJYiqIoSsno\nzmRgEg3s2t5myUtnaLnpqeUhCSqRkcNkeOVkpKzFUk82Z/5KSWakdMDX9FDgGEsonDbcqR/sEcaS\niBEoyZUBPaanUJCC/9gLLU+yIV/TySuKHZJsaoOlGxfJXHwNwJTXIt6kOPaTgTSZdSjt1/UaedOg\n6i7y6zNBz4PlvLDPjArlecBSpIukG7+LJNZkcY8xlvNDJPEFPVL6iZCclHHIx8b95mtUBBLimGWu\n+nbp+uY0tAeNlvfqJ+ky5C0cgRjPmM+Y5eXeor8M8PsSg2IxURRF6Ve0nomiKIpSMlrPZGDSVU7I\nU24jlnYAwEvSgcsiCYpy1LMsU6w9APCTlMDeLHxNJsdl/mBKCSyRtKWldlbhl+cPD8kIzlSetDYA\nLSnp7ZKz2YNM3hd7mDnBkkaSvOFAbdgkF5X5iydY4ufOnm08nixt8vMCTK8yvtcyBxmrKx6HZxzx\nyMRuNgUYtmdlIj6Wi8Jl0jNrSNicJ/wZnp9+txxPC9LLr91hXrCEF3CT52BejmeCZMXygDw/6DXn\nTdQvpUqfu/D4GnKpXVyeA3pH9tJKi4qiKErp6M5EURRFKRndmQxMGuK7yityMFvYa26Vgx4pFfhI\nAuEfHW6rcIAbSwCAKXGwDMDeQwGSj/xuueUHTCmHr1sdlBIKSyoeSHnD7Xa4L4oPNOQNFwXAkRTE\nfQSAVpJuWEb0uRyi5Lrg5GnFr/H4srjB0g/LHyz97O61rvC9p3NSHuJrAkBTWibq42A/lgRDJNcF\nPXJeZPLmVzyVoyBEGiun8exKxG9+Z8Je+dzL3O2yTVte0+eS0pmfJCunfvMz5PHkfjvJWgwHD/ca\n6s2lKIqilIzuTBRFUZSSUZvJwKQyuGsLzl4mLPUAgNeS52Ttwh4e7HnltaQsk86bn+fPBNyFczJx\nGx4Hac1L0lcZiTns3ZLMS80qZ0tPrbDbzM8fcksPMJat2CstZUspwknm4ldYPuKxYpmmNWXm5uIU\n9Bws6XUVDuA0pJ6M6cHE/SjzyrHhsYq4pPQTyTQbbeYted3mYOF04PxMw2gTxz7blEPTXikxJSwp\nM7ZlZAChm7y9nOYFf2d4HvhpvrpIbu7IyrmXzJnjzUGHHJQbDZDURs/D6TtjStB7XrZWNqw7E0VR\nFKVUNDeXoiiKUjKam2tg0p7etT12UJwMLEtuuVnOYA+bgEfKMilLnt+eNuspc1p1zh3FHmLchyRM\nGcBlSbknyx5MtKPnfFQdKSmxVIaCYDgokYPPgjQWHBAXz5r95txbnHOJvZ7iafIMckgXz/0oJp3x\nWJX7KJ28g0TCQZ0szSSy9DxI4vNwrneYklKe8sCxZxt7OLWSTMPBmQCQzRUOIOT5zQGbPBedYG9F\nY7xpLJoS8jvXkTa9obiSZUu1wroTAAAgAElEQVSc8qvRWKWChT3hALO8xCHmKXuGS725FEVRlFLR\nnYmiKIpSMmozGZh0TdfOW3intNTsiVIVlN4rfspp1ZiQ0gKnCWfJBQDSOfLaoZxXHDSXofOd0q7z\nZ1geSpB00J6klPXUzZxteklVUSE6DlaLeOTYZGyS2vJmm9xvlnLY44YlFKfgP5ZV+JnmyYfM6yIZ\njPNuGT5nztct9D7PPa9DVcUwBR1y4CN7b7HHmNfivFpmUF7KolxbVL0xRXON57PTd4Zhz1iudMnS\nZpZKKlSFzLHhPHD8GRfNE86vVuYQbBkLsGdalXHOHqHeXIqiKErJ6M5EURRFKRVb06kMTLzuXVt9\nlgmcKhbytr4hwWnXC+dxCpLHk1OKepYSPHRNlgHcJNu4PaZ8wdt6DuoKeKmf5LHElRo5j5kT7AWV\nyEkJheUmHlsAaElImaXVkl5RPDYs8YV8Zu6uMp+Uf9zgXGgU+JgpLDOy3ASYaev5mbJXGks92ztI\nMwTQ4pbjZz7DwtUFOSjUSOeP4hLUqLImcex3SXmoOV1mtLk9bt5LV3IO37OuRHzyPthDEnD47tL3\nsCUh5yLPm5akQ0kFeu3rBXvZA1TmUhRFUUpGFxNFURSlVGy1mSiKoiglozuTgUnXCOkgRUI7JVhs\nz8rIb3bpLPNJ901OFslJ79IOLrZAhThiewfXWWEN3+NQ46Pcz3q51INDFGXP7rJsA2CtHDBdfdkm\n4iPXVE4m6VSbpCJYuOaJYXehH31OJY45SpvdifmZVXhk0sV4Xrp7O9WkYVdVN2Ug4OhzttuU+0x7\nBt87R7jzvbIrsYeSjGYcUj7Ec9RPSn7K9jl+hmapW7PsrtviZyrPb0/Juck/5J1KXXPUPNt6KkPy\nIuy27oRZu6W3Ej3qzkRRFEUpEfXmUhRFUUpHZa6BydcTr3b+P+uVElY8GDPOH56nqFi33LJmIN03\nIy1bxbGnrUEc58JS0gIAu2KqOA765TXDKSm7eFLS1dUR2lq7siRT0a7fnaVo65yUSLIBWUIWANrC\nQ8VxxiUlkGBG1tMIpFpkmx4z6WXCJ69jlPqlGh/enLyRYFK6sgKAK0cR7Xly16axsqgASjIk50XC\na7rDhvgZZWW/XHF5nHdRuVyfKankqWSxPynHz9cu5xb/Am6NjRPHX1hjjGuUeeRcq8xtE8ferJxr\nWbd8xnGfOS8CbnkvWZJMqyCvEUSrOM65KWmm2xyblCW/ux5yJ66IbxHH3rici1mv6b7M1wWONM7Z\nE2xdTPqG7du346677kJzczMsy8LMmTNx0kknob29HYsWLcK2bdswZMgQzJs3D5HIrsI8H374Ia6+\n+mpcfvnlOOqoowAADz30EN566y3Yto1DDjkE55xzDqxBqE8qirIPMwj/JvXLYuJ2u3H22Wdj3Lhx\nSCQSqKurw9SpU7Fs2TIccsghmD17NpYuXYqlS5firLPOAgDk83k8/PDDOPTQQzvbee+99/Dee+/h\n1ltvBQBcc801WLduHSZPntwft6EoitItdGfSR1RWVqKyshIAEAwGMXLkSDQ2NmLVqlW4/vrrAQDH\nH388rr/++s7F5JlnnsGRRx6Jjz76qLMdy7KQTqeRzWZh2zZyuRyi0WjR63tad22xXQHy0uH6rgC8\nCSlf5KnMaTZSI47jIVla1UdSjrmVBsrzjQV6bCboc5OE4hSRzRHCOW/hfiSDlQX74ETWJdtIkeTn\nc0mJpDtfKpctvXL43oIklfnSsvwty0cAkCaJzp0j77gM9bPIeHvcpjyX9sq5lPLJcrc8tzzUB2/G\nLBmdpjaay0aJ45Bf3pcnZ5bl7cow12bjtQCNnzctZS8XyYhukofSHrPOTRnkMwpm24xzusLSZo6e\nIcu8ABC2pZxpk/xp5eU8yvilNMlyHeB8L72C1jPpe+rr6/HJJ5/gwAMPREtLS+ciU1FRgZaWHROy\nsbERK1euxHXXXYd77rmn87MHHXQQJk+ejPPOOw+2bWPWrFkYNWqUcY0XXngBTzzxBOLxOBYvXtw/\nN6YoivJ/9EXQ4hdffIG7774bn3zyCc444wx873vf63xvzZo1eOCBB5DP53HiiSdi9uzZBduqr6/H\nL3/5SyxcuBBr167Fr371KwwdOhS2bSMajeLSSy/t1g/1rvTrYpJMJrFw4UL88Ic/RCgkf+1YltVp\n+1iyZAnOPPNMuKjAzJYtW/DFF1/g3nvvBQDceOONWL9+PSZOnCjOmzlzJmbOnNmHd6IoilKAPpC5\nIpEIzjnnHKxatUq8ns/nsXjxYlx99dWoqqrC/PnzUVtb6/hDe3dMnDgRdXV1AIBHHnkEzz33HE4/\n/fQe9a/fFpNsNouFCxfiX/7lX3DkkTs8JqLRKJqamlBZWYmmpiaUl+/Y+n700Ue44447AACtra1Y\nvXo1XC4XtmzZgvHjxyMQ2CE5fO1rX8P7779vLCaM7d+1cFlZ6dXjznA9A6Ch+mBxnHZJicNjc80I\nqqPg4PnDRDtM+aErhjcSSSTJkFl3Ie6XXmOePAVPusmTzZYyTRhSmvA5SCg5UGAjCtffaIqMlNfM\nm146bkh5IpqXHkvsAZbwy19MLIMBgDclpZw0yUPtATl+GQcJpCuBrClJMSxjBRNSymSZsS0kPeMA\noN2S98ZyaJ7kk0a/HF8OkE3bDvV6vPK6OfIIMwJRqdaLzzKDWStT0pMKJPGxl1owLb25WPZq95sS\nLMuhhjxKclwgQ3Je1vSIZE/B3sIpuWupRKNRRKNRvPXWW+L1Dz/8EDU1NRg2bBgA4JhjjsGqVauM\nxeTjjz/uVHqmTpXepJ39tm0kEgnU1NQ4vl+IfllMbNvGvffei5EjR+I73/lO5+u1tbV45ZVXMHv2\nbLzyyis4/PDDAQB33XVX5zl33XUXDjvsMBxxxBFYsWIFXnzxReRyOdi2jXXr1uGkk07qj1tQFEXp\nNj0xwO/cEQB7pqo0NjaiqmrXj6Oqqip88MEHxnl33303fvSjH2HSpEl48MEHxXvr16/HlVdeifb2\ndvj9fvzgBz/oUR+AflpM3nvvPbz66qvYb7/9cOWVVwIAfvCDH2D27NlYtGgRXnrppU7X4EIcddRR\nePfdd3HFFVcAAKZNm4ba2to+77+iKEqP6MFismDBgj7syA46OjrQ0dGBSZMmAQCOO+44rFmzpvP9\nrjLX0qVL8dBDD+G8887r0TX6ZTGZMGECHn30Ucf3rr322oKfveiiizr/73K5enyDAGCldklZNnk4\ncaAeAFS2bBLH7g7pWeKiAMI8e4iRRJX3mx4j6bCUWbxJue23Sc7I0RbenTfrPZhBdFKCKk9LqYbb\nZAmFpTYAsIJSAsmSdBNIS6nM8GDymPmS/FnyJrLNvEyiDQe5guE2gvHt4thNuaSSPilNuinIMdwu\nA1MBwEVeUHkfBcWRd5Gf5lGg8XOjzcoyOS84yJY9ltizLdsL3klBh3vtSlt0tPEae/nx+AcS0hPL\n2yGPA345dqmQGUzM92pRUC6Pje0p7r3Ynbm0J/B3aU949tln8eKLLwIA5s+fj1jMHBMAiMViaGjY\nJQ03NDTs9tzuUFtbi4ULF/b4c4MiAl5RFKVf6QVvrlmzZmHWrFlFzzvggAOwefNm1NfXIxaLYcWK\nFbj00kvFOeFwGOFwGBs2bMCECRPw2muv7ba9DRs2dNpfeoIuJoqiKL1MXwQtNjc3o66uDolEApZl\n4emnn8Ztt92GUCiEH/3oR7jpppuQz+cxY8YMjB5t7h4vvPDCTgN812BwYJfNBABCoRDOP//8HvfP\nsm2HqL0BxrZ1Kzv/z8FoHCQGmJ494aT0qPG31YtjDk7Lu6UHjZNcxH7oOb+Uyjjgyp2hVONp07uI\n5TUrR94vCSkTIE3eWkHZh2TMdC3M+OV4cfBfhgL5EiQfsYcZAPhIajBycdH7fO9OUmUxMhTUyPPA\nRZJJsEM+cwBwp2W/OEg075HzyNsmpTZXh5Q2AcAOyfFKReUvRF9cykNWRo5nNiKlHJZLAdPTiD3w\nPInCkmsm6JBrrsgv8SR54PEf2xDJkC4HGTdLucw40NGfkv3m74PtNj3bmPKvf7PoOd1h+7uvd+u8\n6ilH98r19gV0Z6IoitLLaDoVRVEUpXQ00ePApM2/y0PG7ZOSCHsSAUAkLtNls6zCubo41btlS4+n\nRHS4cY2UV8oqPsrTxOnM2aunpUwGqwFAO6R0w8FlkQx50FCwH3vDcKCfEyzpMexh5uSFxjmuOOgz\nwsFqbkqV79BP9s5i6YylTK46GciRjBg2AwytsHzOGcrflaNAPTe1wZ5ugLO805V8ufRQ8nVQjjea\ne3nLzAvH89P4DD1TlrmcPJU4KLEjIL2JWFrjIFBDIrTNfmfIU41lrkBcBru6SBq2HeRQd6JwDrE9\npa+CIfdlBsVioiiK0p+ozKUoiqKUTF+kU9nXGRSLyYiPu/hUU/JIJBxyLnnk9jlfRsFOWcqblSyc\nt8nv8CuFg+Y8HLRIAVcpf/F8X7GsDDbzUwChUYmRgyu74e3iS5keSF3x5uU1OU14zmPmwGJvLZb8\nWH7jsfKybAPAFaHAR/K481GZAZZy3PRMXY0OgXwe+fVJ1+wvjtsjUt6MdMg2vC2mhxhLTiD5yJCc\nfFIeYi9Alp8AINAuZVy+Vw6yzZIky8GwgBlAyCn/2QvN3U7jz985h0DfUBtV1KTvYT4ivcxsLwVS\nZsyAWdtnlhboDXRnoiiKopRMX6Sg39fRxURRFKWXUQP8QKWLtMWeWJZTDh3aonJAoRska1HgGNIk\nYeXMXFMeP1VzpPxeHWUyWI23zeUdlPIbDoGMHI/KubfI24UD+SyHHFksX7C8wTJMS6WMxHXSkj3k\nwdThkQFu7IXGfXCqtGjksOJgP5JI0hGZE8vlJiknaXr9meNJFQrpvorlknK6jk2BpCDphiUqP8lk\nTgGGLgpWtajf/J3wgoIBeb4DsEnyc5GcmfNJ2So9VErH7K3IzxgAAnRdvg+WtTiY2Hab33VX2pS+\neoPBaDPptrB3ww03dCYd68ott9zSqx1SFEX5qmNbrm79G0h0e2fy/vvvo6WlBRs3bsQ555zTWQVx\nw4YNfdY5RVGUryKDcWfS7cXE4/Hgpptuwh133IEbb7wR//Vf/4VIJIKvQmqvZNV+nf/noDAnqcFI\nIU8eMW3V4+X5VQeI4zAHPbaTFwrMLXk6RNt+CqpLeWSK7rTb9ELx+yiVO6VRZy+cPMlabpJhgu2m\nt5GR66mIobGMclolAqbs0uqVElM0Q+niSfJoJS8pzi0FmBJHa1hWjktZUlapSMl+smdb9pD9wLCR\nlQMhOYixtapaHIejZoXIaOPHdBFK+c85xSrpGbI85PD9bB42QZ5C91He+kXB9+OVY4w2WdLjHHj8\nHeJf5f6kORZMR7X0luO5x3YKDmosa/vSaJPnSW8x0HYd3aFHdxwMBnHVVVdh/PjxqKurw6ZNmzrr\ntiuKoig7sGF1699Aots7k507EMuyMGfOHIwZMwY33ngj0mnTUKYoijKYyffsd/qAoNuLyQUXXCCO\njz32WIwcORJvvvlmr3eqt+ka8MfV2vxtm43zuZKimwKb3FRhz5WSx4Z05rDltUm+8KSoX/SZJAWO\nBdPSGwwAwts3ysuS1GAEwJHUxl5rLPcBZoAmjxVXF3RTwKeTzMUyFad/96VkICRX8XP6hcfPmc9p\nj0hvOa4Y6clJSZArNQJmXicOIMyRhxJX+gt1SDkUANwNNB89nCeLvKaouiNLO05VK710b/wZ7ifL\no06eVvyaJ0PfCZaTeP6TFGxxeQQA7nKqQknlEFzsaUi5uPJeM2A2E4war/UGti4mJtdee22nlPXc\nc885nnPaaaf1bq8URVG+wgw0Cas7FF1MTjjhBHG8ePFizJ07t886pCiK8lVHFxMHpk+fLo7/+Mc/\nGq/t64Tad+VE4q1vLmzKLokq6bnDeZzYQyZVIb2LeMvvoop8gJkGnCURprrhvYLvA0A2LLfsLC1k\nqFKdUUGPUnxz5ToA8FJerLyfZBaSXQyvHc4XBocqeyTPeVukHOQnGSwfMlPQc4AaB6uWt3wmP0D9\n5IqSPHY7LiKfoZsrWdL4u9gTK2D2OzV2mjj2UiAqe+h5WQJ0yD/FpMOcHl7ee7hhk2yTZNxMpfSM\nA4CWclmVM2ScURiugphzmynouRomezgy3pwcC55ngDnXegtdTBRFUZSS0cVEURRFKZm8rQZ4g3ff\nfVcc5/N547UpU6b0bq/6EPbecNpOGzIV5bBiScRIrx2ndPIOqd1zQfJEIa8bb4dsg9t0Ik9p601P\nICn9sNeZyyu9t4wARQAJyhnGsgvDgZBObXKAIOcEy0SHyDboPp2eYbH0+pwLKk0Sn5fkj0DSIfCU\n7oW94xhO/c65vQAgHqZ7DVBqfOqXn9PJt1LlRbf5FWeJ1UfzmfNs5TyUsy1renNFm6U0xs/ISFFP\n3ymutNhWJmUzAOhwyX5Up2RwpScj5WSjrEDaIb9aH6E7EwfuuececRyJRMRrlmXhzjvv7P2eKYqi\nfEXRxcSBu+66qz/6oSiKMmCwbV1MBiTxLgFqfvJ+CTV9bpxvpFV3F64CZ7xfLrf4rVGZhh0o/sul\nIvOJfKGDvKA8pnTmyppBhl2xKA24p0PmQ3Jb0mstHTW9dthDiQPauHogB59xNTwAyAak5Mcp5TvK\nRpj96EIo0Wi8xlKZlZdSTrBNVj0M0n3laayc0tx74/K6bqoEaLOnG5UZYC8pAIiQ/FMsxxOnmM8F\nqCKnQxVK9vJzGUG4Ui5iOTRTLnOMAWb5ApacOPccew5yXq1AxqxeGrSl1MvScNYjpTIj2NXJ082p\nDEAvkNediaIoilIqaoBXFEVRSkZtJgOUcOuu1NPuDinlWG0Oqa85E3JEbuGz5UMLXi+w5UNxXLXN\nlNJSNePEMXuzcD8RlvJFukJ6VQEwvMw8bQ3ymPMfccU88kbidP0A4KJAMJbKWALMUEBnd+CAQZ+R\n50lKNxxICcAMLC2XMgtLJL4myonVTOn3HbyiulbwBMxgvvrqieLYn5X3EdtmBqIaUhk9Ex5f9jw0\nSwSYv5AtkiY535qVlR56NuUcc2rT12FKjbKNSMH3vVkpi7F8Cpgeeh6n5971miQ/O1VajMdkgHJP\ngy13e221mSiKoiilojsTRVEUpWR0ZzJA6Zr3yi6XW1+PyyGoKy69QKyk3IJ7KVdXS0xKVj5Ku+5q\nNlONezuknJEjOcgm2QuUU8yTpDxQMKUylidcjVvkB0gKsgMO+acIzm3GHkl5S7bh5kAyB7ko7zZT\ng4vPsKdPQo6dkQYfQI7ydXFwJedL49xdIG+unN9BAOEgUJJhhm5fL6/hkffJFQx3NCJfywWlvNlW\nIWWZCFUPdGUo8NQhYNYItqR7s72UN476wFUTASATkcHAHQGZ/8uTl3IcB4WyrOVvdigNUSQANk/z\nl/PdeR2CLSPbqLIljjTO2RNMH7qBz6BYTBRFUfoT9eZSFEVRSkZlLkVRFKVk+sIA/9prr+GJJ56A\nbdsIBoM499xzMXbsWADAmjVr8MADDyCfz+PEE0/E7NmzC7ZVX1+PX/7yl1i4cCHWrl2LX/3qVxg6\ndChs20Y0GsWll16KaLRnVSj7ZTG5++678dZbbyEajWLhwoUAgPb2dixatAjbtm3DkCFDMG/ePEQi\nkYIDBuxINFlXV4dYLIa6urpuXb9rqdMMaePJUIxPN2wJgbh0sXWT3l7etFF+PkeRuVVmBDcnnORI\n5xS5H/tbZMS2q4HsHwBc7OpL7sQIkO6fl8ouRzob5YcBdJTLewmSHYDLD7NubSSjBJD2Sq07QzaU\nsnZ5r5yhIBU13aSTfjm+QYqS54hsj1fafjyUWNPDrtoAslQLJ+uX4+0ml2ZONulkz+CIbE+rnHvl\n5BbNdgN283UiXcS13ciMQGPhdrD18PwNuqTdkcsgs/2D3dCNMr9wqL8TlffBCVl99J1xLJ/t5PLd\nC+TN7pfM0KFDcf311yMSiWD16tW4//77cfPNNyOfz2Px4sW4+uqrUVVVhfnz56O2thajRpnJMnfH\nxIkTO/+ePvLII3juuedw+umn96h//SLsTZ8+HT/96U/Fa0uXLsUhhxyC3/zmNzjkkEOwdOlSALsG\nbOHChTj11FNx//33i889/fTTGDlyZH90W1EUZY+wYXXrX084+OCDEYnsiNcZP348Ghp2/ND48MMP\nUVNTg2HDhsHj8eCYY47BqlWrjM9//PHHuPLKK3HllVfutgS7bdtIJBIIh4s74zD9sphMmjSpcxB2\nsmrVKhx//PEAgOOPP77z5nc3YADQ0NCAt956CyeeeGJ/dFtRFGWPsG2rW/8AoK6urvPfCy+80K32\nX3rpJXzta18DADQ2NqKqqqrzvaqqKjQ2mkGkd999N8455xz8+te/Nt5bv349rrzySlx44YX43//9\nX8yYMaPH97zXbCYtLS2orNwhNVRUVKClxYxE7zpgALBkyRKcddZZSCRMV9CuvPDCC3jiiScQj8ex\nePFiZLtIMX5yK/U6bH2TQSmBJMMysZ3bT+VAt34kG0hQqVWfg+srSQUs/xjbfE7Y53L4HUARvjZd\nt73mYHEcbJXykTsun4FTnZAsJTz0JuRnXO1SDnLTse01x8JFMlWEpDKWegwcZBdDFmyiiHbKapCq\nlPIduwLnymSdEQBoDct+h1J07+TuyrJh1qEUsIfddrdJF1l3u5ScsiOlW7qLEmvaW8zsC4FyWb62\nbfwR8n063yhH7CB/chYCdp9n+S1bLseT67hsrZhgXOOLhJS1xvtkponY52vkB+i73VJjtsmSam9F\nwOd6YIBfsGBBj9p+99138fLLL+PnP/95tz/T0dGBjo4OTJo0CQBw3HHHYc2aXePVVeZaunQpHnro\nIZx33nk96tc+YYC3LAsW/UHgAfvnP/+JaDSKcePGYe3atQXbmzlzJmbOnNln/VUURSlEb3hzPfvs\ns3jxxRcBAPPnz0csFsOmTZtw3333Yf78+Sgr22Gji8VihoITiznYgrtJbW1tp227J+y1xSQajaKp\nqQmVlZVoampCefmuX4pOA/bee+/hzTffxOrVq5FOp5FIJPCb3/wGl1566d66BUVRFEcc/Ad6zKxZ\nszBr1qzO4+3bt+PWW2/FxRdfjBEjdu2kDzjgAGzevBn19fWIxWJYsWKF8XcxHA4jHA5jw4YNmDBh\nAl577bXdXnfDhg0YNswh918R9tpiUltbi1deeQWzZ8/GK6+8gsMPPxzA7gdszpw5mDNnDgBg7dq1\neOqpp7q9kHQtl2pRJLQR+QzAzQkOaWZw6VQDqjWSp2hswFni6Eo8JKW1bHSMOK5s4shdwJ2gyH2S\nI8o+fUd+IGUm0+tKIGuW5GX5wkgWybIhyS6WQ7R6gOt6UJv5BplBwE5S9HS16Wll1Huhe7Up8SNH\n6nOiTbfH9JKKcb0MvlcuVculg8MOrpeUeDRTb2ZP6IqHE5VGpYeZ5XMoJUzehmWfy51+trxKHLPE\n6qqX5XIBwKZnZrEnIeElKThqfSqPHWTcmmrpnZTLyu8uR8BbdJ8VH/3D7Eicnuv4SY797Sl94Rr8\n2GOPob29Hb///e8BAG63GwsWLIDb7caPfvQj3HTTTcjn85gxYwZGjzZrKF144YWdVXIPPfRQ8d5O\nmwkAhEIhnH/++T3uX78sJrfffjvWrVuHtrY2XHDBBTj99NMxe/ZsLFq0CC+99FKnazCw+wFTFEX5\nqtAXrsEXXHABLrjgAsf3vv71r+PrX/96wc+PGzdOGN/POussAMDkyZPxxz/+seT+9cticvnllzu+\nfu211xqvFRqwnUyePBmTJ0/ulb4piqL0Nvm8RsAriqIoJaJlewcoXW0JRv12LiYEIEAZS7moTrJc\nFkFKjJb6I2e1ZTsDYBY14kj8lEc6KUZS0m8852Bz4ay0XDzIIvdjFxVe4jrztsecHhy1bWXkNbiY\nkx2W9iLObLzjOvIz3lbpuupi206E9PiIaZNi200+Jg2KnAmXC4mxa3Y2aBZ3YpfxHLlNBylzgq9V\n2j98zRShDQABOT99rH2zLYjdoskd3K4wXZrZBmKR/YKLjxlu6n7zGWKIdK22OVKf5k2OxtPLxcgc\n3I9zVOO9OSKvKa1FgK+NbG1BBztlqLBtZ0/pDQP8V41BsZgoiqL0J5roUVEURSmZvjDA7+voYqIo\nitLLqMw1QEnGdvmn+9ulHu+kW3MFPCsnh8lN2UlTXqn/cjoWX5u8JgB4t7wnj8OyjczYw8XxVp+s\nsFeT/8Ro008+85zCwkrL+IxspbQj2GUUX+BAW1Rq+Mlq6Zc/pOkDcextlPYnV9YhNQrr6xx/QXYE\nUOVLjpsAgFylvBfOautpl3Ytjn/JU7XMTMC0yxjpaGi8eR4Z9rm8GcdjsR2LbCQ56he3ydUe3Y0O\ndhnKFp0dJudWS+VYcVxG1Rw9TlmDqaonZxo24pE4PVAH2d44/gNAICvvLVZTJN6LcagQybab3qIn\n6VQGCoNiMVEURelPdGeiKIqilIwuJoqiKErJ5FXmGph01ZU5zsEpzoTTwbP+G6iX9ooAyH7BP0so\nfgMA8i2Uojsj9d+KLevEcSQibQ+s+QNmbijEKXU4xWt4OK6kSsbPZMrMzKNlLZ+J4wBVjORqd6D7\ncqx2RzYS1v05V5fZQN54iZ8ZWuR45Ztl3I5FOa1cLfL9QNLMVO0K0FzKSBuIK0i2ngppx8lGpP0D\nMHPFcX4vF42NEffD74dNW49F88TzhczzFvvgf+UHXGRDHDLcaDNLebHQTLYxtmuN3F8el1FOMafv\nJc1HbwvFptDc4kqYbL8DAE/c/G72BrozURRFUUomZ/6+GfDoYqIoitLLaNCioiiKUjIqcw1Qgm+/\nuuuA8w6Vc0YfwK6RtUPYhx4cT1BPWizbBUbK9gDARefktkn91w1Z38FXTXYDKt8KACA/fKMfXNuC\n82i55XRwtMvQvXNshZM0kHQAAB3JSURBVJH7jHIfWR1mv61iujX3M0ptOuRxMmwkrZQfjUscUylo\nm2Jy+PwdF5bja4Wk3YBjJexmWWbWzTYVAJ5qabdKVZt1Kbri/ZJKRnM/ee7CzJ+WHy7nJz9jzp3W\nUSnrigBmrrn8WBl/xHm1uKQx/yGyHeI/shTr46I4HW+DrLPizRauBQPA+XvUC2gEvKIoilIyujNR\nFEVRSkYXE0VRFKVk1JtLURRFKZm8LiYDE8u/q2BQvkMG8tlNZuJBT4iSLlYMlSeUy+AzFxs52TDL\nQXiAYSh1jx0nr1kpDbFdC3wBgMupcFW5DILLUuJGI8iLDPbZkDRwpoJmUF24YZPsBwcHFimKZDkY\nhI3AxiwlQKREj2zkd283g9HYocHmoDl2xKCEiq6wNKZbZWbwHxfccrVQQs9kXBxm2+Tcs9rNZIZe\nMo77qQ2EpfOBzYXBKJmh1dFiXMMiR4t8uSygxYG86ZCcBzk3OXLANMBbOfkMfSn5PXORg4PFcyBn\nJsHk7xUn33RTwTN2JECbORZ9hcpciqIoSsnoYqIoiqKUjLoGK4qiKCVjd3trMnAi5QfFYmJ3KaZk\nJN8rM4MWGW8z2Ro4oWKxRIRc8AhAvqJaHOcoKZ1TAkoBafyAaSPZXnmgOI6QDSSQMIMSuxJq+dJ4\nzbCRsH2IvkRWkpJJ+kNGm8bXiW0mNBZ5j18c50YdZLZZM1Zel+wE3jZK9NhCtjMO7IuY8yQdkePt\npmSH7qh8xp7R8nkYNhbATIjolzYmTkKaKZPX8Dd8Lj/vUGQKUZkw0SgURkGk7g4Z8Bl0KDLFySPB\nBbQ46SjDBdC6YcHmpK0dVTL4suzL9fIDQXPuORVW6w36qNl9mkGxmCiKovQnajNRFEVRSkZtJoqi\nKErJ6M5kgNJy+Mmd/480y+JOnq2f8ulGkkDWrcExHpTMkBPpWSmZRBAwk0dyUSl34xa6prSRZKpG\nGm2mye9+2OdvymuwXYDuK1cmbSruxq3GNVgLz3PMjRF3Ir9VPDYAkKXiWK4yaYPK+aUtwtNGMQvt\nlMQRAJI05vwM2SZFcT/5qLyvRNQsCNUalPFHgYy0C0TjG8Sx9alM9JjPmPFHLopnsdtoPCn+wj9E\nvt+x3xRx7K4mWwYAf5NMiGi1SvsRx7JYTWTb4YSiAEDxLpkhMkGlh+avtY1igzieplLGvgBAKkJ2\nRpecS4EEzW+XtOfZDnavnENCyd7A7vbWRA3wiqIoym7QdCqKoihKyeQHodFEFxNFUZReRm0mA5QW\nbxetlWRTJ8XUUy/tKuDiS6QpZzlGhGIarJCZ1ykRkXq7Oy91aB/Fb+R95Idvm/vo0LZP5Ckck0Cx\nFx3lI+Q1MzImIdTukMuIrpuj2AqbtHG2/bi2Sb0eAFApx4JjJ/KU3wsRadvxOnxzLXb052OOYyB9\nne0CwUaK3wDgjcjxaovIfGpxinsIkp3McrA92GQTsVLSlsAKe57GP/z5OnmCUz4qijNh+5CRS65S\nPg+OJQLMHGxcNM2wG5KNhYuocf41APAm5L0EEvRM6DuTqpTzO+OTY+VEb1lQdDFRFEVRSibfB6vJ\nqlWr8Oc//xmWZcHtduOHP/whJkyYAABYtmwZ/vrXvwIATjnlFEyfPr1gW2vXrsVTTz2Furo6LFu2\nDA8++CBisRhyuRxGjhyJiy++GH6/v2AbjC4miqIovYyDcFAyhxxyCGpra2FZFjZt2oRFixbh9ttv\nR3t7Ox577DEsWLAAAFBXV4fa2lpEIt3fZx1zzDGYO3cuAOCOO+7AihUrMGPGjB71TxcTRVGUXiaX\n6/2dSSCwSyZNpVKw/s9Nf82aNZg6dWrn4jF16lSsWbMG3/jGN8Tn16xZgyVLlsDv9+Pggw/eTb9z\nSKVSCIeLS4LMoFhMRtX/c9cB17Fw2I7mScN3pc04ka7kfDLnD2v8vg6zZkogLl/LU40IKy+1c09C\nxhPYnPsIpm7N9gu2oQSpD2m/1K1bRk81rlHWuFH2kzT+VFjq70HWzh3ymLkor5M7I8c7HpKaPVdE\niddMMNr0V8rx8rdSfRMaG8555cpSfQ62mwHwUM6qqs/ekycUiZ1IVctYDACwIOcjzz2eF8b8ZVuQ\nQ7/ZHpQaJmvpcG2SYn1wgm1WeYolypP9yE3xSS6H2KxMhbRNtlbsJ47DHdtkG7bsgzdt5imz7L5J\notX9RI87dhI7mTlzJmbOnLnbc1euXIlHHnkELS0tmD9/PgCgsbERVVW7vnexWAyNjTJ2KJ1O4777\n7sO1116LmpoaLFq0SLy/YsUKbNiwAc3NzRg+fDhqa2u73f+dDIrFRFEUpT/piWfwTnmqOxxxxBE4\n4ogjsG7dOvz5z3/GNddc063Pffnllxg6dCiGD98RfHvcccfhhRde6Hx/p8xl2zYWL16MJ598ErNn\nz+7+TWAfWEwuuugiBAIBuFwuuN1uLFiwAK+//jr+8pe/4IsvvsDNN9+MAw44AADwzjvv4OGHH0Y2\nm4XH48HZZ5+NKVOmFLmCoihK/9L9CPjd8+yzz+LFF18EAMyfPx+x2C4vvEmTJuHuu+9Ga2srYrEY\n1q3b5cXX2NiISZMm7dE1LcvCYYcdhmefffart5gAwHXXXYfy8l2ugqNHj8YVV1yB+++/X5xXVlaG\nq666CrFYDJ9++iluuukm3Hffff3dXUVRlIL0hjPXrFmzMGvWrM7jLVu2YNiwYbAsCx9//DEymQzK\nysowbdo0/OlPf0J7+450Pm+//TbmzJkj2hoxYgTq6+uxZcsW1NTUYPny5bu97oYNGzBs2LDdvr87\n9onFhBk1apTj6/vvv3/n/0ePHo10Oo1MJgOv16zt0RXWY7viZHtgf/UcxYlwTWy2kaQ90obizpr5\nkXwfvi37kSEdeoTU0w1t3CFGgWMOOKcV24t8rTLnkjsoNf5tQ8xfN601UksN5qheCY1d+yhp76ho\nM+M1ONcWE4rLfnqShe0hAGCxzaPNIX9XF9xFcnflKQ7C6Rr5ZqlT20lZ19xNv1Z9Wx1ibrg+DHvk\ncPxRkxy7XEr2yV0hbRU7XpT35m/ZWvB9nmtGjjfAyB3HcSNGDiyaz1mKHWJbHAB4UnKulXFs1naa\nW/TMbYqXAYD40AOM13qDvoiAf+ONN/Dqq6/C7XbD5/Nh3rx5sCwLkUgEp556aqcN5bTTTjM8uXw+\nH84//3wsWLAAfr8fEyZMQDK56+/STpuJbduoqqrChRde2OP+7ROLyU033QQA+OY3v1nQ+NSVf/zj\nHxg3bpzjQvLCCy/giSeeQDwex+LFi3u1r4qiKMXI94E31+zZs3crPZ1wwgk44YQTCn5+2rRpuP32\n243Xp0+fXjQupTvs9cXkxhtvRCwWQ0tLC37xi19gxIgRRfW+zz77DA8//DB+9rOfOb5fzCNCURSl\nL+mLoMV9HVfxU/qWnUalaDSKww8/HB9++GHB8xsaGnDrrbfioosuQk1NTcFzFUVR9ga2bXfr30Bi\nr+5MkskkbNtGMBhEMpnEO++8g9NOO22353d0dGDBggWYM2dOZxqB7tA6fNdOJ7JNLlauZoc63GRH\ncZOvvk11FeJ+6f9e1iE1fN/mj41L5Dso/sInUxdYCfKJT0n9Pe+Qc8lVJvVxN+eCongOzpfkoWsO\n47reDriapW+/YXkMSe2W4zsAM/aEa554KFeUKy3HgnOh7XiNNHzKP+Wi+BeuTc/5qVwNVF8GAAJU\nU3zMePmZDnpGVNveyB8GmLVy+Bo0F10V0tbA5KvMOixsjzCeIecpc6hBY1wnLO2KyZi0e3rJzsV1\n57m+T7JMxnoBgE12rGCLrIli89hxTjGH8TZymU05xjhnT9Cswf1MS0sLbr31VgA7Ii+/8Y1vYNq0\naVi5ciX+8Ic/oLW1FQsWLMDYsWPxs5/9DM8++yy2bNmCxx57DI899hgA4Oqrr0Y06mBkVBRF2UsM\nsE1Ht9iri8mwYcPw61//2nh9Z2AOc+qpp+LUU0/tj64piqLsMblBWB1rrxvgFUVRBhq9EbT4VWNQ\nLCbl76/YdUDaLtdVAGD41ee9Us/1pmSt76pW8tOnz6eHy9xHAJAfNVG2GZcxCpybyCK7gCvskBGU\nrmute8s8pyshikOh+uNGXQuYdpb8dmkfyqcpvoNiFtxVZm3vxAHTxHGO8pQFOsiuZcs+sP3DEbYx\ncd0QW2r4RqxFhbS5AABa5Gcsm2JZ2NbAqWR9DraIDhlLkdv8pTwmWxv/0fJWy1olLgcbFbhWCNnK\nbH6GQ6SjS3rUQUaTHHMT+lLmKWM7GMd3cGyWPyO/YwDgYjtWmxz//FY5VtkGqm3PtiAAvpEjjdd6\nA11MFEVRlJIZhGuJLiaKoii9je5MFEVRlJIZaDEk3WFQLCa5bbt0fatB6u+uSlML5xw+nFfIiEHo\noNxfnMsoYBaayVDtkMA20sbrZVwD69juKBWzB2ANl779LsrPk09QjQjS7G2yJ3H9E8CM38CBk2W/\nsjIGxDGWoghch4JtIhbnhso45CkbJnOb2RE5Xu56yuPE9oxiuaUAgHK2cU13Vyv1k+wXmaGyHseO\nNuRz9vg3yjbbqK4NzQsX2b2ccrjZzWTrqZR2lly1tCPwM8+7TDuMn2Kr7G1bjXO6Eo5Lm4hv2Bh5\ngmXaNziHm71Nfkcsson4xu4vjp3GAsGQ+VovoN5ciqIoSsmozKUoiqKUjC4miqIoSskMxkSPupgo\niqL0MrozGaC4h3YJumJDq4PxlgPz2IBuUbI9TnLHxbLcCTMAK9xEBkoK6nIK7hNwwCEApCgxY1Ce\n4+LCSu0yQI5NnlaAgtsAw4icjVI/KSCOnRXcDok1g1soUzQl9AMlgmTjuBFgCMDFBdHY+MqOAZQk\nkBMPupvMAlz5rZuN1wQjpYGdx8qdkOMPAK40OUnQeFvDpHHc4uJuHJiXlAXPADPJaHaLnIt27l1x\n7KUCW95Kh+SS1M88jzf106K56v1SJkO1y8x8e+kKmbTSv10a4HPkWGAUPHMKUC5SNG1PUW8uRVEU\npWRyWfXmUhRFUUpEdyaKoihKydh53ZkMSFrfWLnrgDRlt98MzPOWU0GnrNTX2zbKAMNUq9Sly/eT\nhX3KJpuFvPJxqVtnGqXe2/zBZ/IabVJjDlWb+m9klFlQqCvZDkrSSPaiwBAZvOYd6tAe2ZysTz8R\nx64w2ZeiUl9PvPOO0WSqUdo3QiPldX3DZaLBXLNM2mh5HGwmVDQqt13aavJJOZ6eKhm8ynYxO24m\nk0x8Lm0miW1Sf49sk0F2gYNlgsQ8afwAkKV+2SlpL4p/KW03Lq/8CkcmyWvkmsxrNG/YKI476uV4\neoPS/hEdSxVNN5sBibmMTJyZaSe7Iz0jd6BwwS231/zTxN/LbIBsIi5plzESVnaYNqrUZ1+I44Bx\nxp6hxbEURVGUklGZS1EURSmZvBrgFUVRlFLJcwjCIGBQLCZNH+/SeC3SVcNDi9ePj2+R2nfHNqm9\nun1yGHMpaYvItZtaLSdq9AVlfIa/Xmrd2SQlEXSwE7BtJ0+xFNmETMLoDUuF2PJSAj8aKwCwwrLf\nVk6OjcUJ+qioV3AsJfQDAGwSR3myE7B9I9sm43ZSTVT4CoC/Ur5m01i0bZL2DjsvbT/l+48Qx74a\n037kH0pJQsnoyjap7BZ5TXfYjBVK1cvxrH9b9ottZ5VjZR/81dvEsXeoGa9UMVE+o0DlNuOcrvgq\nKc6EEygCRjwLPzMu7pZtoKSNJAvZDvFfjIvsnbl2adfK5aUtzjNsmNEGz7XeQoMWFUVRlJLRxURR\nFEUpGTXAK4qiKCWT1ziTgUnF2F16d6Baav7emJlniLVuf7WMv6gK+MWxq0K+b/i7t5u5uTg3lEW5\nucqnyHiB6FSzWJDRZKvUiLPtUsfmmATGxcWySPcGgEy91NczrfLecgnKuRSRdgH/cAfbwzCp63MR\nL44X8NH57pCZQ8xdRoXB6F4q6Bk6xap0JUMxI4A5T9i24KJ+uWNkYyk3516YclyN4Hsjwy7P32yL\nnAOJjzca1/DS2ITGyHxfnKeM52p+u5mnzCKbn2uEzEuW2yjzr+UpLsVDfcpmzaJq2bicFzmyAfL3\nLjhO2kgyo+V3CgCC3sLxLnsK2ysHA4NiMVEURelP1GaiKIqilIwuJoqiKErJaJzJAGX7hl35d5o/\nXSfea3nPzLlkZ+SviuBI0te9UpvNtkp9NDxa6sfhalPT518uwcqQOO7YTj7zKakxVx9k2h6SLZR7\ni3Tnli9kvEtyu9Schx1aLY5HfWOycY3G9TImpPETaUsIV5ONpEzGssRXvme06YvIc3Jpea8cW8Fj\nZXEND5hj4QtLbTxSI21nrZ83imMeu3iDWRckl5F/MIYfKmNTEk3yM+kOafvh+wCAQFTOleZPZb94\nbKoPknmzgtXSbpNsMmOc3n9qtThu/0KOlbdM2o8CFXL+Dzm4SK0dAJ4iubcaP5a2t1CVnDflo+Rc\nBIBcmuJ2yGbCUefNH3wujr2hVQX7BACjvndR0XO6g+5MFEVRlJLpy6zBH374Ia6++mpcfvnlOOqo\nowAAy5Ytw1//+lcAwCmnnILp06cXbGPt2rV46qmnUFdXh2XLluHBBx9ELBZDLpfDyJEjcfHFF8Pv\n9xdsg9HFRFEUpZfpK2+ufD6Phx9+GIceemjna+3t7XjsscewYMECAEBdXR1qa2sRiUR214zBMccc\ng7lz5wIA7rjjDqxYsQIzZszoUd90MVEURell+ioF/TPPPIMjjzwSH330Uedra9aswdSpUzsXj6lT\np2LNmjX4xje+IT67Zs0aLFmyBH6/HwcffLBj+7lcDqlUCmGHVD/FGBSLSbaLvSFQIfV5/+Gmtpvu\nIM2+RWrdZSOk1u3Zn3JzkZbupI2zzs/5vtq3SpuJyy3tNKzpA6buzHaDitGyBkqmmnKIUb8/X77W\nuAbbXfIZ2Y9EI8UCpGWbZSPMX0vlI2WsROsXVNtlk4yd2LxC6u2+GOUUA+CLymfi8ko7QPOnsvZI\n01qqGU9YXtMuY9N4JRplHi2eJ1wn5POVspYGAKQb5dwLj5JSA9svEk1ynsQbTBsJU3WgtEdU7FfY\nRpVoksds+wGATELOpZbPZM2fZAPNtYQcu4qDqG5Lg2nLjDeacU+iDZrf5aNkXI9TPruWT7cbr/UG\nPZG56urqOv8/c+ZMzJw50/G8xsZGrFy5Etdddx3uuece8XpVl3o8sVgMjY3S1pZOp3Hffffh2muv\nRU1NDRYtWiTeX7FiBTZs2IDm5mYMHz4ctbW13e7/TgbFYqIoitKf9MQAv1OeKsaSJUtw5plnwuXg\ncFKML7/8EkOHDsXw4cMBAMcddxxeeOGFzvd3yly2bWPx4sV48sknMXv27B5dQxcTRVGUXsbuBdfg\nZ599Fi+++CIAYP78+fjoo49wxx13AABaW1uxevVquFwuxGIxrFu3y0u1sbERkyZN2qNrWpaFww47\nDM8++6wuJoqiKHsbJxm6p8yaNQuzZs3qPL7rrrvE/w877DAcccQRaG9vx5/+9Ce0/1/aprfffhtz\n5swRbY0YMQL19fXYsmULampqsHz58t1ed8OGDRjmkK6/GINiMTn02Vf3dhcURdkHGVH8lD1i+VPH\n91HLJpFIBKeeeirmz58PADjttNMMTy6fz4fzzz8fCxYsgN/vx4QJE5Dskq9up83Etm1UVVXhwgsv\n7HlH7AHOVVddtbe7UBDtX2nsy/3bl/tm29o/pXfpuSVHURRFUQhdTBRFUZSScV9//fXX7+1O9DXj\nxo3b210oiPavNPbl/u3LfQO0f0rvYdn2IKwvqSiKovQqKnMpiqIoJaOLiaIoilIyAzrOZM2aNXjg\ngQeQz+dx4okn9jiis6+vv27dOvzxj3/Epk2bRDppAPiP//gP7Lffjjra1dXVuOqqq/ZqX59//nk8\n99xzcLlcCAQCOP/88zFq1Ki91p+dvPHGG7jttttwyy234IADDkB9fT3mzZuHESN2RBCMHz8e5513\nXp/1s7t9XbFiBf7yl7/AsiyMGTMGl1122V7rz5IlS7B27Y68a+l0Gi0tLViyZAmAfW/ebdu2Dffc\ncw9aW1sRiURwySWXiDxUyj7E3vZN7ityuZx98cUX21u2bLEzmYx9xRVX2J999tk+df2tW7faGzdu\ntH/729/ar7/+unjvrP/f3t2FNNXHcQD/uk2DkUinl3lRCa1CsxJCpCwvuoguy+hG0IuKjJhGrxKJ\njaJuTCWDiUQvJBQ4SIMghOjlIhVcSSEswmWC6ck5RWazQ8b+z4WP59FnC6en7az5/Vxtuu185/m7\nH+e/c37/4uK4yhoIBNTbLpdLXLt2Tdc8QggxOTkpLl++LC5duiQ8Ho8QYvpvevbs2ahlW0zWoaEh\nceHCBTExMSGEEGJ8fFzXPLM9e/ZMOBwO9X68jbva2lrx6tUrIYQQPT094tatWzHLRwuTsNNcHo8H\n6enpsFgsMJlMyM/Ph8s1/0prsdz+mjVrkJGRgaSkpN+8SmxEktVs/q8DrqIoUc0c6b5rbm7GgQMH\nkJwc2jU4ViLJ+uLFC+zfv1+9KjktLS3cS8Usz2zt7e0hrcpjJZKsX79+xdatWwEA2dnZePv2rR5R\nKQIJW0z+35Z55cqVIW2Z43n7U1NTuHjxIiorK9HV1RWNiKpIs7a1taG8vBwPHz7EkSNHdM3T19cH\nn8+HHTt2hDzf6/WioqICdrsdHz9+jFrOSLMODQ1BlmVUVVWhsrIS79+/1zXPjJGREXi9XvXDGoi/\ncZeRkaHm6Orqwo8fPzAxMX+bfYq9hP7O5G/W0NAASZIwPDyMq1evYv369UhPT5//iVE003juzZs3\nePz4McrKynTJEQwG0dTUFLZ/0IoVK9DQ0IDU1FT09fXhxo0bqK2tnXNkFWvBYBCyLMNut2NsbAx2\nux01NTWLWoDoT2pvb8fOnTvntDSPt3FXUlKCe/fu4fXr18jKyoIkSYtqwU7Rl7B7RZIkjI6OqvdH\nR0chSdJfs/2Zx1osFmzZsgX9/f1/OuKcbS0ka7SnDOfLoygKBgYGcOXKFdhsNvT29qK6uhqfP39G\ncnIyUlNTAUxf8GaxWCDLsm5ZZx6Tm5sLk8mkrikRrUwL2ZcdHR3YvXt3yPOB+Bl3kiTh/PnzqK6u\nRlFREQDoXoQpvIQtJlarFbIsw+v14tevX+jo6FjU6mF6bP/79++Ymppemc7v9+PTp09RPXMqkqyz\nP/y6u7vVRXb0yGM2m3H37l04HA44HA5s2rQJFRUVsFqt8Pv9CP67yt3w8DBkWV5UO+0/lRUA8vLy\n1LOn/H5/VDNFOu4GBwcRCASwefNm9WfxOO5m78/W1tYFr0tOsZOw01xGoxFHjx7F9evXEQwGsXfv\nXqxbt0737Tc3N8NqtSI3Nxcejwc1NTUIBAJ49+4dnE4n6urqMDg4iNu3b8NgMCAYDOLgwYNR/aeO\nJGtbWxt6enpgNBqxfPly2Gw2XfP8jtvthtPphNFohMFgwPHjx0Paccc6a05ODj58+IAzZ87AYDCg\nuLhYPXrSIw8wPcWVn58/50SKeBx3brcbjx49QlJSErKysnDs2LGo5SFt2E6FiIg0S9hpLiIiih0W\nEyIi0ozFhIiINGMxISIizVhMiIhIMxYTSlgtLS1obGzUOwbRksBTg+mvVVJSot7++fMnTCaT2mqj\ntLQUBQUFekUjWnJYTCgh2Gw2nDhxAtu3b9c7CtGSlLBXwBM5nU58+/YNp06dgtfrRVlZGU6ePAmn\n0wlFUVBUVIQNGzagsbERPp8PBQUFc66wfvnyJZ4+fYrx8XFs3LgRpaWlWL16tY7viCh+8TsTWlJ6\ne3tRX1+P06dP48GDB2hpaUFVVRXq6urQ2dkJt9sNAHC5XGhtbcW5c+dw584dZGZmor6+Xuf0RPGL\nxYSWlMOHDyMlJQU5OTlYtmwZ9uzZg7S0NEiShMzMTHz58gUA8Pz5cxQWFmLt2rUwGo0oLCxEf38/\nRkZGdH4HRPGJ01y0pMxe5TAlJSXkvqIoAKYXjrp//z6amprU3wshMDY2xqkuojBYTIjCWLVqFQ4d\nOsQzwogixGkuojD27duHJ0+eYGBgAAAwOTmJzs5OnVMRxS8emRCFkZeXB0VRcPPmTfh8PpjNZmzb\ntg27du3SOxpRXOJ1JkREpBmnuYiISDMWEyIi0ozFhIiINGMxISIizVhMiIhIMxYTIiLSjMWEiIg0\nYzEhIiLN/gHQEZfAVB/ONAAAAABJRU5ErkJggg==\n", 183 | "text/plain": [ 184 | "
" 185 | ] 186 | }, 187 | "metadata": {}, 188 | "output_type": "display_data" 189 | }, 190 | { 191 | "data": { 192 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZMAAAEaCAYAAADUo7pxAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzsvXmUVdWZ/v+cO09Vt+rWhAwyOjCo\nxHmIYiIhJiYuuzWJUXFpdNmJU0uUrzikg0EJioq6gonpxpCoWYmxE+x0jLFBUeOIYmIEQSWCMlPj\nncdzf3/ws6j32YcaqEuhVe9nLdZi33vOPvvss0/tu5/9Dla5XC5DURRFUfqB60A3QFEURfnso5OJ\noiiK0m90MlEURVH6jU4miqIoSr/RyURRFEXpNzqZKIqiKP1GJxPlgLN06VJ4PJ4Bv+7pp5+Oyy+/\nvKJ1rly5ElOmTIHX68Xpp59e0boV5dOMpX4myv5g+vTpGDlyJJYuXdr52ebNmzFq1Cg899xz4g9t\nJpNBPB5HU1PTgLaxtbUVHo8H1dXVFatz4sSJOO6443DHHXcgHA4jFotVrG5F+TSjKxPlgBMMBgd8\nIgGAWCxW0YkEAN5//3186UtfwqhRo3QiUYYUOpkoe2Xx4sWYNGkS/H4/Ghsbce655wIAxowZg9tv\nv10ce/nll3euNi655BKsWLECv/zlL2FZFizLwsqVKzFq1CgAwBe+8AVYloUxY8YAMGWuT8ovvfQS\njj76aIRCIRxzzDFYtWqVuOaKFStwxBFHIBAI4Mgjj8Tzzz8Py7Lw6KOP9ur+WOb6pDxv3jwMGzYM\nsVgMF198MZLJZI91rVy5EpZloVQq4eKLL4ZlWZ2rsldffRWnnXYagsEgamtrccEFF2Dnzp3i/OXL\nl+PUU09FKBRCNBrFtGnTsGHDhs7+nD59ujj+0UcfhWVZneXNmzfj3HPPRX19PQKBAMaNG4eFCxf2\nqh8UpRLoZKI48sMf/hA33ngjrrzySvzjH//A008/jaOPPrpX595///049dRT8c1vfhPbtm3Dtm3b\ncPLJJ2P16tUAgP/+7//Gtm3bjMmhK7Zt46abbsL999+P1atXo7GxEd/85jdRLBYBAFu2bMHXv/51\nnHDCCVi9ejUWLVqEWbNm9fu+n3jiCbS2tmLlypX4zW9+g//93//FnXfe2eN5J598MrZt2wYA+MlP\nfoJt27bhW9/6FrZv344ZM2Zg5MiReP311/HHP/4R77zzDs4777zOc5cvX44vf/nLOOaYY/DKK6/g\ntddew8UXX4xCodDrdl955ZXo6OjA8uXLsW7dOixZsgQjR47sewcoyj4y8LueyqeeVCqFu+66C/Pm\nzcPVV1/d+XlvJ5NoNAqfz4dgMIhhw4Z1ft7Q0ABgt7zU9XMnyuUy7rvvvs5rzp07FyeeeCI2bNiA\nww47DA8++CAaGxvx0EMPwe12Y9KkSZg/fz6+8pWv9PV2BaNHj8aiRYsAAIcffji+9a1vYfny5Zg3\nb1635/l8vs57ikajnf+fP38+qqursXTpUvh8PgDAI488gqlTp+KFF17Aaaedhttuuw1f+cpXcN99\n93XWd/jhh/ep3Zs2bcK//Mu/YOrUqQDQuepTlIFCVyaKwZo1a5DNZjFjxowD1gbLsnDUUUd1locP\nHw4A2LFjBwBg7dq1OO644+B2uzuPOemkk/p93a7X/OS6n1xzX1izZg1OPPHEzonkk2tEo1GsWbMG\nAPDmm2/2u6+vu+46zJ8/HyeccAJuvPFGvPDCC/2qT1H6ik4mSp9xuVxgI8C+SDK9vUbXieKT/QHb\nto3PKknXP/qfXKPrNQ8EvenvSy+9FJs2bcJ3v/tdbNu2DV/5yldw0UUXDWQzlSGOTiaKwaRJkxAI\nBPDMM884ft/Y2IitW7eKz9566y1R9vl8KJVKxmcAjM/3tY2rVq0Sdb366qv9rrfSTJ48Ga+++iry\n+XznZ3//+9/R0dGBKVOmAACOOeaYvfY14Nzfn+w/deWggw7CpZdeil/96ldYsmQJHnvsMcTj8Qrd\niaJ0j04mikEkEsH111+PuXPnYvHixXjvvffw97//HT/+8Y8B7PYh+e1vf4tnnnkG69evx6xZs7Bp\n0yZRx9ixY/Hmm29iw4YNaG5uRqFQQH19PSKRCJ555hls374dbW1t+9zGK6+8Ejt27MD3vvc9vPvu\nu3juuedwyy23ANg/K5Z95eqrr0Y8Hscll1yCd955B3/9618xc+ZMnHrqqTj11FMBAD/4wQ/w5z//\nGddddx3efvttrF+/HkuXLsX69esB7O7vdevWYfHixdiwYQP+8z//E48//rhxnaeeegobNmzAmjVr\n8Pvf/x6jRo1CVVXVgN+zMjTRyURxZN68ebjjjjvwwAMPYMqUKZgxY0bnr+Ebb7wRZ511Fr71rW/h\n1FNPRTQaxTe+8Q1x/vXXX4/6+nocddRRaGhowEsvvQSXy4XFixfj8ccfx8iRI/G5z31un9s3YsQI\n/M///A9efvllTJ06Ff/+7//euUkeCAT2/cYrTFNTE5555hls3rwZxx13HL72ta9hypQpeOKJJzqP\nmTFjBp566im89tprOOGEE3D88cfjl7/8JbxeL4Ddk8ntt9+O+fPn46ijjsKzzz6L//iP/xDXKZfL\nuO666zBlyhScdtppSKVS+POf//ypmliVwY16wCuDhhdeeAHTpk3D22+/jSOOOOJAN0dRhhQ6mSif\nWX7605/iqKOOwvDhw7F27VrMmjULtbW1n8q9E0UZ7KjMpXxm2bRpE84//3wcdthh+N73vodTTz0V\nf/rTnwDs9u+IRCJ7/bcvdFff/PnzK3lrivKZQ1cmyqCktbUVra2te/1+woQJfa7zgw8+2Ot3sVhM\nY3EpQxqdTBRFUZR+ozKXoihKBUlv3Hygm3BAGBIrk69+5x+d/zcc6QJ+43iPV4Ysy6WzVM6IsmXJ\nObmmUcoddQfVGNdgk81MKifK6USGyilqo9eoMxCWJrGBkLy3YkHeey4tr+n2utETqY5Ut98HI0FR\nrmmQId5dLtNUdefmFmpnkc6R/ZtJpkW57OChXsjlRTkYCYtygNrZ1dseAPzUd/6A2d8uD7dL9ufO\nj7aLMj+zmkZzXDCZpBx7Xr+so2zL1zdLY9UXlB79gDme8xnZbn9I9k0gJMdVIW9GO0jHZWRlrsOm\n946v6fLI/g9HzX0tfi/5XeZnmGxLiHKixfRr8gblvT33+AnGMfvCn7yH9eq4swrrK3K9TwMa6FFR\nFKXCWN6h59+jk4miKEqFcQd7XuUPNobEZHLCF/eE825ulvKRy8FD2O+X3UIqFqJRKR34/fKAbFbK\nLsViz4ECPR4pB9mkPvq88hqlkqlO+ujXUDBIMkxGtiOXl3VwndmsGUOLuysUkn0VCctrkvIAUp8A\nAMNHypAfqaSUuXI5Wea+CYdNKSccke3qqU4Wez0kYdXVmXJoMND9lmP4jBGizCHJCkXzGaYz8rOd\nO1lSlQ8gEJT3mUzIDmZpEwCScZK5slK28pGk5yH5syYWMuqsqpLnhEPynFRatqO1RbYhGJLnBx3+\nGPO9ezyyzM+wtVVeI94uywAQrQ0an1UCl0dXJoqiKEo/UZlLURRF6Te6MhmkdJUT2PrF8ppShcst\nB4LfL5fcJaojHi92+72DAROKJHHk81KC8vlku4qWPD6XM+WLYlGewzJWhmQrbhe3qVgw5TmW/LhO\nlsG4TpYmAKAmKoch33siIct8DcuhgzMkq7DU6CXrIQ+NAy+VOzpMC6ZUWh7DUmQqLdvFbXAKwhgO\ny3bV10tro0RCjjUv9WcsJo93lCqpv9p2yWP4HfFQXyU6TLkoGZfWWdVR2Y4SWdzlsvI+8nnZhmzW\n/NNUR33hpvc0l+u+f8NVplTp8+2fvQ1dmSiKoij9RlcmiqIoSr9x+4aeP/iQmEymHLrHUsRPS/ai\nbf6CKNFnHrdc9rtJcupIy25kHzqXw7gK+GQd9RFphVPtk4552RJZJ+VNC6YqPzkhWuSkWDId77qS\nL1HflMzhEfZLuSfkke30uuT3RVvWkS6a7WYKNskqWdnueFp+HwmaclxjhKygQJJfUdYZ8EjZhckW\nzb4gNQhuV/d1pPOyjkTGlFh4rNSE5DOs8sn+5b5qTkkpZ1eb2e50Vj6DMWOkNV1jTI7/mrC8Lx7/\nAFC0ZcOTWVnm+/J5pBWVx0XSsMu8hscln7NND6Cd3sOSLa3OwgFznNQGc8ZnlcBJeh3sDInJRFEU\nZSCx3LoyURRFUfoJG/EMBYbEZHJM00ed/y9DPuTWQtQ4vjkt4zgNj3SIcrU7LsrZmFyyJ4sU28ht\neupVu2QdHlse4y5LaSHrk23KO8UUgzwnactzLJInfCTL+EiiCllmHK5gQcY7KlOs0LRLSia2S8ow\nDW4pPzmRtmRcpizfK4W0csOUL2x6zgVbylrVPnlvQUvKdS6qswoyfhgA+DMy1pPtlu1sjwwX5XhA\nNtyKmlJOwJKWUuGiHHtZj3ymzYV6Ua4jf8Kwz3zFU3n5TPIF2VeN1WSZ5ZNtCnl68QyL3TsDVnll\nLK+wze+DaT2X88iby5TpZqXfr/Gup0tmm1wWj53KODGqzKUoiqL0G92AVxRFUfqN5WR1M8gZEpPJ\njtweKYCtcgol86FX+aXklCxIZ6mdaSnlsCVK0COX6EnbjGXUUpJ1BNxScgp5pbRQLlIcIpjL6CxZ\nShXL8t5sslJrp2V/lU/ed9ZtSmlZe5goc//VuaQMxjJCa9GUFfNkNcZ9UettF+VgSV6D5Q8AyNns\n4CbbkSjIcxKQZZYA2zy15jUonhSfA9NfkI43pZyOktRqdtkynYGdk/3toWv6ySrNyQqtQGOpKigb\nypZtLI86SVg+knLrfJRWoCz7qiUvJb+US/Z/yEEO9aF7a8W0LduVpve2VDbfmf2lRqnMpSiKovQb\n3YBXFEVR+o2uTAYp7dk9cs2uDnnLTnkmayLS2iXgIec/ioEV9MnveTnNjmUAkClQNseCPCaekxIT\n1+nkOMbH5Ivd67ZV5IDIslg6a8pH3O4sWQKxrMKOZrlSz7GQWopSntjuktZdLENaDmHt2QGTHQxZ\nnkvl5PFuki4DXtNiLFuQdfBY8rodBlcXaoJmw0v0DOLksFkk6czmsUYSVtoMo4VsjmKwRfmZye+T\nFrXBNsdVwCOfWYdbWp21pSnrJ0muNUGShl2yPsCU29gRNZmTMm9HhlJJOPx9rwmZUmMl0D0TRVEU\npd+4HQLIDnZ0MlEURakwKnMNUnxdLHkOqpXSQtnBwoPlCl4eez0U54mcwFhCYTkJAA6KxI3PutKe\nkxJTvkhh8B3azdZa9WFpERPyyHvPU9ysHElUXrcp7QTJyowtakpl2U6/W947x8QCgOYUOXmSrMhx\nmlpIMuFYaAAQ8ZeoLO/dF5AWS7Fg9/3rZD3XEJb35nVRX5C8yTKiEzweI9WyvznWWWtaykGtNBZr\nq8zOCdVzjKvuMxb6PPL4iM8czzzm+T7G1UjrLr9LWmbxuCnBIdMiyVp2mS3bZDujAdkGfj6A+Q4A\nPceO6w0DJXO98sor+N3vfoctW7Zg/vz5GD9+fOd3f/jDH/Dss8/C5XLh0ksvxdSpU7uta+XKldiw\nYQMuu+wyPP7441ixYgWqq6tRKBQwefJkXHbZZXB1c19Dby2mKIqyn7FcVq/+9ZY1a9Zg8eLFxuej\nRo3CDTfcgIkTJ4rPN2/ejJdffhn33nsvbrnlFixZsgS206+ubjjrrLOwcOFC3Hvvvfjoo4+wdu3a\nbo8fEisTRVGUgWSgZK6RI0c6fr5q1SqcfPLJ8Hq9aGxsxLBhw/DBBx/g0EMPFcc999xzWLZsGUKh\nEEaPHg2v11QOisUiCoUCIpGI8V1XhsRk0hjaE98o5JLST6Fsdl6iKCWmZF5KCQ1+Gccp6JZSRM6W\nS2VejgOmPMRUeaUM4PL17VcF4OwU1xW2Mqv2y74JukxTIJ8l22WTHBEvSmfMiEfGwIp5TQumGp8c\npBxXy++S52RCpqUPU+WWsZ+KNNRT9IxtSiEZInku4DZDlXP8Lhscdp0s2co9jwuWcvjeWW5jGbEh\nLOtk+QgwJSlDxiU5yE1lp1hzHrJo9FqyXexwWIR871jW8lhmOH8vqC/ombm9lDGS+or7FgAsh3hd\nlcDl6X0Gxzlz5nT+f/r06Zg+fXq/r9/a2opDDjmksxyLxdDa2iqOaWtrw+OPP44777wToVAIt912\nG8aMGdP5/Z/+9Ce8+OKLaG5uxtSpU8V3TgyJyURRFGUg6YvT4oIFC/b63c0334xCoYBsNotkMonZ\ns2cDAC688MIe90B64v3338fkyZNRXb076sJJJ52Ebdu2dX5/1lln4eyzz0axWMS9996Ll156Caec\ncspe69PJRFEUpcJUSuaaP38+gN17JitXrsRVV13Vq/NisRhaWvYYPbS2tiIWi3Vzxt7xeDyYOnUq\n3n33XZ1MuoZ7ZynC4+DJFKEQ2yxGhCm7YAAUR4hOYEkFANLkmMeSU9iTozJlNIQpNbDkxMt6F0lr\nLLXlSfLjvnK6RrEshxDH4mIpyG2b8gWfk6XYTx5qJ8uKfA3AlDhY7uE6wp7uZUQnSSpVks+V+9O8\nLylzOWW+5HMylrRc81HcMr4PD8XVytumdVLK6n7sGX1H39sOFj02jbWOoowx5iU5zksyVoHGUcAy\nZUWWxrK27BtOM8DSpJPMxTJipTjQTovHHnssHnjgAXzta19DW1sbtm3bhgkTJohjDjnkECxduhSJ\nRALBYBCvvvoqRo8ebdRVLpexbt06jB07tttrDonJRFEUZSAZqA34119/HQ8//DDi8TgWLFiAMWPG\n4JZbbsGoUaNw0kkn4fvf/z5cLpejWW9tbS2+8Y1v4NZbb0UoFDL2RD7ZMymVSjj44IPx5S9/udu2\nWOWyU0CRwcU/N2zo/D//2i45zKe8Kc9RUjmxD69M0mUZSsJpZcI+BwdiZZKlKKu8MuFfjwDgo9gl\nvDLhX4sRt9yA95XNTf0kZTXi/uZkTMYKwGFlwveep81vs2/6vjLhZEs9rUwyRdk3vVmZcNicHlcm\nVi9WJj2sivnvIPdV0GOuGnj1mC3Je+3zysRlXoPb0deVidM44RXrkYc0GsfsC5uuOKdXx43++bKK\nXO/TwJBYmXyYHtH5f3452RrGiTTJE/9slX/8wuQgx7GNmhPmH40C/Z2uCsmBHvTJa5bL0krKyaHQ\n7y51ewy/jPxHxIgp5hCen//QlMjhjeOWFUoydHsmZ9bJ8abYECYSpOyN9PPH7yBR8b0nshwLTbY7\nGpKNCPnkA+LYaQCQJUdSvo9cga2mZMPDfrPddWE5OfAzacnIHyYeF/0goP7n+FS72yHLHM+Lf14G\nqZ0Br1PcLFnuKW4Zp0jnZxoNmO+lcQ3qf6PdXp7sHX507KcFRF+suQYLQ2IyURRFGUgO9J7JgUAn\nE0VRlEqzv5Y8n2J0MlEURakwGuhxkPL86r0vOb0+h/ABBQos2Cw3v1u27xTlSI3ccK9rkOVy2dzI\ntuiXi99PARL9ss3FomxTqeSQgtQlH2cmQ3s5FBY7GOhe183nHUxuSZgOh2nTn3XtrKwjHjc34LnO\nYoE8y3v4PtEunw8AuEmUD4TkHpTXR97/UbkPwM8HDpu30Wj3/dfaKu81m5XjIBAwX79crnt7GB95\nmvN92jYHyXTYdC51b2yQz8s64m3SAMLrM9vddJCMYhAim5NsVtbJY5P7xuMx31kfPbNiQe6r5HJk\nnEBplW2HoBN8zulTzGP2BZW5FEVRlH6jKxNFURSl36g11yAlnd6zlOWlspNUkSezUpYSfAEpmZQK\ncv3c0S7ljTLbPQIIV3WfN6FE59glWfY6ZHLL5Uj+6ZDtCNCyP0/Sg9dH0QEcpAb+zOORv8BYjsvl\nqG9IMnHCRb/qCiRFuKgNw0dJ82MnEgnpc+Dxkm9FD7GUuK8AoL2dfCfomfB9hMOy/90O17Qp3XCA\n8qzwUOKxlmgnn6eE2d/VMSlJjT+k+/4LBmW7exN3atcuCn5Kz9Dvl396uC8yKdOPqkx+UCwTZrKy\ncz7+p8yhUj9MmvQDwOgx3UfC3Vd0ZaIoiqL0H90z2b/Yto05c+YgFothzpw52LlzJ+677z4kEgmM\nGzcO11xzDTweD3bt2oWf/vSniMfjiEQiuOaaa1BXVwcAaG5uxs9+9rPOIGY33XQTGhsr47WqKIpS\nCUwDjsHPgE4mTz31FEaMGIFMZvfS+9FHH8VZZ52FU045BT//+c/x7LPPYsaMGXjkkUdw2mmn4fTT\nT8c777yDX//617jmmmsAAD/5yU/wr//6rzjyyCORzWZ79dCiNXskpcY6ectBPx9tejuPaJIe2MWS\nLGco8kOBrMHiCVMiYWutxrruPYaZSNC0yCmW5L21JeTN8cq7tlpexOOW5WTGQUoj9aE6LM8JeGU5\nkZFtaGww5b14kqzOSPKoJSmysZbSswZNSYS999tSUfk9eX37qN0hyh/jdZB2dnRI2SVDhmrcv+ms\nrKM9YT7kjL97rZ0t28aOpcgIHlnOOwR48NMj4HcoSxZlo0bIZxiNmO0uknVhIk3SGHUft4Hbmc6a\n1wj6ZSX84z9foEoPkX3B4wowZcNKMRStuQbsjltaWrB69WqcccYZAHabg65ZswYnnngiAOD000/H\nqlWrAOxOOTllym4bvcmTJ+ONN97o/LxUKuHII48EAAQCAfj9DrOBoijKAcTyuHv1bzAxYJPJ0qVL\ncdFFF3X+CkokEgiFQnC7d3do10xgo0ePxuuvvw5gd1TMTCaDRCKBrVu3IhwO4+6778b/+3//D488\n8kif8xoriqLsbyqdA/6zwIDIXG+++Sai0SjGjRuHNWvW9Hj8zJkz8fDDD2PlypWYOHEiYrEYXC4X\nbNvGu+++i7vuugv19fVYtGgRVq5ciS9+8Yvi/OXLl+PJJ59EOp3GkiVL0BDbc5tVIbmurQ07pAd1\nCKLYldYUOzrKObmhRl5jRKM5ZxtB6SiwIMsGHEDR5zHX5/xZwCfr4OCFZQoimMh2H7gQANz0Y4oD\nJhYo+J6Lgl5Gq8wXqKFWnhOiwIIc7djj6j6yLgB4vBQ9l+Sjole2I0gpXzkoo22b7a6rktfw1sh2\ncf+ar5tZZ3VEtpMDIjIBH4/nngOX7opLOYiDjvL4jfjlAT6HwJo23WtthKJzU/8VHJxuu1JfY36W\nJTUzT2MvFJDlSKD71NiAmbulUr+vLWvoyVwDMpmsX78eb7zxBt566y3k83lkMhksXboU6XQapVIJ\nbrdbZAKLxWK44YYbAADZbBavvfYawuEwYrEYxowZg6amJgDA8ccfj/fee8+YTCqVR1lRFGWfGGSr\njt4wIJPJBRdcgAsuuADA7vSTf/zjH3Httdfi3nvvxauvvopTTjkFK1euxLHHHgsAnVZcLpcLf/jD\nH/CFL3wBADBhwgSk02nE43FUV1fjnXfewbhx4wbiFhRFUXrNUNyAP6B+JhdeeCHuu+8+/OY3v8HY\nsWM7Vxhr167Fr3/9a1iWhYkTJ+Kyyy4DALhcLsycORM/+tGPUC6XMW7cuF6tQLpa/+xskw85mzdj\nc1WH5TKe85PkyRKIj2+ISLMep5wpbZSX4r3NlByILFdYnmuOm5t3vJ/HMkzEJ9vBliyJLCWl8pvy\nEcsqHSnZnxmSIuqiXEfP5jM+khk5pwfnsWApDTBlKT4mQrJWnqy/yj3kaQFMi7AgyYr1lJukJiAd\nCJvTZtI0zgPCMiI/Y5Y203k5jnjsAmZ+kjIl/uJrsJVfyGtaz3ldsj93pWR8ukROVsrvEF/DaW+a\n75UlQDfda8Db/TgCTImvUgy2/ZDeMOCTyeTJkzF58mQAQFNTE3784x8bx5x44omdVl7MkUceibvv\nvnu/tlFRFKU/WLy5OARQD3hFUZRKM0Ay1+OPP44VK1agunp3qJhvf/vbOProo7s9Z/HixTjmmGNw\n4oknYu7cuWhra4PP50OxWMRZZ521z/vNQ2IyWb1272vZVNLB4Y1CcIcj0vqlulqWAwE5cN7NyO8z\naadfKXIJnkrKvPIcIp1jYgWDZp2RiHycLe3ymDhZ8XD8r1BQlqsi5lLdS+1IpijEfEL29Y5d8niv\n16yT4zK5LAprT5IBZw1IO6SmjcelpMfOkn4Ko85R2dnxtLXNlCr5mQwfRrGi8qaM1ZWWDvOzjz6S\n0hjHS+NnXF3VvfWX0w/kIr0OCXqGfO/5PEl+OVMaBuRnVVWcslheY8d2mTaAn/GoUVImA4DaapJU\nyVm4pZXe27DsK7b2AoB8Yf94LVbaA37NmjVYuXIlrrrqKuO7s846C2efffY+133ttddi/PjxSCaT\nuOaaa3D66afD4+n71DAkJhNFUZQB5VO0AV8ul/Hwww/j7bffRn19/V4nimw2C7/fD9c+tl0nE0VR\nlArTlw34OXPmdP5/X9wa/vKXv+CFF17AuHHjcPHFFyMSkZGQX3/9dWzduhWLFi1Ce3s7vv/973da\nyALAAw88AK/Xi23btuGSSy7RyaQ7usbBat4pLWo41hEAhCgrX5kkkM0fJ0Q5FZd11tbLhxkImt3M\n2eycjukKhxpvazEdx2rrupdVOJsdh8b3emS2QY+DRlKkUPgZCntfou/Z+Y9jkgGmrNLWJvULMwul\nLPM1AfNl3rpN9h/LEOFI9/3vFI6f5bkPN3Xfbj6+WDSfIfu6cQh/Hq6csZDva/gwU5IKS6M9JFKy\nvG2rlFw5HLzLwZOS0zAUKNuoTWMtlZR9VUWZLtvaTVlx5055jVbKgNq2Q+qGviDJ07WmdBar53em\nQuGZ+uC0uGDBgr1+d/PNN6NQKCCbzSKZTGL27NkAdlvCTp06FTNmzMB5550HAPjtb3+LX/3qV7jy\nyitFHe+++y5OOeUUuFwuxGKxzlBVn/CJzBWPx3Hrrbdi6tSpaGho6HX7P2FITCaKoigDSaWsuebP\nnw9g73smNTV7QgWcccYZuPPOO/f5WtXV1Rg7dizef//9fZpMPj3CnqIoymDBZfXuXz9pa2vr/P/r\nr7+OUaNGGcdMnDgRr7zyCmzbRltb215DWuVyOWzcuBHDhg3bp7YMiZWJu8tDi1CGw3g8x4cb0hdL\nHJyt0V0rl8q9kbCqqB1ZkiuZBYsNAAAgAElEQVQ4M2A6IWWa6pgpaTll7utKiDIttrVKeW7bFilv\n5B2skerqZLtrqmVftJFywxKUkzVXKEgxlUiHSaVlpS0tlEEyYPavj6ygiuT8V6ZnnOigrIl0fixm\nhs5nmbBEWRL5Gmxh5iRz8VhraJDyD/dVjsLFs1VaS5sZn4qlMn5GNTHZ/9Ea2Qan/QD+KBqVY42v\nwRaSXsrY6SQr1lTL5xyNyjrqSFpji0eWBAHnjKWVYKA84B999FFs3LgRlmWhoaEBV1xxhXHM8ccf\nj3feeQezZs1CfX09Dj30UPH9Aw880GkaPG3atH2OKjIkJhNFUZQBpcKmwV2dvbvySZ6n7ptidUYR\nYebOndvfpnWik4miKEql+RSZBg8UQ2IySXdZ3rLTV8DB+S+ZkHJEgSxVPLQ0dpG85KYlfUcbpeCD\nKSXUkowyfDhZVnlkpsCSQ3RtHr/sUOgiLYKdujwkNbCk4lQHOxAePILqpJhLHUmHbHc2y4ryGLYA\na2oicyQHslnKxkiyC99HNis7lJ8p17e7DlnmdnaQdOZ2s3RmWg7xD1qWh9JSmTT6KhruPmvi7jrk\nvYTD8h04iN4RrqM9bg4+lmmTSWk5GAhQmgF673K5nvMSBcjpMEpOjFUU9n77DmmFlkyYknY4Ylp4\nVQINp6IoiqL0H81noiiKovQbjRo8ONm8cY/5XNPwavHduDGmZDLhYLlE5VDjbDHjdch62BWnbHmp\nrKwzlZZ1cJbDIMkZsWrzmhyuvERh1TmKQiQkv+cMh5xhDwCKtjwnlSOrKcqgx2HDR9SbcgZn3eP+\n9VNmP86Ol82bL66PpMiaIFlruSkEPYW15zD3Dr6tqPJLGSVVkFLauo2yHAn3/Gs1QMpXdUjeezon\n7zWZJkmQjM6cLPwiJGvVROQ1OAVAOk/xwULmn42SLT/rKUMk05GQJ4RDZrsPilHGR2pntiDvKxqR\nnZkvVsghsRdopkVFURSl/+jKRFEURek3ujIZnBwxtbHz/2wtk8ma+gXHLvJ5u7dIYmmH8TtE7GZp\nIUSZFVmySpIM5hS+PEMWR2yBxNZGLN3YJGHF0+YL0UJeiV66t4PqZaVs8dSWNK1cWNbiH3VteYrF\nRdZf1WEHCzG6t11Fqf/wOAj7ydHUI8s+j2nB1JaWssmWZtnO+pg8fngtyWI5sy9aEvIzzgyay7Pl\nmzw/SwZLLHs5sWmb7Iy6GnlNltpYggWAbErWwZG1OKZbmCwFa6rJItLhbzFnz/RytkwabHlSaSNB\n811nGbZiQUHUmktRFEXpN7oyURRFUfqN7pkMTiJdLEOCZNDhZInFn7WRpcnOFrl+LuSlDMAxgZys\neNJZtoKS12Q5o1ikWFJJ09KKx29DPVkTOVjIdMVDK3O3y+wbTz05bBpVkpMd3SdfAwD8JCOyxVLE\nCBEmv29PmO20STrj/uR210Tlq1Auy3I6Y14jQNIky6NMc0I+Dyd5tED6EGcCZEfScEB+72TZxrD0\n2BjjuGXy+2SGsyaadbKclqf74BhiDPdla8o8fvN2WY7VyA7nNrBjb0uHk8Nst83ad3RloiiKovSb\nCsfm+iygk4miKEql0dhcg5OPt+6xoilQeHin+EgcDymblVY46bSUmJp3yMyLdY0y0yIQAOOUcbAr\nLEmxJOLnoFgOsDRTIGUsRLGOKNI7PA4Ob14aMduaOXS+vAjHQuP4SYApgeRINuQfeakUxX1yiK/W\nWMeylfy+tY3tjShmW6DnX5aFIksxVrffc386xc2KJxxMpbrWWSCnUXJaZEu3WNS8D7bGYktBlofY\n2o4lVwAIUn/5KNUAZ+TkGGOhIElpDnsO/M44hdfvCo+1aMTJ6m8/rSBcas2lKIqi9BddmSiKoij9\nRvdMBif1sT2SUIBMPto6zKUySwW8rK+tldJYXZ0st7RIz7H2dimTAWYmOZZqyuXul8lOzmgs5eQL\nUlrYuVO2y02/nmpqKQx+k9kGjt+Vq+4+hhX35a5mlpeAVMr8rCsHDZNWO6H6nj3xWEIynBQp/H4q\nLceBTf3vJO3wvfKPUc4eWEUyC2eQBIAd22S2y9FjZSy5+hg7McrzUxm2pjMuYVqyVckyW9exNVc6\na/6hZNWV+99NFy2wdkY4yXPhgDwnXyTJLyPPSSTl8dsdHJRNKrSiUGsuRVEUpd/oykRRFEXpL+UB\nCqeSTCaxaNEi7Nq1Cw0NDZg1axYiETYAksydOxczZ87E+PHjcdVVVyEQCMDlcsG2bZx//vk47rjj\n9qktQ2Iy6bpkDpJMU19j/oIok4VHS4fsJs5gOKJJHj9lvJRhWhJmN3MMJV71t5P8FqZw8aObnOQ5\niu9F4eF9jVIuqqslmYvihbksU4roSLE1kTyGLWjYUqs6Yr5kNTXdW6Y11HLcMvl9POngtEiyVIzi\nTbGMws6sbD330TazXVXkjFpLaQHyFF8tkZbnc6ZAAKiZXCPK7OTJUk+I2u0ira0jbj5DjlsWDNA4\nIaddlr06pPEiAHM8s+zF1lpFsvbijJFcH2CGdWeZkaU1dhZurO25LypGhWWuNWvWYOXKlbjqqqvE\n58uWLcMRRxyBc845B8uWLcOyZctw0UUX9anuH/7wh6iursbWrVtx++237/NkMvSEPUVRlP2N5erd\nv36yatUqTJs2DQAwbdo0rFq1yjgmn8/jvvvuw6xZs7Bw4ULk8+YeLgCk02mEw/uexnhIrEwURVEG\nknIf9kzmzJnT+f/p06dj+vTpvT63o6MDtbW1AICamhp0dJjhxJ955hn4fD4sWrQImzZtwo033ii+\nv+222wAAO3bswKxZs3p9bUYnE0VRlErTh1XHggUL9vrdzTffjEKhgGw2i2QyidmzZwMALrzwQkyd\nOlVe0rJgOUxia9euxVe/+lUAwOjRozF69Gjx/Scy1/bt2zFv3jxMnjwZgYDpaN0TQ2IyObhuj7tt\nIifFXN4fAcw0pfx8hjXIDwJeqcVm6Hz2PAfMoIBJ0tPZy766Su7D5IrmYGWTT24n36uf2h2i/BAR\nv7kcZuf+ETHZn2nKPcIexpxjAjBNTUfVScGcc4nsTMiBzumJAaAjwSayFDCRPLbZjJe9xFnTB0y9\nnc/hgKFNMVnmcQMA8bTsv9a4/J7z3HCOjyL1r1PEgVi1vG7YJzehSvTMEhnZJg7K6ATnWeGcPpyv\nhANUOu1lcF4VzjnTFJXH2+i5nW5rP+UzqZA11/z58wHsfc8kGo2ira0NtbW1aGtrQ3V1tVM1vWLY\nsGGIRqPYvHkzJkyY0Ofzdc9EURSlwpTd7l796y/HHnssnn/+eQDA888/77h5PmnSJPz1r38FAHz0\n0UfYtGmTY10dHR3YuXMn6uvr96ktQ2JloiiKMqAMkNPiOeecg0WLFuHZZ5/tNA1mZsyYgQcffBCz\nZs3CiBEjMG7cOPH9bbfdBpfLhVKphAsuuAA1NTVGHb3BKpfZwG7wsWp9e+f/O3JSX4oFknw4vC6p\nFWSKUlZJFUhyKpG3NKW/5eU4AET9MtIdn5PMS5vPVF7O+x63+dhqA9LdOeBxttrovEZB3hdLJk64\nXWw+LL8PeWQbgm5ZdsPsi0RJamf5krxXy5AiJH636UHPkp5dlv3rdsl2lGz5DPma2aLpdV+kOrn/\nWiitL8tahaLZ35y2d1itHIv8jLmdmaLUkzwuU0rz0fjmscaUwRKg2W6Wxrxued1ID1Iaj/+Qw9jl\n8VzlcrBR7kK2LN/1vG2aoPO7fsSEpm7r7C3JV/+nV8dFTjy7Itf7NDAgK5Pm5mYsXrwY7e3tsCwL\n06dPx1e/+tUeHW4++OAD3Hrrrbjuuutw4oknAgAeffRRrF69GuVyGUcccQQuvfRSx00nRVGUA8YQ\n/Js0IJOJ2+3GzJkzMW7cOGQyGcyZMwdHHnkkVq5cuVeHG9u28dhjj+Goo47qrGf9+vVYv3497r77\nbgDAD37wA6xduxaTJ08eiNtQFEXpFWWNzbV/qK2t7bSFDgaDGDFiBFpbW7Fq1SrMnTsXwG6Hm7lz\n53ZOJn/+859xwgknYMOGDZ31WJaFfD6PYrGIcrmMUqmEaDRqXI8Ju/eYSgVC0lIobJkyVxFyOVxw\nyW6qD0qpIeRKiXLGlnlmd2RMDTJNsglLDwdHdoqyC1I2SNqmc5HHktJNkYMV2h4qk7d0RrbJSZ4L\n+WQ7fW4pPfA1dhXkvfvcpjmXhzztG3wtouwty2t0lGtFuS1vho8Ie+Rz9lD/hlxSZixY8pmXqO8C\nPtMlm+WfTImszEgGY0k1b5uvXyNFCIj65DleS8pFIVtKPWWyOsu5jJzHBlEvWTj2cF9OkRFyJVkH\nS2c8vlki5LFY7ZXvlNN1k7Z87gWSsVoy8t7TebO/a4MOrvaVYAjmMxnw6XPnzp348MMPMWHChL06\n3LS2tuL111/HjBkzxLmHHnooJk+ejCuuuAJXXHEFjjrqKIwcOdK4xvLly3HNNdfgsssu2/83pCiK\nQpQtq1f/BhMDas2VzWZxzz334JJLLkEoJH81dHW4Wbp0KS688EIjztD27duxZcsW/OxnPwMAzJs3\nD++++y4mTpwojuurF6miKEpFUZlr/1EsFnHPPffg1FNPxQknnABg7w43GzZswP333w8AiMfjeOut\nt+ByubB9+3Yccsghnd6Zn/vc5/Dee+8ZkwnTUdizHI56pWwQKJrL6axHSkhs8cGylseW0oMFCjLo\nYDGWZksqWvbnbCkTsANWoWRapnhIlgq45BLeJomkSDlVSpQWNew1LWoO8u+Q59AQait0b1bo9Iqx\nZVW8JB2vfC7qX7JgGhGQbQKAUFHKP3mLpBpbXtPjkhJfaykmyh150xlsWLBNlCMeOS4KZCGWKcpn\n6mSlxlJOB0l4LBO6POSAWJThNCyY4zvvln1RXZJ95baL3ZY9Rcq5C6AjfJAot7pl/3Ff8DvSGJTt\nDllmu236A52DfGZsqVkflHWEq8gzGKY0BgSNY/YFlgqHAgMymZTLZfzsZz/DiBEj8LWvfa3z808c\nbs455xzhcLN48eLOYxYvXoxjjjkGxx9/PF5++WWsWLECpVIJ5XJZhAlQFEX5tKAb8PuJ9evX44UX\nXsDBBx/cGVvm29/+dq8cbrpy4okn4p133sENN9wAAJg6dSqOPfbY/d5+RVGUPjEEJ5Mh4bS4bsPm\nzv97ICWTsoPwYh4jl6ztJSnlsAzGzlG70qa1Ub4ol/2NESmFNXl3yWuUpAVZ3CVlBABoL8j8qyyZ\nVHnkMp+lhDykDPNRstG4RtAr75Xlijw5cDYG20U54DIlkpwtZZd8Wfafj+Q5N1mtsUUTAFhlee8J\nt7QAK5bl76iIJetgi74WB/mORw7LXDYd0RurKHbYZIdMtthjqYyt/rgNgGnlF3VJiYnHQd6WclLJ\noc4dKSkD7uyQ53CsrcOaZNCxYT5pvciSFgCkyYKR77WNpEjuO34+AJAqyr3bYw+rNY7ZF1rffrFX\nx8WOPLUi1/s0oOFUFEVRKs0gs9TqDTqZKIqiVBjdMxmkDO94t/P/tkvKF61Vo4zjk2UpFwUtKQ81\nYasoZ11y+e2xpExwWFWrcY0cpOTBclFVRspcgaQsx/JrjTqLYenAubVGRgZIFKXcZlF8rxKk/OEU\n1ylIzoAxt3lvXeH7tMumM1ekLGWWYD5uHNOV9sAwUW6DGeU06pKWVmFb1hm3pJzBz5ydAxu9zcY1\n2E+ArfoCBSldukvSOs7pD07eK2WXnFuWvTY5YxZlnSmvHAPRnBw3AODLUl9UDRflgC3loNqt/5AV\nOLS7fsRRojwyImXYkcl1ouxKy/vo8Mj3sMMyZVyOt8ZWgDGfHEc1JfnMIh2m1Z9t8Xj8onHMvqDW\nXIqiKEq/0ZWJoiiK0n90z2RwsjO6J2sYW4D4S6YjE8s/bFnCS2NviZwDKS6P2zZDpHvomNaCXNYH\nArJcckvrmKLbDInuI2eyqpKUevhps0UTSztjQluMa7Bs1W5LKyeWIni572TB1EbWcQVLSh5VXvmM\nApCWbY0ls53c5/GAlML8ZXpmZKGULEnpMm859LdLSkx8bwU3WUV5pEMcjxsACKdlXDIf3XsyUCfK\nRRdbWslnmvFK+c6pHQWXbCfXmWg6TJStshmzjZ9zXXE7HSD7pkzvEEuCBb8ZFn9nUT7DjR2yLxrC\n5KTolRZ6HVUjjDq53fuWycPElM8GP0NiMlEURRlIVOZSFEVR+o1uwA9SqvJ7LI5KZM1VdJkxrtjh\nLVygMN80UBJeKUllyQkvS3GhAMBNUkE1LcnJuMuUTNxmDKG0WzptFcj5L1uSdbCzpdeSso2Tw1s0\nLy1kGkmecJF1UdEr29kelJZYu0+Sxa7Pa3clspjzSAsnJ0mhTCkgXfRMWe702HTvbpI2y2Zf+Elu\nY9hCL+uSfeFksRSqks+wKidlr1BeWiwl/dIqjcP187gBgFRZXsNDHey3pbTGVmu2y5T8jPHp6t5a\nkS3bUj4pMPHzAIBDC9Kq7DCvfIfK5AhcoDhbLttMf5DzmqkcKoGuTBRFUZR+sz/Cy69ZswZ33XUX\nGht3R6Y44YQTcN5553V7zsqVK7FhwwZcdtllePzxx7FixQpUV1ejUChg8uTJuOyyy4zo7PuKTiaK\noigVZn9twE+cOBFz5szZ5/PPOussnH322bBtGz/84Q+xdu1aTJkypSJtGxKTSSCzx6qp6JMSST5g\nOrzxQAilZNwgd15aTUU80okRFO7M9phSQy7Qvd2ItyAtU8pk/eXymVZRHDo/RI56cEnrlwzJcQ0U\nvpzvGwBKHoovVaLYUeQQ5y7Ivhre+rFRpytLodtrZEywdJWUxsI5aaUWbDOtuVxp2Y5SVfcxl1x5\naVkVqZWOfCzXAUAgLeW4bFBegy2WqhJynBT8pqWVLyNjmXl3bTaO6UqoQSaHi0elJZyTdh/LyGfg\nydNYc0t51KJw/VbJtE70hilzJVmMhZPSYdBVkBKhNyjPdzlYQHrbpYWY1ULjk9qJmBlbjom46Y/+\noZX5w3og90yee+45LFu2DKFQCKNHj4bXa0r5xWIRhUIBkYgZN3Bf6fVkctttt+Hzn/88zjjjDPH5\nj3/8Y9x0000Va5CiKMpnnb7smXRdafSU2O+9997D7NmzUVtbi5kzZ2LUKPnjoa2tDY8//jjuvPNO\nhEIh3HbbbRgzZkzn93/605/w4osvorm5GVOnThXf9ZdeTybvvfceOjo6sHHjRlx66aWdOtu6det6\nOFNRFGVo0ZeVyYIFC3p13NixY/Hggw8iEAhg9erVWLhwIR544AFxzPvvv4/Jkyd3Jho86aSTsG3b\nts7vP5G5isUi7r33Xrz00ks45ZRTet3W7uj1ZOLxeHDHHXfg/vvvx7x583D99dcjEongsxDBvjU6\npvP/HbaMXVRXNmMXRVPbRJktlFwZaXllffRPUS62krMgx98GEKmV7XCNHC3K+YaDRTnjk5Y/bNEE\nALH4JlH2JaUlUFVIWvFkg1L2ylFcqG01k4xr+MtSjmjc9ndRdsXlNd1ZebzdTn0DoJiVkofLt0GU\nI9R/7lqSkxpklj8AsOleUzWyP1lG8eW7j6PlzZrxwoo+KSv6c/IYlnJYLvJt+cCoM7NmjSwnKW3A\nKHmv7qh8hjXN78vvPzavYeCn+GkxKStmYlJK8+RNR98iWXPFvbJdbbVScorYFEer4yNRZlkXAFKN\nE0Q5QtIkErJOtMuxaCfMZ1hokVJl8IszjWP2hUpYcz399NNYsWIFAOCmm25CLLbnb8DRRx+NJUuW\nIB6Pd04cfcHj8WDq1Kl49913KzaZ9OmOg8EgbrzxRhxyyCGYM2cONm3a1Jm3XVEURdlNGVav/nXH\nmWeeiYULF2LhwoWIxWJob2/v/PH+wQcfwLZtVFXJfbdDDjkEa9euRSKRQLFYxKuvvurcvnIZ69at\nQ1NTU2VuGH1YmXxyE5Zl4YILLsDo0aMxb9485POmPbiiKMpQxslHq7+8+uqreOaZZ+B2u+Hz+XDd\nddcZP+Zra2vxjW98A7feeitCoZCxJ/LJnkmpVMLBBx+ML3/5yxVrX68zLb788ss4+eSTxWcbN27E\nG2+80aOt84GmY/Xyzv+XXXL+tBwcmZIROVtz7KL6lvWi7CbLoWJYWmqVfKZjFMfvSgalVVnOJSUn\nC6b1FhMuyGV+ICfLFluZcQyxopQNHEOk++WSOpCRUoI3IWUD2y+telrrDzXq3FWW/V3jlhZN0Yy0\nBLJdPf8G4gyaKb98JtG0tAzypWS7y255DZY6AVOKYSsnthAzpBu2JAJQDFCaAHpmbE2XCJPlm0v+\nUq3NmWHXGQ/FdGOLMpbzWqulJAsA7rJ8j8JZ2Z9p6n9OAVBrS7k5SGMXMJ+BuyhlRO4bm6zS/ElT\n0mZLwkrJXO9v2NTzQQAOGW/25WeVHt/K//iP/+ic/f7yl784HvNpn0wURVEGEg2n4sAXvyiTxSxZ\nsgSXXXbZfmuQoijKZx2dTBw4/fTTRfmXv/yl8dmnHVcXy5wyxcRyki9qWqQ1ke2l2Fo9KIPunLR2\nYcc+AHDlpLQQan9LHhCQMldypMyamPP17GzEFkmelJQv2MnL1SElK6f7DESkzMVhIyySdjgmVphk\nMQBw+6VEUtUinepYckpGpEUTpwjY/ZmUkPxFCmPfThZ7beQARxZOZa8Zj4olKIOMlFAQks8sUT/R\nOCVPUk1N24eiHOiQ7QxsfU+2M0iS1IgjjWv4yXKNx2c2LCXXIrWpKmtmnQw3y3ZyNkZ/QMpeDTQW\nS0Epz+XDZtwydgrluHocOt9XkjJYiqwXAcBD74jpmrpv6GSiKIqi9BudTBRFUZR+4xRlerDT42Ty\nzjvviLJt28ZnlQoUtr9w/+O1zv+7ghS+fMRY4/gSOby5c1KuKJHslWoYJ8rBOEkoZDECAHZAyhFu\ndmxsk3JQxCud0ZxELq6zGCRnJpa18hRCnSQSpEwnL3RIp0PLR/IPyUGGox5ZCgGApyAlKDc5hRar\npDwRyMt2+RNmDLFstXS8aw/JchXJKhbJnWyFxvU5wffm9lDfkDVX0CH2mY9jn5EciiQ9E498hVku\niiRNa64CWWdxmVPORtql7OjKOWQnpedsB+R9uOkdsEhS9dBY8+w0Y7iFwvLe4gdJp9pY81rZzqR8\nHvkm03LKdgjRXwl0ZeLAT3/6U1GORCLiM8uy8JOf/KTyLVMURfmMopOJA4sXLx6IdiiKogwaymWd\nTAYl+V17ltSBCdJ5imNJAYBru4wTBM7S56WMhX5aKqcppDfFpwIAN8lDdlJa2JSLFPab6iylzTpd\nAWoXSXpWkGxVyDILOZK9nKIbeCicNTvilSgMOMlinlbTcQw1ZGVD8psvQ9ZHG6W1XWarKRcFhktn\nvhGjSOJgSyyK2+Qmh8JQg9nfLIdyncWItD5iy0GWhgCHVAMkKdnN8l5dISlReagNHp9Dls+omXah\nK+yk6NkqLbXKaQfZdtzh3dbpaqfnTnJoiWJ32T4z9hynAQhkKc7b+1J+L+Zkf/vKpuNvmcYnpp5u\nHLMv2LoyURRFUfqLbsAriqIo/Ub3TAYpvpOndf7fNqxOzLhZLD+42aGNsCkMeJmW7ByjCQBsdoKz\n5RKc432xfGSx5RUAK05LdpacolJ2KZCVVCEow+L7k6ZzGlsXFcPynBKFsef4X04OnJyN0ZOUDm5W\nmsLDR6RVjy9m9q+LssuVQ/IcttZyU5kt2Sy2qgLgJse8LGVn3BiSjqY7UlIWGx42LdtqIGVXV53s\nLy9b7JHkyjJZySFDZC5A6Q/IkddDz8Oul06iTtaJbpIiC9VSSrMPklaTfLxVkJJUKWhmIjXuDfId\nKn/uNOMcebxJdpx8JyrmtKh7JoqiKEp/0ZWJoiiK0m90ZTIEcLVL6aa8/R/GMZZXdkt5rAybzk6K\nAQptbcTAYrkJgNUiQ6CXmmW78imyCKM6ihlT2imx9Uq1dG301kh5g621PCylVUtpCACseinheXdu\nEWVXBznucRa4egfnP5L4QHJdqVVKP5ZLykveOjOOk+FMuYscSeneyy6KMcbxvhwy/1mU2S8Yl/Lc\n+BHSKi1QI517PTDTH3iKpgwomsGWbXmSpHxSqOG4ZoBDFsm8HGv+jTLbY+Fjad2YTcg2AECZnG79\njVJC9VB2TI49x9Zdfgc5tBiRz5klPE4vEWih7I1byUoTgJfH3sRjjGP2hZ4TRgw+htxkoiiKsr9R\nay5FURSl36jMpSiKovQb3YDfTzz44INYvXo1otEo7rnnHgBAMpnEokWLsGvXLjQ0NGDWrFmIRCJ4\n8cUX8eSTT6JcLiMYDOLyyy8XeYxt28acOXMQi8UwZ86cXl0/WbfHLNETlZq9d/g4PtzwZE5ER4ny\ndmuEKFc1SJNQf530WvaWzP0NL6dKHUOmqGzCTAEAvWVzHyZEuS4KUbm/sbFaavYHZaQnOQdMzFbJ\n8wFgW3B8t3V4SH/Pkalwzmfuw3R4pRlpoSzNemvKcs8kUJCaPeexAIAS6eecgphTMVflyCSX0jln\nHdrtomfkzye6/T5sy2ecdZnm3Twu8n553R3jzxDl1oLcByva8r5H+OVeEQDUdUiPds790jr5C7JN\nh8u9H07vDABeCtbJewYcS4FT7CZDDaLs1DcuS/ZnoSyfe6BM711I7tvUcMQCmCbJlYLjtlaCLVu2\n4MEHH8SHH36I888/H2effXbnd3/729/wi1/8ArZt44wzzsA555zTbV07d+7EnXfeiXvuuQdr1qzB\nXXfdhcbGRpTLZUSjUVx77bWIRqPd1sEMiLB3+umn4+abbxafLVu2DEcccQQeeOABHHHEEVi2bBkA\noLGxEXPnzsU999yDc889Fz//+c/FeU899RRGjJB/zBVFUT5NlGH16l9fiEQiuPTSS/H1r39dfG7b\nNpYsWYKbb74ZixYtwksvvYTNmzf3qe6JEydi4cKFuPvuuzF+/Pi9pmjvjgGZTCZNmoRIRFoWrVq1\nCtOm7XYmnDZtGlatWsn/rTUAACAASURBVAUAOOywwzqPPeSQQ9DSsucXY0tLC1avXo0zzpC/zhRF\nUT5NlMtWr/71hWg0igkTJhhx4z744AMMGzYMTU1N8Hg8OPnkkzv/nnbln//8J2bPno3Zs2fvdbIo\nl8vIZDIIh82VYU8csD2Tjo4O1P7/5oI1NTXo6Ogwjnn22Wfxuc99rrO8dOlSXHTRRchkTG/krixf\nvhxPPvkk0uk0lixZIr4reaS80R6WEhUAfJiWK59yTj70WEDKLG0F6a0bpCX8sNJG4xoW5DqYU8C6\nyeTTKknZxcmTPFcjPZU3V00yjulK1kfLflJyUgHT5NZnSVmAveQtMm3N1kmpIeU1l84sa3Xk5Q+P\n5pJs56iQNKv22+Z44F99NQn5S41z1HBq5ni1HANOUlrjljdE2SLz43KDfB65KukFnrZNf+u8Xwak\n9FtSUsqV5fhNFWS5ISClNKdfv3FKe5x1yz8cHSX5jBq8MidK2Wc+wzjkO1BXkudEW/8pytz/1TSe\nPSEzGGWbR8qu8aJsd7Yox9rBwa2inI6a7zqbSZvhJfeNUh8miq5y/fTp0zF9+vQ+Xau1tRV1Xd6z\nuro6vP/++8ZxDz74IL7zne9g0qRJeOSRR8R37777LmbPno1kMgm/349vf/vbfWoD8CnZgLcsCxYl\n5HnnnXfw3HPP4Uc/+hEA4M0330Q0GsW4ceOwZs0ap2o62ZcHoiiKUin6supYsGDBfmzJblKpFFKp\nFCZN2v0D87TTTsPf/va3zu8nTpzYOaktW7YMjz76KK644oo+XeOATSbRaBRtbW2ora1FW1sbqrs4\nt23atAkPPfQQbrrpJlRV7f65vH79erzxxht46623kM/nkclk8MADD+Daa689ULegKIriCGc42Bee\nfvpprFixAgBw0003IRYzlQIAiMVixnbA3o7tDccee2ynoVRfOGCTybHHHovnn38e55xzDp5//nkc\nd9xxAIDm5mbcfffduPrqqzF8+J5l6QUXXIALLrgAALBmzRr88Y9/7PVEku4i52zOS/mikDa3jUaG\npEe7l2xRwnnp5R1KSNmFvY5T4SbjGqG0lIfc7DVPeUM8CeldjYwZbM9DaWJrbXkfNc1y6csBKDn/\nhuU3g+3FS1ILczceIcpsKRRKSOkn3G56IdskPTbXHSbKvhIFgiTv9azXTGLM8k6aZJNIgSyUKLBm\nLckwllOqWs5T45dSGQcRjRSkZ3+e0/oCKJTl2ImVyMvekhZjkZAs5yDbkC6b2nfWJeU1lhUti7zZ\nyUoqHJfyEQA0UPBHTrXcVjdBlNm6LmFRniHL9CEPl+W9+rxy/O4sy2f8XkJaYU6MmBEHqjIO+XUq\nQCVMg88880yceeaZPR43fvx4bNu2DTt37kQsFsPLL79s/G0Mh8MIh8NYt24dDj/8cLz44ot7rW/d\nunVoajL/ZvXEgEwm9913H9auXYtEIoHvfve7+OY3v4lzzjkHixYtwrPPPttpGgwATzzxBJLJJP7r\nv/4LwO4kRQOxDFQURakU+8M0uL29HXPmzEEmk4FlWXjqqadw7733IhQK4Tvf+Q7uuOMO2LaNL3zh\nCxg1apRx/pVXXtmZcv2oo44S332yZwIAoVAI//Zv/9bn9lnlciUWZJ9uNn7wXuf/jZVJyVyZDA/J\nVcNArEy8CQr3TisTV1r+KnNamRSbDhbljtoxotzXlUlH9UjjGjsgNzFr3fLXNq9MOLy85WA40OeV\nSanvK5NAUfZXpENuyHNofZDFTK9WJhxGvVH2X5yeR6vHHBe8Mmm05crOZn8Zt1yJ8MqkWDZ/L7rI\nC6Sj0P3KZGz5A1F2WplwWHpemfCm/76sTIJleY2CJVd2O/NyZRLPyXE1MSLHJmCuTGJHfN44Zl94\nanX3MdY+4atHe3s+6DPCp2IDXlEUZTChaXsHKV1/ldb6pDZeXW7jwxHMSJ06ESTvXPolHCJLNC95\note2ml7IvENX9shHYSWoXRTdtMz52x2IbXmbrinryNSPEeV0QK5MSi7zV1M95C85i9rFexPBLEUR\nLlKeeQAW5UY/aAPpubwqo0RXoRrTUz8Tkb/6M+TB7g3LX85un2wXJ2Ly+MyVYLlG9k+RTMKNfO4U\ntcBrmb9em3PS7LbOJ8dFNCGjNLP3Oq8ACm756xwAqunX+DBa6RU9ck/FopVMNmya7SbqJ4qy25b3\nFszL945Ncqts+c6kyHsdANpc8j0s2vLe63xyrB3slu+xP2+uLvcXg1/vMRkSk4miKMpAooEeFUVR\nlH6zPzbgP+3oZKIoilJhVOYapAQye/YfRuc+Ft95Wrfz4SgHKIPbMKnNsk7tSVAmwB3S2qVcNi1T\nLPJJsGgPpNAkw2owJa8Z+KFIVlF+0uzTtI+w1SOv0WTLdlu2GVG13SX18saC7E9/VurURcqG5zZN\n/WFx/+RpX4X8SuDr2VIm7peaO4dc4SjANvnUcFThVNm0GKuG1Oh5XHhL8j5qm98T5WjKzPI5vEo6\nmxWCcg/F207jlayign7Zzto2GcYEANxZc/9HVEkWd4Vq+czbomOMczps2c6ROWkBFmrZJMou2icr\n076jZZsDJUVRbO2yfIf85e7DLIV3mX3hokyhqJA1V1/CqQwWhsRkoiiKMpDoykRRFEXpNzqZKIqi\nKP3GVplrcFLqEl7cx57mDp7k5bDcv/jYI7MLjirK7IJ5Cv1eaJRxiDwFU8v1paUfSTEgte7WqAxX\nXrSkT0OwJMPgA4A/Lz9LVklvdc5oWG2R7T+Fj/flzWsUw9LrOOE3/QG6whkLk5FhxjFpCoUfqJP9\nXbtjXbfXaI1NMD4rQvZXtCB9KzhbID8P3sdJ1pjhKTi7INeZC8h9mHhMZvX0VpvjwkN1pINyD6V1\njGxHY/O7ohxMyNDvRb+ZIZIpU4wwF8Ut81DcsoY0+S8BqKeIDTny42GPeA/HonPJvZ+yy/zTNHzX\n30SZ6+C9HU+SninvxQGwm8znWgl0ZaIoiqL0m5JpczPo0clEURSlwqjToqIoitJvVOYapIS37NGV\ni1EZ38dlmVGDXSmpEU/c+owoWy1k688Rfg+S+x276g43rlHPcZrIf6Ca/Eg8pM/7tkg7/t0fSt8I\nOyzt8iN+aVPvIn3evV36jNgJM5VyUz2lTh17rKyT9kjCm9fKctH0Eaml+F6lbdLfxS7JvnLXyPtq\niks/HwAoU/wui/fGUhTvy5bXQED2f7TFjK9WbidNniIN+w6SUZw5NprFMcccrutvlc/EKpL/RZbi\nTdH5HtAeIYDyTkovTNGjmUKLzKVjec2YbXzvHnpmRp1F+b2nSu4ZehzGSeKfMtJzKSePCY+Q73a6\nXfZvaodDHL46uV8XOu2be2lx31APeEVRFKXf6MpEURRF6Tc6mSiKoij9Rq25FEVRlH5j62QyOMk2\n7tkQ98VlEh6EzQB+vEnPDmvlYTIRUHWrDCDnokB5VTlzg5iDApZCUeMY0SbakC+OOdI4pi0iUxLn\nbBkILwTphOgu02ZugzQUCGflxitgJswqkXNZkNIRl/2caMmk3EzPxCWPcpMTqdUk7zPdKJ0BAcBL\nCc68vMFOgTYz9TLoZXNkjCg3tZuOk96wvNdUAxleBOUG/LC4DPTo88k2ANLBFgAKAXnvGXKErEqQ\nsQI9n7zfTKIWrJUBPzmJV5ISbPE12KkRcHYI7EqmTvaFTePGlZJ96UmZxh81k2TecisuN9TtNvme\nBUbL97baySiADAcqxf6QubZs2YIHH3wQH374Ic4//3ycffbZnd9dddVVCAQCcLlccLvdWLBgQY/1\nzZw5E4888gh27tyJWbNmYfjw3U7Ofr8fV155ZWe5twyJyURRFGUg2R+TSSQSwaWXXopVq1Y5fv/D\nH/4Q1dU9Z2B1YtiwYVi4cCEA4P/+7//w+9//HldffXWf6tDJRFEUpcLsD9PgaDSKaDSK1atX79P5\nO3fuxP33349sNovjjjtur8dlMhlEIqZi0xM6mSiKolSYcq+XJpXzlL/jjjsAAF/60pcwffp04/tf\n/OIXmDFjBqZNm4ann35afLd9+3bMnj0b2WwWuVwO8+fP7/P1h8Rk8mFkz/7CwV6pW/tT5n6Gi/Tf\nqtYPRbkQksH3clXSkc+flPpvZKcMDAkAFmvCpJ9z8D13RjpgZWrlvgEAbMlILXx4QO5FhHMyMB4H\nJgQlKMr4zX2cvFvugVRnaL+DHPNYj0erDLgIAFZQ1mkfKrVxTi5W9IVFuT0kNX4AyIfGiHK4Vjqi\n8p7Veyl5fLZdfl8dMp3/OMGTm/YSat0y6GIuIPvTkzMDaXpbyTmyTj6TREiOtZJHjhtORpbxmYEe\nQ3FKglaSe2duW+757ag9TJTbirVGnROyfxfl4HbpVBveQL+m2dmSHH8LB8trAoA7LwNjWpTEDuPk\nXqbtle8QB58EzD0o863aN3rw2RTMmTOn8//Tp093nAh6Yt68eYjFYujo6MDtt9+O4cOHY9KkSeKY\n9evX4/rrrwcAnHbaaXjsscc6v+sqc7388st46KGHcMstt/SpDUNiMlEURRlI+rJnsrfN8qeffhor\nVqwAANx0002IxWKOxwHo/C4ajeK4447DBx98YEwmAGBZPa+Ejj32WDz44IO9abpAJxNFUZQKU4k9\nkzPPPBNnnnlmj8dls1mUy2UEg0Fks1m8/fbbOO+884zjDjvsMLz00ks47bTT8Ne//nWv9a1btw5N\nTU17/X5v6GSiKIpSYfaHNVd7ezvmzJmDTCYDy7Lw1FNP4d5770UikcDdd98NACiVSvj85z+PqVOn\nGudfeumluP/++/Hkk08aG/Cf7JkAgMfjwXe/+90+t88q936n6DNLfPX/df4/GZYzrq9gJsfiZFbu\notTC2c6+5Jcavu2mgIu8bwDAl5H7Fxb5puRJ380EpE6d9EitFwC2Z+U5vGcSzdL+BhFIy/0jTi4E\nAKWQND10Z6Tub+wFUeBBOyD7CgByEZnUyJuNG8eIOsgXg/dpANMvh/eHOPnSlojU6DMleY1ar9kX\njS3S98QiX5ZUtbTTT/vknkl9y3qjTm+7fEalCO1bUWBSV1aOX+7fQtAcJwUfPRPaP3KX5F6QL0dJ\n1PJmUi++90Kwe78pF13D2CvKOSSyisrxnWg8RLaB/pRxcrcC7ScBgIsCrtZNOdm5wX3k7t/3zmvx\nhn81A81+VtGViaIoSoXRcCqKoihKv7GHYAx6nUwURVEqzODfPDAZGpNJF009kJOaftFj6qjN1cPk\nMZA28F5L6r3VGek7wZoz27IDMHw62GehSP4DoYyMk+X1mZpyrSUTbFkZ0vAD0rQwb8lrVLmlXX40\nYfrgeLZvkh+QLb9dI+OaZavlHlXOa3rWZrzSF6LGJfVzjrPF8B4VAOR98jo++p6TjR2Uln4Radqj\nCjgksnKn5V5CMSLPafdJn5Cqotx3KbvN168YlftHvBfBfjv5qOxfTnjmTZvx1dx071wnj0UPJwbL\nmnsmHOMuXyvjYjX7pQdHALKOcFj2lcs2k2Pl6JkWXPK5N+74h6yD9pMQG2nU6Uube2GVQCcTRVEU\npd/YQ3A20clEURSlwjgYGA56dDJRFEWpMKWSrkwGJV1jOXG8Kd6LAICR29aKMvtWpKvNWFBdCSSk\nr0CgbYtxTL5aasS2m/Zl8qT3kn8Ba+kAEIxLbdsqyphLrhpZ5jhPca+04w/WmPfpDsv9n47aMaKc\ndpMfCuQ120vm/lGpSH4OQXlOXUo+I0+73BsqRswwE9UJuY9VpHa/FzlelIfjY1H2kW+RBfOPg+2n\nHDMUMyxky30W3q9zFSg2GgCLftKyDxPH9/JSfC93Tu6HcNwtwBwXRjw1ihNnR+iZRc0YV8a+iy2v\nUVOQz4PHWrUt92k4vh0AeAKyv5z2Vbprky9p7gGW6BlWiiHgvmcwJCYTRVGUgWQIWgYf+MnEKUPY\nK6+8gt/97nfYsmUL5s+fj/HjxwMA3n77bTz22GMoFovweDyYOXMmpkyZcoDvQFEURVIegrPJAZ9M\nADND2KhRo3DDDTfg5z//uTiuqqoKN954I2KxGD766CPccccdeOihhwa6uYqiKN0yBFWuT8dkwowc\nadqDA8DYsXtybI8aNQr5fB6FQgFer9fx+E8Ibdpjfx4MSVt1yyHxgP2xzF/irqJUmJNlzKWiS3ox\nGPpxiXI3APB3UK4L8hfg3N0cH8mfMe3jWQu3SfvmeGD1Saljc56WvN/MheH2SNv+SFLehzsk28A+\nJNVuM+6WG/IZ+MkPwvZJXyB3Wu49eNJmnZyT3BeXevmUgNxf4jzyvLcW2f5P4xou2p/gPDiu6u73\nJizb3M/I0H6ch/xG+Llng3VUNnONMEGKwWZRfCrej/OQv4bF/hsA8vXync1SHhXOI1+d3yjK7m3k\nv+SAPfpweQ7FCHMl5fg2/qI7mFi599NfffWAP0D0lCHMiddeew3jxo1znEiWL1+OJ598Eul0GkuW\nLKloWxVFUXrCVmuugac3GcKYjz/+GI899theM4Hta7YyRVGUSjAUnRYPePxjpwxh3dHS0oK7774b\nV111FYYNG9btsYqiKAeCcrncq3+DiQO6MulthrBPSKVSWLBgAS644AIcfvjhez2OKcX27Ecka0eL\n74pujtoExLxyX4BzkPuzUrMPpzg3idTCrYK0oQcAOyDlObaZD6alnb13x0Z5/napQQOAXZB1WG7K\no0I6ruWX9x6g+EqBGoec2Y1jRZnzmEcoF4w/IPd+POw/AzMWVIn2SMqUbyPVOEGU2b8GANy7ZP/Y\nKcq3bstnGtz6kSiHwlLzL9c4pEyl5+rKyXsPkIZfDkqfEd5bA4DI5jXyA4rhxnsTaZ/s3/pmmWPF\n0yp9cgAARfLP4PHJ9+6n5+Ex/2y4c/K5RgsbRZn3kyx+5sNkLC/bZ/p/eLL0DPk+OGYYt9Nl5tJx\nbf/I+KwS6J7JANPR0eGYIez111/Hww8/jHg8jgULFmDMmDG45ZZb8PTTT2P79u144okn8MQTTwAA\nbr31VkSj3SfiURRFGUgG2aKjVxzQyaSpqQkLFy40Pj/++ONx/PHHG5+fe+65OPfccweiaYqiKPtM\naQhmxzrgG/CKoiiDDXVaHKTYb77S+X9vx1/Ed34H0+K8S+rU/oOlnuvZJI0E8jukv4anivIutJn5\nOLKt8rOqQ+RejmcYxcXinNikpQNAuWD6LQjIp8ZdLfV2O07tTJj+G8GM1MYLH0r/i3yHjEflCcr9\np0y7mRck3yHr9Ialf0xk3MGi7D9I+vlwG4DdsmlXLHrOhQ7Kax6Q1/RUkQ9D1vQVMvLbt8sYYpnN\nci+nlJP7BIE6M06Zm8ZOKSWvW2h/RZRrI3IfwKLz8ylzj6pMfeONSd8UV7UsG3HgEuTPAcDdLP2N\nOL8JyOcp97c3RbllDfl2+cw/TfwHOtshn1EhI/dQfGG5JxhuIH8xB0Z+s8dDesX+mExefPFFPPnk\nk537zJdffjnGjBkDAPjb3/6GX/ziF7Dt/6+9c4+J6vr2+Jd5MMCA4KAD+lPxQqmgrVYuGmvVxqgt\nyf2j2pq0JvqHtkUqaLU61mgUa7UPBVvb+IzWV20jtWrT+2uwptpEwV7pw1YFvVIKVXkMj/IaGGA4\n5/7BdWTtM8IMwzgjrE9CMpuzzz7rPGDPWd+91pIwY8YMzJ49u8uxzGYzPvzwQ2RmZuL69evYunUr\njEYjZFlGaGgoli1b5rJ84PXVXAzDMH0NSXbuxxWMRiM2btyIzMxMvPTSS/YMIZIk4cCBA1i7di0+\n+ugj5OTk4M6dOy6NHR8fj23btiEjIwMxMTE4c+ZM9zsJ9Is3E4ZhmIeJJ95MRo0aZf8cGxuL6uqO\nTAaFhYWIjIxERETHqtXJkycjLy9PkUmkqKgIu3fvBgCMHTvWsd2yjObm5h6FXfBkwjAM08u4EkOy\nZs0a+2dnA67PnTuH8ePHAwBqamoQHn5/GX94eDhu3bql2GfXrl1YtGgRRo8ejaNHj5JtBQUFMJlM\naGxshE6nw7x585y2/x79YjJpLL5fT0T8xmBrVtaU0IVSP7ToxxZRCfEaUiv1jYtxKgCg0gg1PPRC\nDILgtxbrmajENfZQxjnAQvWJ5kLql7Y1Un+8qPVowh3EVgixK2KsSuAIWutbFUbPQ1un9LfLVnoP\n1AOFc9dQvUP6h+aWarcq76E6gGo1tkaqHVirqWaiCaJjiP39rYJmBaDZTO2oLXIQ09EJXQjVDXRh\nytxnokZirVTW2+mMVEvPo12w02ZR1mtXCfqRIh7p+h9d2iDbHNTSGTKYtP3H0G++ciXVj9oa6Hn6\nh1D9KWCg8toEDqc6oq2WanzWKvpsKWwaTrVPAJBFLbKXcGU11wcffODS2NeuXcP58+exadMmp/ex\nWCywWCz27CLTpk3DlStX7Nvj4+Ptk9rp06fx+eefIzk52SW7WDNhGIbpZWRJduqnK7Kzs2EymWAy\nmVBT0zGhl5SUYO/evTCZTAgJ6ZhwDQaD3eUFdGQJuZdZpCckJiaioKDA5f36xZsJwzDMw6Q3NJOk\npCQkJSXZ21VVVcjIyEBaWhqGDr2/ojEmJgZlZWUwm80wGAzIzc3FsmXLyFh6vR56vR43btxAXFwc\nLly48MDj3rhxw66/uAJPJgzDML2MJxI9njhxAo2Njdi/fz8A2IsJqtVqLFq0CFu2bIEkSZg+fTqG\nO3DpLVmyxC7Ajxs3jmy7p5kAQFBQEBYvXuyyfTyZMAzD9DKeWM2VkpKClJQUh9sSEhKQkJDQ5f7R\n0dEk48j8+fMBAGPGjMHhw4fdtq9fTCZh//nk/YYYzBaqLCZkM9JZvV1IOqdzUCipK/wdiOXtg6iY\nWGegCRQlIbmh3mImba1FWRxLpaISmBxMg+ICBREfQVT0b4p6krTNQULgJAB9KxU5gw10CaEkJC9s\nF46p1dMATwCQgqjYKglJGMVkfKpAQaxNVKbekbRU7NYKYwRECUkxw410gFoqrosBigCgjR1F2gMS\nqGgvG+iYsoYuVlA5KDIFISGiWk/jBcTgS9W/aLCrWBBN1SwkRwQABwkmCUJgqvapCaTdNmCQcsg2\narckJnKMpdci8D/iSTu4WkjW6SAoty1iJO0inEdII/2bUFxfBwth5MgRit/1Bn0tI7Az9IvJhGEY\n5mHSbuPcXAzDMIyb8JsJwzAM4zayxG8mfRKpU7K79gbqQ9Y4uOnqYKGgk/k2abeVFAsHoGNowmlR\nKT+tg2JCd2kAYajg3xULarUW0KJJrS3Kglv+ETRIS7SrrZ4GMar1VAfQ/O9N0jbqlIXDNIMFbUHQ\nGqQAoQCRYINlyOOKMUX0+bmkbauihcI0w6im1RqmXMaoaqNBiKoAWuAJjVQzke6UkLaYnNNPpfTh\n+8fQIl0KjUTQL/xaaKCeXKEscCb69dURXSf8lIVCbiJSoDLgtjaSlsVWC4XZ9HV3SVssbKW2KpN1\ntgdQ3UtTR++ZykKDK1vDabJOtVCAS9RtAEBrprqXFEL1znbh2VNV0vNwFKDop1NqYb0BF8diGIZh\n3IbdXAzDMIzbSCzAMwzDMO4iOViG3NfpF5OJ2ng/FkKylnTR8wHYaNEpMfGj3Cb435uFQkrKXHuK\nYgZqIXYCgg9ZM5DGjPiplbdOTNjXItQ0UAfRMf38qSYiFm+q/0vp01cX0TGDR9LEjtrRNFalXfDZ\n6yxC/AYAWYx70Alag6g5CcWaWgKVsUI6P2VBMjJmGNW15GH0+vq3C3EoVQ6SODYJcSVCAkpJSLqo\nKHg2QFl8yHaH6nNSJdUe/IfQMRoHR5N2SDnVvRxpD0GhNHGj2ibEiPz0I2mLhdz0j8coxlQPprpV\nezGNxVIJfzP+4j9bIf7LVqqsxyHqnbooIcYmisb9NMRNJm1tq/Ja+Dcp47V6A660yDAMw7gNTyYM\nwzCM27AAzzAMw7iNmBKoP+An94MptOHyv+2f1c10jbyfpCz0YzXQcpc2DfXhBzZUkLbGIhR8Etez\nW5WiiSz42zvHwgCAn1A8SzVEsClcmTfLT/Dzqxu71g0kQZcRi2uJcRIdO1H9yE8oeoRAIc5EzBem\npzE8jvAT/fxibjPxkdU5sFMrFCwTcnXVCLEWwU1UmwiopjENYqExR0hhNM5H9Q/NpyZV07xkYp4t\nAMBwqkdI/jSORFbR739i3jhtgxDf0eqg+NM/tI8YfyFqaWJxMjlcWdLVr1aZc40guH2kBvpsqgYL\nYzrQekRETSSgmeof2tI/6Q6C9umIgBff7LaPM/zXa9ec6vfv/U/0yvF8AX4zYRiG6WVYM2EYhmHc\nhicThmEYxm04zqSP0vLfJ+2fm6uovqEfqqzNEDCM5g1qLSgk7Yoyuk5fH0HjHPxDqW5gKaU+agCQ\nbFSrCY2hmoiYC0rVRPUMTTX1xwNAu6C7NJbSPupA6n8PGC7oMHX02jj6diUJsSgtQg4rqZX6pXUG\nqpG01Svra4jHCX6Mxg+I9Utqf6P+6MoCZTxM4EC6T8SEONIOC/iNtG11VBO5e53mTmu1KHOhhY2k\nGomtmeYDMxfQ2BSblV6b0GHKOBNdyGXSFp8TbVDXubhUQkyOPjJc0cdSLsT6CBqULozGhIjnFRKj\nrAEixtDYymheLMvfVFsTn29tKb1WGiEmCgCazdRuTcEN2o4eSdpNRcWk3dYgxH8B0Ih/Ey8quvQI\nfjNhGIZh3IazBjMMwzBuI7UrV4m6y4ULF/DNN99AlmUEBgbitddew8iRIwEAqampCAgIgEqlsteG\n744FCxbg6NGjMJvNWLFiBYYO7fDI6HQ6LFmyxN52Fp5MGIZhehlPpKA3Go3YuHEjgoOD8dtvv2Hf\nvn1477337NvT09MxYED3S+8dERkZaa8Pf/bsWZw8eRJpaWkujdEvJpOWuvt+ejF/Vf5XeYr+9beo\nbzV0FNVAjKNpHiKblfrTKwuovzjYSGs9AEBYLNUrWmtpvYfaYkHv0FK7DfFUVwAA7QDq6xb90k2C\nr7z2Fs1/1NJgVYm6JgAACg1JREFU7bLtiJBI+vD6h1CtorWE+sJr/6Z6kyMsFTReIDSafkMaEEfz\nUYm6AgC0WajtjSXUZ1/3N9WxbC1Uzwgy0PPQhXStVQCATtDKQiLpfW8op7pMQxm95wBgLqDxGio1\nvYdBg6hdA6MMpC1qgNqBSl3GEEm1HtElU/k/V0m75i/63DSUKu9h0CCai6uxnOpv1jqq+YWNoHYP\n0FONpKFYqYNV3exadzEKmsidy9Sm8otK7TLsCfo388xGRZce4Qk316hR93OPxcbGorpameeuK8xm\nM3bs2AGr1YoJEyY8sF9zczOCg5V1cLqjX0wmDMMwDxNXBPg1a9bYP8+cORMzZ87sdp9z585h/Pjx\n5HdbtmwBAMyaNcvhGAcPHsRzzz2HZ599FtnZ2WRbeXk5TCYTrFYrWlpayBuPs/BkwjAM08vILiwN\ndkbf6My1a9dw/vx5bNq0yf67d999FwaDAXV1ddi8eTOGDh2K0aNploebN29i5cqVAIBp06bh2LFj\n9m2d3Vy5ubnYu3cv1q1b55Jdqu67MAzDMK4g2dqd+umK7OxsmEwmmEwm1NR0uBZLSkqwd+9emEwm\nhITcd6MaDB1uw9DQUEyYMAGFhYUOx/TzU5afFklMTERBQYGzp2qnX7yZ/GvH8QduG/bALT2nN8Y0\ndt+lW1z3enoeZYSC++i776LAtXUqPaM37qE30L9O2yO9YoUSV+9ZRPddPMbFb591e4ykpCQkJSXZ\n21VVVcjIyEBaWhpZaWW1Wu0rvKxWK/744w/MnTtXMd6oUaOQk5ODadOm4eLFiw887o0bNxAR4frV\n6/NvJp39kb4I2+cevmyfL9sGsH2PGidOnEBjYyP2798Pk8lkvz51dXXYsGEDTCYT1q5di4SEBDz1\n1FOK/RcuXIgzZ85g5cqV9jede9zTTEwmE7788kukpKS4bF+/eDNhGIZ51ElJSXH4Tz4iIsKud3SF\n0Wi0i/QA8Morr9h/31k/6Sl9/s2EYRiG8TzqjRs3bvS2EZ4mOjq6+05ehO1zD1+2z5dtA9g+pvfo\nF8WxGIZhGM/Cbi6GYRjGbXgyYRiGYdymT6/munLlCg4ePAhJkjBjxgzMnj3bp46fn5+Pw4cPo6Sk\nBMuXL8ekSZPs215++WWMGNERlTFo0CC8/fbbXrX1+++/x5kzZ6BSqRAQEIDFixdj2DBPROk4Z889\nfvrpJ2zfvh3vv/8+YmJiFBlQY2NjkZyc7DE7nbU1NzcXX331Ffz8/BAVFYU33+ydWuM9sefQoUO4\nfv06AKC1tRV1dXU4dOgQAN977iorK7F7927U19cjODgYS5cuRXi4skYL4wPIfZT29nY5LS1NLi8v\nl9va2uRVq1bJt2/f9qnjV1RUyMXFxfKnn34qX7p0iWybP3++T9lqsVjsn/Py8uTNmzd71R5ZluWm\npiZ5w4YN8tq1a+XCwkJZljuu6VtvveUx23pia2lpqWwymeSGhgZZlmW5trbWq/Z05rvvvpN37txp\nb/vac5eZmSmfP39elmVZvnr1qvzJJ588NPsY1+izbq7CwkJERkYiIiICGo0GkydPRl6eMkOwN49v\nNBoRFRXlVIoDT+KMrUFB97PVWq1Wj9rs7L07fvw4XnjhBWi1Wo/Z0h3O2PrDDz/g+eeft2diDQ1V\nZvJ9mPZ0JicnB1OmTPGYPV3hjK137tzBE088AQAYM2YMfv75Z2+YyjhBn51MampqyOtweHi4IurT\nl4/f1taGNWvWYN26dbh8+XL3O7iBs7ZmZ2dj6dKlOHbsGBYuXOhVe4qKilBVVYWEhATF/mazGatX\nr0Z6enqPcgz1tq2lpaUoKyvD+vXrsW7dOly5csWr9tyjsrISZrPZ/s8a8L3nLioqym7H5cuX0dzc\njIYGmsqf8Q36tGbyKLNr1y4YDAZUVFRg06ZNGDFiBCIjI71q071cQRcvXsTXX3/tcvGc3kKSJBw5\ncgRLlixRbBs4cCB27dqFkJAQFBUVYdu2bcjMzCRvVg8bSZJQVlaG9PR01NTUID09HRkZGdDre5JV\nrPfIycnBpEmToFLd/07pa8/dggUL8Nlnn+HHH39EfHw8DAYDsZfxHfrsXTEYDKR4THV1tT2z5qNw\n/Ht9IyIiMHr0aBQXF/e2ieRYrtjqaZdhd/ZYrVbcvn0b77zzDlJTU3Hr1i1s3boVf/75J7RarT2b\nanR0NCIiIlBWVqY4xsOy9V6fxMREaDQaGI1GDBkyxGM2uXIvc3Nz8cwzzyj2B3znuTMYDFi1ahW2\nbt2KefPmAYDXJ2HGMX12MomJiUFZWRnMZjNsNhtyc3ORmJj4SBy/sbERbW1tAID6+nrcvHnToyun\nnLG18z+/X3/9FUOGDPGaPUFBQThw4AB27tyJnTt3IjY2FqtXr0ZMTAzq6+sh/X+Vu4qKCpSVlfUo\nA2pv2QoAEydOtK+eqq+v96hNzj53d+/ehcViweOPP27/nS8+d53v56lTpzB9+nSP2cO4R591c6nV\naixatAhbtmyBJEmYPn06hg8f7vXjHz9+HDExMUhMTERhYSEyMjJgsVjwyy+/ICsrC9u3b8fdu3ex\nb98+qFQqSJKE2bNne/SP2hlbs7OzcfXqVajVagQHByM1NdWr9jyI/Px8ZGVlQa1WQ6VS4fXXX+9R\nCdLetHXcuHH4/fffsWLFCqhUKsyfP5/UonjY9gAdLq7JkyeThRS++Nzl5+fjiy++gJ+fH+Lj4/Hq\nq696zB7GPTidCsMwDOM2fdbNxTAMwzw8eDJhGIZh3IYnE4ZhGMZteDJhGIZh3IYnE4ZhGMZteDJh\n+iwnT57Enj17vG0Gw/QLeGkw88iyYMEC++fW1lZoNBp7qo3k5GRMnTrVW6YxTL+DJxOmT5CamorF\nixdj7Nix3jaFYfolfTYCnmGysrJQXl6OZcuWwWw2Iy0tDW+88QaysrJgtVoxb948REdHY8+ePaiq\nqsLUqVNJhPW5c+fw7bffora2Fo899hiSk5MxePBgL54Rw/gurJkw/Ypbt25hx44dWL58OQ4fPoyT\nJ09i/fr12L59Oy5duoT8/HwAQF5eHk6dOoWVK1di//79iIuLw44dO7xsPcP4LjyZMP2KuXPnwt/f\nH+PGjYNOp8OUKVMQGhoKg8GAuLg4/PXXXwCAs2fPYs6cORg2bBjUajXmzJmD4uJiVFZWevkMGMY3\nYTcX06/oXOXQ399f0bZarQA6CkcdPHgQR44csW+XZRk1NTXs6mIYB/BkwjAOGDRoEF588UVeEcYw\nTsJuLoZxwKxZs3D69Gncvn0bANDU1IRLly552SqG8V34zYRhHDBx4kRYrVZ8/PHHqKqqQlBQEJ58\n8kk8/fTT3jaNYXwSjjNhGIZh3IbdXAzDMIzb8GTCMAzDuA1PJgzDMIzb8GTCMAzDuA1PJgzDMIzb\n8GTCMAzDuA1PJgzDMIzb8GTCMAzDuM3/AWl0w6bG6QWzAAAAAElFTkSuQmCC\n", 193 | "text/plain": [ 194 | "
" 195 | ] 196 | }, 197 | "metadata": {}, 198 | "output_type": "display_data" 199 | } 200 | ], 201 | "source": [ 202 | "show_melspectrogram(conf, XX[0, 0, :, :, -1], title=conf.labels[y[0]])\n", 203 | "show_melspectrogram(conf, X_train[0, :, :, -1], title=conf.labels[y_train[0]])" 204 | ] 205 | }, 206 | { 207 | "cell_type": "code", 208 | "execution_count": 7, 209 | "metadata": {}, 210 | "outputs": [ 211 | { 212 | "data": { 213 | "text/plain": [ 214 | "array([1, 1, 2, 3, 2, 1, 4, 3, 3, 4, 2, 5, 1, 1, 5, 5, 3, 1, 3, 3, 2, 3,\n", 215 | " 2, 2, 4, 3, 4, 3, 1, 2, 2, 3, 5, 2, 1, 2, 5, 2, 5, 2, 4, 1, 3, 5,\n", 216 | " 4, 1, 3, 1, 3, 0, 0, 5, 2, 1, 1, 2, 2, 0, 3, 1, 4, 2, 1, 4, 5, 3,\n", 217 | " 3, 5, 2, 2, 2, 2, 2, 3, 3, 2, 3, 3, 1, 3, 2, 3, 4, 0, 3, 3, 3, 1,\n", 218 | " 1, 1, 5, 3, 3, 1, 3, 1, 5, 1, 5, 1, 2, 2, 2, 2, 3, 0, 4, 2, 1, 3,\n", 219 | " 5, 2, 2, 3, 1, 1, 1, 1, 5, 2, 2, 5, 2, 4, 3, 2, 2, 2, 3, 1, 2, 1,\n", 220 | " 3, 3, 1, 4, 3, 1, 3, 1, 2, 3, 2, 1, 0, 1, 3, 2, 3, 2, 3, 5, 4, 0,\n", 221 | " 4, 5, 3, 5, 2, 2, 3, 1, 0, 1, 3, 1, 5, 2, 3, 0, 3, 1, 2, 3, 4, 1,\n", 222 | " 3, 2, 1, 4, 4, 1, 1, 1, 3, 2, 1, 2, 5, 3, 2, 3, 0, 0, 2, 1, 4, 1,\n", 223 | " 2, 2, 5, 1, 3, 3, 3, 3, 4, 3, 4, 5, 0, 1, 1, 2, 1, 1, 2, 2, 2, 1,\n", 224 | " 2, 1, 1, 2, 2, 5, 0, 2, 2, 2, 3, 0, 4, 0, 3, 2, 0, 2, 3, 0, 2, 2,\n", 225 | " 4, 1, 3, 1, 5, 1, 3, 1, 1, 1, 2, 0, 0, 5, 2, 1, 2, 5, 0, 0, 2, 2,\n", 226 | " 4, 1, 5, 1, 2, 1, 2, 1, 3, 3, 5, 3, 3, 1, 2, 3, 1, 4, 2, 3, 3, 1,\n", 227 | " 1, 3])" 228 | ] 229 | }, 230 | "execution_count": 7, 231 | "metadata": {}, 232 | "output_type": "execute_result" 233 | } 234 | ], 235 | "source": [ 236 | "y_train" 237 | ] 238 | }, 239 | { 240 | "cell_type": "code", 241 | "execution_count": 8, 242 | "metadata": {}, 243 | "outputs": [ 244 | { 245 | "data": { 246 | "text/plain": [ 247 | "array([2, 0, 1, 1, 0, 1, 4, 0, 2, 3, 3, 1, 2, 1, 3, 3, 2, 1, 3, 1, 3, 2,\n", 248 | " 1, 0, 1, 2, 3, 1, 4, 3, 2, 4, 1, 3, 3, 0, 4, 1, 3, 3, 2, 0, 1, 3,\n", 249 | " 5, 2, 0, 4, 0, 1, 1, 2, 2, 3, 1, 1, 4, 3, 3, 1, 3, 0, 2, 3, 3, 3,\n", 250 | " 1, 2, 1, 3, 2, 2])" 251 | ] 252 | }, 253 | "execution_count": 8, 254 | "metadata": {}, 255 | "output_type": "execute_result" 256 | } 257 | ], 258 | "source": [ 259 | "y_test" 260 | ] 261 | }, 262 | { 263 | "cell_type": "code", 264 | "execution_count": null, 265 | "metadata": { 266 | "collapsed": true 267 | }, 268 | "outputs": [], 269 | "source": [] 270 | } 271 | ], 272 | "metadata": { 273 | "kernelspec": { 274 | "display_name": "Python 3", 275 | "language": "python", 276 | "name": "python3" 277 | }, 278 | "language_info": { 279 | "codemirror_mode": { 280 | "name": "ipython", 281 | "version": 3 282 | }, 283 | "file_extension": ".py", 284 | "mimetype": "text/x-python", 285 | "name": "python", 286 | "nbconvert_exporter": "python", 287 | "pygments_lexer": "ipython3", 288 | "version": "3.6.2" 289 | } 290 | }, 291 | "nbformat": 4, 292 | "nbformat_minor": 2 293 | } 294 | -------------------------------------------------------------------------------- /apps/cnn-laser-machine-listener/README.md: -------------------------------------------------------------------------------- 1 | ## Running predictor 2 | 3 | ```sh 4 | python ../../realtime_predictor.py 5 | ``` 6 | 7 | This will load .pb model file specified by `conf.runtime_model_file`, and start listening to the audio input. 8 | 9 | or specifically set .pb file. 10 | 11 | ```sh 12 | python ../../realtime_predictor.py -pb cnn-model-laser-machine-listener.pb 13 | ``` 14 | 15 | Following another example below will not use CUDA and predict from a file. 16 | 17 | ```sh 18 | CUDA_VISIBLE_DEVICES= python ../../realtime_predictor.py -f laser-machine-listener/data/cutting_not_in_focus/paper_cutting_not_in_focus.wav 19 | ``` 20 | 21 | ## Training model 22 | 23 | See [EXAMPLE_APPS.md](../../EXAMPLE_APPS.md). 24 | 25 | -------------------------------------------------------------------------------- /apps/cnn-laser-machine-listener/cnn-alexbased-laser-machine-listener.pb: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/apps/cnn-laser-machine-listener/cnn-alexbased-laser-machine-listener.pb -------------------------------------------------------------------------------- /apps/cnn-laser-machine-listener/cnn-model-laser-machine-listener.pb: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/apps/cnn-laser-machine-listener/cnn-model-laser-machine-listener.pb -------------------------------------------------------------------------------- /apps/cnn-laser-machine-listener/config.py: -------------------------------------------------------------------------------- 1 | # CNN Version Laser Machine Listener 2 | # Application configurations 3 | 4 | from easydict import EasyDict 5 | 6 | conf = EasyDict() 7 | 8 | # Basic configurations 9 | conf.sampling_rate = 16000 10 | conf.duration = 1 11 | conf.hop_length = 253 # to make time steps 64 12 | conf.fmin = 20 13 | conf.fmax = conf.sampling_rate // 2 14 | conf.n_mels = 64 15 | conf.n_fft = conf.n_mels * 20 16 | 17 | # Labels 18 | conf.labels = ['background', 'cutting_in_focus', 'cutting_not_in_focus', 19 | 'marking', 'sleeping', 'waiting'] 20 | 21 | # Training configurations 22 | conf.folder = '.' 23 | conf.n_fold = 1 24 | conf.normalize = 'samplewise' 25 | conf.valid_limit = None 26 | conf.random_state = 42 27 | conf.samples_per_file = 30 28 | conf.test_size = 0.2 29 | conf.batch_size = 32 30 | conf.learning_rate = 0.0001 31 | conf.epochs = 50 32 | conf.verbose = 2 33 | conf.best_weight_file = 'best_model_weight.h5' 34 | conf.eval_ensemble = False # This solution shuffles samples, ensemble not available 35 | 36 | # Runtime conficurations 37 | conf.rt_process_count = 1 38 | conf.rt_oversamples = 10 39 | conf.pred_ensembles = 10 40 | conf.runtime_model_file = 'cnn-model-laser-machine-listener.pb' 41 | -------------------------------------------------------------------------------- /apps/cnn-laser-machine-listener/download.sh: -------------------------------------------------------------------------------- 1 | git clone https://github.com/kotobuki/laser-machine-listener.git 2 | 3 | -------------------------------------------------------------------------------- /apps/cnn-laser-machine-listener/model_based_on_fsd.h5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/apps/cnn-laser-machine-listener/model_based_on_fsd.h5 -------------------------------------------------------------------------------- /apps/fsdkaggle2018/config.py: -------------------------------------------------------------------------------- 1 | # Freesound Dataset Kaggle 2018 2 | # Application configurations 3 | 4 | from easydict import EasyDict 5 | 6 | conf = EasyDict() 7 | 8 | # Basic configurations 9 | conf.sampling_rate = 44100 10 | conf.duration = 1 11 | conf.hop_length = 347 # to make time steps 128 12 | conf.fmin = 20 13 | conf.fmax = conf.sampling_rate // 2 14 | conf.n_mels = 128 15 | conf.n_fft = conf.n_mels * 20 16 | conf.model = 'mobilenetv2' # 'alexnet' 17 | 18 | # Labels 19 | conf.labels = ['Hi-hat', 'Saxophone', 'Trumpet', 'Glockenspiel', 'Cello', 'Knock', 20 | 'Gunshot_or_gunfire', 'Clarinet', 'Computer_keyboard', 21 | 'Keys_jangling', 'Snare_drum', 'Writing', 'Laughter', 'Tearing', 22 | 'Fart', 'Oboe', 'Flute', 'Cough', 'Telephone', 'Bark', 'Chime', 23 | 'Bass_drum', 'Bus', 'Squeak', 'Scissors', 'Harmonica', 'Gong', 24 | 'Microwave_oven', 'Burping_or_eructation', 'Double_bass', 'Shatter', 25 | 'Fireworks', 'Tambourine', 'Cowbell', 'Electric_piano', 'Meow', 26 | 'Drawer_open_or_close', 'Applause', 'Acoustic_guitar', 27 | 'Violin_or_fiddle', 'Finger_snapping'] 28 | 29 | # Training configurations 30 | conf.folder = '.' 31 | conf.n_fold = 1 32 | conf.normalize = 'samplewise' 33 | conf.valid_limit = None 34 | conf.random_state = 42 35 | conf.test_size = 0.01 36 | conf.samples_per_file = 5 37 | conf.batch_size = 32 38 | conf.learning_rate = 0.0001 39 | conf.epochs = 500 40 | conf.verbose = 2 41 | conf.best_weight_file = 'best_mobilenetv2_weight.h5' 42 | 43 | # Runtime conficurations 44 | conf.rt_process_count = 1 45 | conf.rt_oversamples = 10 46 | conf.pred_ensembles = 10 47 | conf.runtime_model_file = 'model/mobilenetv2_fsd2018_41cls.pb' 48 | -------------------------------------------------------------------------------- /apps/fsdkaggle2018/train_alexbased.py: -------------------------------------------------------------------------------- 1 | # Train FSDKaggle2018 model 2 | # 3 | import sys 4 | sys.path.append('../..') 5 | from lib_train import * 6 | 7 | conf.model = 'alexnet' 8 | conf.logdir = 'logs_alexbased' 9 | conf.best_weight_file = 'best_alexbased_weight.h5' 10 | 11 | # 1. Load Meta data 12 | DATAROOT = Path.home() / '.kaggle/competitions/freesound-audio-tagging' 13 | #Data frame for training dataset 14 | df_train = pd.read_csv(DATAROOT / 'train.csv') 15 | #Plain y_train label 16 | plain_y_train = np.array([conf.label2int[l] for l in df_train.label]) 17 | 18 | # 2. Preprocess data if it's not ready 19 | def fsdkaggle2018_map_y_train(idx_train, plain_y_train): 20 | return np.array([plain_y_train[i] for i in idx_train]) 21 | def fsdkaggle2018_make_preprocessed_train_data(): 22 | conf.folder.mkdir(parents=True, exist_ok=True) 23 | if not os.path.exists(conf.X_train): 24 | XX = mels_build_multiplexed_X(conf, [DATAROOT/'audio_train'/fname for fname in df_train.fname]) 25 | X_train, y_train, X_test, y_test = \ 26 | train_valid_split_multiplexed(conf, XX, plain_y_train, demux=True) 27 | np.save(conf.X_train, X_train) 28 | np.save(conf.y_train, y_train) 29 | np.save(conf.X_test, X_test) 30 | np.save(conf.y_test, y_test) 31 | 32 | fsdkaggle2018_make_preprocessed_train_data() 33 | 34 | # 3. Load all dataset & normalize 35 | X_train, y_train = load_audio_datafiles(conf, conf.X_train, conf.y_train, normalize=True) 36 | X_test, y_test = load_audio_datafiles(conf, conf.X_test, conf.y_test, normalize=True) 37 | print('Loaded train:test = {}:{} samples.'.format(len(X_train), len(X_test))) 38 | 39 | # 4. Train folds 40 | history, model = train_classifier(conf, fold=0, 41 | dataset=[X_train, y_train, X_test, y_test], 42 | model=None, 43 | init_weights=None, # from scratch 44 | #init_weights='../../model/mobilenetv2_fsd2018_41cls.h5' 45 | ) 46 | 47 | # 5. Evaluate 48 | evaluate_model(conf, model, X_test, y_test) 49 | 50 | print('___ training finished ___') 51 | -------------------------------------------------------------------------------- /apps/fsdkaggle2018/train_mobilenetv2.py: -------------------------------------------------------------------------------- 1 | # Train FSDKaggle2018 model 2 | # 3 | import sys 4 | sys.path.append('../..') 5 | from lib_train import * 6 | 7 | conf.logdir = 'logs_mobilenetv2' 8 | conf.best_weight_file = 'best_mobilenetv2_weight.h5' 9 | 10 | # 1. Load Meta data 11 | DATAROOT = Path.home() / '.kaggle/competitions/freesound-audio-tagging' 12 | #Data frame for training dataset 13 | df_train = pd.read_csv(DATAROOT / 'train.csv') 14 | #Plain y_train label 15 | plain_y_train = np.array([conf.label2int[l] for l in df_train.label]) 16 | 17 | # 2. Preprocess data if it's not ready 18 | def fsdkaggle2018_map_y_train(idx_train, plain_y_train): 19 | return np.array([plain_y_train[i] for i in idx_train]) 20 | def fsdkaggle2018_make_preprocessed_train_data(): 21 | conf.folder.mkdir(parents=True, exist_ok=True) 22 | if not os.path.exists(conf.X_train): 23 | XX = mels_build_multiplexed_X(conf, [DATAROOT/'audio_train'/fname for fname in df_train.fname]) 24 | X_train, y_train, X_test, y_test = \ 25 | train_valid_split_multiplexed(conf, XX, plain_y_train, demux=True) 26 | np.save(conf.X_train, X_train) 27 | np.save(conf.y_train, y_train) 28 | np.save(conf.X_test, X_test) 29 | np.save(conf.y_test, y_test) 30 | 31 | fsdkaggle2018_make_preprocessed_train_data() 32 | 33 | # 3. Load all dataset & normalize 34 | X_train, y_train = load_audio_datafiles(conf, conf.X_train, conf.y_train, normalize=True) 35 | X_test, y_test = load_audio_datafiles(conf, conf.X_test, conf.y_test, normalize=True) 36 | print('Loaded train:test = {}:{} samples.'.format(len(X_train), len(X_test))) 37 | 38 | # 4. Train folds 39 | history, model, plain_datagen = train_model(conf, fold=0, 40 | dataset=[X_train, y_train, X_test, y_test], 41 | model=None, 42 | init_weights=None, # from scratch 43 | #init_weights='../../model/mobilenetv2_fsd2018_41cls.h5' 44 | ) 45 | 46 | # 5. Evaluate 47 | evaluate_model(conf, model, X_test, y_test) 48 | 49 | print('___ training finished ___') 50 | -------------------------------------------------------------------------------- /apps/fsdkaggle2018small/config.py: -------------------------------------------------------------------------------- 1 | # Freesound Dataset Kaggle 2018 2 | # Application configurations 3 | 4 | from easydict import EasyDict 5 | 6 | conf = EasyDict() 7 | 8 | # Basic configurations 9 | conf.sampling_rate = 16000 10 | conf.duration = 1 11 | conf.hop_length = 253 # to make time steps 64 12 | conf.fmin = 20 13 | conf.fmax = conf.sampling_rate // 2 14 | conf.n_mels = 64 15 | conf.n_fft = conf.n_mels * 20 16 | conf.model = 'mobilenetv2' # 'alexnet' 17 | 18 | # Labels 19 | conf.labels = ['Hi-hat', 'Saxophone', 'Trumpet', 'Glockenspiel', 'Cello', 'Knock', 20 | 'Gunshot_or_gunfire', 'Clarinet', 'Computer_keyboard', 21 | 'Keys_jangling', 'Snare_drum', 'Writing', 'Laughter', 'Tearing', 22 | 'Fart', 'Oboe', 'Flute', 'Cough', 'Telephone', 'Bark', 'Chime', 23 | 'Bass_drum', 'Bus', 'Squeak', 'Scissors', 'Harmonica', 'Gong', 24 | 'Microwave_oven', 'Burping_or_eructation', 'Double_bass', 'Shatter', 25 | 'Fireworks', 'Tambourine', 'Cowbell', 'Electric_piano', 'Meow', 26 | 'Drawer_open_or_close', 'Applause', 'Acoustic_guitar', 27 | 'Violin_or_fiddle', 'Finger_snapping'] 28 | 29 | # Training Configurations 30 | conf.folder = '.' 31 | conf.n_fold = 1 32 | conf.normalize = 'samplewise' 33 | conf.valid_limit = None 34 | conf.random_state = 42 35 | conf.test_size = 0.01 36 | conf.samples_per_file = 5 37 | conf.batch_size = 32 38 | conf.learning_rate = 0.0001 39 | conf.epochs = 500 40 | conf.verbose = 2 41 | conf.best_weight_file = 'best_mobilenetv2_weight.h5' 42 | 43 | # Runtime conficurations 44 | conf.rt_process_count = 1 45 | conf.rt_oversamples = 10 46 | conf.pred_ensembles = 10 47 | conf.runtime_model_file = 'please make yourself' ### NOT PROVIDED 48 | -------------------------------------------------------------------------------- /apps/fsdkaggle2018small/train_alexbased.py: -------------------------------------------------------------------------------- 1 | # Train FSDKaggle2018 model 2 | # 3 | import sys 4 | sys.path.append('../..') 5 | from lib_train import * 6 | 7 | conf.model = 'alexnet' 8 | conf.logdir = 'logs_alexbased_small' 9 | conf.best_weight_file = 'best_alexbased_small_weight.h5' 10 | 11 | # 1. Load Meta data 12 | DATAROOT = Path.home() / '.kaggle/competitions/freesound-audio-tagging' 13 | #Data frame for training dataset 14 | df_train = pd.read_csv(DATAROOT / 'train.csv') 15 | #Plain y_train label 16 | plain_y_train = np.array([conf.label2int[l] for l in df_train.label]) 17 | 18 | # 2. Preprocess data if it's not ready 19 | def fsdkaggle2018_map_y_train(idx_train, plain_y_train): 20 | return np.array([plain_y_train[i] for i in idx_train]) 21 | def fsdkaggle2018_make_preprocessed_train_data(): 22 | conf.folder.mkdir(parents=True, exist_ok=True) 23 | if not os.path.exists(conf.X_train): 24 | XX = mels_build_multiplexed_X(conf, [DATAROOT/'audio_train'/fname for fname in df_train.fname]) 25 | X_train, y_train, X_test, y_test = \ 26 | train_valid_split_multiplexed(conf, XX, plain_y_train, demux=True) 27 | np.save(conf.X_train, X_train) 28 | np.save(conf.y_train, y_train) 29 | np.save(conf.X_test, X_test) 30 | np.save(conf.y_test, y_test) 31 | 32 | fsdkaggle2018_make_preprocessed_train_data() 33 | 34 | # 3. Load all dataset & normalize 35 | X_train, y_train = load_audio_datafiles(conf, conf.X_train, conf.y_train, normalize=True) 36 | X_test, y_test = load_audio_datafiles(conf, conf.X_test, conf.y_test, normalize=True) 37 | print('Loaded train:test = {}:{} samples.'.format(len(X_train), len(X_test))) 38 | 39 | # 4. Train folds 40 | history, model = train_classifier(conf, fold=0, 41 | dataset=[X_train, y_train, X_test, y_test], 42 | model=None, 43 | init_weights=None, # from scratch 44 | #init_weights='../../model/mobilenetv2_small_fsd2018_41cls.h5' 45 | ) 46 | 47 | # 5. Evaluate 48 | evaluate_model(conf, model, X_test, y_test) 49 | 50 | print('___ training finished ___') 51 | -------------------------------------------------------------------------------- /apps/fsdkaggle2018small/train_mobilenetv2.py: -------------------------------------------------------------------------------- 1 | # Train FSDKaggle2018 model 2 | # 3 | import sys 4 | sys.path.append('../..') 5 | from lib_train import * 6 | 7 | conf.logdir = 'logs_mobilenetv2_small' 8 | conf.best_weight_file = 'best_mobilenetv2_small_weight.h5' 9 | 10 | # 1. Load Meta data 11 | DATAROOT = Path.home() / '.kaggle/competitions/freesound-audio-tagging' 12 | #Data frame for training dataset 13 | df_train = pd.read_csv(DATAROOT / 'train.csv') 14 | #Plain y_train label 15 | plain_y_train = np.array([conf.label2int[l] for l in df_train.label]) 16 | 17 | # 2. Preprocess data if it's not ready 18 | def fsdkaggle2018_map_y_train(idx_train, plain_y_train): 19 | return np.array([plain_y_train[i] for i in idx_train]) 20 | def fsdkaggle2018_make_preprocessed_train_data(): 21 | conf.folder.mkdir(parents=True, exist_ok=True) 22 | if not os.path.exists(conf.X_train): 23 | XX = mels_build_multiplexed_X(conf, [DATAROOT/'audio_train'/fname for fname in df_train.fname]) 24 | X_train, y_train, X_test, y_test = \ 25 | train_valid_split_multiplexed(conf, XX, plain_y_train, demux=True) 26 | np.save(conf.X_train, X_train) 27 | np.save(conf.y_train, y_train) 28 | np.save(conf.X_test, X_test) 29 | np.save(conf.y_test, y_test) 30 | 31 | fsdkaggle2018_make_preprocessed_train_data() 32 | 33 | # 3. Load all dataset & normalize 34 | X_train, y_train = load_audio_datafiles(conf, conf.X_train, conf.y_train, normalize=True) 35 | X_test, y_test = load_audio_datafiles(conf, conf.X_test, conf.y_test, normalize=True) 36 | print('Loaded train:test = {}:{} samples.'.format(len(X_train), len(X_test))) 37 | 38 | # 4. Train folds 39 | history, model, plain_datagen = train_model(conf, fold=0, 40 | dataset=[X_train, y_train, X_test, y_test], 41 | model=None, 42 | init_weights=None, # from scratch 43 | #init_weights='../../model/mobilenetv2_small_fsd2018_41cls.h5' 44 | ) 45 | 46 | # 5. Evaluate 47 | evaluate_model(conf, model, X_test, y_test) 48 | 49 | print('___ training finished ___') 50 | -------------------------------------------------------------------------------- /apps/urban-sound/UrbanSound8K-Overview&Preprocess.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# A Classification Application For UrbanSound8K Dataset\n", 8 | "\n", 9 | "[UrbanSound8K](https://urbansounddataset.weebly.com/urbansound8k.html) is publically available audio dataset:\n", 10 | "\n", 11 | "> This dataset contains 8732 labeled sound excerpts (<=4s) of urban sounds from 10 classes: air_conditioner, car_horn, children_playing, dog_bark, drilling, enginge_idling, gun_shot, jackhammer, siren, and street_music. The classes are drawn from the [urban sound taxonomy](https://urbansounddataset.weebly.com/taxonomy.html).\n", 12 | "\n", 13 | "The audio files in this dataset are also from [freesound](https://datasets.freesound.org/), but different classes with [FSDKaggle2018](https://www.kaggle.com/c/freesound-audio-tagging) dataset.\n", 14 | "\n", 15 | "As you can find in [\"Recognition of Acoustic Events Using\n", 16 | "Masked Conditional Neural Networks\"](https://arxiv.org/pdf/1802.02617.pdf), state of the art accuracy of this dataset is about 73-74% so far.\n", 17 | "\n", 18 | "- References\n", 19 | "\n", 20 | " - [1] J. Salamon, C. Jacoby and J. P. Bello, [\"A Dataset and Taxonomy for Urban Sound Research\"](http://www.justinsalamon.com/uploads/4/3/9/4/4394963/salamon_urbansound_acmmm14.pdf), 22nd ACM International Conference on Multimedia, Orlando USA, Nov. 2014.\n", 21 | " - [2] Fady Medhat, David Chesmore and John Robinson, [\"Recognition of Acoustic Events Using\n", 22 | "Masked Conditional Neural Networks\"](https://arxiv.org/pdf/1802.02617.pdf), 2018." 23 | ] 24 | }, 25 | { 26 | "cell_type": "code", 27 | "execution_count": 1, 28 | "metadata": {}, 29 | "outputs": [ 30 | { 31 | "name": "stdout", 32 | "output_type": "stream", 33 | "text": [ 34 | "{'sampling_rate': 44100, 'duration': 1, 'hop_length': 347, 'fmin': 20, 'fmax': 22050, 'n_mels': 128, 'n_fft': 2560, 'model': 'mobilenetv2', 'labels': ['dog_bark', 'children_playing', 'car_horn', 'air_conditioner', 'street_music', 'gun_shot', 'siren', 'engine_idling', 'jackhammer', 'drilling'], 'folder': PosixPath('.'), 'n_fold': 1, 'valid_limit': None, 'random_state': 42, 'test_size': 0.01, 'samples_per_file': 5, 'batch_size': 32, 'learning_rate': 0.0001, 'metric_save_ckpt': 'val_acc', 'epochs': 100, 'verbose': 2, 'best_weight_file': 'best_model_weight.h5', 'rt_process_count': 1, 'rt_oversamples': 10, 'pred_ensembles': 10, 'runtime_model_file': 'model/mobilenetv2_fsd2018_41cls.pb', 'label2int': {'dog_bark': 0, 'children_playing': 1, 'car_horn': 2, 'air_conditioner': 3, 'street_music': 4, 'gun_shot': 5, 'siren': 6, 'engine_idling': 7, 'jackhammer': 8, 'drilling': 9}, 'num_classes': 10, 'samples': 44100, 'rt_chunk_samples': 4410, 'mels_onestep_samples': 4410, 'mels_convert_samples': 48510, 'dims': [128, 128, 1], 'metric_save_mode': 'auto', 'logdir': 'logs', 'data_balancing': 'oversampling', 'X_train': 'X_train.npy', 'y_train': 'y_train.npy', 'X_test': 'X_test.npy', 'y_test': 'y_test.npy', 'steps_per_epoch_limit': None}\n" 35 | ] 36 | }, 37 | { 38 | "name": "stderr", 39 | "output_type": "stream", 40 | "text": [ 41 | "Using TensorFlow backend.\n" 42 | ] 43 | } 44 | ], 45 | "source": [ 46 | "import sys\n", 47 | "sys.path.append('../..')\n", 48 | "from lib_train import *\n", 49 | "%matplotlib inline\n", 50 | "\n", 51 | "DATAROOT = Path('/mnt/dataset/UrbanSound8K') ## Set folder of your copy\n", 52 | "df = pd.read_csv(DATAROOT/'metadata/UrbanSound8K.csv')\n", 53 | "folds = list(set(df.fold))" 54 | ] 55 | }, 56 | { 57 | "cell_type": "markdown", 58 | "metadata": {}, 59 | "source": [ 60 | "## Inside this dataset" 61 | ] 62 | }, 63 | { 64 | "cell_type": "code", 65 | "execution_count": 3, 66 | "metadata": {}, 67 | "outputs": [ 68 | { 69 | "data": { 70 | "text/html": [ 71 | "
\n", 72 | "\n", 85 | "\n", 86 | " \n", 87 | " \n", 88 | " \n", 89 | " \n", 90 | " \n", 91 | " \n", 92 | " \n", 93 | " \n", 94 | " \n", 95 | " \n", 96 | " \n", 97 | " \n", 98 | " \n", 99 | " \n", 100 | " \n", 101 | " \n", 102 | " \n", 103 | " \n", 104 | " \n", 105 | " \n", 106 | " \n", 107 | " \n", 108 | " \n", 109 | " \n", 110 | " \n", 111 | " \n", 112 | " \n", 113 | " \n", 114 | " \n", 115 | " \n", 116 | " \n", 117 | " \n", 118 | " \n", 119 | " \n", 120 | " \n", 121 | " \n", 122 | " \n", 123 | " \n", 124 | " \n", 125 | " \n", 126 | " \n", 127 | " \n", 128 | " \n", 129 | " \n", 130 | " \n", 131 | " \n", 132 | " \n", 133 | " \n", 134 | " \n", 135 | " \n", 136 | " \n", 137 | " \n", 138 | " \n", 139 | " \n", 140 | " \n", 141 | " \n", 142 | " \n", 143 | " \n", 144 | " \n", 145 | " \n", 146 | " \n", 147 | " \n", 148 | " \n", 149 | " \n", 150 | " \n", 151 | " \n", 152 | " \n", 153 | " \n", 154 | " \n", 155 | " \n", 156 | " \n", 157 | " \n", 158 | " \n", 159 | " \n", 160 | " \n", 161 | " \n", 162 | " \n", 163 | " \n", 164 | " \n", 165 | " \n", 166 | " \n", 167 | " \n", 168 | " \n", 169 | " \n", 170 | " \n", 171 | " \n", 172 | " \n", 173 | " \n", 174 | " \n", 175 | " \n", 176 | " \n", 177 | " \n", 178 | " \n", 179 | " \n", 180 | " \n", 181 | " \n", 182 | " \n", 183 | " \n", 184 | " \n", 185 | " \n", 186 | " \n", 187 | " \n", 188 | " \n", 189 | " \n", 190 | " \n", 191 | " \n", 192 | " \n", 193 | " \n", 194 | " \n", 195 | " \n", 196 | " \n", 197 | " \n", 198 | " \n", 199 | " \n", 200 | " \n", 201 | " \n", 202 | " \n", 203 | " \n", 204 | " \n", 205 | " \n", 206 | " \n", 207 | " \n", 208 | " \n", 209 | " \n", 210 | " \n", 211 | "
slice_file_namefsIDstartendsaliencefoldclassIDclass
0100032-3-0-0.wav1000320.0000000.317551153dog_bark
1100263-2-0-117.wav10026358.50000062.500000152children_playing
2100263-2-0-121.wav10026360.50000064.500000152children_playing
3100263-2-0-126.wav10026363.00000067.000000152children_playing
4100263-2-0-137.wav10026368.50000072.500000152children_playing
5100263-2-0-143.wav10026371.50000075.500000152children_playing
6100263-2-0-161.wav10026380.50000084.500000152children_playing
7100263-2-0-3.wav1002631.5000005.500000152children_playing
8100263-2-0-36.wav10026318.00000022.000000152children_playing
9100648-1-0-0.wav1006484.8234025.4719272101car_horn
\n", 212 | "
" 213 | ], 214 | "text/plain": [ 215 | " slice_file_name fsID start end salience fold classID \\\n", 216 | "0 100032-3-0-0.wav 100032 0.000000 0.317551 1 5 3 \n", 217 | "1 100263-2-0-117.wav 100263 58.500000 62.500000 1 5 2 \n", 218 | "2 100263-2-0-121.wav 100263 60.500000 64.500000 1 5 2 \n", 219 | "3 100263-2-0-126.wav 100263 63.000000 67.000000 1 5 2 \n", 220 | "4 100263-2-0-137.wav 100263 68.500000 72.500000 1 5 2 \n", 221 | "5 100263-2-0-143.wav 100263 71.500000 75.500000 1 5 2 \n", 222 | "6 100263-2-0-161.wav 100263 80.500000 84.500000 1 5 2 \n", 223 | "7 100263-2-0-3.wav 100263 1.500000 5.500000 1 5 2 \n", 224 | "8 100263-2-0-36.wav 100263 18.000000 22.000000 1 5 2 \n", 225 | "9 100648-1-0-0.wav 100648 4.823402 5.471927 2 10 1 \n", 226 | "\n", 227 | " class \n", 228 | "0 dog_bark \n", 229 | "1 children_playing \n", 230 | "2 children_playing \n", 231 | "3 children_playing \n", 232 | "4 children_playing \n", 233 | "5 children_playing \n", 234 | "6 children_playing \n", 235 | "7 children_playing \n", 236 | "8 children_playing \n", 237 | "9 car_horn " 238 | ] 239 | }, 240 | "execution_count": 3, 241 | "metadata": {}, 242 | "output_type": "execute_result" 243 | } 244 | ], 245 | "source": [ 246 | "df[:10]" 247 | ] 248 | }, 249 | { 250 | "cell_type": "markdown", 251 | "metadata": {}, 252 | "source": [ 253 | "The following shows that most of data length are 4 seconds." 254 | ] 255 | }, 256 | { 257 | "cell_type": "code", 258 | "execution_count": 4, 259 | "metadata": {}, 260 | "outputs": [ 261 | { 262 | "data": { 263 | "text/plain": [ 264 | "count 8732.000000\n", 265 | "mean 3.607904\n", 266 | "std 0.973570\n", 267 | "min 0.054517\n", 268 | "25% 4.000000\n", 269 | "50% 4.000000\n", 270 | "75% 4.000000\n", 271 | "max 4.000000\n", 272 | "dtype: float64" 273 | ] 274 | }, 275 | "execution_count": 4, 276 | "metadata": {}, 277 | "output_type": "execute_result" 278 | } 279 | ], 280 | "source": [ 281 | "(df.end - df.start).describe()" 282 | ] 283 | }, 284 | { 285 | "cell_type": "markdown", 286 | "metadata": {}, 287 | "source": [ 288 | "## Duplication with FSDKaggle2018\n", 289 | "\n", 290 | "As shown below, there are duplicated Freesound ID with FSDKaggle2018 dataset.\n", 291 | "\n", 292 | "___Due to these duplication, we are NOT using FSDKaggle2018 pretrained model to evaluate performance on this dataset. Or performance would be too good to be true.___\n" 293 | ] 294 | }, 295 | { 296 | "cell_type": "code", 297 | "execution_count": 5, 298 | "metadata": { 299 | "collapsed": true 300 | }, 301 | "outputs": [], 302 | "source": [ 303 | "fsd = pd.read_csv('~/.kaggle/competitions/freesound-audio-tagging/train_post_competition.csv')" 304 | ] 305 | }, 306 | { 307 | "cell_type": "code", 308 | "execution_count": 6, 309 | "metadata": {}, 310 | "outputs": [ 311 | { 312 | "name": "stdout", 313 | "output_type": "stream", 314 | "text": [ 315 | "Number of duplicated Freesound ID: 130\n", 316 | "Duplicated samples are distributed over folds: [ 5 2 10 1 4 3 8 6 9 7]\n" 317 | ] 318 | } 319 | ], 320 | "source": [ 321 | "usids = np.array(df.fsID.unique(), dtype=np.int)\n", 322 | "fsids = np.array(fsd.freesound_id.unique(), dtype=np.int)\n", 323 | "dup_ids = [uid for uid in usids if uid in fsids]\n", 324 | "print('Number of duplicated Freesound ID:', len(dup_ids))\n", 325 | "print('Duplicated samples are distributed over folds:', df[df.fsID.isin(dup_ids)].fold.unique())" 326 | ] 327 | }, 328 | { 329 | "cell_type": "markdown", 330 | "metadata": {}, 331 | "source": [ 332 | "## Convert to numpy array files" 333 | ] 334 | }, 335 | { 336 | "cell_type": "code", 337 | "execution_count": 7, 338 | "metadata": { 339 | "collapsed": true 340 | }, 341 | "outputs": [], 342 | "source": [ 343 | "# Convert data for each fold\n", 344 | "for fold in folds:\n", 345 | " cur_df =df[df.fold.isin([fold])]\n", 346 | " Xfiles = [str(DATAROOT/'audio'/('fold%d' % (r.fold))/r.slice_file_name) for i, r in cur_df.iterrows()]\n", 347 | " y = [conf.label2int[r['class']] for i, r in cur_df.iterrows()]\n", 348 | " XX = mels_build_multiplexed_X(conf, Xfiles)\n", 349 | " X, y = mels_demux_XX_y(XX, y)\n", 350 | " np.save('X_fold%d.npy' % fold, X)\n", 351 | " np.save('y_fold%d.npy' % fold, y)" 352 | ] 353 | }, 354 | { 355 | "cell_type": "code", 356 | "execution_count": null, 357 | "metadata": { 358 | "collapsed": true 359 | }, 360 | "outputs": [], 361 | "source": [] 362 | } 363 | ], 364 | "metadata": { 365 | "kernelspec": { 366 | "display_name": "Python 3", 367 | "language": "python", 368 | "name": "python3" 369 | }, 370 | "language_info": { 371 | "codemirror_mode": { 372 | "name": "ipython", 373 | "version": 3 374 | }, 375 | "file_extension": ".py", 376 | "mimetype": "text/x-python", 377 | "name": "python", 378 | "nbconvert_exporter": "python", 379 | "pygments_lexer": "ipython3", 380 | "version": "3.6.2" 381 | } 382 | }, 383 | "nbformat": 4, 384 | "nbformat_minor": 2 385 | } 386 | -------------------------------------------------------------------------------- /apps/urban-sound/config.py: -------------------------------------------------------------------------------- 1 | # Freesound Dataset Kaggle 2018 2 | # Application configurations 3 | 4 | from easydict import EasyDict 5 | 6 | conf = EasyDict() 7 | 8 | # Basic configurations 9 | conf.sampling_rate = 44100 10 | conf.duration = 1 11 | conf.hop_length = 347 # to make time steps 128 12 | conf.fmin = 20 13 | conf.fmax = conf.sampling_rate // 2 14 | conf.n_mels = 128 15 | conf.n_fft = conf.n_mels * 20 16 | conf.model = 'mobilenetv2' # 'alexnet' 17 | 18 | # Labels 19 | conf.labels = ['dog_bark', 'children_playing', 'car_horn', 'air_conditioner', 20 | 'street_music', 'gun_shot', 'siren', 'engine_idling', 'jackhammer', 21 | 'drilling'] 22 | 23 | # Training configurations 24 | conf.folder = '.' 25 | conf.n_fold = 1 26 | conf.valid_limit = None 27 | conf.random_state = 42 28 | conf.test_size = 0.01 29 | conf.samples_per_file = 5 30 | conf.batch_size = 32 31 | conf.learning_rate = 0.0001 32 | conf.metric_save_ckpt = 'val_acc' 33 | conf.epochs = 100 34 | conf.verbose = 2 35 | conf.best_weight_file = 'best_model_weight.h5' 36 | 37 | # Runtime conficurations 38 | conf.rt_process_count = 1 39 | conf.rt_oversamples = 10 40 | conf.pred_ensembles = 10 41 | conf.runtime_model_file = 'model/mobilenetv2_fsd2018_41cls.pb' 42 | -------------------------------------------------------------------------------- /apps/urban-sound/train.py: -------------------------------------------------------------------------------- 1 | # Train FSDKaggle2018 model 2 | # 3 | import sys 4 | sys.path.append('../..') 5 | from lib_train import * 6 | 7 | # Load all dataset & normalize 8 | X_train, y_train = load_dataset(conf, 'X_train.npy', 'y_train.npy', normalize=True) 9 | X_test, y_test = load_dataset(conf, 'X_test.npy', 'y_test.npy', normalize=True) 10 | print('Loaded train:test = {}:{} samples.'.format(len(X_train), len(X_test))) 11 | 12 | # Train folds 13 | history, model, plain_datagen = train_classifier(conf, fold=0, 14 | dataset=[X_train, y_train], 15 | model=None, 16 | #init_weights=None, # from scratch 17 | init_weights='../../model/mobilenetv2_fsd2018_41cls.h5') 18 | acc = evaluate_model(conf, model, plain_datagen, X_test, y_test) 19 | 20 | print('___ training finished ___') 21 | -------------------------------------------------------------------------------- /common.py: -------------------------------------------------------------------------------- 1 | # Common functions and definitions 2 | # 3 | # This file defines commonly used parts for ease of programming. 4 | # Import as follows: 5 | # 6 | # > from common import * 7 | # 8 | # Private notation(s): 9 | # - mels = melspectrogram 10 | # 11 | 12 | # # Basic definitions 13 | import warnings 14 | warnings.simplefilter('ignore') 15 | import numpy as np 16 | np.warnings.filterwarnings('ignore') 17 | np.random.seed(1001) 18 | 19 | import sys 20 | import shutil 21 | from pathlib import Path 22 | import pandas as pd 23 | import matplotlib.pyplot as plt 24 | sys.path.insert(0, str(Path.cwd())) 25 | 26 | # # Configration 27 | def is_handling_audio(conf): 28 | return 'sampling_rate' in conf 29 | 30 | def test_conf(conf): 31 | if conf.model not in ['mobilenetv2', 'alexnet']: 32 | raise Exception('conf.model not recognized: {}'.format(conf.model)) 33 | if conf.data_balancing not in ['over_sampling', 'under_sampling', 34 | 'by_generator', 'dont_balance']: 35 | raise Exception('conf.data_balancing not recognized: {}'.format(conf.data_balancing)) 36 | 37 | def auto_complete_conf(conf): 38 | if 'folder' in conf: 39 | conf.folder = Path(conf.folder) 40 | conf.label2int = {l:i for i, l in enumerate(conf.labels)} 41 | conf.num_classes = len(conf.labels) 42 | # audio auto configurations 43 | if is_handling_audio(conf): 44 | conf.samples = conf.sampling_rate * conf.duration 45 | conf.rt_chunk_samples = conf.sampling_rate // conf.rt_oversamples 46 | conf.mels_onestep_samples = conf.rt_chunk_samples * conf.rt_process_count 47 | conf.mels_convert_samples = conf.samples + conf.mels_onestep_samples 48 | conf.dims = (conf.n_mels, 1 + int(np.floor(conf.samples/conf.hop_length)), 1) 49 | # optional configurations 50 | if 'model' not in conf: 51 | conf.model = 'mobilenetv2' 52 | if 'metric_save_ckpt' not in conf: 53 | conf.metric_save_ckpt = 'val_loss' 54 | if 'metric_save_mode' not in conf: 55 | conf.metric_save_mode='auto' 56 | if 'logdir' not in conf: 57 | conf.logdir = 'logs' 58 | if 'data_balancing' not in conf: 59 | conf.data_balancing = 'over_sampling' 60 | if 'X_train' not in conf: 61 | conf.X_train = 'X_train.npy' 62 | conf.y_train = 'y_train.npy' 63 | conf.X_test = 'X_test.npy' 64 | conf.y_test = 'y_test.npy' 65 | if 'steps_per_epoch_limit' not in conf: 66 | conf.steps_per_epoch_limit = None 67 | if 'aug_mixup_alpha' not in conf: 68 | conf.aug_mixup_alpha = 1.0 69 | if 'samples_per_file' not in conf: 70 | conf.samples_per_file = 1 71 | if 'eval_ensemble' not in conf: 72 | conf.eval_ensemble = True # Set False if samples_per_file > 1 but ensemble is not available 73 | if 'what_is_sample' not in conf: 74 | conf.what_is_sample = 'log mel-spectrogram' 75 | if 'use_audio_training_model' not in conf: 76 | conf.use_audio_training_model = True 77 | 78 | from config import * 79 | auto_complete_conf(conf) 80 | print(conf) 81 | 82 | # # Data utilities 83 | def load_labels(conf): 84 | conf.labels = load_npy(conf, 'labels.npy') 85 | auto_complete_conf(conf) 86 | print('Labels are', conf.labels) 87 | 88 | def datapath(conf, filename): 89 | return conf.folder / filename 90 | 91 | def load_npy(conf, filename): 92 | return np.load(conf.folder / filename) 93 | 94 | # # Model 95 | if conf.use_audio_training_model: 96 | from sound_models import create_model, freeze_model_layers 97 | 98 | # # Audio Utilities 99 | import librosa 100 | import librosa.display 101 | 102 | def read_audio(conf, pathname, trim_long_data): 103 | y, sr = librosa.load(pathname, sr=conf.sampling_rate) 104 | # trim silence 105 | if 0 < len(y): # workaround: 0 length causes error 106 | y, _ = librosa.effects.trim(y) # trim, top_db=default(60) 107 | # make it unified length to conf.samples 108 | if len(y) > conf.samples: # long enough 109 | if trim_long_data: 110 | y = y[0:0+conf.samples] 111 | else: # pad blank 112 | padding = conf.samples - len(y) # add padding at both ends 113 | offset = padding // 2 114 | y = np.pad(y, (offset, conf.samples - len(y) - offset), 'constant') 115 | return y 116 | 117 | def audio_to_melspectrogram(conf, audio): 118 | spectrogram = librosa.feature.melspectrogram(audio, 119 | sr=conf.sampling_rate, 120 | n_mels=conf.n_mels, 121 | hop_length=conf.hop_length, 122 | n_fft=conf.n_fft, 123 | fmin=conf.fmin, 124 | fmax=conf.fmax) 125 | spectrogram = librosa.power_to_db(spectrogram) 126 | spectrogram = spectrogram.astype(np.float32) 127 | return spectrogram 128 | 129 | def show_melspectrogram(conf, mels, title='Log-frequency power spectrogram'): 130 | import IPython 131 | import matplotlib 132 | from sklearn.model_selection import StratifiedKFold 133 | matplotlib.style.use('ggplot') 134 | 135 | librosa.display.specshow(mels, x_axis='time', y_axis='mel', 136 | sr=conf.sampling_rate, hop_length=conf.hop_length, 137 | fmin=conf.fmin, fmax=conf.fmax) 138 | plt.colorbar(format='%+2.0f dB') 139 | plt.title(title) 140 | plt.show() 141 | 142 | def read_as_melspectrogram(conf, pathname, trim_long_data, debug_display=False): 143 | x = read_audio(conf, pathname, trim_long_data) 144 | mels = audio_to_melspectrogram(conf, x) 145 | if debug_display: 146 | IPython.display.display(IPython.display.Audio(x, rate=conf.sampling_rate)) 147 | show_melspectrogram(conf, mels) 148 | return mels 149 | 150 | # # Dataset Utilities 151 | def deprecated_samplewise_mean_audio_X(X): 152 | for i in range(len(X)): 153 | X[i] -= np.mean(X[i]) 154 | X[i] /= (np.std(X[i]) + 1.0) 155 | 156 | def samplewise_normalize_audio_X(X): 157 | for i in range(len(X)): 158 | X[i] -= np.min(X[i]) 159 | X[i] /= (np.max(np.abs(X[i])) + 1.0) 160 | 161 | def samplewise_normalize_X(X): 162 | for i in range(len(X)): 163 | X[i] -= np.min(X[i]) 164 | X[i] /= (np.max(np.abs(X[i])) + 1e-07) # same as K.epsilon() 165 | 166 | def split_long_data(conf, X): 167 | # Splits long mel-spectrogram data with small overlap 168 | L = X.shape[1] 169 | one_length = conf.dims[1] 170 | loop_length = int(one_length * 0.9) 171 | min_length = int(one_length * 0.2) 172 | print(' sample length', L, 'to split by', one_length) 173 | for idx in range(L // loop_length): 174 | cur = loop_length * idx 175 | rest = L - cur 176 | if one_length <= rest: 177 | yield X[:, cur:cur+one_length] 178 | elif min_length <= rest: 179 | cur = L - one_length 180 | yield X[:, cur:cur+one_length] 181 | 182 | def mels_len(mels): 183 | """Gets lenfth of log mel-spectrogram.""" 184 | return mels.shape[1] 185 | 186 | def audio_sample_to_X(conf, wave): 187 | mels = audio_to_melspectrogram(conf, wave) 188 | X = [] 189 | for s in range(0, mels_len(mels) // conf.dims[1]): 190 | cur = s * conf.dims[1] 191 | X.append(mels[:, cur:cur + conf.dims[1]][..., np.newaxis]) 192 | X = np.array(X) 193 | samplewise_normalize_audio_X(X) 194 | return X 195 | 196 | def load_sample_as_X(conf, filename, trim_long_data): 197 | wave = read_audio(conf, filename, trim_long_data) 198 | return audio_sample_to_X(conf, wave) 199 | 200 | def geometric_mean_preds(_preds): 201 | preds = _preds.copy() 202 | for i in range(1, preds.shape[0]): 203 | preds[0] = np.multiply(preds[0], preds[i]) 204 | return np.power(preds[0], 1/preds.shape[0]) 205 | 206 | # # Tensorflow Utilities 207 | import tensorflow as tf 208 | 209 | def load_graph(model_file): 210 | graph = tf.Graph() 211 | graph_def = tf.GraphDef() 212 | 213 | with open(model_file, "rb") as f: 214 | graph_def.ParseFromString(f.read()) 215 | with graph.as_default(): 216 | tf.import_graph_def(graph_def) 217 | return graph 218 | 219 | class KerasTFGraph: 220 | def __init__(self, model_pb_filename, input_name, 221 | keras_learning_phase_name, output_name): 222 | self.graph = load_graph(model_pb_filename) 223 | self.layer_in = self.graph.get_operation_by_name(input_name) 224 | self.leayer_klp = self.graph.get_operation_by_name(keras_learning_phase_name) 225 | self.layer_out = self.graph.get_operation_by_name(output_name) 226 | self.sess = tf.Session(graph=self.graph) 227 | def predict(self, X): 228 | preds = self.sess.run(self.layer_out.outputs[0], 229 | {self.layer_in.outputs[0]: X, 230 | self.leayer_klp.outputs[0]: 0}) 231 | return preds 232 | def close(self): 233 | self.sess.close() 234 | 235 | def load_keras_tf_graph(conf, graph_file): 236 | model_node = { 237 | 'alexnet': ['import/conv2d_1_input', 238 | 'import/batch_normalization_1/keras_learning_phase', 239 | 'import/output0'], 240 | 'mobilenetv2': ['import/input_1', 241 | 'import/bn_Conv1/keras_learning_phase', 242 | 'import/output0'] 243 | } 244 | return KerasTFGraph( 245 | conf.runtime_model_file if graph_file == '' else graph_file, 246 | input_name=model_node[conf.model][0], 247 | keras_learning_phase_name=model_node[conf.model][1], 248 | output_name=model_node[conf.model][2]) 249 | 250 | # # Pyaudio Utilities 251 | if is_handling_audio(conf): 252 | import pyaudio 253 | 254 | def print_pyaudio_devices(): 255 | p = pyaudio.PyAudio() 256 | count = p.get_device_count() 257 | for i in range(count): 258 | dev = p.get_device_info_by_index(i) 259 | print (i, dev['name'], dev) 260 | 261 | # # Test Utilities 262 | def recursive_test(a, b, fn): 263 | """Greedy test every single corresponding contents between a & b recursively.""" 264 | if isinstance(a, (list, set, tuple, np.ndarray)): 265 | results = np.array([test_equal(aa, bb) for aa, bb in zip(a, b)]) 266 | #print(results) # for debug 267 | return np.all(results == 1) 268 | else: 269 | return 1 if np.all(fn(a, b)) else 0 270 | 271 | def test_equal(a, b): 272 | """Exhaustively test if a equals b""" 273 | return recursive_test(a, b, lambda a, b: a == b) 274 | 275 | def test_not_equal(a, b): 276 | """Exhaustively test if a != b""" 277 | return not test_equal(a, b) 278 | 279 | -------------------------------------------------------------------------------- /config.py: -------------------------------------------------------------------------------- 1 | apps/fsdkaggle2018/config.py -------------------------------------------------------------------------------- /convert_keras_to_tf.py: -------------------------------------------------------------------------------- 1 | import sys 2 | sys.path.append('../..') 3 | from common import * 4 | 5 | import argparse 6 | 7 | parser = argparse.ArgumentParser(description='Keras to Tensorflow converter') 8 | parser.add_argument('model_type', type=str, 9 | help='Model type "alexnet" or "mobilenetv2".') 10 | parser.add_argument('keras_weight', type=str, 11 | help='Keras model weight file.') 12 | parser.add_argument('out_prefix', type=str, 13 | help='Prefix of your tensorflow model name.') 14 | args = parser.parse_args() 15 | print(args, dir(args)) 16 | 17 | # create model 18 | conf.model = args.model_type 19 | model = create_model(conf) 20 | model.load_weights(args.keras_weight) 21 | 22 | # load tensorflow and keras backend 23 | import tensorflow as tf 24 | from tensorflow.python.framework import graph_util 25 | from tensorflow.python.framework import graph_io 26 | from keras import backend as K 27 | ksess = K.get_session() 28 | print(ksess) 29 | 30 | # transform keras model to tensorflow graph 31 | # the output will be json-like format 32 | K.set_learning_phase(0) 33 | graph = ksess.graph 34 | kgraph = graph.as_graph_def() 35 | print(kgraph) 36 | 37 | import os 38 | num_output = 1 39 | prefix = "output" 40 | pred = [None]*num_output 41 | outputName = [None]*num_output 42 | for i in range(num_output): 43 | outputName[i] = prefix + str(i) 44 | pred[i] = tf.identity(model.get_output_at(i), name=outputName[i]) 45 | print('output name: ', outputName) 46 | 47 | # convert variables in the model graph to constants 48 | constant_graph = graph_util.convert_variables_to_constants( 49 | ksess, ksess.graph.as_graph_def(), outputName) 50 | 51 | # save the model in .pb and .txt 52 | output_dir = "./" 53 | output_graph_name = args.out_prefix+".pb" 54 | output_text_name = args.out_prefix+".txt" 55 | graph_io.write_graph(constant_graph, output_dir, output_graph_name, as_text=False) 56 | graph_io.write_graph(constant_graph, output_dir, output_text_name, as_text=True) 57 | print('saved graph .pb at: {0}\nsaved graph .txt at: {1}'.format( 58 | os.path.join(output_dir, output_graph_name), 59 | os.path.join(output_dir, output_text_name))) 60 | -------------------------------------------------------------------------------- /deskwork_detector.py: -------------------------------------------------------------------------------- 1 | from realtime_predictor import * 2 | 3 | emoji = {'Writing': '\U0001F4DD ', 'Scissors': '\u2701 ', 4 | 'Computer_keyboard': '\u2328 '} 5 | 6 | def on_predicted_deskwork(ensembled_pred): 7 | result = np.argmax(ensembled_pred) 8 | label = conf.labels[result] 9 | if label in ['Writing', 'Scissors', 'Computer_keyboard']: 10 | p = ensembled_pred[result] 11 | level = int(p*10) + 1 12 | print(emoji[label] * level, label, p) 13 | 14 | if __name__ == '__main__': 15 | model = get_model(args.model_pb_graph) 16 | # file mode 17 | if args.input_file != '': 18 | process_file(model, args.input_file, on_predicted_deskwork) 19 | my_exit(model) 20 | # device list display mode 21 | if args.input < 0: 22 | print_pyaudio_devices() 23 | my_exit(model) 24 | # normal: realtime mode 25 | FORMAT = pyaudio.paInt16 26 | CHANNELS = 1 27 | audio = pyaudio.PyAudio() 28 | stream = audio.open( 29 | format=FORMAT, 30 | channels=CHANNELS, 31 | rate=conf.sampling_rate, 32 | input=True, 33 | input_device_index=args.input, 34 | frames_per_buffer=conf.rt_chunk_samples, 35 | start=False, 36 | stream_callback=callback # uncomment for non_blocking 37 | ) 38 | # main loop 39 | stream.start_stream() 40 | while stream.is_active(): 41 | main_process(model, on_predicted_deskwork) 42 | time.sleep(0.001) 43 | stream.stop_stream() 44 | stream.close() 45 | # finish 46 | audio.terminate() 47 | my_exit(model) 48 | 49 | 50 | -------------------------------------------------------------------------------- /ext/download.sh: -------------------------------------------------------------------------------- 1 | curl -O https://raw.githubusercontent.com/bckenstler/CLR/master/clr_callback.py 2 | curl -O https://raw.githubusercontent.com/daisukelab/aug-agg/master/background_mixer.py 3 | curl -O https://raw.githubusercontent.com/daisukelab/balanced-mixup-generator/master/mixup_generator/balanced_mixup_generator.py 4 | curl -O https://raw.githubusercontent.com/daisukelab/balanced-mixup-generator/master/mixup_generator/mixup_generator.py 5 | curl -O https://raw.githubusercontent.com/daisukelab/balanced-mixup-generator/master/mixup_generator/random_eraser.py 6 | curl -O https://raw.githubusercontent.com/daisukelab/project-ml/master/projectml.py 7 | -------------------------------------------------------------------------------- /lib_train.py: -------------------------------------------------------------------------------- 1 | # Training functions and definitions 2 | # 3 | # This contains all we need to train. 4 | # Import as follows: 5 | # 6 | # > from lib_train import * 7 | # 8 | 9 | from common import * 10 | import os 11 | from ext.random_eraser import get_random_eraser 12 | from ext.mixup_generator import MixupGenerator 13 | from ext.balanced_mixup_generator import BalancedMixupGenerator 14 | from ext.clr_callback import CyclicLR 15 | from sklearn.model_selection import train_test_split 16 | from sklearn.metrics import f1_score, precision_score, recall_score, accuracy_score 17 | import keras 18 | from keras.preprocessing.image import ImageDataGenerator 19 | from keras.callbacks import (EarlyStopping, LearningRateScheduler, 20 | ModelCheckpoint, TensorBoard, ReduceLROnPlateau) 21 | from keras import backend as K 22 | 23 | from imblearn.over_sampling import RandomOverSampler 24 | from imblearn.under_sampling import RandomUnderSampler 25 | 26 | # # Dataset Utilities 27 | def mels_populate_samples(org_mels, targ_len, targ_samples): 28 | """Populates training samples from original full length wave's log mel-spectrogram.""" 29 | org_len = mels_len(org_mels) 30 | assert org_len >= targ_len 31 | one_step = np.min([float(targ_len), 32 | (org_len - targ_len + 1) / targ_samples]) 33 | generated = [] 34 | for i in range(targ_samples): 35 | cur = int(one_step * i) 36 | generated.append(org_mels[:, cur:cur+targ_len]) 37 | return np.array(generated) 38 | 39 | def test_mels_populate_samples(): 40 | if True: # Brute force test 41 | T = 10 42 | raw = np.zeros((64, T)) 43 | for i in range(T): 44 | raw[:, i] = i 45 | raw[0, :] = 0 # placing 0 at the bottom 46 | for k in range(1, T + 1): 47 | for i in range(1, 100 + 1): 48 | try: 49 | generated = mels_populate_samples(raw, k, i) 50 | if generated.shape[0] != i or generated.shape[2] != k: 51 | print('Fail at i={}, k={}, generated.shape={}'.format(i, k, generated.shape)) 52 | except: 53 | print('Exception at i={}, k={}'.format(i, k)) 54 | break 55 | show_melspectrogram(conf, raw) 56 | print('Test finished.') 57 | else: # short test 58 | k, i = 1, 11 59 | generated = mels_populate_samples(raw, k, i) 60 | #test_mels_populate_samples() 61 | 62 | def mels_build_multiplexed_X(conf, X_files): 63 | """Builds multiplexed input data (X) from list of files.""" 64 | XX = np.zeros((len(X_files), conf.samples_per_file, conf.dims[0], conf.dims[1])) 65 | for i, file in enumerate(X_files): 66 | whole_mels = read_as_melspectrogram(conf, file, trim_long_data=False) 67 | multiplexed = mels_populate_samples(whole_mels, conf.dims[1], conf.samples_per_file) 68 | XX[i, :, :, :] = multiplexed 69 | return XX[..., np.newaxis] 70 | 71 | def mels_demux_XX_y(XX, y): 72 | """De-multiplex data.""" 73 | dims = XX.shape 74 | X = XX.reshape(dims[0] * dims[1], dims[2], dims[3], dims[4]) 75 | y = np.repeat(y, dims[1], axis=0) # ex. if dims[1]==2) [1, 2, 3] -> [1, 1, 2, 2, 3, 3] 76 | return X, y 77 | 78 | def generate_y_from_pathname(X_files): 79 | """Generates y data from pathname of files.""" 80 | labeled_y = [Path(f).parent.name for f in X_files] 81 | labels = sorted(list(set(labeled_y))) 82 | label2int = {label: i for i, label in enumerate(labels)} 83 | y = np.array([label2int[a_y] for a_y in labeled_y]) 84 | print('Set labels to config.py; conf.labels =', labels) 85 | return y, labels, label2int 86 | 87 | def train_valid_split_multiplexed(conf, XX, y, demux=True, delta_random_state=0): 88 | """Splits multiplexed set of data. 89 | 90 | XX[sample_no, multiplex_no, ...] to be 3 or more dimentional data vector. 91 | y[sample_no] can be both one-hot or label index. 92 | demux==True will output X, False will output raw split XX""" 93 | assert len(XX) == len(y) 94 | def split_fold(XX, y, fold_list, demux): 95 | _XX, _y = XX[fold_list], y[fold_list] 96 | if demux: 97 | return mels_demux_XX_y(_XX, _y) 98 | else: 99 | return _XX, _y 100 | # decode y if it is one-hot vector 101 | _y = flatten_y_if_onehot(y) 102 | print() 103 | # split train/valid 104 | train_fold, valid_fold, _, _ = train_test_split(list(range(len(_y))), _y, 105 | test_size=conf.test_size, 106 | random_state=conf.random_state + delta_random_state) 107 | X_or_XX_train, y_train = split_fold(XX, y, train_fold, demux) 108 | X_or_XX_valid, y_valid = split_fold(XX, y, valid_fold, demux) 109 | # train/valid are ready 110 | return X_or_XX_train, y_train, X_or_XX_valid, y_valid 111 | 112 | def load_audio_datafiles(conf, X_or_XX_file, y_file, normalize): 113 | X_or_XX, y = load_npy(conf, X_or_XX_file), \ 114 | keras.utils.to_categorical(load_npy(conf, y_file)) 115 | if normalize: 116 | print(' normalize samplewise') 117 | if len(X_or_XX.shape) == 5: 118 | for X in X_or_XX: # it is XX 119 | samplewise_normalize_audio_X(X) 120 | else: 121 | samplewise_normalize_audio_X(X_or_XX) # it is X 122 | return X_or_XX, y 123 | 124 | def load_datafiles(conf, X_file, y_file=None, normalize=True): 125 | X = load_npy(conf, X_file) 126 | if y_file is not None: 127 | y = keras.utils.to_categorical(load_npy(conf, y_file)) 128 | if normalize: 129 | print(' normalize samplewise') 130 | samplewise_normalize_X(X) 131 | return (X, y) if y_file is not None else X 132 | 133 | # # Data Distribution Utilities 134 | def flatten_y_if_onehot(y): 135 | """De-one-hot y, i.e. [0,1,0,0,...] to 1 for all y.""" 136 | return y if len(np.array(y).shape) == 1 else np.argmax(y, axis = -1) 137 | 138 | def get_class_distribution(y): 139 | """Calculate number of samples per class.""" 140 | # y_cls can be one of [OH label, index of class, class label name string] 141 | # convert OH to index of class 142 | y_cls = flatten_y_if_onehot(y) 143 | # y_cls can be one of [index of class, class label name] 144 | classset = sorted(list(set(y_cls))) 145 | sample_distribution = {cur_cls:len([one for one in y_cls if one == cur_cls]) for cur_cls in classset} 146 | return sample_distribution 147 | 148 | def get_class_distribution_list(y, num_classes): 149 | """Calculate number of samples per class as list""" 150 | dist = get_class_distribution(y) 151 | assert(y[0].__class__ != str) # class index or class OH label only 152 | list_dist = np.zeros((num_classes)) 153 | for i in range(num_classes): 154 | if i in dist: 155 | list_dist[i] = dist[i] 156 | return list_dist 157 | 158 | def _balance_class(X, y, min_or_max, sampler_class, random_state): 159 | """Balance class distribution with sampler_class.""" 160 | y_cls = flatten_y_if_onehot(y) 161 | distribution = get_class_distribution(y_cls) 162 | classes = list(distribution.keys()) 163 | counts = list(distribution.values()) 164 | nsamples = np.max(counts) if min_or_max == 'max' \ 165 | else np.min(counts) 166 | flat_ratio = {cls:nsamples for cls in classes} 167 | Xidx = [[xidx] for xidx in range(len(X))] 168 | sampler_instance = sampler_class(ratio=flat_ratio, random_state=random_state) 169 | Xidx_resampled, y_cls_resampled = sampler_instance.fit_sample(Xidx, y_cls) 170 | sampled_index = [idx[0] for idx in Xidx_resampled] 171 | return np.array([X[idx] for idx in sampled_index]), np.array([y[idx] for idx in sampled_index]) 172 | 173 | def balance_class_by_over_sampling(X, y, random_state=42): 174 | """Balance class distribution with imbalanced-learn RandomOverSampler.""" 175 | return _balance_class(X, y, 'max', RandomOverSampler, random_state) 176 | 177 | def balance_class_by_under_sampling(X, y, random_state=42): 178 | """Balance class distribution with imbalanced-learn RandomUnderSampler.""" 179 | return _balance_class(X, y, 'min', RandomUnderSampler, random_state) 180 | 181 | def visualize_class_balance(title, y, labels): 182 | sample_dist_list = get_class_distribution_list(y, len(labels)) 183 | index = range(len(labels)) 184 | fig, ax = plt.subplots(1, 1, figsize = (16, 5)) 185 | ax.bar(index, sample_dist_list) 186 | ax.set_xlabel('Label') 187 | ax.set_xticks(index) 188 | ax.set_xticklabels(labels, rotation='vertical') 189 | ax.set_ylabel('Number of Samples') 190 | ax.set_title(title) 191 | fig.show() 192 | 193 | def print_class_balance(title, y, labels): 194 | distributions = get_class_distribution(y) 195 | dist_dic = {labels[cls]:distributions[cls] for cls in distributions} 196 | print(title, '=', dist_dic) 197 | zeroclasses = [label for i, label in enumerate(labels) if i not in distributions.keys()] 198 | if 0 < len(zeroclasses): 199 | print(' 0 sample classes:', zeroclasses) 200 | 201 | # # Training Functions 202 | def create_train_generator(conf, _Xtrain, _ytrain, image_data_generator=None): 203 | # Create Keras ImageDataGenerator 204 | def print_generator_use(message): 205 | print(' {}{}'.format(message, 206 | ', with class balancing' if conf.data_balancing == 'by_generator' else '')) 207 | if image_data_generator is None: 208 | aug_datagen = ImageDataGenerator( 209 | rotation_range=0, 210 | width_shift_range=0.4, 211 | height_shift_range=0.0, 212 | horizontal_flip=True, 213 | preprocessing_function=get_random_eraser(v_l=0, v_h=1)) 214 | print_generator_use('Using normal data generator') 215 | else: 216 | aug_datagen = image_data_generator 217 | print_generator_use('Using Special data generator') 218 | # Create Generators 219 | mixup_class = MixupGenerator if conf.data_balancing != 'by_generator' \ 220 | else BalancedMixupGenerator 221 | train_generator = mixup_class(_Xtrain, _ytrain, 222 | alpha=conf.aug_mixup_alpha, batch_size=conf.batch_size, 223 | datagen=aug_datagen)() 224 | return train_generator 225 | 226 | def get_steps_per_epoch(conf, _Xtrain): 227 | train_steps_per_epoch = (len(_Xtrain) + conf.batch_size - 1) // conf.batch_size 228 | if conf.steps_per_epoch_limit is not None: 229 | train_steps_per_epoch = np.clip(train_steps_per_epoch, train_steps_per_epoch, 230 | conf.steps_per_epoch_limit) 231 | if conf.verbose > 0: 232 | print(' train_steps_per_epoch, {}'.format(train_steps_per_epoch)) 233 | return train_steps_per_epoch 234 | 235 | def balance_dataset(conf, X, y): 236 | if conf.data_balancing == 'over_sampling' or conf.data_balancing == 'under_sampling': 237 | print_class_balance(' Current category distribution', y, conf.labels) 238 | X, y = balance_class_by_over_sampling(X, y) if conf.data_balancing == 'over_sampling' \ 239 | else balance_class_by_under_sampling(X, y) 240 | print_class_balance(' Balanced distribution', y, conf.labels) 241 | else: 242 | print(' Dataset is not balanced so far, conf.data_balancing =', conf.data_balancing) 243 | return X, y 244 | 245 | def calculate_metrics(conf, y_true, y_pred): 246 | """Calculate possible metrics.""" 247 | y_true = flatten_y_if_onehot(y_true) 248 | y_pred = flatten_y_if_onehot(y_pred) 249 | average = 'weighted' if conf.num_classes > 2 else 'binary' 250 | f1 = f1_score(y_true, y_pred, average=average) 251 | recall = recall_score(y_true, y_pred, average=average) 252 | precision = precision_score(y_true, y_pred, average=average) 253 | accuracy = accuracy_score(y_true, y_pred) 254 | return f1, recall, precision, accuracy 255 | 256 | def skew_preds(y_pred, binary_bias=None): 257 | _preds = y_pred.copy() 258 | if binary_bias is not None: 259 | ps = np.power(_preds[:, 1], binary_bias) 260 | _preds[:, 1] = ps 261 | _preds[:, 0] = 1 - ps 262 | print(' @skew', binary_bias) 263 | return _preds 264 | 265 | def print_metrics(conf, y_true, y_pred, binary_bias=None, title_prefix=''): 266 | # Add bias if binary_bias is set 267 | _preds = skew_preds(y_pred, binary_bias) 268 | # Calculate metrics 269 | f1, recall, precision, acc = calculate_metrics(conf, y_true, _preds) 270 | print('{0:s}F1/Recall/Precision/Accuracy = {1:.4f}/{2:.4f}/{3:.4f}/{4:.4f}' \ 271 | .format(title_prefix, f1, recall, precision, acc)) 272 | 273 | def summarize_metrics_history(metrics_history, show_graph=True): 274 | """Summarize history of metrics.""" 275 | metrics_history = np.array(metrics_history) 276 | df=pd.DataFrame({'x': np.arange(1, metrics_history.shape[0]+1), 277 | 'f1': metrics_history[:, 0], 278 | 'recall': metrics_history[:, 1], 279 | 'precision': metrics_history[:, 2], 280 | 'accuracy': metrics_history[:, 3]}) 281 | print(df[['f1', 'recall', 'precision', 'accuracy']].describe()) 282 | 283 | if show_graph: 284 | plt.plot('x', 'f1', data=df, marker='', color='blue', markersize=2, linewidth=1) 285 | plt.plot('x', 'recall', data=df, marker='', color='olive', linewidth=1) 286 | plt.plot('x', 'precision', data=df, marker='o', color='pink', markerfacecolor='red', linewidth=4) 287 | plt.plot('x', 'accuracy', data=df, marker='', color='black', linewidth=1) 288 | plt.legend() 289 | plt.show() 290 | 291 | return df 292 | 293 | def evaluate_model(conf, model, X, y): 294 | # Predict 295 | preds = model.predict(X) 296 | # Calculate metrics 297 | f1, recall, precision, acc = calculate_metrics(conf, y, preds) 298 | print('F1/Recall/Precision/Accuracy = {0:.4f}/{1:.4f}/{2:.4f}/{3:.4f}' \ 299 | .format(f1, recall, precision, acc)) 300 | # Calculate ensemble accuracy 301 | if conf.samples_per_file > 1 and conf.eval_ensemble: 302 | sample_preds_list = np.array([preds[i::conf.samples_per_file] 303 | for i in range(conf.samples_per_file)]) 304 | one_y = y[::conf.samples_per_file] 305 | ensemble_preds = geometric_mean_preds(sample_preds_list) 306 | f1, recall, precision, acc = calculate_metrics(conf, one_y, ensemble_preds) 307 | print('Ensemble F1/Recall/Precision/Accuracy = {0:.4f}/{1:.4f}/{2:.4f}/{3:.4f}' \ 308 | .format(f1, recall, precision, acc)) 309 | return f1, recall, precision, acc 310 | 311 | def train_classifier(conf, fold, dataset, model=None, init_weights=None, 312 | image_data_generator=None): 313 | # Test configuration to make sure training properly 314 | test_conf(conf) 315 | # Split train/valid 316 | if len(dataset) == 2: # Auto train/valid split 317 | print('----- Fold #%d ----' % fold) 318 | _X_train, _y_train = dataset 319 | # Cross validation split & balance # of samples 320 | Xtrain, Xvalid, ytrain, yvalid = \ 321 | train_test_split(_X_train, _y_train, 322 | test_size=conf.test_size, 323 | random_state=conf.random_state) 324 | else: # Or predetermined train/valid split 325 | Xtrain, ytrain, Xvalid, yvalid = dataset 326 | 327 | # Balamce train set 328 | Xtrain, ytrain = balance_dataset(conf, Xtrain, ytrain) 329 | 330 | # Get generators, steps, callbacks, and model 331 | train_generator = create_train_generator(conf, Xtrain, ytrain, image_data_generator) 332 | train_steps_per_epoch = get_steps_per_epoch(conf, Xtrain) 333 | callbacks = [ 334 | ModelCheckpoint(str(datapath(conf, conf.best_weight_file)), 335 | monitor=conf.metric_save_ckpt, mode=conf.metric_save_mode, 336 | verbose=1 if conf.verbose > 0 else 0, 337 | save_best_only=True, save_weights_only=True), 338 | CyclicLR(base_lr=conf.learning_rate / 10.0, max_lr=conf.learning_rate, 339 | step_size=train_steps_per_epoch, mode='triangular'), 340 | TensorBoard(log_dir=str(datapath(conf, 'logs/fold_%d' % fold)), write_graph=True) 341 | ] 342 | if model is None: 343 | model = create_model(conf, weights=init_weights) 344 | # Train model 345 | history = model.fit_generator(train_generator, 346 | steps_per_epoch=train_steps_per_epoch, 347 | epochs=conf.epochs, 348 | validation_data=(Xvalid, yvalid), 349 | callbacks=callbacks, 350 | verbose=conf.verbose) 351 | # Load best weight 352 | model.load_weights(datapath(conf, conf.best_weight_file)) 353 | return history, model 354 | 355 | ## More visualization 356 | 357 | # Thanks to http://scikit-learn.org/stable/auto_examples/model_selection/plot_confusion_matrix.html#sphx-glr-auto-examples-model-selection-plot-confusion-matrix-py 358 | import itertools 359 | from sklearn.metrics import confusion_matrix 360 | 361 | def plot_confusion_matrix(y_test, y_pred, classes, 362 | normalize=True, 363 | title=None, 364 | cmap=plt.cm.Blues): 365 | """Plot confusion matrix. 366 | """ 367 | po = np.get_printoptions() 368 | np.set_printoptions(precision=2) 369 | 370 | y_test = flatten_y_if_onehot(y_test) 371 | y_pred = flatten_y_if_onehot(y_pred) 372 | cm = confusion_matrix(y_test, y_pred) 373 | 374 | if normalize: 375 | cm = cm.astype('float') / cm.sum(axis=1)[:, np.newaxis] 376 | if title is None: title = 'Normalized confusion matrix' 377 | else: 378 | if title is None: title = 'Confusion matrix (not normalized)' 379 | 380 | plt.imshow(cm, interpolation='nearest', cmap=cmap) 381 | plt.title(title) 382 | plt.colorbar() 383 | tick_marks = np.arange(len(classes)) 384 | plt.xticks(tick_marks, classes, rotation=45) 385 | plt.yticks(tick_marks, classes) 386 | 387 | fmt = '.2f' if normalize else 'd' 388 | thresh = cm.max() / 2. 389 | for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])): 390 | plt.text(j, i, format(cm[i, j], fmt), 391 | horizontalalignment="center", 392 | color="white" if cm[i, j] > thresh else "black") 393 | 394 | plt.ylabel('True label') 395 | plt.xlabel('Predicted label') 396 | plt.tight_layout() 397 | np.set_printoptions(**po) -------------------------------------------------------------------------------- /model/alexbased_small_fsd2018_41cls.h5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/model/alexbased_small_fsd2018_41cls.h5 -------------------------------------------------------------------------------- /model/alexbased_small_fsd2018_41cls.pb: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/model/alexbased_small_fsd2018_41cls.pb -------------------------------------------------------------------------------- /model/mobilenetv2_fsd2018_41cls.h5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/model/mobilenetv2_fsd2018_41cls.h5 -------------------------------------------------------------------------------- /model/mobilenetv2_fsd2018_41cls.pb: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/model/mobilenetv2_fsd2018_41cls.pb -------------------------------------------------------------------------------- /model/mobilenetv2_small_fsd2018_41cls.h5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/model/mobilenetv2_small_fsd2018_41cls.h5 -------------------------------------------------------------------------------- /premitive_file_predictor.py: -------------------------------------------------------------------------------- 1 | # Predict from a sound file 2 | # 3 | # This is simplest premitive example to predict from a sound file. 4 | # Audio content will be just split and predict for each split portions. 5 | # No ensemble applied, sometimes weak to make predictions. 6 | # 7 | # Example: 8 | # $ CUDA_VISIBLE_DEVICES= python premitive_file_predictor.py sample/fireworks.wav 9 | # 10 | 11 | from common import * 12 | import argparse 13 | 14 | parser = argparse.ArgumentParser(description='Run sound classifier') 15 | parser.add_argument('audio_file', type=str, 16 | help='audio file to predict.') 17 | parser.add_argument('--model-pb-graph', '-pb', default='model/mobilenetv2_fsd2018_41cls.pb', type=str, 18 | help='Feed model you want to run') 19 | args = parser.parse_args() 20 | 21 | model = KerasTFGraph(args.model_pb_graph, 22 | input_name='import/input_1', 23 | keras_learning_phase_name='import/bn_Conv1/keras_learning_phase', 24 | output_name='import/output0') 25 | 26 | X = load_sample_as_X(conf, args.audio_file, trim_long_data=False) 27 | 28 | preds = model.predict(X) 29 | for pred in preds: 30 | result = np.argmax(pred) 31 | print(conf.labels[result], pred[result]) 32 | -------------------------------------------------------------------------------- /realtime_predictor.py: -------------------------------------------------------------------------------- 1 | #-*-coding:utf-8-*- 2 | #!/usr/bin/python 3 | # 4 | # Run sound classifier in realtime. 5 | # 6 | from common import * 7 | 8 | import pyaudio 9 | import sys 10 | import time 11 | import array 12 | import numpy as np 13 | import queue 14 | from collections import deque 15 | import argparse 16 | 17 | parser = argparse.ArgumentParser(description='Run sound classifier') 18 | parser.add_argument('--input', '-i', default='0', type=int, 19 | help='Audio input device index. Set -1 to list devices') 20 | parser.add_argument('--input-file', '-f', default='', type=str, 21 | help='If set, predict this audio file.') 22 | #parser.add_argument('--save_file', default='recorded.wav', type=str, 23 | # help='File to save samples captured while running.') 24 | parser.add_argument('--model-pb-graph', '-pb', default='', type=str, 25 | help='Feed model you want to run, or conf.runtime_weight_file will be used.') 26 | args = parser.parse_args() 27 | 28 | # # Capture & pridiction jobs 29 | raw_frames = queue.Queue(maxsize=100) 30 | def callback(in_data, frame_count, time_info, status): 31 | wave = array.array('h', in_data) 32 | raw_frames.put(wave, True) 33 | return (None, pyaudio.paContinue) 34 | 35 | def on_predicted(ensembled_pred): 36 | result = np.argmax(ensembled_pred) 37 | print(conf.labels[result], ensembled_pred[result]) 38 | 39 | raw_audio_buffer = [] 40 | pred_queue = deque(maxlen=conf.pred_ensembles) 41 | def main_process(model, on_predicted): 42 | # Pool audio data 43 | global raw_audio_buffer 44 | while not raw_frames.empty(): 45 | raw_audio_buffer.extend(raw_frames.get()) 46 | if len(raw_audio_buffer) >= conf.mels_convert_samples: break 47 | if len(raw_audio_buffer) < conf.mels_convert_samples: return 48 | # Convert to log mel-spectrogram 49 | audio_to_convert = np.array(raw_audio_buffer[:conf.mels_convert_samples]) / 32767 50 | raw_audio_buffer = raw_audio_buffer[conf.mels_onestep_samples:] 51 | mels = audio_to_melspectrogram(conf, audio_to_convert) 52 | # Predict, ensemble 53 | X = [] 54 | for i in range(conf.rt_process_count): 55 | cur = int(i * conf.dims[1] / conf.rt_oversamples) 56 | X.append(mels[:, cur:cur+conf.dims[1], np.newaxis]) 57 | X = np.array(X) 58 | samplewise_normalize_audio_X(X) 59 | raw_preds = model.predict(X) 60 | for raw_pred in raw_preds: 61 | pred_queue.append(raw_pred) 62 | ensembled_pred = geometric_mean_preds(np.array([pred for pred in pred_queue])) 63 | on_predicted(ensembled_pred) 64 | 65 | # # Main controller 66 | def process_file(model, filename, on_predicted=on_predicted): 67 | # Feed audio data as if it was recorded in realtime 68 | audio = read_audio(conf, filename, trim_long_data=False) * 32767 69 | while len(audio) > conf.rt_chunk_samples: 70 | raw_frames.put(audio[:conf.rt_chunk_samples]) 71 | audio = audio[conf.rt_chunk_samples:] 72 | main_process(model, on_predicted) 73 | 74 | def my_exit(model): 75 | model.close() 76 | exit(0) 77 | 78 | def get_model(graph_file): 79 | model_node = { 80 | 'alexnet': ['import/conv2d_1_input', 81 | 'import/batch_normalization_1/keras_learning_phase', 82 | 'import/output0'], 83 | 'mobilenetv2': ['import/input_1', 84 | 'import/bn_Conv1/keras_learning_phase', 85 | 'import/output0'] 86 | } 87 | return KerasTFGraph( 88 | conf.runtime_model_file if graph_file == '' else graph_file, 89 | input_name=model_node[conf.model][0], 90 | keras_learning_phase_name=model_node[conf.model][1], 91 | output_name=model_node[conf.model][2]) 92 | 93 | def run_predictor(): 94 | model = get_model(args.model_pb_graph) 95 | # file mode 96 | if args.input_file != '': 97 | process_file(model, args.input_file) 98 | my_exit(model) 99 | # device list display mode 100 | if args.input < 0: 101 | print_pyaudio_devices() 102 | my_exit(model) 103 | # normal: realtime mode 104 | FORMAT = pyaudio.paInt16 105 | CHANNELS = 1 106 | audio = pyaudio.PyAudio() 107 | stream = audio.open( 108 | format=FORMAT, 109 | channels=CHANNELS, 110 | rate=conf.sampling_rate, 111 | input=True, 112 | input_device_index=args.input, 113 | frames_per_buffer=conf.rt_chunk_samples, 114 | start=False, 115 | stream_callback=callback # uncomment for non_blocking 116 | ) 117 | # main loop 118 | stream.start_stream() 119 | while stream.is_active(): 120 | main_process(model, on_predicted) 121 | time.sleep(0.001) 122 | stream.stop_stream() 123 | stream.close() 124 | # finish 125 | audio.terminate() 126 | my_exit(model) 127 | 128 | if __name__ == '__main__': 129 | run_predictor() 130 | -------------------------------------------------------------------------------- /rpi/config.py: -------------------------------------------------------------------------------- 1 | # Freesound Dataset Kaggle 2018 2 | # Application configurations 3 | 4 | from easydict import EasyDict 5 | 6 | conf = EasyDict() 7 | 8 | # Basic configurations 9 | conf.sampling_rate = 16000 10 | conf.duration = 1 11 | conf.hop_length = 253 # to make time steps 64 12 | conf.fmin = 20 13 | conf.fmax = conf.sampling_rate // 2 14 | conf.n_mels = 64 15 | conf.n_fft = conf.n_mels * 20 16 | conf.model = 'alexnet' 17 | 18 | # Labels 19 | conf.labels = ['Hi-hat', 'Saxophone', 'Trumpet', 'Glockenspiel', 'Cello', 'Knock', 20 | 'Gunshot_or_gunfire', 'Clarinet', 'Computer_keyboard', 21 | 'Keys_jangling', 'Snare_drum', 'Writing', 'Laughter', 'Tearing', 22 | 'Fart', 'Oboe', 'Flute', 'Cough', 'Telephone', 'Bark', 'Chime', 23 | 'Bass_drum', 'Bus', 'Squeak', 'Scissors', 'Harmonica', 'Gong', 24 | 'Microwave_oven', 'Burping_or_eructation', 'Double_bass', 'Shatter', 25 | 'Fireworks', 'Tambourine', 'Cowbell', 'Electric_piano', 'Meow', 26 | 'Drawer_open_or_close', 'Applause', 'Acoustic_guitar', 27 | 'Violin_or_fiddle', 'Finger_snapping'] 28 | 29 | # Training Configurations 30 | conf.folder = '.' 31 | conf.n_fold = 1 32 | conf.normalize = 'samplewise' 33 | conf.valid_limit = None 34 | conf.random_state = 42 35 | conf.test_size = 0.01 36 | conf.samples_per_file = 5 37 | conf.batch_size = 32 38 | conf.learning_rate = 0.0001 39 | conf.epochs = 500 40 | conf.verbose = 2 41 | conf.best_weight_file = 'best_alexbased_weight.h5' 42 | 43 | # Runtime conficurations 44 | conf.rt_process_count = 5 45 | conf.rt_oversamples = 10 46 | conf.pred_ensembles = 10 47 | conf.runtime_model_file = '../model/alexbased_small_fsd2018_41cls.pb' 48 | -------------------------------------------------------------------------------- /sample/desktop_sounds.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/sample/desktop_sounds.wav -------------------------------------------------------------------------------- /sample/fireworks.wav: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/sample/fireworks.wav -------------------------------------------------------------------------------- /sound_models.py: -------------------------------------------------------------------------------- 1 | import keras 2 | from keras.layers import Dense, Conv2D, SeparableConv2D, Convolution2D, AveragePooling2D 3 | from keras.layers import MaxPooling2D, GlobalAveragePooling2D, GlobalMaxPooling2D, Activation, Dropout, BatchNormalization, Flatten, Input 4 | from keras.models import Model, Sequential 5 | from keras.applications.mobilenetv2 import MobileNetV2 6 | 7 | def model_cnn_alexnet(input_shape, num_classes, time_compress=[2, 1, 1], early_strides=(2,3)): 8 | model = Sequential() 9 | 10 | model.add(Conv2D(48, 11, input_shape=input_shape, strides=early_strides, activation='relu', padding='same')) 11 | model.add(MaxPooling2D(3, strides=(1,2))) 12 | model.add(BatchNormalization()) 13 | 14 | model.add(Conv2D(128, 5, strides=early_strides, activation='relu', padding='same')) 15 | model.add(MaxPooling2D(3, strides=2)) 16 | model.add(BatchNormalization()) 17 | 18 | model.add(Conv2D(192, 3, strides=(1, time_compress[0]), activation='relu', padding='same')) 19 | model.add(Conv2D(192, 3, strides=(1, time_compress[1]), activation='relu', padding='same')) 20 | model.add(Conv2D(128, 3, strides=(1, time_compress[2]), activation='relu', padding='same', name='last_conv')) 21 | model.add(MaxPooling2D(3, strides=(1,2))) 22 | model.add(BatchNormalization()) 23 | 24 | model.add(Flatten()) 25 | model.add(Dense(256, activation='relu')) 26 | model.add(Dropout(0.5)) 27 | model.add(Dense(256, activation='relu')) 28 | model.add(Dropout(0.5)) 29 | model.add(Dense(num_classes, activation='softmax')) 30 | return model 31 | 32 | def model_mobilenetv2(input_shape, num_classes): 33 | base_model = MobileNetV2(weights=None, input_shape=input_shape, include_top=False, 34 | alpha=0.35, depth_multiplier=0.5) 35 | x = base_model.output 36 | x = GlobalAveragePooling2D()(x) 37 | x = Dense(1024, activation='relu')(x) 38 | predictions = Dense(num_classes, activation='softmax')(x) 39 | model = Model(inputs=base_model.input, outputs=predictions) 40 | return model 41 | 42 | def create_model(conf, weights=None, show_detail=False): 43 | if conf.model == 'alexnet': 44 | print('Model: AlexNet based') 45 | model = model_cnn_alexnet(conf.dims, conf.num_classes, 46 | time_compress=[1, 1, 1], early_strides=(3,2)) 47 | else: 48 | print('Model: MobileNetV2') 49 | model = model_mobilenetv2(input_shape=conf.dims, num_classes=conf.num_classes) 50 | model.compile(loss='categorical_crossentropy', 51 | optimizer=keras.optimizers.Adam(lr=conf.learning_rate), 52 | metrics=['accuracy']) 53 | if weights is not None: 54 | print('Loading weights:', weights) 55 | model.load_weights(weights, by_name=True, skip_mismatch=True) 56 | if show_detail: 57 | model.summary() 58 | return model 59 | 60 | def freeze_model_layers(model, trainable_after_this=''): 61 | trainable = False 62 | for layer in model.layers: 63 | if layer.name == trainable_after_this: 64 | trainable = True 65 | layer.trainable = trainable 66 | -------------------------------------------------------------------------------- /template_realtime_mels_viewer.py: -------------------------------------------------------------------------------- 1 | #-*-coding:utf-8-*- 2 | #!/usr/bin/python 3 | # 4 | # This is template python application for printing log mel-spectrogram as text. 5 | # You can edit for your purpose. 6 | # 7 | from common import * 8 | import pyaudio 9 | import sys 10 | import time 11 | import array 12 | import numpy as np 13 | import queue 14 | import argparse 15 | 16 | parser = argparse.ArgumentParser(description='Sample log mel-spectrogram viewer') 17 | parser.add_argument('--input-file', '-f', default='', type=str, 18 | help='If set, process this audio file.') 19 | args = parser.parse_args() 20 | 21 | conf={} 22 | conf['sampling_rate'] = 44100 23 | conf['duration'] = 1 24 | conf['hop_length'] = conf['sampling_rate'] // 10 # 347 -> to make time steps 128 25 | conf['fmin'] = 20 26 | conf['fmax'] = conf['sampling_rate'] // 2 27 | conf['n_mels'] = 64 28 | conf['n_fft'] = conf['n_mels'] * 20 29 | conf['rt_process_count'] = 1 30 | conf['rt_oversamples'] = 10 31 | conf['rt_chunk_samples'] = conf['sampling_rate'] // conf['rt_oversamples'] 32 | conf['audio_split'] = 'dont_crop' 33 | auto_complete_conf(conf) 34 | 35 | mels_onestep_samples = conf['rt_chunk_samples'] * conf['rt_process_count'] 36 | mels_convert_samples = conf['samples'] + mels_onestep_samples 37 | 38 | raw_frames = queue.Queue(maxsize=100) 39 | def callback(in_data, frame_count, time_info, status): 40 | wave = array.array('h', in_data) 41 | raw_frames.put(wave, True) 42 | return (None, pyaudio.paContinue) 43 | 44 | def level2char(level, amin=0, amax=40): 45 | chrs = ['\u2581', '\u2582', '\u2583', '\u2584', '\u2585', '\u2586', '\u2587', '\u2588'] 46 | level = np.clip(level, amin, amax) 47 | index = int((level - amin - 1e-3) / (amax - amin) * len(chrs)) 48 | return chrs[index] 49 | 50 | raw_audio_buffer = [] 51 | def main_process(): 52 | global raw_audio_buffer 53 | while not raw_frames.empty(): 54 | raw_audio_buffer.extend(raw_frames.get()) 55 | if len(raw_audio_buffer) >= mels_convert_samples: break 56 | if len(raw_audio_buffer) < mels_convert_samples: return 57 | 58 | audio_to_convert = np.array(raw_audio_buffer[:mels_convert_samples]) / 32767.0 59 | raw_audio_buffer = raw_audio_buffer[mels_onestep_samples:] 60 | mels = audio_to_melspectrogram(conf, audio_to_convert) 61 | for i in range(mels.shape[1]): 62 | #print(' '.join(['%2d' % int(x) for x in mels[:,0]])) 63 | print(''.join([level2char(x) for x in mels[:,i]])) 64 | 65 | def process_file(filename): 66 | # Feed audio data as if it was recorded in realtime 67 | audio = read_audio(conf, filename) * 32767 68 | while len(audio) > conf['rt_chunk_samples']: 69 | raw_frames.put(audio[:conf['rt_chunk_samples']]) 70 | audio = audio[conf['rt_chunk_samples']:] 71 | main_process() 72 | 73 | if __name__ == '__main__': 74 | if args.input_file != '': 75 | process_file(args.input_file) 76 | exit(0) 77 | 78 | FORMAT = pyaudio.paInt16 79 | CHANNELS = 1 80 | audio = pyaudio.PyAudio() 81 | stream = audio.open( 82 | format=FORMAT, 83 | channels=CHANNELS, 84 | rate=conf['sampling_rate'], 85 | input=True, 86 | #input_device_index=1, 87 | frames_per_buffer=conf['rt_chunk_samples'], 88 | start=False, 89 | stream_callback=callback # uncomment for non_blocking 90 | ) 91 | 92 | stream.start_stream() 93 | while stream.is_active(): 94 | main_process() 95 | time.sleep(0.001) 96 | stream.stop_stream() 97 | stream.close() 98 | 99 | audio.terminate() 100 | -------------------------------------------------------------------------------- /test/test_lib_train.py: -------------------------------------------------------------------------------- 1 | import sys 2 | sys.path.append('..') 3 | from lib_train import * 4 | 5 | X = [ 6 | [1.0,0.0,0.0], 7 | [0.8,0.0,0.1], 8 | [0.5,0.5,0.0], 9 | [0.4,0.4,0.0], 10 | [0.6,0.7,0.0], 11 | [0.1,0.5,0.5], 12 | ] 13 | y = [ 14 | 'kylin', 15 | 'kylin', 16 | 'tiger', 17 | 'tiger', 18 | 'tiger', 19 | 'eleph', 20 | ] 21 | 22 | # balance_class_by_over_sampling, balance_class_by_under_sampling 23 | correct = (np.array([[1. , 0. , 0. ], 24 | [0.8, 0. , 0.1], 25 | [0.5, 0.5, 0. ], 26 | [0.4, 0.4, 0. ], 27 | [0.6, 0.7, 0. ], 28 | [0.1, 0.5, 0.5], 29 | [0.1, 0.5, 0.5], 30 | [0.1, 0.5, 0.5], 31 | [1. , 0. , 0. ]]), np.array(['kylin', 'kylin', 'tiger', 'tiger', 'tiger', 'eleph', 'eleph', 32 | 'eleph', 'kylin'])) 33 | print('balance_class_by_over_sampling?', 34 | test_equal(correct, balance_class_by_over_sampling(X, y))) 35 | correct = (np.array([[0.1, 0.5, 0.5], 36 | [0.8, 0. , 0.1], 37 | [0.4, 0.4, 0. ]]), np.array(['eleph', 'kylin', 'tiger'])) 38 | print('balance_class_by_under_sampling?', 39 | test_equal(correct, balance_class_by_under_sampling(X, y))) -------------------------------------------------------------------------------- /title.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/daisukelab/ml-sound-classifier/fdee30de02d33948a301e3d56b2b890f04810158/title.jpg -------------------------------------------------------------------------------- /visualize.py: -------------------------------------------------------------------------------- 1 | import cv2 2 | import matplotlib.pyplot as plt 3 | import numpy as np 4 | import keras.backend as K 5 | 6 | def normalize_2d(two_d): 7 | img_temp = two_d - np.min(two_d) 8 | return img_temp/np.max(img_temp) 9 | 10 | def colormap_2d_flipud(norm_2d): 11 | #return cv2.applyColorMap(np.uint8(255 * np.flipud(norm_2d)), cv2.COLORMAP_JET) 12 | return np.flipud(norm_2d) 13 | 14 | def visualize_cam_audio(conf, model, x, name, layer='Conv_1'): 15 | """Grad-CAM visualization for audio spectrogram.""" 16 | last_conv_layer = model.get_layer(layer) # MobileNetV2 last conv layer 17 | if len(x.shape) == 3: 18 | X = x[np.newaxis, ...] 19 | else: 20 | X = x 21 | preds = model.predict(X) 22 | targ_class = np.argmax(preds[0]) 23 | 24 | output = model.output[:, targ_class] 25 | grads = K.gradients(output, last_conv_layer.output)[0] 26 | pooled_grads = K.mean(grads, axis=(0, 1, 2)) 27 | iterate = K.function([model.input], [pooled_grads, last_conv_layer.output[0]]) 28 | pooled_grads_value, conv_layer_output_value = iterate([X]) 29 | for i in range(int(last_conv_layer.output.shape[3])): 30 | conv_layer_output_value[:, :, i] *= pooled_grads_value[i] 31 | 32 | img = normalize_2d(X[-1, :, :, -1]) 33 | 34 | heatmap = np.mean(conv_layer_output_value, axis=-1) 35 | heatmap = np.maximum(heatmap, 0) 36 | heatmap /= np.max(heatmap) 37 | heatmap = cv2.resize(heatmap, (img.shape[1], img.shape[0])) 38 | 39 | superimposed = (heatmap + img) / 2 40 | 41 | # img, superimposed, heatmap - all 2d [n_mels, time hop] 42 | fig = plt.figure(figsize=(10, 5), dpi=100) 43 | ax = fig.add_subplot(131) 44 | ax.set_axis_off() 45 | ax.set_title('predicted class activation map', fontsize=8) 46 | ax.imshow(colormap_2d_flipud(heatmap)) 47 | ax = fig.add_subplot(132) 48 | ax.set_axis_off() 49 | ax.imshow(colormap_2d_flipud(superimposed)) 50 | ax.set_title('CAM {} - {}\n-- overlay --'.format(conf.labels[targ_class], name), fontsize=9) 51 | ax = fig.add_subplot(133) 52 | ax.set_axis_off() 53 | ax.set_title(conf.what_is_sample, fontsize=8) 54 | ax.imshow(colormap_2d_flipud(img)) 55 | fig.show() 56 | 57 | """ 58 | TBD 59 | def _imshow_friendly(img): 60 | img_temp = img - np.min(img) 61 | img_temp = img_temp/np.max(img_temp) 62 | friendly = np.uint8(255 * img_temp) 63 | return friendly 64 | 65 | def visualize_cam_image(conf, model, model_weight, test_file_index, datapath, 66 | expected_preds, test_time_aug_param={}): 67 | ""Grad-CAM visualization for image."" 68 | d.load_test_as_image(datapath) 69 | d.create_test_generator(test_time_aug_param) 70 | model.load_weights(model_weight) 71 | last_conv_layer = model.get_layer('block5_conv3') 72 | cur_X_test, cur_y_test = next(d.test_gen) 73 | x = np.array([cur_X_test[test_file_index]]) 74 | preds = model.predict(x) 75 | targ_class = np.argmax(preds[0]) 76 | result = calc_soft_acc(expected_preds[test_file_index], preds[0]) 77 | 78 | output = model.output[:, targ_class] 79 | grads = K.gradients(output, last_conv_layer.output)[0] 80 | pooled_grads = K.mean(grads, axis=(0, 1, 2)) 81 | iterate = K.function([model.input], [pooled_grads, last_conv_layer.output[0]]) 82 | pooled_grads_value, conv_layer_output_value = iterate([x]) 83 | for i in range(int(last_conv_layer.output.shape[3])): 84 | conv_layer_output_value[:, :, i] *= pooled_grads_value[i] 85 | heatmap = np.mean(conv_layer_output_value, axis=-1) 86 | heatmap = np.maximum(heatmap, 0) 87 | heatmap /= np.max(heatmap) 88 | 89 | img = next(d.test_gen)[0][test_file_index] 90 | fig = plt.figure(figsize=(10, 5), dpi=100) 91 | ax = fig.add_subplot(131) 92 | heatmap = cv2.resize(heatmap, (img.shape[1], img.shape[0])) 93 | ax.set_axis_off() 94 | ax.set_title('predicted class activation map', fontsize=8) 95 | ax.matshow(heatmap) 96 | heatmap = np.uint8(255 * heatmap) 97 | heatmap = cv2.applyColorMap(heatmap, cv2.COLORMAP_JET) 98 | superimposed = ((heatmap*0.5/np.max(heatmap) + img)) / 1.5 99 | ax = fig.add_subplot(132) 100 | ax.set_axis_off() 101 | ax.imshow(_imshow_friendly(superimposed)) 102 | ax.set_title('%s? %s' % (d.labels[targ_class], 'yes' if result == 1 else 'no'), fontsize=9) 103 | ax = fig.add_subplot(133) 104 | ax.set_axis_off() 105 | ax.set_title(conf.what_is_sample, fontsize=8) 106 | ax.imshow(_imshow_friendly(img)) 107 | fig.show() 108 | """ --------------------------------------------------------------------------------