├── .gitignore
├── Custom_SLAM_Landmark_Detection_Tracking.ipynb
├── LICENSE
├── README.md
├── __init__.py
├── helpers.py
├── images
├── Project 1 Thumbnail_preview.png
├── dummy.jpg
├── gaussian_edited1.jpg
├── matrix1_edited1.png
├── matrix2_edited1.jpg
├── preview.png
├── slam.gif
├── slam_viz.gif
├── thumbnail_preview.png
└── video_preview.png
├── mqtt-slamviz.py
├── robot_class.py
├── rpslam-mqtt.py
├── rpslam-thread.py
└── slam.gif
/.gitignore:
--------------------------------------------------------------------------------
1 | # Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 | *$py.class
5 |
6 | # C extensions
7 | *.so
8 |
9 | # Distribution / packaging
10 | .Python
11 | build/
12 | develop-eggs/
13 | dist/
14 | downloads/
15 | eggs/
16 | .eggs/
17 | lib/
18 | lib64/
19 | parts/
20 | sdist/
21 | var/
22 | wheels/
23 | pip-wheel-metadata/
24 | share/python-wheels/
25 | *.egg-info/
26 | .installed.cfg
27 | *.egg
28 | MANIFEST
29 |
30 | # PyInstaller
31 | # Usually these files are written by a python script from a template
32 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
33 | *.manifest
34 | *.spec
35 |
36 | # Installer logs
37 | pip-log.txt
38 | pip-delete-this-directory.txt
39 |
40 | # Unit test / coverage reports
41 | htmlcov/
42 | .tox/
43 | .nox/
44 | .coverage
45 | .coverage.*
46 | .cache
47 | nosetests.xml
48 | coverage.xml
49 | *.cover
50 | *.py,cover
51 | .hypothesis/
52 | .pytest_cache/
53 |
54 | # Translations
55 | *.mo
56 | *.pot
57 |
58 | # Django stuff:
59 | *.log
60 | local_settings.py
61 | db.sqlite3
62 | db.sqlite3-journal
63 |
64 | # Flask stuff:
65 | instance/
66 | .webassets-cache
67 |
68 | # Scrapy stuff:
69 | .scrapy
70 |
71 | # Sphinx documentation
72 | docs/_build/
73 |
74 | # PyBuilder
75 | target/
76 |
77 | # Jupyter Notebook
78 | .ipynb_checkpoints
79 |
80 | # IPython
81 | profile_default/
82 | ipython_config.py
83 |
84 | # pyenv
85 | .python-version
86 |
87 | # pipenv
88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies
90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not
91 | # install all needed dependencies.
92 | #Pipfile.lock
93 |
94 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow
95 | __pypackages__/
96 |
97 | # Celery stuff
98 | celerybeat-schedule
99 | celerybeat.pid
100 |
101 | # SageMath parsed files
102 | *.sage.py
103 |
104 | # Environments
105 | .env
106 | .venv
107 | env/
108 | venv/
109 | ENV/
110 | env.bak/
111 | venv.bak/
112 |
113 | # Spyder project settings
114 | .spyderproject
115 | .spyproject
116 |
117 | # Rope project settings
118 | .ropeproject
119 |
120 | # mkdocs documentation
121 | /site
122 |
123 | # mypy
124 | .mypy_cache/
125 | .dmypy.json
126 | dmypy.json
127 |
128 | # Pyre type checker
129 | .pyre/
130 |
--------------------------------------------------------------------------------
/Custom_SLAM_Landmark_Detection_Tracking.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# Custom SLAM Implementation\n",
8 | "\n",
9 | "---\n",
10 | "\n",
11 | "## Project Overview\n",
12 | "\n",
13 | "In this project, we'll implement SLAM for robot that moves and senses in a 2 dimensional, grid world!\n",
14 | "\n",
15 | "SLAM gives us a way to both localize a robot and build up a map of its environment as a robot moves and senses in real-time. The core is the function, `slam` takes in six parameters as input and returns the vector `mu`. `mu` contains the (x,y) coordinate locations of the robot as it moves, and the positions of landmarks that it senses in the world\n",
16 | "\n",
17 | "You can implement helper functions as you see fit, but your function must return `mu`. The vector, `mu`, should have (x, y) coordinates interlaced, for example, if there were 2 poses and 2 landmarks, `mu` will look like the following, where `P` is the robot position and `L` the landmark position:\n",
18 | "```\n",
19 | "mu = matrix([[Px0],\n",
20 | " [Py0],\n",
21 | " [Px1],\n",
22 | " [Py1],\n",
23 | " [Lx0],\n",
24 | " [Ly0],\n",
25 | " [Lx1],\n",
26 | " [Ly1]])\n",
27 | "```\n",
28 | "\n",
29 | "You can see that `mu` holds the poses first `(x0, y0), (x1, y1), ...,` then the landmark locations at the end of the matrix; we consider a `nx1` matrix to be a vector.\n",
30 | "\n",
31 | "## To generate an environment\n",
32 | "\n",
33 | "In a real SLAM problem, you may be given a map that contains information about landmark locations.The data is collected as an instantiated robot moves and senses in a world. The SLAM function will take in this data as input. The data can come from the point cloud senbsed by LIDAR.\n",
34 | "\n",
35 | "---"
36 | ]
37 | },
38 | {
39 | "cell_type": "markdown",
40 | "metadata": {},
41 | "source": [
42 | "## Create the world\n",
43 | "\n",
44 | "Use the code below to generate a world of a specified size with randomly generated landmark locations. You can change these parameters and see how your implementation of SLAM responds! \n",
45 | "\n",
46 | "`data` holds the sensors measurements and motion of your robot over time. It stores the measurements as `data[i][0]` and the motion as `data[i][1]`.\n",
47 | "\n",
48 | "#### Helper functions\n",
49 | "\n",
50 | "You can see the robot move/sense cycle in the `robot` class\n",
51 | "\n",
52 | "The `helpers.py` file, contains the data generation"
53 | ]
54 | },
55 | {
56 | "cell_type": "code",
57 | "execution_count": 285,
58 | "metadata": {},
59 | "outputs": [
60 | {
61 | "name": "stdout",
62 | "output_type": "stream",
63 | "text": [
64 | " \n",
65 | "Landmarks: [[46, 39], [47, 48], [34, 30], [34, 44], [80, 22]]\n",
66 | "Robot: [x=75.77127 y=41.76748]\n"
67 | ]
68 | }
69 | ],
70 | "source": [
71 | "import numpy as np\n",
72 | "from helpers import make_data\n",
73 | "\n",
74 | "# your implementation of slam should work with the following inputs\n",
75 | "# feel free to change these input values and see how it responds!\n",
76 | "\n",
77 | "# world parameters\n",
78 | "num_landmarks = 5 # number of landmarks\n",
79 | "N = 20 # time steps\n",
80 | "world_size = 100.0 # size of world (square)\n",
81 | "\n",
82 | "# robot parameters\n",
83 | "measurement_range = 50.0 # range at which we can sense landmarks\n",
84 | "motion_noise = 2.0 # noise in robot motion\n",
85 | "measurement_noise = 2.0 # noise in the measurements\n",
86 | "distance = 20.0 # distance by which robot (intends to) move each iteratation \n",
87 | "\n",
88 | "\n",
89 | "# make_data instantiates a robot, AND generates random landmarks for a given world size and number of landmarks\n",
90 | "data = make_data(N, num_landmarks, world_size, measurement_range, motion_noise, measurement_noise, distance)"
91 | ]
92 | },
93 | {
94 | "cell_type": "markdown",
95 | "metadata": {},
96 | "source": [
97 | "### A note on `make_data`\n",
98 | "\n",
99 | "The function above, `make_data`, takes in so many world and robot motion/sensor parameters because it is responsible for:\n",
100 | "1. Instantiating a robot (using the robot class)\n",
101 | "2. Creating a grid world with landmarks in it\n",
102 | "\n",
103 | "**This function also prints out the true location of landmarks and the *final* robot location, which you should refer back to when you test your implementation of SLAM.**\n",
104 | "\n",
105 | "The `data` this returns is an array that holds information about **robot sensor measurements** and **robot motion** `(dx, dy)` that is collected over a number of time steps, `N`. You will have to use *only* these readings about motion and measurements to track a robot over time and find the determine the location of the landmarks using SLAM. We only print out the true landmark locations for comparison, later.\n",
106 | "\n",
107 | "\n",
108 | "In `data` the measurement and motion data can be accessed from the first and second index in the columns of the data array. See the following code for an example, where `i` is the time step:\n",
109 | "```\n",
110 | "measurement = data[i][0]\n",
111 | "motion = data[i][1]\n",
112 | "```\n"
113 | ]
114 | },
115 | {
116 | "cell_type": "code",
117 | "execution_count": 286,
118 | "metadata": {},
119 | "outputs": [
120 | {
121 | "name": "stdout",
122 | "output_type": "stream",
123 | "text": [
124 | "Example measurements: \n",
125 | " [[0, -2.5999772475370366, -11.341737096922019], [1, -2.5492738206336245, -3.4680929858835317], [2, -17.998764419531803, -18.422814837895277], [3, -16.03548246418916, -4.153298799924289], [4, 29.037065744381053, -26.259416576279754]]\n",
126 | "\n",
127 | "\n",
128 | "Example motion: \n",
129 | " [2.531159403366524, 19.839184259307366]\n"
130 | ]
131 | }
132 | ],
133 | "source": [
134 | "# print out some stats about the data\n",
135 | "time_step = 0\n",
136 | "\n",
137 | "print('Example measurements: \\n', data[time_step][0])\n",
138 | "print('\\n')\n",
139 | "print('Example motion: \\n', data[time_step][1])"
140 | ]
141 | },
142 | {
143 | "cell_type": "markdown",
144 | "metadata": {},
145 | "source": [
146 | "Try changing the value of `time_step`, you should see that the list of measurements varies based on what in the world the robot sees after it moves. As you know from the first notebook, the robot can only sense so far and with a certain amount of accuracy in the measure of distance between its location and the location of landmarks. The motion of the robot always is a vector with two values: one for x and one for y displacement. This structure will be useful to keep in mind as you traverse this data in your implementation of slam."
147 | ]
148 | },
149 | {
150 | "cell_type": "markdown",
151 | "metadata": {},
152 | "source": [
153 | "## Initialize Constraints\n",
154 | "\n",
155 | "One of the most challenging tasks here will be to create and modify the constraint matrix and vector: omega and xi. In the second notebook, you saw an example of how omega and xi could hold all the values the define the relationships between robot poses `xi` and landmark positions `Li` in a 1D world, as seen below, where omega is the blue matrix and xi is the pink vector.\n",
156 | "\n",
157 | "
\n",
158 | "\n",
159 | "\n",
160 | "In *this* project, you are tasked with implementing constraints for a 2D world. We are referring to robot poses as `Px, Py` and landmark positions as `Lx, Ly`, and one way to approach this challenge is to add *both* x and y locations in the constraint matrices.\n",
161 | "\n",
162 | "
\n",
163 | "\n",
164 | "You may also choose to create two of each omega and xi (one for x and one for y positions)."
165 | ]
166 | },
167 | {
168 | "cell_type": "markdown",
169 | "metadata": {},
170 | "source": [
171 | "### : Function that initializes omega and xi\n",
172 | "\n",
173 | "Complete the function `initialize_constraints` so that it returns `omega` and `xi` constraints for the starting position of the robot. Any values that we do not yet know should be initialized with the value `0`. You may assume that our robot starts out in exactly the middle of the world with 100% confidence (no motion or measurement noise at this point). The inputs `N` time steps, `num_landmarks`, and `world_size` should give you all the information you need to construct intial constraints of the correct size and starting values.\n"
174 | ]
175 | },
176 | {
177 | "cell_type": "code",
178 | "execution_count": 287,
179 | "metadata": {},
180 | "outputs": [],
181 | "source": [
182 | "def initialize_constraints(N, num_landmarks, world_size):\n",
183 | " ''' This function takes in a number of time steps N, number of landmarks, and a world_size,\n",
184 | " and returns initialized constraint matrices, omega and xi.'''\n",
185 | " \n",
186 | " ## Recommended: Define and store the size (rows/cols) of the constraint matrix in a variable\n",
187 | " dim = N + num_landmarks\n",
188 | " \n",
189 | " ## TODO: Define the constraint matrix, Omega, with two initial \"strength\" values\n",
190 | " ## for the initial x, y location of our robot\n",
191 | " \n",
192 | " omega_1 = np.zeros((dim,dim))\n",
193 | " xi_1 = np.zeros((dim))\n",
194 | " \n",
195 | " # setting initial position\n",
196 | " omega_1[0][0] = 1\n",
197 | " xi_1[0] = world_size/2\n",
198 | " \n",
199 | " omega_2 = np.zeros((dim,dim))\n",
200 | " xi_2 = np.zeros((dim))\n",
201 | " \n",
202 | " # setting initial position\n",
203 | " omega_2[0][0] = 1\n",
204 | " xi_2[0] = world_size/2\n",
205 | " \n",
206 | " return omega_1, xi_1, omega_2, xi_2\n",
207 | " "
208 | ]
209 | },
210 | {
211 | "cell_type": "markdown",
212 | "metadata": {},
213 | "source": [
214 | "\n",
215 | "\n",
216 | "Below, you'll find some test code that allows you to visualize the results of your function `initialize_constraints`. We are using the [seaborn](https://seaborn.pydata.org/) library for visualization.\n",
217 | "\n",
218 | "**Please change the test values of N, landmarks, and world_size and see the results**. Be careful not to use these values as input into your final smal function.\n",
219 | "\n",
220 | "This code assumes that you have created one of each constraint: `omega` and `xi`, but you can change and add to this code, accordingly. The constraints should vary in size with the number of time steps and landmarks as these values affect the number of poses a robot will take `(Px0,Py0,...Pxn,Pyn)` and landmark locations `(Lx0,Ly0,...Lxn,Lyn)` whose relationships should be tracked in the constraint matrices. Recall that `omega` holds the weights of each variable and `xi` holds the value of the sum of these variables, as seen in Notebook 2. You'll need the `world_size` to determine the starting pose of the robot in the world and fill in the initial values for `xi`."
221 | ]
222 | },
223 | {
224 | "cell_type": "code",
225 | "execution_count": 288,
226 | "metadata": {},
227 | "outputs": [],
228 | "source": [
229 | "# import data viz resources\n",
230 | "import matplotlib.pyplot as plt\n",
231 | "from pandas import DataFrame\n",
232 | "import seaborn as sns\n",
233 | "%matplotlib inline"
234 | ]
235 | },
236 | {
237 | "cell_type": "code",
238 | "execution_count": 289,
239 | "metadata": {},
240 | "outputs": [],
241 | "source": [
242 | "# define a small N and world_size (small for ease of visualization)\n",
243 | "N_test = 5\n",
244 | "num_landmarks_test = 2\n",
245 | "small_world = 10\n",
246 | "\n",
247 | "# initialize the constraints\n",
248 | "initial_omega_1, initial_xi_1, initial_omega_2, initial_xi_2 = initialize_constraints(N_test, num_landmarks_test, small_world)"
249 | ]
250 | },
251 | {
252 | "cell_type": "code",
253 | "execution_count": 290,
254 | "metadata": {},
255 | "outputs": [
256 | {
257 | "data": {
258 | "text/plain": [
259 | ""
260 | ]
261 | },
262 | "execution_count": 290,
263 | "metadata": {},
264 | "output_type": "execute_result"
265 | },
266 | {
267 | "data": {
268 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAhwAAAGcCAYAAACSpnk5AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzt3XuQ3fV53/H3EYu8jsG4E7orFa2JM1KelMvU2ARn6jZcbIxAlMuQNuCkqR1jZhjjy7hNa0KLbTIQO57GoWMIAeEEu6kZ2ym2glQ5HYICocbBV4FNnlQQLK1BK19SX1HErrZ/7EGzrA67ktjf76fvft8v5gx7zh799HzsXfbZ5/n9zulNT08jSZLUpGVdFyBJkpY+Gw5JktQ4Gw5JktQ4Gw5JktQ4Gw5JktQ4Gw5JktS4oa4LkCRJh5eI+ChwPrArM08a8PkecCNwHvAT4E2Z+eX5jumEQ5IkzfXHwNp5Pn8usKZ/uwL4g4UOaMMhSZKeIzPvA743z1MuBD6WmdOZ+SDwsohYOd8xbTgkSdLBOg7YMev+eP+x59XUORy+XrokqSa9Nv+yF59y1aL9nH36Kx85lNoH/Zl5a2qk4XjxKVc1cdgiPP2Vj7B7susqujHc/2oyf7d1dMX89WaHuvMP13n5xTgwNuv+KuDJ+f5Anf8zSZJUsl7nZ0RsAK6KiDuB1wDfz8yn5vsDNhySJJWm1+wGJyI+AZwBHBsR48B7gSMBMvMWYBMzl8RuY+ay2DcvdEwbDkmS9ByZedkCn58G3nYwx7ThkCSpNN2vVA6aDYckSaVpeKXShPJaJEmSVBwnHJIklcaViiRJapwrFUmSpP054ZAkqTSuVCRJUuNcqUiSJO3PCYckSaVxpSJJkhrnSkWSJGl/TjgkSSqNKxVJktQ4VyqSJEn7c8IhSVJpXKlIkqTGFdhwlFfxPG5576/yzXt+hy9+6re6LqUTD9x/HxesO4fz157N7bfd2nU5ras5f83Zwfzmrzt/KZZUw/HxP3uQC992U9dldGJqaoobrr+Om29Zz10bNrJ50908tm1b12W1pub8NWcH85u/0vzLeot3a6vk1v6mFjzw5cf43vd/0nUZnXjk4a2MjR3PqrExjly+nLXnrWPLvfd0XVZras5fc3Ywv/krzd9btni3lix4DkdE/DxwIXAcMA08CWzIzEcbrk0HYdfEBCtWrth3f2R0lIe3bu2wonbVnL/m7GB+89edvyTztjYR8Z+AO4Ee8NfAQ/2PPxER72m+PB2oaab3e6xX4HXah6rm/DVnB/Obv9L8vd7i3Vqy0ITjLcCJmfnM7Acj4veArwMfaKowHZzR0RXsfGrnvvu7JiYYGRnpsKJ21Zy/5uxgfvNXmn8JXqWyF/gnAx5f2f+cDhMnnnQy27c/wfj4Dp7Zs4fNmzZy+plndV1Wa2rOX3N2ML/5685fkoUmHO8C7omI/wvs6D/2cmA1cFWThR2KO37nTfzLV6/h2JcdxbbNv81v37KJOz7z+a7LasXQ0BBXX3MtV15xOXv3TnHRxZewevWarstqTc35a84O5jd/pfkLXBv1pqf333/NFhHLgNOYOWm0B4wDD2Xm1PP9mRefctX8B13Cnv7KR9g92XUV3Rjut6/m77aOrpi/3uxQd/7+136rHcCL3/ChRfs5+/Sf/2YrtS94lUpm7gUebKEWSZJ0IAqccJR31okkSSqO76UiSVJpCrxKxYZDkqTSuFKRJEnanxMOSZJK40pFkiQ1zpWKJEnS/pxwSJJUGlcqkiSpcQU2HOVVLEmSiuOEQ5Kk0hR40qgNhyRJpXGlIkmStD8nHJIklcaViiRJapwrFUmSpP054ZAkqTSuVCRJUtN6BTYcrlQkSVLjnHBIklSYEiccNhySJJWmvH6jmYbj6a98pInDFmO48jbO/F1X0K2a89ecHcyv+fnlIUlSYVyp9O2ebOKoZRgeqjf/s7/dmL/bOrpi/nqzQ935u5jslNhweJWKJElqnCsVSZIKU+KEw4ZDkqTClNhwuFKRJEmNc8IhSVJpyhtw2HBIklQaVyqSJEkDOOGQJKkwJU44bDgkSSpMiQ2HKxVJktQ4JxySJBWmxAmHDYckSaUpr99wpSJJkprnhEOSpMK4UpEkSY0rseFwpSJJkhrnhEOSpMI0PeGIiLXAjcARwPrM/MCcz78cuAN4Wf8578nMTfMd0wmHJEml6S3ibY6IOAK4CTgXOAG4LCJOmPO0/wx8MjNPAS4Fbl6oZBsOSZI022nAtsx8PDP3AHcCF855zjTw0v7HxwBPLnRQVyqSJBWm4ZXKccCOWffHgdfMec77gD+PiLcDLwFev9BBnXBIklSYXq+3aLdBhx/w2PSc+5cBf5yZq4DzgI9HxLw9xZJqOB64/z4uWHcO5689m9tvu7Xrclpn/nrz15wdzG/+uvM3YBwYm3V/FfuvTN4CfBIgMz8PDAPHznfQJdNwTE1NccP113HzLeu5a8NGNm+6m8e2beu6rNaYv978NWcH85u/zvwNTzgeAtZExCsiYjkzJ4VumPOc7cDrACLinzLTcHx7vpqXTMPxyMNbGRs7nlVjYxy5fDlrz1vHlnvv6bqs1pi/3vw1Zwfzm7/O/E02HJk5CVwFfA54lJmrUb4eEddFxAX9p/174K0R8TXgE8CbMnPu2uU5Dvmk0Yh4c2b+0aH++cW2a2KCFStX7Ls/MjrKw1u3dlhRu8xfb/6as4P5zV93/qb0X1Nj05zHrp318TeA1x7MMV/IhOP9L+DPLrrp/c5nKfOlXw+V+evNX3N2ML/5K83f4OtwNGXeCUdEPF+b2ANGF7+cQzc6uoKdT+3cd3/XxAQjIyMdVtQu89ebv+bsYH7z15m/xKZqoQnHKPDrwL8acPtus6UdnBNPOpnt259gfHwHz+zZw+ZNGzn9zLO6Lqs15q83f83Zwfzmrzt/SRY6h+Nu4KjM/OrcT0TElkYqOkRDQ0Ncfc21XHnF5ezdO8VFF1/C6tVrui6rNeavN3/N2cH85q8zf4kTjt709LwnlR6S3ZMDlmqVGB6C3ZNdV9GN4X77av5u6+iK+evNDnXn73/tt9oBjL3ts4v2c3bHTRe2UrsvbS5JUmnKG3AsndfhkCRJhy8nHJIkFabEczhsOCRJKkyJDYcrFUmS1DgnHJIkFabECYcNhyRJhSmx4XClIkmSGueEQ5Kk0pQ34LDhkCSpNK5UJEmSBnDCIUlSYUqccNhwSJJUmAL7DVcqkiSpeU44JEkqjCsVSZLUuAL7DVcqkiSpeU44JEkqjCsVSZLUuAL7DVcqkiSpeU44JEkqzLJl5Y04bDgkSSpMiSuVRhqO4crbGPN3XUG3zN91Bd2pOTuYX/Nr5Mtj92QTRy3D8FC9+Z/9j435u62jK+avNzvUnb+LRsurVCRJUuMK7De8SkWSJDXPCYckSYVxpSJJkhpXYsPhSkWSJDXOCYckSYUpcMBhwyFJUmlcqUiSJA3ghEOSpMIUOOCw4ZAkqTSuVCRJkgZwwiFJUmEKHHDYcEiSVBpXKpIkSQM44ZAkqTAFDjhsOCRJKo0rFUmSpAGccEiSVJgCBxw2HJIklcaViiRJ0gBOOCRJKkyBA46lNeF44P77uGDdOZy/9mxuv+3WrstpnfnrzV9zdjC/+evL3+v1Fu3WliXTcExNTXHD9ddx8y3ruWvDRjZvupvHtm3ruqzWmL/e/DVnB/Obv+78JVmw4YiIn4+I10XEUXMeX9tcWQfvkYe3MjZ2PKvGxjhy+XLWnreOLffe03VZrTF/vflrzg7mN3+d+Xu9xbu1Zd6GIyLeAXwWeDvwSERcOOvTNzRZ2MHaNTHBipUr9t0fGR1lYmKiw4raZf5689ecHcxv/jrzL8WVyluBV2fmRcAZwH+JiHf2P3dYnbIyzfR+j5V42dChMn+9+WvODuY3f935S7LQVSpHZOaPADLziYg4A/h0RBzPYdZwjI6uYOdTO/fd3zUxwcjISIcVtcv89eavOTuY3/x15i+xqVpowrEzIl757J1+83E+cCxwcpOFHawTTzqZ7dufYHx8B8/s2cPmTRs5/cyzui6rNeavN3/N2cH85q8zf4nncCw04fh1YHL2A5k5Cfx6RPxhY1UdgqGhIa6+5lquvOJy9u6d4qKLL2H16jVdl9Ua89ebv+bsYH7z152/JL3p6f33Xy/U7skBS7VKDA/B7smFn7cUDffbV/N3W0dXzF9vdqg7f/9rv9Udxxm//38W7efslnf981Zq95VGJUkqTIGncNhwSJJUmqV40qgkSdIL5oRDkqTCFDjgsOGQJKk0ywrsOFypSJKkxjnhkCSpMAUOOGw4JEkqTYlXqdhwSJKk54iItcCNwBHA+sz8wIDn/BvgfcA08LXMfON8x/QcDkmSCrOst3i3uSLiCOAm4FzgBOCyiDhhznPWAFcDr83ME4F3LVSzEw5JkgrT8ErlNGBbZj4OEBF3AhcC35j1nLcCN2Xm3wNk5q6FDmrDIUmSZjsO2DHr/jjwmjnP+TmAiHiAmbXL+zJz83wHdaUiSVJhGn57+kGPzn2zuCFgDXAGcBmwPiJeNl/NTjgkSSpMr9k3px0HxmbdXwU8OeA5D2bmM8DfRUQy04A89HwHdcIhSZJmewhYExGviIjlwKXAhjnP+QxwJkBEHMvMiuXx+Q5qwyFJUmGavEolMyeBq4DPAY8Cn8zMr0fEdRFxQf9pnwO+GxHfAO4FfjMzvztfzb3p6blrmRdu9+R+u55qDA/B7smuq+jGcH9BZ/5u6+iK+evNDnXn73/tt/pKXBfe9sVF+zn72bee2krtTjgkSVLjPGlUkqTCFPjK5jYckiSVxrenlyRJGsAJhyRJhSlwwNFMwzFceRtj/q4r6Jb5u66gOzVnB/O3qcS3p3elIkmSGtdIP1rrtdjgtehgfvN3W0dXav7eh7rzdzHZKXDA4TkckiSVxqtUJEmSBnDCIUlSYcqbb9hwSJJUHK9SkSRJGsAJhyRJhRn0tvKHOxsOSZIK40pFkiRpACcckiQVpsABhw2HJEmlcaUiSZI0gBMOSZIK41UqkiSpca5UJEmSBnDCIUlSYcqbb9hwSJJUHN+eXpIkaQAnHJIkFabAAYcNhyRJpfEqFUmSpAGWVMPxwP33ccG6czh/7dncftutXZfTOvPXm7/m7GB+89eXv9dbvFtblkzDMTU1xQ3XX8fNt6znrg0b2bzpbh7btq3rslpj/nrz15wdzG/+OvMv6/UW7dZaza39TQ175OGtjI0dz6qxMY5cvpy1561jy733dF1Wa8xfb/6as4P5zV93/pIs2HBExGkR8Qv9j0+IiHdHxHnNl3Zwdk1MsGLlin33R0ZHmZiY6LCidpm/3vw1Zwfzm7/O/CWuVOa9SiUi3gucCwxFxP8GXgNsAd4TEadk5vXNl3hgppne77ESz+I9VOavN3/N2cH85q8zf4kZF7os9peBVwIvAnYCqzLzBxHxIeALwGHTcIyOrmDnUzv33d81McHIyEiHFbXL/PXmrzk7mN/8decvyUIrlcnMnMrMnwCPZeYPADLzaWBv49UdhBNPOpnt259gfHwHz+zZw+ZNGzn9zLO6Lqs15q83f83ZwfzmrzP/skW8tWWhCceeiPipfsPx6mcfjIhjOMwajqGhIa6+5lquvOJy9u6d4qKLL2H16jVdl9Ua89ebv+bsYH7z15m/xJVKb3p6//3XsyLiRZn5DwMePxZYmZkPD/pzuycHLNUqMTwEuye7rqIbw/321fzd1tEV89ebHerO3//ab7UDeMdn/mbRfs7+t4t+vpXa551wDGo2+o9/B/hOIxVJkqR5LStvwOF7qUiSVBobDkmS1LgSz+FYMq80KkmSDl9OOCRJKowrFUmS1LgCNyquVCRJUvOccEiSVJg231Z+sdhwSJJUmBLXEyXWLEmSCuOEQ5KkwhS4UbHhkCSpNCWew+FKRZIkNc4JhyRJhSlwwGHDIUlSaUp8pVFXKpIkqXFOOCRJKkyJJ43acEiSVJgC+w1XKpIkqXlOOCRJKkyJJ43acEiSVJge5XUcrlQkSVLjGplwDFc+NzF/1xV0y/xdV9CdmrOD+dvkSkWSJDXOhqNv92QTRy3D8FC9+Z/97cb83dbRFfPXmx3qzu9k58D4P5MkSYXpFfhCHDYckiQVpsSVilepSJKkxjnhkCSpMAVuVGw4JEkqTYlv3uZKRZIkNc4JhyRJhWn6pNGIWAvcCBwBrM/MDzzP834Z+BTwC5n5xfmO6YRDkqTC9HqLd5srIo4AbgLOBU4ALouIEwY872jgHcAXDqRmGw5JkjTbacC2zHw8M/cAdwIXDnjebwO/C+w+kIPacEiSVJhl9BbtNsBxwI5Z98f7j+0TEacAY5l594HW7DkckiQVpuGLVAYdffrZDyJiGfBh4E0Hc1AnHJIkabZxYGzW/VXAk7PuHw2cBGyJiCeAXwQ2RMSp8x3UCYckSYVp+CqVh4A1EfEK4FvApcAbn/1kZn4fOPbZ+xGxBfgPXqUiSdISs6zXW7TbXJk5CVwFfA54FPhkZn49Iq6LiAsOtebe9PT0ws86SLsnWfyDFsK3aDa/+butoys1f+9D3fn7X/utvvTnrQ9+c9F+zl7xi8e3UrsrFUmSClPgK5vbcEiSVBrfS0WSJGkAJxySJBWmwAGHDYckSaUpcT1RYs2SJKkwS6rheOD++7hg3Tmcv/Zsbr/t1q7LaZ35681fc3Ywv/nry9/r9Rbt1pYl03BMTU1xw/XXcfMt67lrw0Y2b7qbx7Zt67qs1pi/3vw1Zwfzm7/O/L1FvLXloBuOiPhYE4W8UI88vJWxseNZNTbGkcuXs/a8dWy5956uy2qN+evNX3N2ML/5685fknlPGo2IDXMe6gFnRsTLADLzkF/idLHtmphgxcoV++6PjI7y8NatHVbULvPXm7/m7GB+89eZv8TX4VjoKpVVwDeA9cy8NW0POBX4rw3XddCmB7yaepu7qa6Zv978NWcH85u/zvwlJlxopXIq8CXgGuD7mbkFeDoz/zIz/7Lp4g7G6OgKdj61c9/9XRMTjIyMdFhRu8xfb/6as4P5zV93/pLM23Bk5t7M/DDwZuCaiPgIh+lrd5x40sls3/4E4+M7eGbPHjZv2sjpZ57VdVmtMX+9+WvODuY3f535e73Fu7XlgJqHzBwH/nVErAN+0GxJh2ZoaIirr7mWK6+4nL17p7jo4ktYvXpN12W1xvz15q85O5jf/HXmL3Ft5NvTLzLfotn85u+2jq7U/L0Pdefv4u3pP/GVby3az9nLTjnOt6eXJEn7K/FFtGw4JEkqTIkrFRsOSZIKU167UeZURpIkFcYJhyRJhXGlIkmSGlfieqLEmiVJUmGccEiSVBhXKpIkqXHltRuuVCRJUguccEiSVJgCNyo2HJIklWZZgUsVVyqSJKlxTjgkSSqMKxVJktS4nisVSZKk/TnhkCSpMK5UJElS47xKRZIkaQAnHJIkFcaViiRJalyJDYcrFUmS1LhGJhzDlc9NzN91Bd0yf9cVdKfm7GD+NpX4OhyNfHnsnmziqGUYHqo3/7P/sTF/t3V0xfz1Zoe683fRaC0rr99wpSJJkprnAEySpMK4UpEkSY3zKhVJkqQBnHBIklQYVyqSJKlxXqUiSZI0gBMOSZIK40pFkiQ1zqtUJEmSBnDCIUlSYQoccNhwSJJUmmUF7lRcqUiSpMY54ZAkqTDlzTdsOCRJKk+BHYcrFUmS1DgnHJIkFcYX/pIkSY0r8CIVVyqSJKl5TjgkSSpMgQMOGw5JkopTYMexpFYqD9x/HxesO4fz157N7bfd2nU5rTN/vflrzg7mN3/d+UuxZBqOqakpbrj+Om6+ZT13bdjI5k1389i2bV2X1Rrz15u/5uxgfvPXmb+3iP+0Zck0HI88vJWxseNZNTbGkcuXs/a8dWy5956uy2qN+evNX3N2ML/568zf6y3erS0H1XBExL+IiHdHxBuaKuhQ7ZqYYMXKFfvuj4yOMjEx0WFF7TJ/vflrzg7mN3/d+Usyb8MREX896+O3Ah8BjgbeGxHvabi2gzLN9H6P9Uq8UPkQmb/e/DVnB/Obv878vUW8tWWhCceRsz6+Ajg7M98PvAH41caqOgSjoyvY+dTOffd3TUwwMjLSYUXtMn+9+WvODuY3f6X5C+w4Fmo4lkXEP4qInwZ6mfltgMz8MTDZeHUH4cSTTmb79icYH9/BM3v2sHnTRk4/86yuy2qN+evNX3N2ML/5685fkoVeh+MY4EvM9EDTEbEiM3dGxFEcZlcBDw0NcfU113LlFZezd+8UF118CatXr+m6rNaYv978NWcH85u/zvxNX10SEWuBG4EjgPWZ+YE5n383cDkzw4dvA7+Rmd+c75i96en9918HUMhPAaOZ+XeDPr97csBSrRLDQ7D7sJr9tGe4376av9s6umL+erND3fn7X/ut/hL+1e0/XLSfs698+dHPqT0ijgD+FjgbGAceAi7LzG/Mes6ZwBcy8ycRcSVwRmb+ynx/zyG90mhm/gQY2GxIkqSinQZsy8zHASLiTuBCYF/DkZn3znr+g8CvLXTQJfM6HJIk1aLhc0aPA3bMuj/ef+z5vAX4XwvV7HupSJJUmmYXOIOOPnCFExG/BpwKnL7QQW04JEkqTMMnjY4DY7PurwKenPukiHg9cA1wemb+w0IHteGQJEmzPQSsiYhXAN8CLgXeOPsJEXEK8IfA2szcdSAH9RwOSZIK0+R7qWTmJHAV8DngUeCTmfn1iLguIi7oP+1DwFHApyLiqxGxYcGaD+Wy2IV4WWzXVXTDyyJn/m3+buvoSs3f+1B3/i4ui31k/EeL9nP2pFVHtVK7Ew5JktQ4z+GQJKk0h9VrfR8YGw5JkgrT9EubN8GViiRJapwTDkmSCjPo6pLDnQ2HJEmFKbDfcKUiSZKa54RDkqTSFDjisOGQJKkwXqUiSZI0gBMOSZIK41UqkiSpcQX2G65UJElS85xwSJJUmgJHHDYckiQVpsSrVBppOIYrb2PM33UF3TJ/1xV0p+bsYH7Nzy8PSZIK41UqfbsnmzhqGYaH6s3/7G835u+2jq6Yv97sUHf+LiY7BfYbXqUiSZKa50pFkqTSFDjisOGQJKkwJV6l4kpFkiQ1zgmHJEmF8SoVSZLUuAL7DVcqkiSpeU44JEkqTYEjDhsOSZIK41UqkiRJAzjhkCSpMF6lIkmSGldgv+FKRZIkNc8JhyRJhXGlIkmSWlBex+FKRZIkNc4JhyRJhXGlIkmSGldgv+FKRZIkNc8JhyRJhXGlIkmSGud7qXTsgfvv44J153D+2rO5/bZbuy6ndeavN3/N2cH85q87fymWTMMxNTXFDddfx823rOeuDRvZvOluHtu2reuyWmP+evPXnB3Mb/5K8/cW8daSeRuOiHhNRLy0//GLI+L9EfFnEfHBiDimnRIPzCMPb2Vs7HhWjY1x5PLlrD1vHVvuvafrslpj/nrz15wdzG/+OvMX2G8sOOH4KPCT/sc3AscAH+w/9kcN1nXQdk1MsGLlin33R0ZHmZiY6LCidpm/3vw1Zwfzm7/u/CVZ6KTRZZk52f/41Mx8Vf/jv4qIrzZY10GbZnq/x3olnsZ7iMxfb/6as4P5zV9n/hIjLjTheCQi3tz/+GsRcSpARPwc8EyjlR2k0dEV7Hxq5777uyYmGBkZ6bCidpm/3vw1Zwfzm7/O/L1F/KctCzUclwOnR8RjwAnA5yPiceC2/ucOGyeedDLbtz/B+PgOntmzh82bNnL6mWd1XVZrzF9v/pqzg/nNX3f+ksy7UsnM7wNvioijgZ/tP388Mw+7BdnQ0BBXX3MtV15xOXv3TnHRxZewevWarstqjfnrzV9zdjC/+SvNX+BKpTc9vf/+64XaPTlgqVaJ4SHYPbnw85ai4X77av5u6+iK+evNDnXn73/tt9oCfOdHk4v2c/bYo4ZaqX3JvA6HJEk6fPnS5pIkFabEq1RsOCRJKkyJ76ViwyFJUmFKnHB4DockSWqcDYckSWqcKxVJkgrjSkWSJGkAJxySJBXGq1QkSVLjXKlIkiQN4IRDkqTCFDjgsOGQJKk4BXYcrlQkSVLjnHBIklQYr1KRJEmN8yoVSZKkAZxwSJJUmAIHHDYckiQVp8COw4ZDkiQ9R0SsBW4EjgDWZ+YH5nz+RcDHgFcD3wV+JTOfmO+YnsMhSVJheov4z1wRcQRwE3AucAJwWUScMOdpbwH+PjNXAx8GPrhQzTYckiQVptdbvNsApwHbMvPxzNwD3AlcOOc5FwJ39D/+NPC6iJh30dPISmV4qMTt0uIZrnxRZf6uK+hWzflrzg7mb1PDP2ePA3bMuj8OvOb5npOZkxHxfeCnge8830GdcEiSpNkGNTPTh/Cc57DhkCRJs40DY7PurwKefL7nRMQQcAzwvfkO6gBMkiTN9hCwJiJeAXwLuBR445znbAD+HfB54JeBv8jMeSccS6rhWOgynqUsIj4KnA/sysyTuq6nbRExxswlWiuAvcCtmXljt1W1JyKGgfuAFzHzff3pzHxvt1W1q39m/ReBb2Xm+V3X06aIeAL4ITAFTGbmqZ0W1LKIeBmwHjiJmbH+b2Tm57utqlz9czKuAj7HzM/Tj2bm1yPiOuCLmbkBuB34eERsY2aycelCx+1NT8/bkBSj/x+bvwXOZmbU8xBwWWZ+o9PCWhIRvwT8CPhYpQ3HSmBlZn45Io4GvgRcVNH//z3gJZn5o4g4Evgr4J2Z+WDHpbUmIt4NnAq8tNKG49TMfN4T9payiLgDuD8z10fEcuCnMvP/dV2XnmspncNxIJfxLFmZeR8L7M+Wssx8KjO/3P/4h8CjzJxFXYXMnM7MH/XvHtm/LY3fJg5ARKwC1jHzW64qEhEvBX6Jmd+4ycw9NhuHp6W0UjmQy3hUgYj4GeAU4Asdl9Kq/pTvS8Bq4KbMrCn/7wP/ETi660I6Mg38eURMA3+Ymbd2XVCLfhb4NvBHEfHPmPkeeGdm/rjbsjTXUppwHPQlOlp6IuIo4E+Bd2XmD7qup02ZOZWZr2TmjPLTIqKK1VpEPHvu0pe6rqVDr83MVzHzypBv669YazEEvAr4g8w8Bfgx8J5uS9IgS6kMPGClAAABTElEQVThOJDLeLSE9c9d+FPgTzLzf3ZdT1f64+QtwNqOS2nLa4EL+ucx3AmcFRH/vdOKWpaZT/b/vQu4i5kVcy3GgfFZE71PM9OA6DCzlBqOfZfx9E8aupSZy3ZUgf5Jk7cDj2bm73VdT9si4h/3z9QnIl4MvB74m26rakdmXp2ZqzLzZ5j5vv+LzPy1jstqTUS8pH+iNBHxEuANwCPdVtWezNwJ7IiI6D/0OqCKk8VLs2TO4Xi+y3g6Lqs1EfEJ4Azg2IgYB96bmbd3W1WrXgv8W+DhiPhq/7HfysxNHdbUppXAHf3zOJYBn8zMuzuuSe0YBe7q/7wdAv5HZm7utqTWvR34k/4vm48Db+64Hg2wZC6LlSRJh6+ltFKRJEmHKRsOSZLUOBsOSZLUOBsOSZLUOBsOSZLUOBsOSZLUOBsOSZLUOBsOSZLUuP8P0vwA3cv6dz8AAAAASUVORK5CYII=\n",
269 | "text/plain": [
270 | ""
271 | ]
272 | },
273 | "metadata": {
274 | "needs_background": "light"
275 | },
276 | "output_type": "display_data"
277 | }
278 | ],
279 | "source": [
280 | "# define figure size\n",
281 | "plt.rcParams[\"figure.figsize\"] = (10,7)\n",
282 | "\n",
283 | "# display omega\n",
284 | "sns.heatmap(DataFrame(initial_omega_1), cmap='Blues', annot=True, linewidths=.5)"
285 | ]
286 | },
287 | {
288 | "cell_type": "code",
289 | "execution_count": 291,
290 | "metadata": {},
291 | "outputs": [
292 | {
293 | "data": {
294 | "text/plain": [
295 | ""
296 | ]
297 | },
298 | "execution_count": 291,
299 | "metadata": {},
300 | "output_type": "execute_result"
301 | },
302 | {
303 | "data": {
304 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAF0AAAGYCAYAAAAgI+5wAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAADohJREFUeJztnX+wXGV5xz+7m6slhLFqkhubREXAJ9WqiGlkjAqCERwyxJn6A+iAwxSdUamgtpWp0wadkdJpUXHwDzGkaqtGBn8OKKCZQQSrxHQQA8mDEKlcQnITrSUQB5Kb7R+7pPcedu/ZPbvnfe659/uZOcPu2bP3feeTh2ffPfuc59SazSYiLfXoCcxFJD0ASQ9A0gOQ9AAkPYB5Jf3dqq1DaykHK0X65StGyvizpXH5joNJx1N6CUDSA5D0ACQ9AEkPQNIDkPQAJD0ASQ9A0gOQ9AAkPQBJD0DSA5D0ACQ9AEkPQNIDkPQAJD2AsqoBhs6lm3/Fk088TnNigsMTh7j27SdHT6kwlZEO8KUL3syB3/82ehoDo/QSQG6km9kKYB2wlFYR0S7gu+6+veS5TaHZbHL+dd+nSZOtX/8CW6/fkHL4oTKtdDP7KHAusAm4q717GfA1M9vk7leWPL8jbDzvFPaPP8rRz1vE+RtvZt/OHfz3z+9INfxQyYv0vwJe7u5TSqDM7FPAvUAy6fvHHwXgid/tZccPv83SV/55ZaXn5fTDwJ902P+C9mtJGDlqPs86esGRx8etXsP4/femGn7o5EX6pcBmM/sV8HB73wuB44GLy5zYZBY8f5R3XXMDAPVGg1/euIkH7rg11fBDp5Z3oZeZ1YFVtD5Ia8AYsMXdJ7q95/IVI5Wq2r18x8GZVbXr7oeBnyaYy5xB6/QAJD0ASQ9A0gOQ9AAkPQBJD0DSA5D0ACQ9AEkPQNIDkPQAJD0ASQ9A0gOQ9AAkPQBJD0DSA5D0AHJLMApSqRIMEjdOU6QHUE59+oF9pfzZ0pi/MOlwivQAJD0ASQ9A0gOQ9AAkPQBJD0DSA5D0ACQ9AEkPQNIDkPQAJD0ASQ9A0gOQ9AAkPQBJD0DSA5D0ACoj/fY7f8oZbzuHNWe/k2s3/nv0dAaiEtInJib4xJVXseGaq7jpG1/hxpt/yAMP/jp6WoWphPR7tm3nRcuXsXzZUp41MsJZZ5zO5tt+HD2twhSWbmYXDnMi07FnfC9LRhcfeT46upg9e/emGn7oDBLpHx/aLHJodiiNrKUtPxwqeX0Z7+nyUg0YHf50OrNk8WJ27xk/8nzPnnEWL0pbCjdM8moZR4EzgP/J7K8BPyllRh14xctX8NBvxnj4kV2MLl7ETbds5qp/Wp9q+KGTJ/1GYIG73519wcxuK2VGHZg3bx7/+NEPcdH7P8zE4Qn+Yt1aTjjuJamGHzrl1Kcf2Fet+vT5C1WfPtuR9AAkPQBJD0DSA5D0ACQ9AEkPQNIDkPQAJD0ASQ9A0gOQ9AAkPQBJD0DSA6jU/UhnMmb2ELAfmAAOufvKbsdK+nB5k7vndhhSeglAkZ6hl/vudbnXXRO41cyawOfd/dpu71ekZ6j1sHVhtbufBLwV+ICZvbHbgeVEeuJGZMOkVrAYw913tf87bmbfonVny9s7HatudTAlSBoFpJvZ0UDd3fe3H78F+ES345XTMxQM9FHgW2YGLadfdfebux0s6RmKRLq77wRe1evxkp6haE7vB0nPUJf09KSoJJX0DEVyer9Iegbl9AAatfJL6yU9g3J6AFq9BKBID0CrlwC0eglAkR6APkgD0AdpAEovAeiDNABFegDK6QHMS2C9MnUvqbrV1Wr526BUQnrKbnWNWv42KLnSzWyFmZ1uZgsy+88cfPjeSNmtrl5r5m4DjzHdi2b2QeA7wF8D28xs3aSXrxh49B5J2a1uJkT6e4DXuPvbgFOBfzCzS9qvJesGlLJbXb2HbVDyVi8Nd38cwN0fMrNTgRvM7EUklJ6yW13RSDazBvBz4BF3XzvdsXn/cLvN7MSnn7T/AdYCC4FXFJte/0zuVvfUwYPcdMtmTjv19aWMVa/lb124BNjeyxh5kX4BcGjyDnc/BFxgZp/vZYBhkLJbXcEC0mXAWcAngQ/nHT+tdHcfm+a1O/ue3QCc8obXccobXlf6OAXX4Z8B/g44ppeDK7FOT8m8Wv42GTNbC4y7+9Zex5D0DAWWjKuBs9tX120CTjOz/5huDDXDhCnNMO86vZE791WbJzomofbq7m/yVi864ZVhXl0VXskZ5MuHu98G3JZ3nKRn0I8YAagaIABFegCK9ADqCcoBJD2DSjACqCXIL5Keoab0kp5aguWLpGdQpAdQV04PQKuX9CjSA6jVy/9dR9Iz6MtRAEovEVR2yVjhbnWK9ACq++Wowi0CdcIrAK1eAlBOj6CyOb3CKNIDqO7qpcLUGv2fezGzP6LVQfrZtJze4O7rux2vqt0sxS4kfRI4zd1fBZwInGlmJ3cbQpGeoVZv9P0ed28Cj7efjrS3rpWokp6lYE5vX+i1FTge+Jy7/6zbsUovGWr1eu7WCXefcPcTgWXAKjP7s25jSHqGWqORu02Hu/+eVrl01yvKJT1LgQ9SM1tkZn/cfnwU8GZgR7chlNMzFPkgBV4AfKmd1+vA9e5+Y7eDJT1LgQ9Sd78HeHWvx0t6hoKR3heSnkWnAdKTtzoZBpKeRZGeHhUbRVArX3plvhwl61bXqOdug1IJ6Sm71dVq9dxtUCohPWW3Our1/G1AcnO6ma0Cmu6+xcxeRutEzg53/97Ao/dIp25192y7t5Sxwn+uM7P1tO4kO8/MfgC8ltYZtMvM7NXu/snSZ0jabnXMgG+kb6f189Ozgd3AMnd/zMz+BfgZrZ5VpZOyW12KSM9LUIfaJ+cPAA+6+2MA7v4H4HDps2uTslsdjUb+NiB5kf6Umc1vS3/N0zvN7DkklJ6yW91MSC9vdPcnAdx9suQR4N2lzaoDqbrVpfhylNci8Mku+/cBFSvN7ZEZEOlzj+gl45xEkR6AIj0ARXoA0auXOYkiPQDl9AAU6QEUyOlmthz4MrCE1umRa9396m7HV+JHjKTUG/nbMzkEfMTd/xQ4GfhA+7eHzkOUNPXqUqCA1N0fdff/aj/eT+veGEu7DaH0kmXAnG5mL6ZV19j1ogBJzzLAOr1917NvAJc+/dtDJyQ9S8FIN7MRWsK/4u7fnO5YSc9S4OJdM6sB1wHb3f1TecdLepZikb4aOB/4pZnd3d73990qJiQ9S7FLGu+gj+aCkp5FJ7wC0GmAABTpAVQ20ivcrU6RHkG9fCXqVgdT/8/UjxgBVDbSq4wu9AqgsquXKqPVSwCK9AAU6QEo0gNQpAegSA9AX44CUHoJQE12AlCkB6AP0gh0ajc9ivQIlNPTo/PpERS6EmMjsBYYd/euLbyLjzDbKdZO6otM07r7GUMMa66zh3oP21Tc/Xbgd/2MUAlStQgseM1Rf0MMYZqlk7JFYGudnrcNRt/SzezLA4/aJ2lbBJYf6Xnd6r6b2VUD3vT0rQjc/eyBZ9ADc6pFIK07mdwHbKB1354asBK4quR5TSFpi8Ba/5FsZl8DTgUWmtkYsN7dr+t2fJ70lcAlwMeAv3X3u83sD+7+o75nNgApWwQWke7u5/Zz/LQ53d0Pu/ungQuBj5nZNQR8oUraIrDYbdT6oieB7j4GvMPMzgK6Xh9ZFklbBBaI9L6HaDa73mKtOAf2lfBHS2T+wiPh2/zNnblzr71w9UDhrnMvWRJEuqRnmQFLxrmHfsSIQJGeHuX0AJTTA9Atd2YnivQsqvAKQDk9AEV6BIr09CjSA9CScXaiSM8wjLsw5iHpWbRkDECRHoEiPT2K9ACU0wMoKN3MzgSuBhrABne/susQpZRg0KEObmbz/6Yf350/9wVLpvzLmFkDuB9YA4wBW4Bz3f2+Tm9XtzrIdKsrlNNXAQ+4+04AM9sErKNVB/oM9I00S7GyuqXAw5Oej6F7YvTB/EVFknqn93RNU4r04TAGLJ/0fBmwq9vBivThsAU4wcyOBR4BzgHO63awIn0IuPsh4GLgFlr3OLre3bteKqKqXZhStZsCRXoAkh6ApAcg6QFIegCSHoCkByDpAUh6AJIegKQHIOkBSHoAkh6ApAdQGenJutUloBLS03arK59KSE/arS4Bff0wbWavp1VYs83dby1nSs8kZbe6FEwb6WZ216TH7wGuAY4B1pvZZSXP7QhJu9UlIC+9jEx6/F5gjbt/HHgL8JelzSpD0m51CciTXjez55rZ84Gau+8FcPcngEOlz65N0m51CcjL6c8BttIqG2ua2RJ3321mC0hxyUKbpN3qElCo7sXM5gOj7t553aa6l2lRsRGo2GguIOkBSHoAkh6ApAcg6QFIegCSHoCkByDpAUh6AJIegKQHIOkBSHoAkh6ApAcg6QFIegCSHoCkB6BudS1UDTDbUYtAmNoiMAGK9AAkPQBJD0DSA5D0ACQ9AEkPQNIDkPQAJD0ASQ9A0gOQ9AAkPQBJD0DSA5D0ACQ9AEkPQNIDkPQAKiNdLQITM9taBE5b92JmrwW2u/tjZnYUcBlwEq1bPl7h7v+bYI5TWgQCR1oEHn/csSmGHzp5kb4RONB+fDWt9lL/3N73byXOawqdWgTu2bs31fBDJ6/Cq96+LxvASnc/qf34DjO7u8R5TWGutQjcZmYXth//wsxWApjZS4GDpc5sEnOtReBFwClm9iDwMuA/zWwn8IX2a0mYbS0CeyqVNrNjgJfQSkdj7r5n2jeU0DjtRz/+CVf862ePtAh830XvHt4fV7e6ANStbvYj6QFIegCSHoCkByDpAUh6AJIegKQHIOkBSHoAkh6ApAcg6QFIegCSHoCkByDpAUh6AJIegKQHIOkBlNUiUEyDIj0ASQ9A0gMopxlmCZjZmbRq5BvABne/MnhKhanEB6mZNYD7gTXAGLAFONfd7wudWEGqkl5WAQ+4+053fwrYBKwLnlNhqiJ9KfDwpOdj7X2VpCrSO5Uyz/y82IWqSB8Dlk96vgzYFTSXganK6mULcIKZHQs8ApwDnBc7peJUItLbV/hdDNwCbAeud/d7Y2dVnEosGWcblYj02YakByDpAUh6AJIegKQHIOkBSHoA/wde+B7BTJL64AAAAABJRU5ErkJggg==\n",
305 | "text/plain": [
306 | ""
307 | ]
308 | },
309 | "metadata": {
310 | "needs_background": "light"
311 | },
312 | "output_type": "display_data"
313 | }
314 | ],
315 | "source": [
316 | "# define figure size\n",
317 | "plt.rcParams[\"figure.figsize\"] = (1,7)\n",
318 | "\n",
319 | "# display xi\n",
320 | "sns.heatmap(DataFrame(initial_xi_1), cmap='Oranges', annot=True, linewidths=.5)"
321 | ]
322 | },
323 | {
324 | "cell_type": "markdown",
325 | "metadata": {},
326 | "source": [
327 | "---\n",
328 | "## SLAM inputs \n",
329 | "\n",
330 | "In addition to `data`, your slam function will also take in:\n",
331 | "* N - The number of time steps that a robot will be moving and sensing\n",
332 | "* num_landmarks - The number of landmarks in the world\n",
333 | "* world_size - The size (w/h) of your world\n",
334 | "* motion_noise - The noise associated with motion; the update confidence for motion should be `1.0/motion_noise`\n",
335 | "* measurement_noise - The noise associated with measurement/sensing; the update weight for measurement should be `1.0/measurement_noise`\n",
336 | "\n",
337 | "#### A note on noise\n",
338 | "\n",
339 | "Recall that `omega` holds the relative \"strengths\" or weights for each position variable, and you can update these weights by accessing the correct index in omega `omega[row][col]` and *adding/subtracting* `1.0/noise` where `noise` is measurement or motion noise. `Xi` holds actual position values, and so to update `xi` you'll do a similar addition process only using the actual value of a motion or measurement. So for a vector index `xi[row][0]` you will end up adding/subtracting one measurement or motion divided by their respective `noise`.\n",
340 | "\n",
341 | "## Graph SLAM Implementation\n",
342 | "\n",
343 | "With a 2D omega and xi structure as shown above (in earlier cells), you'll have to be mindful about how you update the values in these constraint matrices to account for motion and measurement constraints in the x and y directions. Recall that the solution to these matrices (which holds all values for robot poses `P` and landmark locations `L`) is the vector, `mu`, which can be computed at the end of the construction of omega and xi as the inverse of omega times xi: $\\mu = \\Omega^{-1}\\xi$\n",
344 | "\n",
345 | "**You may also choose to return the values of `omega` and `xi` if you want to visualize their final state!**"
346 | ]
347 | },
348 | {
349 | "cell_type": "code",
350 | "execution_count": 292,
351 | "metadata": {},
352 | "outputs": [
353 | {
354 | "name": "stdout",
355 | "output_type": "stream",
356 | "text": [
357 | "[1 3 6]\n",
358 | "[[2 2]\n",
359 | " [5 4]\n",
360 | " [5 9]]\n"
361 | ]
362 | }
363 | ],
364 | "source": [
365 | "import numpy as np \n",
366 | " \n",
367 | "a = np.array([[1 ,2 ],[3 ,4 ],[5 ,6 ]]) \n",
368 | "print(a[[0 ,1 ,2 ],[0 ,0 ,1]]) \n",
369 | "\n",
370 | "a[[0 ,1 ,2 ],[0 ,0 ,1]] = a[[0 ,1 ,2 ],[0 ,0 ,1]] + [1, 2, 3]\n",
371 | "print(a)"
372 | ]
373 | },
374 | {
375 | "cell_type": "code",
376 | "execution_count": 293,
377 | "metadata": {},
378 | "outputs": [],
379 | "source": [
380 | "## Optimized implementation of Graph SLAM. Minimum Lines.\n",
381 | "\n",
382 | "## slam takes in 6 arguments and returns mu, \n",
383 | "## mu is the entire path traversed by a robot (all x,y poses) *and* all landmarks locations\n",
384 | "def slam(data, N, num_landmarks, world_size, motion_noise, measurement_noise):\n",
385 | " \n",
386 | " coefficients = [1, -1, -1, 1]\n",
387 | " \n",
388 | " # initialize the constraints\n",
389 | " initial_omega_1, initial_xi_1, initial_omega_2, initial_xi_2 = initialize_constraints(N, num_landmarks, world_size)\n",
390 | " \n",
391 | " ## get all the motion and measurement data as you iterate\n",
392 | " for i in range(len(data)):\n",
393 | " \n",
394 | " landmarks = data[i][0] # measurement\n",
395 | " motion = data[i][1] # motion\n",
396 | " \n",
397 | " # setting measurement constraints\n",
398 | " for landmark in landmarks:\n",
399 | " \n",
400 | " # calculate indices in the same order as coefficients (to meaningfully add)\n",
401 | " index1 = [i, i, N+landmark[0], N+landmark[0]]\n",
402 | " index2 = [i, N+landmark[0], i, N+landmark[0]]\n",
403 | " \n",
404 | " # dx update\n",
405 | " initial_omega_1[index1, index2] = initial_omega_1[index1, index2] + np.divide(coefficients, measurement_noise)\n",
406 | " initial_xi_1[[i, N+landmark[0]]] = initial_xi_1[[i, N+landmark[0]]] + np.divide([-landmark[1], landmark[1]], measurement_noise)\n",
407 | " \n",
408 | " # dy update\n",
409 | " initial_omega_2[index1, index2] = initial_omega_2[index1, index2] + np.divide(coefficients, measurement_noise)\n",
410 | " initial_xi_2[[i, N+landmark[0]]] = initial_xi_2[[i, N+landmark[0]]] + np.divide([-landmark[2], landmark[2]], measurement_noise)\n",
411 | " \n",
412 | " index1 = [i, i, i+1, i+1]\n",
413 | " index2 = [i, i+1, i, i+1]\n",
414 | " \n",
415 | " # dx update\n",
416 | " initial_omega_1[index1, index2] = initial_omega_1[index1, index2] + np.divide(coefficients, motion_noise)\n",
417 | " initial_xi_1[[i, i+1]] = initial_xi_1[[i, i+1]] + np.divide([-motion[0], motion[0]], motion_noise)\n",
418 | " \n",
419 | " # dy update\n",
420 | " initial_omega_2[index1, index2] = initial_omega_2[index1, index2] + np.divide(coefficients, motion_noise)\n",
421 | " initial_xi_2[[i, i+1]] = initial_xi_2[[i, i+1]] + np.divide([-motion[1], motion[1]], motion_noise)\n",
422 | " \n",
423 | " ## TODO: update the constraint matrix/vector to account for all measurements, measurement noise,\n",
424 | " ## motion and motion noise. Compute best estimate of poses and landmark positions using the formula, omega_inverse * Xi\n",
425 | " mu_1 = np.linalg.inv(np.matrix(initial_omega_1)) * np.expand_dims(initial_xi_1, 0).transpose()\n",
426 | " mu_2 = np.linalg.inv(np.matrix(initial_omega_2)) * np.expand_dims(initial_xi_2, 0).transpose()\n",
427 | " \n",
428 | " mu = []\n",
429 | " for i in range(len(mu_1)):\n",
430 | " mu.extend((mu_1[i], mu_2[i]))\n",
431 | " \n",
432 | " return mu # return 2d `mu`\n"
433 | ]
434 | },
435 | {
436 | "cell_type": "markdown",
437 | "metadata": {},
438 | "source": [
439 | "## Helper functions\n",
440 | "\n",
441 | "To check that your implementation of SLAM works for various inputs, we have provided two helper functions that will help display the estimated pose and landmark locations that your function has produced. First, given a result `mu` and number of time steps, `N`, we define a function that extracts the poses and landmarks locations and returns those as their own, separate lists. \n",
442 | "\n",
443 | "Then, we define a function that nicely print out these lists; both of these we will call, in the next step.\n"
444 | ]
445 | },
446 | {
447 | "cell_type": "code",
448 | "execution_count": 294,
449 | "metadata": {},
450 | "outputs": [],
451 | "source": [
452 | "# a helper function that creates a list of poses and of landmarks for ease of printing\n",
453 | "# this only works for the suggested constraint architecture of interlaced x,y poses\n",
454 | "def get_poses_landmarks(mu, N):\n",
455 | " # create a list of poses\n",
456 | " poses = []\n",
457 | " for i in range(N):\n",
458 | " poses.append((mu[2*i].item(), mu[2*i+1].item()))\n",
459 | "\n",
460 | " # create a list of landmarks\n",
461 | " landmarks = []\n",
462 | " for i in range(num_landmarks):\n",
463 | " landmarks.append((mu[2*(N+i)].item(), mu[2*(N+i)+1].item()))\n",
464 | "\n",
465 | " # return completed lists\n",
466 | " return poses, landmarks\n"
467 | ]
468 | },
469 | {
470 | "cell_type": "code",
471 | "execution_count": 295,
472 | "metadata": {},
473 | "outputs": [],
474 | "source": [
475 | "def print_all(poses, landmarks):\n",
476 | " print('\\n')\n",
477 | " print('Estimated Poses:')\n",
478 | " for i in range(len(poses)):\n",
479 | " print('['+', '.join('%.3f'%p for p in poses[i])+']')\n",
480 | " print('\\n')\n",
481 | " print('Estimated Landmarks:')\n",
482 | " for i in range(len(landmarks)):\n",
483 | " print('['+', '.join('%.3f'%l for l in landmarks[i])+']')\n"
484 | ]
485 | },
486 | {
487 | "cell_type": "markdown",
488 | "metadata": {},
489 | "source": [
490 | "## Run SLAM\n",
491 | "\n",
492 | "Once you've completed your implementation of `slam`, see what `mu` it returns for different world sizes and different landmarks!\n",
493 | "\n",
494 | "### What to Expect\n",
495 | "\n",
496 | "The `data` that is generated is random, but you did specify the number, `N`, or time steps that the robot was expected to move and the `num_landmarks` in the world (which your implementation of `slam` should see and estimate a position for. Your robot should also start with an estimated pose in the very center of your square world, whose size is defined by `world_size`.\n",
497 | "\n",
498 | "With these values in mind, you should expect to see a result that displays two lists:\n",
499 | "1. **Estimated poses**, a list of (x, y) pairs that is exactly `N` in length since this is how many motions your robot has taken. The very first pose should be the center of your world, i.e. `[50.000, 50.000]` for a world that is 100.0 in square size.\n",
500 | "2. **Estimated landmarks**, a list of landmark positions (x, y) that is exactly `num_landmarks` in length. \n",
501 | "\n",
502 | "#### Landmark Locations\n",
503 | "\n",
504 | "If you refer back to the printout of *exact* landmark locations when this data was created, you should see values that are very similar to those coordinates, but not quite (since `slam` must account for noise in motion and measurement)."
505 | ]
506 | },
507 | {
508 | "cell_type": "code",
509 | "execution_count": 296,
510 | "metadata": {
511 | "scrolled": false
512 | },
513 | "outputs": [
514 | {
515 | "name": "stdout",
516 | "output_type": "stream",
517 | "text": [
518 | "\n",
519 | "\n",
520 | "Estimated Poses:\n",
521 | "[50.000, 50.000]\n",
522 | "[52.480, 70.055]\n",
523 | "[56.108, 90.923]\n",
524 | "[35.630, 94.832]\n",
525 | "[16.401, 99.498]\n",
526 | "[35.537, 97.548]\n",
527 | "[54.325, 96.304]\n",
528 | "[73.259, 93.547]\n",
529 | "[93.515, 93.291]\n",
530 | "[82.962, 76.024]\n",
531 | "[72.818, 59.175]\n",
532 | "[62.115, 41.686]\n",
533 | "[50.447, 23.174]\n",
534 | "[40.059, 5.881]\n",
535 | "[21.148, 5.680]\n",
536 | "[1.185, 6.857]\n",
537 | "[20.096, 14.759]\n",
538 | "[37.264, 23.434]\n",
539 | "[55.876, 33.656]\n",
540 | "[73.653, 42.820]\n",
541 | "\n",
542 | "\n",
543 | "Estimated Landmarks:\n",
544 | "[45.901, 39.276]\n",
545 | "[46.580, 47.959]\n",
546 | "[33.748, 30.970]\n",
547 | "[33.779, 44.856]\n",
548 | "[79.896, 23.079]\n"
549 | ]
550 | }
551 | ],
552 | "source": [
553 | "# call your implementation of slam, passing in the necessary parameters\n",
554 | "mu = slam(data, N, num_landmarks, world_size, motion_noise, measurement_noise)\n",
555 | "\n",
556 | "\n",
557 | "# print(mu)\n",
558 | "# print out the resulting landmarks and poses\n",
559 | "if(mu is not None):\n",
560 | " # get the lists of poses and landmarks\n",
561 | " # and print them out\n",
562 | " poses, landmarks = get_poses_landmarks(mu, N)\n",
563 | " print_all(poses, landmarks)"
564 | ]
565 | },
566 | {
567 | "cell_type": "markdown",
568 | "metadata": {},
569 | "source": [
570 | "## Visualize the constructed world\n",
571 | "\n",
572 | "Finally, using the `display_world` code from the `helpers.py` file (which was also used in the first notebook), we can actually visualize what you have coded with `slam`: the final position of the robot and the positon of landmarks, created from only motion and measurement data!\n",
573 | "\n",
574 | "**Note that these should be very similar to the printed *true* landmark locations and final pose from our call to `make_data` early in this notebook.**"
575 | ]
576 | },
577 | {
578 | "cell_type": "code",
579 | "execution_count": 297,
580 | "metadata": {},
581 | "outputs": [
582 | {
583 | "name": "stdout",
584 | "output_type": "stream",
585 | "text": [
586 | "Last pose: (73.65291181422072, 42.82015482215206)\n"
587 | ]
588 | },
589 | {
590 | "data": {
591 | "image/png": "iVBORw0KGgoAAAANSUhEUgAABIoAAARfCAYAAABz4ghpAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvpW3flQAAIABJREFUeJzs3WvsbXddJvBnt5vedBdpWcdTKEZDRyhq8NaSI3IrtoVUtCCGYS4OjeJMcKilIRMJQ6jhNhrSaZ0MgQqG4ui84AjFtAKHHoGCFAGnKCOgow0RiO3ZQC9/bKHtYc2LWVNDENn/s89ifX+rn8/L04fffpzMN5onZ5VF3/d9AAAAAHjAO2bqAgAAAADUYCgCAAAAIImhCAAAAICBoQgAAACAJIYiAAAAAAaGIgAAAACSbDAUveQlL8m+ffvy0z/90/f/2e23356LLroo5513Xi666KLccccdSZK+7/PKV74y5557bp7xjGfkL//yL8drDgAAAMBR9S2Homc961l54xvf+HV/dtVVV2Xfvn05cOBA9u3bl6uuuipJcsMNN+Qzn/lMDhw4kFe84hW57LLLRikNAAAAwNH3LYeis846Kw9+8IO/7s8OHjyYCy+8MEly4YUX5vrrr/+6P18sFvnhH/7h3HnnnTl06NAItQEAAAA42o7o31H0xS9+MXv27EmS7NmzJ1/60peSJLfeemv27t17f27v3r259dZbj0JNAAAAAMZ2VP9l1n3ff8OfLRaLo/kTAAAAAIxkeST/oVNPPTWHDh3Knj17cujQoZxyyilJ/t/fILrlllvuz91yyy33/82jf8p6vbPR73XdarbZKj1ay1bpUSG7m3zXrZLM9/aq9KiQrdKjtexYb7d4e1V6VMhW6VEhW6VHa7c35tutZav0qJCt0qPC7Y3VY+7ZKj0qZKv0qJLd1hH9jaJzzjkn11xzTZLkmmuuyVOf+tSv+/O+7/Pxj388q9Xqnx2KAAAAAKjjW/6NoksvvTQf+chHctttt+WJT3xiXvjCF+aXf/mXc8kll2T//v057bTTcuWVVyZJnvSkJ+X9739/zj333Jx44ol59atfPfr/AAAAAAAcHd9yKLr88sv/yT+/+uqrv+HPFotFXv7yl2/fCgAAAIBvu6P6L7MGAAAAoF2GIgAAAACSGIoAAAAAGBiKAAAAAEhiKAIAAABgYCgCAAAAIEmy6Pu+n7oEAAAAANNbTvnj6/XORrmuW802W6VHa9kqPSpkd5PvulWS+d5elR4VslV6tJYd6+0Wb69KjwrZKj0qZKv0aO32xny7tWyVHhWyVXpUuL2xesw9W6VHhWyVHlWy2/LpGQAAAABJDEUAAAAADAxFAAAAACQxFAEAAAAwMBQBAAAAkMRQBAAAAMDAUAQAAABAEkMRAAAAAANDEQAAAABJkkXf9/3UJQAAAACY3nLKH1+vdzbKdd1qttkqPVrLVulRIbubfNetksz39qr0qJCt0qO17Fhvt3h7VXpUyFbpUSFbpUdrtzfm261lq/SokK3So8LtjdVj7tkqPSpkq/Sokt2WT88AAAAASGIoAgAAAGBgKAIAAAAgiaEIAAAAgIGhCAAAAIAkhiIAAAAABoYiAAAAAJIYigAAAAAYGIoAAAAASJIs+r7vpy4BAAAAwPSWU/74er2zUa7rVrPNVunRWrZKjwrZ3eS7bpVkvrdXpUeFbJUerWXHervF26vSo0K2So8K2So9Wru9Md9uLVulR4VslR4Vbm+sHnPPVulRIVulR5Xstnx6BgAAAEASQxEAAAAAA0MRAAAAAEkMRQAAAAAMDEUAAAAAJDEUAQAAADAwFAEAAACQxFAEAAAAwMBQBAAAAECSZNH3fT91CQAAAACmt5zyx9frnY1yXbeabbZKj9ayVXpUyO4m33WrJPO9vSo9KmSr9GgtO9bbLd5elR4VslV6VMhW6dHa7Y35dmvZKj0qZKv0qHB7Y/WYe7ZKjwrZKj2qZLfl0zMAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCSGIgAAAAAGi77v+6lLAAAAADC95ZQ/vl7vbJTrutVss1V6tJat0qNCdjf5rlslme/tVelRIVulR2vZsd5u8faq9KiQrdKjQrZKj9Zub8y3W8tW6VEhW6VHhdsbq8fcs1V6VMhW6VEluy2fngEAAACQxFAEAAAAwMBQBAAAAEASQxEAAAAAA0MRAAAAAEkMRQAAAAAMDEUAAAAAJDEUAQAAADAwFAEAAACQJFn0fd9PXQIAAACA6S2n/PH1emejXNetZput0qO1bJUeFbK7yXfdKsl8b69KjwrZKj1ay471dou3V6VHhWyVHhWyVXq0dntjvt1atkqPCtkqPSrc3lg95p6t0qNCtkqPKtlt+fQMAAAAgCSGIgAAAAAGhiIAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCTJou/7fuoSAAAAAExvOeWPr9c7G+W6bjXbbJUerWWr9KiQ3U2+61ZJ5nt7VXpUyFbp0Vp2rLdbvL0qPSpkq/SokK3So7XbG/Pt1rJVelTIVulR4fbG6jH3bJUeFbJVelTJbsunZwAAAAAkMRQBAAAAMDAUAQAAAJDEUAQAAADAwFAEAAAAQBJDEQAAAAADQxEAAAAASQxFAAAAAAwMRQAAAAAkSRZ93/dTlwAAAABgesspf3y93tko13Wr2War9GgtW6VHhexu8l23SjLf26vSo0K2So/WsmO93eLtVelRIVulR4VslR6t3d6Yb7eWrdKjQrZKjwq3N1aPuWer9KiQrdKjSnZbPj0DAAAAIImhCAAAAICBoQgAAACAJIYiAAAAAAaGIgAAAACSGIoAAAAAGBiKAAAAAEhiKAIAAABgYCgCAAAAIEmy6Pu+n7oEAAAAANNbTvnj6/XORrmuW802W6VHa9kqPSpkd5PvulWS+d5elR4VslV6tJYd6+0Wb69KjwrZKj0qZKv0aO32xny7tWyVHhWyVXpUuL2xesw9W6VHhWyVHlWy2/LpGQAAAABJDEUAAAAADAxFAAAAACQxFAEAAAAwMBQBAAAAkMRQBAAAAMDAUAQAAABAEkMRAAAAAANDEQAAAABJkkXf9/3UJQAAAACY3nLKH1+vdzbKdd1qttkqPVrLVulRIbubfNetksz39qr0qJCt0qO17Fhvt3h7VXpUyFbpUSFbpUdrtzfm261lq/SokK3So8LtjdVj7tkqPSpkq/Sokt2WT88AAAAASGIoAgAAAGBgKAIAAAAgiaEIAAAAgIGhCAAAAIAkhiIAAAAABoYiAAAAAJIYigAAAAAYGIoAAAAASJIs+r7vpy4BAAAAwPSWU/74er2zUa7rVrPNVunRWrZKjwrZ3eS7bpVkvrdXpUeFbJUerWXHervF26vSo0K2So8K2So9Wru9Md9uLVulR4VslR4Vbm+sHnPPVulRIVulR5Xstnx6BgAAAEASQxEAAAAAA0MRAAAAAEkMRQAAAAAMDEUAAAAAJDEUAQAAADAwFAEAAACQxFAEAAAAwMBQBAAAAECSZNH3fT91CQAAAACmt5zyx9frnY1yXbeabbZKj9ayVXpUyO4m33WrJPO9vSo9KmSr9GgtO9bbLd5elR4VslV6VMhW6dHa7Y35dmvZKj0qZKv0qHB7Y/WYe7ZKjwrZKj2qZLfl0zMAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCSGIgAAAAAGi77v+6lLAAAAADC95ZQ/vl7vbJTrutVss1V6tJat0qNCdjf5rlslme/tVelRIVulR2vZsd5u8faq9KiQrdKjQrZKj9Zub8y3W8tW6VEhW6VHhdsbq8fcs1V6VMhW6VEluy2fngEAAACQxFAEAAAAwMBQBAAAAEASQxEAAAAAA0MRAAAAAEkMRQAAAAAMDEUAAAAAJDEUAQAAADAwFAEAAACQJFn0fd9PXQIAAACA6S2n/PH1emejXNetZput0qO1bJUeFbK7yXfdKsl8b69KjwrZKj1ay471dou3V6VHhWyVHhWyVXq0dntjvt1atkqPCtkqPSrc3lg95p6t0qNCtkqPKtlt+fQMAAAAgCSGIgAAAAAGhiIAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCTJou/7fuoSAAAAAExvOeWPr9c7G+W6bjXbbJUerWWr9KiQ3U2+61ZJ5nt7VXpUyFbp0Vp2rLdbvL0qPSpkq/SokK3So7XbG/Pt1rJVelTIVulR4fbG6jH3bJUeFbJVelTJbsunZwAAAAAkMRQBAAAAMDAUAQAAAJDEUAQAAADAwFAEAAAAQBJDEQAAAAADQxEAAAAASQxFAAAAAAwMRQAAAAAkSRZ93/dTlwAAAABgesspf3y93tko13Wr2War9GgtW6VHhexu8l23SjLf26vSo0K2So/WsmO93eLtVelRIVulR4VslR6t3d6Yb7eWrdKjQrZKjwq3N1aPuWer9KiQrdKjSnZbPj0DAAAAIImhCAAAAICBoQgAAACAJIYiAAAAAAaGIgAAAACSGIoAAAAAGBiKAAAAAEhiKAIAAABgYCgCAAAAIEmy6Pu+n7oEAAAAANNbTvnj6/XORrmuW802W6VHa9kqPSpkd5PvulWS+d5elR4VslV6tJYd6+0Wb69KjwrZKj0qZKv0aO32xny7tWyVHhWyVXpUuL2xesw9W6VHhWyVHlWy2/LpGQAAAABJDEUAAAAADAxFAAAAACQxFAEAAAAwMBQBAAAAkMRQBAAAAMDAUAQAAABAEkMRAAAAAANDEQAAAABJkkXf9/3UJQAAAACY3nLKH1+vdzbKdd1qttkqPVrLVulRIbubfNetksz39qr0qJCt0qO17Fhvt3h7VXpUyFbpUSFbpUdrtzfm261lq/SokK3So8LtjdVj7tkqPSpkq/Sokt2WT88AAAAASGIoAgAAAGBgKAIAAAAgiaEIAAAAgIGhCAAAAIAkhiIAAAAABoYiAAAAAJIYigAAAAAYGIoAAAAASJIs+r7vpy4BAAAAwPSWU/74er2zUa7rVrPNVunRWrZKjwrZ3eS7bpVkvrdXpUeFbJUerWXHervF26vSo0K2So8K2So9Wru9Md9uLVulR4VslR4Vbm+sHnPPVulRIVulR5Xstnx6BgAAAEASQxEAAAAAA0MRAAAAAEkMRQAAAAAMDEUAAAAAJNnyv/XszW9+c9761rdmsVjk+7//+/Oa17wmhw4dyqWXXpo77rgjj3nMY/Kbv/mbOe64445WXwAAAABGcsR/o+jWW2/NW97ylvzBH/xBrr322hw+fDjXXXddXvva1+Z5z3teDhw4kJNPPjn79+8/mn0BAAAAGMlWn54dPnw4X/nKV3LfffflK1/5Srquy4c//OGcf/75SZJnPvOZOXjw4FEpCgAAAMC4Fn3f90f6H7766qtzxRVX5Pjjj8/jH//4vPSlL81znvOcvOc970mS/P3f/32e//zn59prrz1qhQEAAAAYxxH/jaI77rgjBw8ezMGDB/OBD3wgd999d2644YZvyC0Wi60KAgAAAPDtccT/MusPfehDOf3003PKKackSc4777zcdNNNufPOO3PfffdluVzmlltuyZ49e77pG+v1zka/1XWr2War9GgtW6VHhexu8l23SjLf26vSo0K2So/WsmO93eLtVelRIVulR4VslR6t3d6Yb7eWrdKjQrZKjwq3N1aPuWer9KiQrdKjSnZbR/w3ih72sIflz//8z3P33Xen7/vceOONOeOMM/K4xz0u7373u5Mkb3/723POOedsXRIAAACA8R3x3yh67GMfm/PPPz/PfOYzs1wuc+aZZ+Y5z3lOnvzkJ+dFL3pRrrjiipx55pn5+Z//+aPZFwAAAICRHPFQlCQXX3xxLr744q/7s0c84hHZv3//VqUAAAAA+PY74k/PAAAAAJgXQxEAAAAASQxFAAAAAAwMRQAAAAAkMRQBAAAAMFj0fd9PXQIAAACA6S2n/PH1emejXNetZput0qO1bJUeFbK7yXfdKsl8b69KjwrZKj1ay471dou3V6VHhWyVHhWyVXq0dntjvt1atkqPCtkqPSrc3lg95p6t0qNCtkqPKtlt+fQMAAAAgCSGIgAAAAAGhiIAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCTJou/7fuoSAAAAAExvOeWPr9c7G+W6bjXbbJUerWWr9KiQ3U2+61ZJ5nt7VXpUyFbp0Vp2rLdbvL0qPSpkq/SokK3So7XbG/Pt1rJVelTIVulR4fbG6jH3bJUeFbJVelTJbsunZwAAAAAkMRQBAAAAMDAUAQAAAJDEUAQAAADAwFAEAAAAQBJDEQAAAAADQxEAAAAASQxFAAAAAAwMRQAAAAAkSRZ93/dTlwAAAABgesspf3y93tko13Wr2War9GgtW6VHhexu8l23SjLf26vSo0K2So/WsmO93eLtVelRIVulR4VslR6t3d6Yb7eWrdKjQrZKjwq3N1aPuWer9KiQrdKjSnZbPj0DAAAAIImhCAAAAICBoQgAAACAJIYiAAAAAAaGIgAAAACSGIoAAAAAGBiKAAAAAEhiKAIAAABgYCgCAAAAIEmy6Pu+n7oEAAAAANNbTvnj6/XORrmuW802W6VHa9kqPSpkd5PvulWS+d5elR4VslV6tJYd6+0Wb69KjwrZKj0qZKv0aO32xny7tWyVHhWyVXpUuL2xesw9W6VHhWyVHlWy2/LpGQAAAABJDEUAAAAADAxFAAAAACQxFAEAAAAwMBQBAAAAkMRQBAAAAMDAUAQAAABAEkMRAAAAAANDEQAAAABJkkXf9/3UJQAAAACY3nLKH1+vdzbKdd1qttkqPVrLVulRIbubfNetksz39qr0qJCt0qO17Fhvt3h7VXpUyFbpUSFbpUdrtzfm261lq/SokK3So8LtjdVj7tkqPSpkq/Sokt2WT88AAAAASGIoAgAAAGBgKAIAAAAgiaEIAAAAgIGhCAAAAIAkhiIAAAAABoYiAAAAAJIYigAAAAAYGIoAAAAASJIs+r7vpy4BAAAAwPSWU/74er2zUa7rVrPNVunRWrZKjwrZ3eS7bpVkvrdXpUeFbJUerWXHervF26vSo0K2So8K2So9Wru9Md9uLVulR4VslR4Vbm+sHnPPVulRIVulR5Xstnx6BgAAAEASQxEAAAAAA0MRAAAAAEkMRQAAAAAMDEUAAAAAJDEUAQAAADAwFAEAAACQxFAEAAAAwMBQBAAAAECSZNH3fT91CQAAAACmt5zyx9frnY1yXbeabbZKj9ayVXpUyO4m33WrJPO9vSo9KmSr9GgtO9bbLd5elR4VslV6VMhW6dHa7Y35dmvZKj0qZKv0qHB7Y/WYe7ZKjwrZKj2qZLfl0zMAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCSGIgAAAAAGhiIAAAAAkiSLvu/7qUsAAAAAML3llD++Xu9slOu61WyzVXq0lq3So0J2N/muWyWZ7+1V6VEhW6VHa9mx3m7x9qr0qJCt0qNCtkqP1m5vzLdby1bpUSFbpUeF2xurx9yzVXpUyFbpUSW7LZ+eAQAAAJDEUAQAAADAwFAEAAAAQBJDEQAAAAADQxEAAAAASQxFAAAAAAwMRQAAAAAkMRQBAAAAMFj0fd9PXQIAAACA6S2n/PH1emejXNetZput0qO1bJUeFbK7yXfdKsl8b69KjwrZKj1ay471dou3V6VHhWyVHhWyVXq0dntjvt1atkqPCtkqPSrc3lg95p6t0qNCtkqPKtlt+fQMAAAAgCSGIgAAAAAGhiIAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCTJou/7fuoSAAAAAExvOeWPr9c7G+W6bjXbbJUerWWr9KiQ3U2+61ZJ5nt7VXpUyFbp0Vp2rLdbvL0qPSpkq/SokK3So7XbG/Pt1rJVelTIVulR4fbG6jH3bJUeFbJVelTJbsunZwAAAAAkMRQBAAAAMDAUAQAAAJDEUAQAAADAwFAEAAAAQBJDEQAAAAADQxEAAAAASQxFAAAAAAwMRQAAAAAkSRZ93/dTlwAAAABgesspf3y93tko13Wr2War9GgtW6VHhexu8l23SjLf26vSo0K2So/WsmO93eLtVelRIVulR4VslR6t3d6Yb7eWrdKjQrZKjwq3N1aPuWer9KiQrdKjSnZbPj0DAAAAIImhCAAAAICBoQgAAACAJIYiAAAAAAaGIgAAAACSGIoAAAAAGBiKAAAAAEhiKAIAAABgYCgCAAAAIEmy6Pu+n7oEAAAAANNbTvnj6/XORrmuW802W6VHa9kqPSpkd5PvulWS+d5elR4VslV6tJYd6+0Wb69KjwrZKj0qZKv0aO32xny7tWyVHhWyVXpUuL2xesw9W6VHhWyVHlWy2/LpGQAAAABJDEUAAAAADAxFAAAAACQxFAEAAAAwMBQBAAAAkMRQBAAAAMDAUAQAAABAEkMRAAAAAANDEQAAAABJkkXf9/3UJQAAAACY3nLKH1+vdzbKdd1qttkqPVrLVulRIbubfNetksz39qr0qJCt0qO17Fhvt3h7VXpUyFbpUSFbpUdrtzfm261lq/SokK3So8LtjdVj7tkqPSpkq/Sokt2WT88AAAAASGIoAgAAAGBgKAIAAAAgiaEIAAAAgIGhCAAAAIAkhiIAAAAABoYiAAAAAJIYigAAAAAYGIoAAAAASJIs+r7vpy4BAAAAwPSWU/74er2zUa7rVrPNVunRWrZKjwrZ3eS7bpVkvrdXpUeFbJUerWXHervF26vSo0K2So8K2So9Wru9Md9uLVulR4VslR4Vbm+sHnPPVulRIVulR5Xstnx6BgAAAEASQxEAAAAAA0MRAAAAAEkMRQAAAAAMDEUAAAAAJDEUAQAAADAwFAEAAACQxFAEAAAAwMBQBAAAAECSZNH3fT91CQAAAACmt5zyx9frnY1yXbeabbZKj9ayVXpUyO4m33WrJPO9vSo9KmSr9GgtO9bbLd5elR4VslV6VMhW6dHa7Y35dmvZKj0qZKv0qHB7Y/WYe7ZKjwrZKj2qZLfl0zMAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCSGIgAAAAAGi77v+6lLAAAAADC95ZQ/vl7vbJTrutVss1V6tJat0qNCdjf5rlslme/tVelRIVulR2vZsd5u8faq9KiQrdKjQrZKj9Zub8y3W8tW6VEhW6VHhdsbq8fcs1V6VMhW6VEluy2fngEAAACQxFAEAAAAwMBQBAAAAEASQxEAAAAAA0MRAAAAAEkMRQAAAAAMDEUAAAAAJDEUAQAAADAwFAEAAACQJFn0fd9PXQIAAACA6S2n/PH1emejXNetZput0qO1bJUeFbK7yXfdKsl8b69KjwrZKj1ay471dou3V6VHhWyVHhWyVXq0dntjvt1atkqPCtkqPSrc3lg95p6t0qNCtkqPKtlt+fQMAAAAgCSGIgAAAAAGhiIAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCTJou/7fuoSAAAAAExvOeWPr9c7G+W6bjXbbJUerWWr9KiQ3U2+61ZJ5nt7VXpUyFbp0Vp2rLdbvL0qPSpkq/SokK3So7XbG/Pt1rJVelTIVulR4fbG6jH3bJUeFbJVelTJbsunZwAAAAAkMRQBAAAAMDAUAQAAAJDEUAQAAADAwFAEAAAAQBJDEQAAAAADQxEAAAAASQxFAAAAAAwMRQAAAAAkSRZ93/dTlwAAAABgesspf3y93tko13Wr2War9GgtW6VHhexu8l23SjLf26vSo0K2So/WsmO93eLtVelRIVulR4VslR6t3d6Yb7eWrdKjQrZKjwq3N1aPuWer9KiQrdKjSnZbPj0DAAAAIImhCAAAAICBoQgAAACAJIYiAAAAAAaGIgAAAACSbDkU3Xnnnbn44ovztKc9LU9/+tNz00035fbbb89FF12U8847LxdddFHuuOOOo9UVAAAAgBFtNRS96lWvyhOe8IS8613vyjve8Y488pGPzFVXXZV9+/blwIED2bdvX6666qqj1RUAAACAER3xUPTlL385H/3oR/PsZz87SXLcccfl5JNPzsGDB3PhhRcmSS688MJcf/31R6cpAAAAAKNa9H3fH8l/8FOf+lRe9rKX5YwzzsinP/3p/MAP/EBe+tKX5olPfGI+9rGP3Z8766yz8tGPfvSoFQYAAABgHEf8N4ruu+++fPKTn8xzn/vcXHPNNTnxxBN9ZgYAAADQsOWR/gf37t2bvXv35rGPfWyS5GlPe1quuuqqnHrqqTl06FD27NmTQ4cO5ZRTTvmmb6zXOxv9VtetZput0qO1bJUeFbK7yXfdKsl8b69KjwrZKj1ay471dou3V6VHhWyVHhWyVXq0dntjvt1atkqPCtkqPSrc3lg95p6t0qNCtkqPKtltHfHfKOq6Lnv37s3NN9+cJLnxxhvzyEc+Muecc06uueaaJMk111yTpz71qVuXBAAAAGB8R/w3ipLkZS97WV784hfn3nvvzSMe8Yi85jWvyde+9rVccskl2b9/f0477bRceeWVR6srAAAAACPaaig688wz87a3ve0b/vzqq6/e5lkAAAAAJnDEn54BAAAAMC+GIgAAAACSGIoAAAAAGBiKAAAAAEhiKAIAAABgYCgCAAAAIEmy6Pu+n7oEAAAAANNbTvnj6/XORrmuW802W6VHa9kqPSpkd5PvulWS+d5elR4VslV6tJYd6+0Wb69KjwrZKj0qZKv0aO32xny7tWyVHhWyVXpUuL2xesw9W6VHhWyVHlWy2/LpGQAAAABJDEUAAAAADAxFAAAAACQxFAEAAAAwMBQBAAAAkMRQBAAAAMDAUAQAAABAEkMRAAAAAAMCm3dSAAAgAElEQVRDEQAAAABJkkXf9/3UJQAAAACY3nLKH1+vdzbKdd1qttkqPVrLVulRIbubfNetksz39qr0qJCt0qO17Fhvt3h7VXpUyFbpUSFbpUdrtzfm261lq/SokK3So8LtjdVj7tkqPSpkq/Sokt2WT88AAAAASGIoAgAAAGBgKAIAAAAgiaEIAAAAgIGhCAAAAIAkhiIAAAAABoYiAAAAAJIYigAAAAAYGIoAAAAASJIs+r7vpy4BAAAAwPSWU/74er2zUa7rVrPNVunRWrZKjwrZ3eS7bpVkvrdXpUeFbJUerWXHervF26vSo0K2So8K2So9Wru9Md9uLVulR4VslR4Vbm+sHnPPVulRIVulR5Xstnx6BgAAAEASQxEAAAAAA0MRAAAAAEkMRQAAAAAMDEUAAAAAJDEUAQAAADAwFAEAAACQxFAEAAAAwGDR930/dQkAAAAAprec8sfX652Ncl23mm22So/WslV6VMjuJt91qyTzvb0qPSpkq/RoLTvW2y3eXpUeFbJVelTIVunR2u2N+XZr2So9KmSr9Khwe2P1mHu2So8K2So9qmS35dMzAAAAAJIYigAAAAAYGIoAAAAASGIoAgAAAGBgKAIAAAAgiaEIAAAAgIGhCAAAAIAkhiIAAAAABoYiAAAAAJIki77v+6lLAAAAADC95ZQ/vl7vbJTrutVss1V6tJat0qNCdjf5rlslme/tVelRIVulR2vZsd5u8faq9KiQrdKjQrZKj9Zub8y3W8tW6VEhW6VHhdsbq8fcs1V6VMhW6VEluy2fngEAAACQxFAEAAAAwMBQBAAAAEASQxEAAAAAA0MRAAAAAEkMRQAAAAAMDEUAAAAAJDEUAQAAADAwFAEAAACQJFn0fd9PXQIAAACA6S2n/PH1emejXNetZput0qO1bJUeFbK7yXfdKsl8b69KjwrZKj1ay471dou3V6VHhWyVHhWyVXq0dntjvt1atkqPCtkqPSrc3lg95p6t0qNCtkqPKtlt+fQMAAAAgCSGIgAAAAAGhiIAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCTJou/7fuoSAAAAAExvOeWPr9c7G+W6bjXbbJUerWWr9KiQ3U2+61ZJ5nt7VXpUyFbp0Vp2rLdbvL0qPSpkq/SokK3So7XbG/Pt1rJVelTIVulR4fbG6jH3bJUeFbJVelTJbsunZwAAAAAkMRQBAAAAMDAUAQAAAJDEUAQAAADAwFAEAAAAQBJDEQAAAAADQxEAAAAASQxFAAAAAAwMRQAAAAAkSRZ93/dTlwAAAABgesspf3y93tko13Wr2War9GgtW6VHhexu8l23SjLf26vSo0K2So/WsmO93eLtVelRIVulR4VslR6t3d6Yb7eWrdKjQrZKjwq3N1aPuWer9KiQrdKjSnZbPj0DAAAAIImhCAAAAICBoQgAAACAJIYiAAAAAAaGIgAAAACSGIoAAAAAGBiKAAAAAEhiKAIAAABgYCgCAAAAIEmy6Pu+n7oEAAAAANNbTvnj6/XORrmuW802W6VHa9kqPSpkd5PvulWS+d5elR4VslV6tJYd6+0Wb69KjwrZKj0qZKv0aO32xny7tWyVHhWyVXpUuL2xesw9W6VHhWyVHlWy2/LpGQAAAABJDEUAAAAADAxFAAAAACQxFAEAAAAwMBQBAAAAkMRQBAAAAMDAUAQAAABAEkMRAAAAAANDEQAAAABJkkXf9/3UJQAAAACY3nLKH1+vdzbKdd1qttkqPVrLVulRIbubfNetksz39qr0qJCt0qO17Fhvt3h7VXpUyFbpUSFbpUdrtzfm261lq/SokK3So8LtjdVj7tkqPSpkq/Sokt2WT88AAAAASGIoAgAAAGBgKAIAAAAgiaEIAAAAgIGhCAAAAIAkhiIAAAAABoYiAAAAAJIYigAAAAAYGIoAAAAASJIs+r7vpy4BAAAAwPSWU/74er2zUa7rVrPNVunRWrZKjwrZ3eS7bpVkvrdXpUeFbJUerWXHervF26vSo0K2So8K2So9Wru9Md9uLVulR4VslR4Vbm+sHnPPVulRIVulR5Xstnx6BgAAAEASQxEAAAAAA0MRAAAAAEkMRQDARN75C+/I6/Zcnr94403f8M/+9L/8SV635/L84S/94QTNAAAeuAxFAMAknnLlefnO01e58ddvyPoTh+7/88/d8Hf5X1d8JA951Kl5+m89fcKGAAAPPIYiAGASJzzkxJz7+gvSH+5z4PnX5t4v35O71nfl+he8M8cef2zO/+0L8qCTHjR1TQCAB5Tl1AUAgAeu085+WM7+tZ/Ih1/5wbzvxdfn7i/enbsO/UOefPm5OeXRD526HgDAA46hCACY1I+88Kx8/k8+m//ztk8nSf7Fsx6Vx/ybH5q4FQDAA9Oi7/t+6hIAwAPbx97wsVz3H65LkvzSn/5SHn72wyduBADwwDTpULRe72yU67rVbLNVerSWrdKjQnY3+a5bJZnv7VXpUSFbpUdr2bHebvH2vp09br/5trz1qf8jxzzomNxz5z15yKNOzbPf/a+yPGH5gPt/i+rZKj1au70x324tW6VHhWyVHhVub6wec89W6VEhW6VHley2/MusAYDJHP7qfTnw/Oty71335tw3XJAf/dWz86VPfSEf/M/vm7oaAMADkqEIAJjMhy67IV/4xKH8yH88K9/zlO/NWf9pX/ae/bB88i1/kb95x19NXQ8A4AHHUAQATOLmP/qbfOJNH893/9jePO4lj0+SHHPsMTn3DRfk+IeckPdd+p7cdvNtE7cEAHhgMRQBAN92O5+7M++95N057uTjc+7rL8gxy3/8P0lWD1/lnCvOyz0792T/v9yfw/ccnrApAMADy3LqAgDAA8/q9JPzi3/9K9/0n3/f08/ICw5duut/kSUAANvxN4oAAAAASGIoAgAAAGBgKAIAAAAgiaEIAAAAgMGi7/t+6hIAAAAATG/S/9azTf9bTHbz33jSWrZKj9ayVXpUyO4m33WrJPO9vSo9KmSr9GgtO9bbLd5elR4VslV6VMhW6dHa7Y35dmvZKj0qZKv0qHB7Y/WYe7ZKjwrZKj2qZLfl0zMAAAAAkhiKAAAAABgYigAAAABIYigCAAAAYGAoAgAAACCJoQgAAACAgaEIAAAAgCSGIgAAAAAGhiIAAAAAkiSLvu/7qUsAAAAAML3llD++Xu9slOu61WyzVXq0lq3So0J2N/muWyWZ7+1V6VEhW6VHa9mx3m7x9qr0qJCt0qNCtkqP1m5vzLdby1bpUSFbpUeF2xurx9yzVXpUyFbpUSW7LZ+eAQAAAJDEUAQAAADAwFAEAAAAQBJDEQAAAAADQxEAAAAASQxFAGzpnb/wjrxuz+X5izfe9A3/7I9f9sd53Z7L894XHZigGQAAsFuGIgC28pQrz8t3nr7Kjb9+Q9afOHT/n3/uhr/LB1/9wTzkUafmJ1/1lAkbAgAAmzIUAbCVEx5yYs59/QXpD/c58Pxrc++X78ld67ty/QvemWOPPzbn//YFedBJD5q6JgAAsIHl1AUAaN9pZz8sZ//aT+TDr/xg3vfi63P3F+/OXYf+Ic/47WfklEc/dOp6AADAhhZ93/dTlwCgfX3f5/ee/nv523f/bZLkB5/7g/m53/+5iVsBAAC7MenfKFqvdzbKdd1qttkqPVrLVulRIbubfNetksz39qr0qJCdqsfpP/V99w9Fj/p3P5Skrf//NtbbLd5elR4VslV6VMhW6dHa7Y35dmvZKj0qZKv0qHB7Y/WYe7ZKjwrZKj2qZLfl31EEwFFx+8235UOXvT/Hf9fxWRyzyHtf9J7c95X7pq4FAADsgqEIgK0d/up9OfD863LvXffm3DdckB/91bPzpU99Ie+65F1TVwMAAHbBv8wagK196LIb8oVPHMqPvPCsfM9TvjenP/F78vk/+Wz+7A1/llN/fG/O+NlHTV0RAADYgL9RBMBWbv6jv8kn3vTxfPeP7c3jXvL4JMkxxx6Tc99wQU485cS879L35I7P3D5xSwAAYBOGIgCO2M7n7sx7L3l3jjv5+Jz7+gtyzPIf/9fK6uGr/Mzv/Ezu2bkn7/n31+XwPYcnbAoAAGzCp2cAHLHV6SfnF//6V77pP3/0zz46Lzh06bexEQAAsA1/owgAAACAJIYiAAAAAAaGIgAAAACSJIu+7/upSwAAAAAwvUn/Zdbr9c5Gua5bzTZbpUdr2So9KmR3k++6VZL53l6VHhWyVXq0lh3r7RZvr0qPCtkqPSpkq/Ro7fbGfLu1bJUeFbJVelS4vbF6zD1bpUeFbJUeVbLb8ukZAAAAAEkMRQAAAAAMJv30DAAAoCXH/vVf5YSr35TjPvD+HPP5z2dx7z352kO73PujP548798mP/lTU1cE2IqhCAAAYAMn/carctIVr83i8OGv+/NjP/fZHPu5zyZ/+PY8+Cd+Mnf+zu+mP+XUiVoCbMdQBAAA8C18xysvy0m/dXmSpD/22Hz1wp/LvU94UvoTTsjyU5/MCb//uzlmfSjHfeiD+a5nPSO3vfNgcuKJ05YGOAKGIgAAgH/G8qN/mhP/239NkvQnfUfu+J/7c+++x9//z7+a5K4XvDAP/dfPTj72sSw/+b/zHb/xqvzDZa+cqDHAkfMvswYAAPhnnPTffyuLvk+SfPnlr/i6kej/6x9ySrJ/f/qTTkqSnPjmN2Vxx+3f1p4AR4OhCAAA4Jv56ldz3MED+b/t3Xu0ZnV9JvjnFEcuNm+pRXZRKLQteEc0GrkZnRaUIqaWbWnANa4QI4mXxIw0YBtBHNuYFLoyyJKsLAcIGmV6XO2ERMhQgiU3C1HxLszSxDAM3YVLql4Jl8O1qGLPH7wp7Qh4Tp2za3/36+fzV12e+u3HP37rXTyefU6SPLxiRR747Tc/dvbpT88Drz8uSTJz373Z/ZqrdkFBgKVlKAIAAHgs3/teZh58MEny0Mtekey+++PGH3rl0Tt+Pfutb3ZaDaALM207+RpKAAAA/keXXJKsXfvIr08/PTnzzMfPf/e7yYtf/Miv3/Sm5DOf6bYfwBLr9ZtZj8dz88o1zWhqs1V6DC1bpUeF7ELyTTNKMr13r0qPCtkqPYaW7ersId69Kj0qZKv0qJCt0mNod6/Ls4eWrdJjQdm5n+buzWzue5x/1zSj3P5gss/k9w/efkfu/gX53v/3Fbh7XfWY9myVHhWyVXpUyS6WV88AAAAey+in/9E1c999vzA+c++9O37d7r13J5UAumQoAgAAeCz77bfjl7vd/P/+wvhu/99PMw/vu9/jJAFqMhQBAAA8lhe9KO0eeyRJnvCVa5OHHnrc+M/+pLNtL/m1LpsBdMJQBAAA8Fj22CNbX31skmTZ7bdnz//6fz52dtOm7Pl3FyVJ2if+m2z9mZ+ABjAUhiIAAIDHcd8fnZR22SP/6fRv/vMZmb3+az+XmbnzjuS44zJz3yPfo+j+t/x+2ic9eZf2BFgKvf7UMwAAgOq2vfSw3P+uU/LEcz6aZffM5clrX5MHX39ctr7i3yd77pnd/uH72eu/XJiMtzySf/4Lcu97z+i5NcDOMRQBAAD8Avee8Z/T7rZbnnjORzOzfXv2vOiz2fOiz/5cbuvLXp67P/l/JHvt1UNLgMUzFAEAAMzDfae9Pw++/rjs+elPZPdrv5RlP/pRZh7amof3+ZVse8lLs8eJb85drzim75oAi2IoAgAAmKftz3lu7j3zf8u9j/J3TTNKxnO7vBPAUppp27btuwQAAAAA/ev1K4rG81zbm2Y0tdkqPYaWrdKjQnYh+aYZJZneu1elR4VslR5Dy3Z19hDvXpUeFbJVelTIVukxtLvX5dlDy1bpUSFbpUeFu9dVj2nPVulRIVulR5XsYi1b9AkAAAAATAVDEQAAAABJDEUAAAAATBiKAAAAAEhiKAIAAABgwlAEAAAAQBJDEQAAAAAThiIAAAAAkhiKAAAAAJiYadu27bsEAAAAAP2b7fPh4/HcvHJNM5rabJUeQ8tW6VEhu5B804ySTO/dq9KjQrZKj6Fluzp7iHevSo8K2So9KmSr9Bja3evy7KFlq/SokK3So8Ld66rHtGer9KiQrdKjSnaxvHoGAAAAQBJDEQAAAAAThiIAAAAAkhiKAAAAAJgwFAEAAACQZAmGou3bt2ft2rV5xzvekSTZtGlTjj/++KxevTonn3xytm7duuiSAAAAAHRv0UPRhRdemIMOOmjH788666y85S1vyYYNG7J8+fJcdNFFi30EAAAAALvATNu27c7+49tuuy3vfe978wd/8Af51Kc+lXPPPTdHHHFErrvuuszOzuY73/lO/vIv/zKf+MQnlrIzAAAAAB1Y1FcUnXnmmXnPe96TZcseOeaOO+7I8uXLMzs7myRZtWpVNm/evPiWAAAAAHRudmf/4dVXX50VK1bkBS94Qa6//vrHzM3MzDzm343Hc/N6VtOMpjZbpcfQslV6VMguJN80oyTTe/eq9KiQrdJjaNmuzh7i3avSo0K2So8K2So9hnb3ujx7aNkqPSpkq/SocPe66jHt2So9KmSr9KiSXaydHoq+/e1v56qrrsrGjRvz4IMP5p577sm6dety9913Z9u2bZmdnc1tt92WlStXLrokAAAAAN3b6VfP3v3ud2fjxo256qqrcvbZZ+eII47IRz/60Rx++OH5whe+kCT53Oc+l6OPPnrJygIAw3HZmy/Jx1eenRsu+M7P/d31H7kuH195dq4+ZUMPzQAAeCyL/qln/9p73vOe/PVf/3WOOeaY3HnnnTn++OOX+hEAwAAcdc7q7L3/KF/9k40Z37hlx5/ffOXN+fbHvp6nPGefvHzdUT02BADgX9vpV89+1uGHH57DDz88SXLAAQfkoosuWopjAYAB2/Mpe+WYc9fkkrX/Vza87dK88YoT8tD92/K5Ez6X3fbYLcf+1Zo84YlP6LsmAAA/Y0mGIgCAR7PfYU/NYae9LF/7sy/nmv90Re6//f7cc9s9eeXZx2TFc3+l73oAAPwrhiIAoFMvfteh+dF1m/JPf/cPSZIXvOkFef4Jh/TcCgCAR7Pk36MIAOBnzczM5MDffNaO3x9x8hE9tgEA4PEYigCATt158x35yge/lD2evEdmls3k79/699n2wLa+awEA8CgMRQBAZ7Y/uC0b3rY+D933UI45b01e8h8Py5Ybt+TL77+m72oAADyKmbZt275LAADT6fPv+ny+8ZffyK+/99fz6o+8Og9vfzif+vefyqbrNuW4zx6Xg994cN8VAQD4Gb0ORePx3LxyTTOa2myVHkPLVulRIbuQfNOMkkzv3avSo0K2So+hZbs6e4h3bynOvvnzN+Xyt/x99v21VXn9//0/Z9nsI1/IvPsDD+d/f9G5abc9nOOvPCFP+ndP7qzDUmSr9KiQrdJjaHevy7OHlq3So0K2So8Kd6+rHtOerdKjQrZKjyrZxfLqGQCw5OZuvTtXn/yF7L58jxxz7podI1GSPOmAJ+Xoj63O1rmt+eI71mf71u09NgUA4GfN9l0AAJg+o/2X5/d/+EeP+ffPeM0z884tp+7CRgAAzIevKAIAAAAgiaEIAAAAgAlDEQAAAABJDEUAAAAATBiKAAAAAEhiKAIAAABgwlAEAAAAQJJkpm3btu8SAAAAAPRvts+Hj8dz88o1zWhqs1V6DC1bpUeF7ELyTTNKMr13r0qPCtkqPYaW7ersId69Kj0qZKv0qJCt0mNod6/Ls4eWrdKjQrZKjwp3r6se056t0qNCtkqPKtnF8uoZAAAAAEkMRQAAAABMGIoAAAAASGIoAgAAAGDCUAQAAABAEkMRAAAAABOGIgAAAACSGIoAAAAAmDAUAQAAAJAkmWnbtu27BAAAAAD9m+3z4ePx3LxyTTOa2myVHkPLVulRIbuQfNOMkkzv3avSo0K2So+hZbs6e4h3r0qPCtkqPSpkq/QY2t3r8uyhZav0qJCt0qPC3euqx7Rnq/SokK3So0p2sbx6BgAAAEASQxEAAAAAE4YiAAAAAJIYigAAAACYMBQBAAAAkMRQBAAAAMCEoQgAAACAJIYiAAAAACYMRQAAAAAkSWbatm37LgEAAABA/2b7fPh4PDevXNOMpjZbpcfQslV6VMguJN80oyTTe/eq9KiQrdJjaNmuzh7i3avSo0K2So8K2So9hnb3ujx7aNkqPSpkq/SocPe66jHt2So9KmSr9KiSXSyvngEAAACQxFAEAAAAwIShCAAAAIAkhiIAAAAAJgxFAAAAACQxFAEAAAAwYSgCAAAAIImhCAAAAIAJQxEAAAAASZKZtm3bvksAAAAA0L/ZPh8+Hs/NK9c0o6nNVukxtGyVHhWyC8k3zSjJ9N69Kj0qZKv0GFq2q7OHePeq9KiQrdKjQrZKj6HdvS7PHlq2So8K2So9Kty9rnpMe7ZKjwrZKj2qZBfLq2cAAAAAJDEUAQAAADBhKAIAAAAgiaEIAAAAgAlDEQAAAABJDEUAAAAATBiKAAAAAEhiKAIAAABgwlAEAAAAQJJkpm3btu8SAAAAAPRvts+Hj8dz88o1zWhqs1V6DC1bpUeF7ELyTTNKMr13r0qPCtkqPYaW7ersId69Kj0qZKv0qJCt0mNod6/Ls4eWrdKjQrZKjwp3r6se056t0qNCtkqPKtnF8uoZAAAAAEkMRQAAAABMGIoAAAAASGIoAgAAAGDCUAQAAABAEkMRAAAAABOGIgAAAACSGIoAAAAAmDAUAQAAAJAkmWnbtu27BAAAAAD9m+3z4ePx3LxyTTOa2myVHkPLVulRIbuQfNOMkkzv3avSo0K2So+hZbs6e4h3r0qPCtkqPSpkq/QY2t3r8uyhZav0qJCt0qPC3euqx7Rnq/SokK3So0p2sbx6BgAAAEASQxEAAAAAE4YiAAAAAJIYigAAAACYMBQBAAAAkMRQBAAAAMCEoQgAAACAJIYiAAAAACZm2rZt+y4BAAAAQP9m+3z4eDw3r1zTjKY2W6XH0LJVelTILiTfNKMk03v3qvSokK3SY2jZrs4e4t2r0qNCtkqPCtkqPYZ297o8e2jZKj0qZKv0qHD3uuox7dkqPSpkq/Sokl0sr54BAAAAkMRQBAAAAMCEoQgAAACAJIYiAAAAACYMRQAsymVvviQfX3l2brjgOz/3d1f9r1fl4yvPztWnbOihGQAAsFCGIgAW5ahzVmfv/Uf56p9szPjGLTv+/NaN/z1fPvPLecpz9snL1x3VY0MAAGC+DEUALMqeT9krx5y7Ju32Nhvedmkeumdr7hvflyveeVl222O3HPtXa/KEJz6h75oAAMA8zPZdAIDh2++wp+aw016Wr/3Zl3PNf7oi999+f+7bcm9e+1evzYrn/krf9QAAgHkyFAGwJF78rkPzo+s25Z/+7h+SJM96w3Pykre+JOPxXM/NAACA+Zpp27btuwQA0+Gb530z6/9gfZLkrde/NU877Gk9NwIAABai16Fovv8vc9OMpjZbpcfQslV6VMguJN80oyTTe/eq9KiQ7aPHnTffkb951X/Jsicsy9a7t+Ypz9knf/jtd+SOuftL9t2VZw/x7lXpUSFbpUeFbJUeQ7t7XZ49tGyVHhWyVXpUuHtd9Zj2bJUeFbJVelTJLpZvZg3Aom1/cFs2vG19HrrvoRxz3pq85D8eln/+wU9y+cmX910NAABYAN+jCIBF+8oHN+YnN27Ji991aP7tUf8u+/9P/zY/um5TvnXet7LPS1flma97Tt8VAQCAefAVRQAsys2fvyk3fuK72ffXVuXw0389SbJst2U55rw12WvFXrnm1C/mrlvu7LklAAAwH4YiAHba3K135+qTv5Ddl++RY85dk2WzP/1YGT1tlP/wyf+QrXNb88V3rM/2rdt7bAoAAMyHV88A2Gmj/Zfn93/4R4/598993XPzzi2n7sJGAADAYviKIgAAAACSGIoAAAAAmDAUAQAAAJDEUAQAAADAxEzbtm3fJQAAAADoX68/9Ww8nptXrmlGU5ut0mNo2So9KmQXkm+aUZLpvXtVelTIVukxtGxXZw/x7lXpUSFbpUeFbJUeQ7t7XZ49tGyVHhWyVXpUuHtd9Zj2bJUeFbJVelTJLpZXzwAAAABIYigCAAAAYMJQBAAAAEASQxEAAAAAE4YiAAAAAJIYigAAAACYMBQBAAAAkMRQBAAAAMCEoQgAAACAJMlM27Zt3yUAAAAA6N9snw8fj+fmlWua0dRmq/QYWrZKjwrZheSbZpRkeu9elR4VslV6DC3b1dlDvHtVelTIVulRIVulx9DuXpdnDy1bpUeFbJUeFe5eVz2mPVulR4VslR5Vsovl1TMAAAAAkhiKAAAAAJgwFAEAAACQxFAEAAAAwIShCAAAAIAkhiIAAAAAJgxFAAAAACQxFAEAAAAwYSgCAAAAIEky07Zt23cJAAAAAPo32+fDx+O5eeWaZjS12So9hpat0qNCdiH5phklmd67V6VHhWyVHkPLdnX2EO9elR4VslV6VMhW6TG0u9fl2UPLVulRIVulR4W711WPac9W6VEhW6VHlexiefUMAAAAgCSGIgAAAAAmDEUAAAAAJDEUAQAAADBhKAIAAAAgiaEIAAAAgAlDEQAAAABJDEUAAAAATBiKAAAAAEiSzLRt2/ZdAgAAAID+zfb58PF4bl65phlNbbZKj6Flq/SokF1IvmlGSab37lXpUSFbpcfQsl2dPcS7V6VHhWyVHhWyVXoM7e51efbQslV6VMhW6VHh7nXVY9qzVXpUyFbpUSW7WF49AwAAACCJoQgAAACACUMRAAAAAEkMRQAAAABMGIoAAAAASGIoAgAAAGDCUAQAAABAEkMRAAAAABOGIgAAAACSJDNt27Z9lwAAAACgf7N9Pnw8nptXrmlGU5ut0mNo2So9KmQXkm+aUZLpvXtVelTIVukxtGxXZycM3vYAABqGSURBVA/x7lXpUSFbpUeFbJUeQ7t7XZ49tGyVHhWyVXpUuHtd9Zj2bJUeFbJVelTJLpZXzwAAAABIYigCAAAAYMJQBAAAAEASQxEAAAAAE4YiAAAAAJIYigAAAACYMBQBAAAAkMRQBAAAAMCEoQgAAACAJMlM27Zt3yUAAAAA6N9snw8fj+fmlWua0dRmq/QYWrZKjwrZheSbZpRkeu9elR4VslV6DC3b1dlDvHtVelTIVulRIVulx9DuXpdnDy1bpUeFbJUeFe5eVz2mPVulR4VslR5Vsovl1TMAAAAAkhiKAAAAAJgwFAEAAACQxFAEAAAAwIShCAAAAIAkhiIAAAAAJgxFAAAAACQxFAEAAAAwMdO2bdt3CQAAAAD6N9vnw8fjuXnlmmY0tdkqPYaWrdKjQnYh+aYZJZneu1elR4VslR5Dy3Z19hDvXpUeFbJVelTIVukxtLvX5dlDy1bpUSFbpUeFu9dVj2nPVulRIVulR5XsYnn1DAAAAIAkhiIAAAAAJgxFAAAAACQxFAEAALAIl735knx85dm54YLv/NzfXf+R6/LxlWfn6lM29NAM2BmGIgAAAHbaUeeszt77j/LVP9mY8Y1bdvz5rRv/e779sa/nKc/ZJy9fd1SPDYGFMBQBAACw0/Z8yl455tw1abe32fC2S7P1nq25d8u9ueKdl2W3PXbLsX+1Jk944hP6rgnM02zfBQAAABi2/Q57ag477WX52p99OZe+49LcO7439225N688+5iseO6v9F0PWABDEQAAAIv24ncdmh9dtyk3fubGJMmz3vCcPP+EQ3puBSyUV88AAABYtJmZmRz4m8/a8fsXvv0lPbYBdtZM27Zt3yUAAAAYttv/6fac/5Lzs+wJy/LgXQ+mObjJ277+tszu6UUWGJJeh6LxeG5euaYZTW22So+hZav0qJBdSL5pRkmm9+5V6VEhW6XH0LJdnT3Eu1elR4VslR4VslV6DO3udXn20LJVelTIVumxVNntD27L3/7mf81P/p8tOeHyE/LfNv63XLvu2jz/zS/MK896dcnOQ89W6VEhW6VHlexiefUMAACARfnKBzfmJzduyYv/l0Nz0OqD8so/eWVWHfbUfP/CG3LTJf/Ydz1gAQxFAAAA7LSbP39TbvzEd7Pvr63K4af/epJk2W7Lcsx5a7LHU/bMNad+MXfdcmfPLYH5MhQBAACwU+ZuvTtXn/yF7L58jxxz7posm/3pf2KOnjbK0R9bna1zW/PFd6zP9q3be2wKzJfvKgYAAMBOGe2/PL//wz96zL9/xmuemXduOXUXNgIWy1cUAQAAAJDEUAQAAADAhKEIAAAAgCSGIgAAAAAmDEUAAAAAJElm2rZt+y4BAAAAQP9m+3z4eDw3r1zTjKY2W6XH0LJVelTILiTfNKMk03v3qvSokK3SY2jZrs4e4t2r0qNCtkqPCtkqPYZ297o8e2jZKj0qZKv0qHD3uuox7dkqPSpkq/Sokl0sr54BAAAAkMRQBAAAAMCEoQgAAACAJIYiAAAAACYMRQAAAAAkMRQBAAAAMGEoAgAAACCJoQgAAACACUMRAAAAAEmSmbZt275LAAAAANC/2T4fPh7PzSvXNKOpzVbpMbRslR4VsgvJN80oyfTevSo9KmSr9Bhatquzh3j3qvSokK3So0K2So+h3b0uzx5atkqPCtkqPSrcva56THu2So8K2So9qmQXy6tnAAAAACQxFAEAAAAwYSgCAAAAIImhCAAAAIAJQxEAAAAASRYxFP34xz/O7/zO7+Q1r3lN1qxZk09/+tNJkjvvvDMnnnhiVq9enRNPPDF33XXXkpUFAAAAoDs7PRTttttuOe2003LZZZfls5/9bD7zmc/kpptuyvnnn58jjzwyGzZsyJFHHpnzzz9/KfsCAAAA0JGdHopWrlyZgw8+OEmy995758ADD8zmzZtz5ZVXZu3atUmStWvX5oorrliapgAAAAB0aqZt23axh9x666054YQTcumll+aVr3xlvvnNb+74u0MPPTTf+MY3FvsIAAAAADq26G9mfe+99+akk07K+973vuy9995L0QkAAACAHswu5h8/9NBDOemkk/La1742q1evTpLss88+2bJlS1auXJktW7ZkxYoVj/nvx+O5eT2naUZTm63SY2jZKj0qZBeSb5pRkum9e1V6VMhW6TG0bFdnD/HuVelRIVulR4VslR5Du3tdnj20bJUeFbJVelS4e131mPZslR4VslV6VMku1k5/RVHbtjnjjDNy4IEH5sQTT9zx50cffXQuvvjiJMnFF1+cV73qVYsuCQAAAED3dvorir71rW/lkksuybOf/ey87nWvS5Kceuqpefvb356TTz45F110Ufbbb7+cc845S1YWAAAAgO7s9FD00pe+NP/4j//4qH/36U9/eqcLAQAAANCPRX8zawAAAACmg6EIAAAAgCSGIgAAAAAmDEUAAAAAJDEUAQAAADBhKAIAAAAgSTLTtm3bdwkAAAAA+jfb58PH47l55ZpmNLXZKj2Glq3So0J2IfmmGSWZ3rtXpUeFbJUeQ8t2dfYQ716VHhWyVXpUyFbpMbS71+XZQ8tW6VEhW6VHhbvXVY9pz1bpUSFbpUeV7GJ59QwAAACAJIYiAAAAACYMRQAAAAAkMRQBAAAAMGEoAgAAACCJoQgAAACACUMRAAAAAEkMRQAAAABMGIoAAAAASJLMtG3b9l0CAAAAgP7N9vnw8XhuXrmmGU1ttkqPoWWr9KiQXUi+aUZJpvfuVelRIVulx9CyXZ09xLtXpUeFbJUeFbJVegzt7nV59tCyVXpUyFbpUeHuddVj2rNVelTIVulRJbtYXj0DAAAAIImhCAAAAIAJQxEAAAAASQxFAAAAAEwYigAAAABIYigCAAAAYMJQBAAAAEASQxEAAAAAEzNt27Z9lwAAAACgf7N9Pnw8nptXrmlGU5ut0mNo2So9KmQXkm+aUZLpvXtVelTIVukxtGxXZw/x7lXpUSFbpUeFbJUeQ7t7XZ49tGyVHhWyVXpUuHtd9Zj2bJUeFbJVelTJLpZXzwAAAABIYigCAAAAYMJQBAAAAEASQxEAAAAAE4YiAAAAAJIYigAAAACYMBQBAAAAkMRQBAAAAMCEoQgAAACAJMlM27Zt3yUAAAAA6N9snw8fj+fmlWua0dRmq/QYWrZKjwrZheSbZpRkeu9elR4VslV6DC3b1dlDvHtVelTIVulRIVulx9DuXpdnDy1bpUeFbJUeFe5eVz2mPVulR4VslR5Vsovl1TMAAAAAkhiKAAAAAJgwFAEAAACQxFAEAAAAwIShCAAAAIAkhiIAAAAAJgxFAAAAACQxFAEAAAAwYSgCAAAAIEky07Zt23cJAAAAAPo32+fDx+O5eeWaZjS12So9hpat0qNCdiH5phklmd67V6VHhWyVHkPLdnX2EO9elR4VslV6VMhW6TG0u9fl2UPLVulRIVulR4W711WPac9W6VEhW6VHlexiefUMAAAAgCSGIgAAAAAmDEUAAAAAJDEUAQAAADBhKAIAAAAgiaEIAAAAgAlDEQAAAABJDEUAAAAATBiKAAAAAEiSzLRt2/ZdAgAAAID+zfb58PF4bl65phlNbbZKj6Flq/SokF1IvmlGSab37lXpUSFbpcfQsl2dPcS7V6VHhWyVHhWyVXoM7e51efbQslV6VMhW6VHh7nXVY9qzVXpUyFbpUSW7WF49AwAAACCJoQgAAACACUMRAAAAAEkMRQAAAABMGIoAAAAASGIoAgAAAGDCUAQAAABAEkMRAAAAABOGIgAAAACSJDNt27Z9lwAAAACgf7N9Pnw8nptXrmlGU5ut0mNo2So9KmQXkm+aUZLpvXtVelTIVukxtGxXZw/x7lXpUSFbpUeFbJUeQ7t7XZ49tGyVHhWyVXpUuHtd9Zj2bJUeFbJVelTJLpZXzwAAAABIYigCAAAAYMJQBAAAAEASQxEAAAAAE4YiAAAAAJIYigAAAACYMBQBAAAAkMRQBAAAAMCEoQgAAACAJMlM27Zt3yUAAAAA6N9snw8fj+fmlWua0dRmq/QYWrZKjwrZheSbZpRkeu9elR4VslV6DC3b1dlDvHtVelTIVulRIVulx9DuXpdnDy1bpUeFbJUeFe5eVz2mPVulR4VslR5Vsovl1TMAAAAAkhiKAAAAAJgwFAEAAACQxFAEAAAAwIShCAAAAIAkhiIAAAAAJgxFAAAAACQxFAEAAAAwYSgCAAAAIEky07Zt23cJAAAAAPo32+fDx+O5eeWaZjS12So9hpat0qNCdiH5phklmd67V6VHhWyVHkPLdnX2EO9elR4VslV6VMhW6TG0u9fl2UPLVulRIVulR4W711WPac9W6VEhW6VHlexiefUMAAAAgCSGIgAAAAAmDEUAAAAAJDEUAQAAADBhKAIAAAAgiaEIAAAAgAlDEQAAAABJDEUAAAAATBiKAAAAAEiSzLRt2/ZdAgAAAID+zfb58PF4bl65phlNbbZKj6Flq/SokF1IvmlGSab37lXpUSFbpcfQsl2dPcS7V6VHhWyVHhWyVXoM7e51efbQslV6VMhW6VHh7nXVY9qzVXpUyFbpUSW7WF49AwAAACCJoQgAAACACUMRAAAAAEkMRQAAAABMGIoAAAAASGIoAgAAAGDCUAQAAABAEkMRAAAAABMzbdu2fZcAAAAAoH+zfT58PJ6bV65pRlObrdJjaNkqPSpkF5JvmlGS6b17VXpUyFbpMbRsV2cP8e5V6VEhW6VHhWyVHkO7e12ePbRslR4VslV6VLh7XfWY9myVHhWyVXpUyS6WV88AAAAASGIoAgAAAGDCUAQAAABAEkMRAAAAABOGIgAAAACSGIoAAAAAmDAUAQAAAJDEUAQAAADAhKEIAAAAgCTJTNu2bd8lAAAAAOjfbJ8PH4/n5pVrmtHUZqv0GFq2So8K2YXkm2aUZHrvXpUeFbJVegwt29XZQ7x7VXpUyFbpUSFbpcfQ7l6XZw8tW6VHhWyVHhXuXlc9pj1bpUeFbJUeVbKL5dUzAAAAAJIYigAAAACYMBQBAAAAkMRQBAAAAMCEoQgAAACAJIYiAAAAACYMRQAAAAAkMRQBAAAAMGEoAgAAACBJMtO2bdt3CQAAAAD6N9vnw8fjuXnlmmY0tdkqPYaWrdKjQnYh+aYZJZneu1elR4VslR5Dy3Z19hDvXpUeFbJVelTIVukxtLvX5dlDy1bpUSFbpUeFu9dVj2nPVulRIVulR5XsYnn1DAAAAIAkhiIAAAAAJgxFAAAAACQxFAEAAAAwYSgCAAAAIImhCAAAAIAJQxEAAAAASQxFAAAAAEwYigAAAABIksy0bdv2XQIAAACA/s32+fDxeG5euaYZTW22So+hZav0qJBdSL5pRkmm9+5V6VEhW6XH0LJdnT3Eu1elR4VslR4VslV6DO3udXn20LJVelTIVulR4e511WPas1V6VMhW6VElu1hePQMAAAAgiaEIAAAAgAlDEQAAAABJDEUAAAAATBiKAAAAAEhiKAIAAABgwlAEAAAAQBJDEQAAAAAThiIAAAAAkiQzbdu2fZcAAAAAoH+zfT58PJ6bV65pRlObrdJjaNkqPSpkF5JvmlGS6b17VXpUyFbpMbRsV2cP8e5V6VEhW6VHhWyVHkO7e12ePbRslR4VslV6VLh7XfWY9myVHhWyVXpUyS6WV88AAAAASGIoAgAAAGDCUAQAAABAEkMRAAAAABOGIgAAAACSGIoAAAAAmDAUAQAAAJDEUAQAAADAhKEIAAAAgCTJTNu2bd8lAAAAAOjfbJ8PH4/n5pVrmtHUZqv0GFq2So8K2YXkm2aUZHrvXpUeFbJVegwt29XZQ7x7VXpUyFbpUSFbpcfQ7l6XZw8tW6VHhWyVHhXuXlc9pj1bpUeFbJUeVbKL5dUzAAAAAJIYigAAAACYMBQBAAAAkMRQBAAAAMCEoQgAAACAJIYiAAAAACYMRQAAAAAkMRQBAAAAMGEoAgAAACBJMtO2bdt3CQAAAAD6N9vnw8fjuXnlmmY0tdkqPYaWrdKjQnYh+aYZJZneu1elR4VslR5Dy3Z19hDvXpUeFbJVelTIVukxtLvX5dlDy1bpUSFbpUeFu9dVj2nPVulRIVulR5XsYnn1DAAAAIAkhiIAAAAAJgxFAAAAACQxFAEAAAAwYSgCAAAAIImhCAAAAIAJQxEAAAAASQxFAAAAAEwYigAAAABIksy0bdv2XQIAAACA/s32+fDxeG5euaYZTW22So+hZav0qJBdSL5pRkmm9+5V6VEhW6XH0LJdnT3Eu1elR4VslR4VslV6DO3udXn20LJVelTIVulR4e511WPas1V6VMhW6VElu1hePQMAAAAgiaEIAAAAgAlDEQAAAABJDEUAAAAATBiKAAAAAEhiKAIAAABgwlAEAAAAQBJDEQAAAAATM23btn2XAAAAAKB/s30+fDyem1euaUZTm63SY2jZKj0qZBeSb5pRkum9e1V6VMhW6TG0bFdnD/HuVelRIVulR4VslR5Du3tdnj20bJUeFbJVelS4e131mPZslR4VslV6VMkullfPAAAAAEhiKAIAAABgwlAEAAAAQBJDEQAAAAAThiIAAAAAknQ0FG3cuDHHHntsjjnmmJx//vldPAIAAACAJbbkQ9H27dvzoQ99KBdccEHWr1+fSy+9NDfddNNSPwYAAACAJbbkQ9ENN9yQpz/96TnggAOy++67Z82aNbnyyiuX+jEAAAAALLGZtm3bpTzw8ssvz7XXXpt169YlSS6++OLccMMN+cAHPrCUjwEAAABgiS35VxQ92u40MzOz1I8BAAAAYIkt+VC0atWq3HbbbTt+v3nz5qxcuXKpHwMAAADAElvyoeiQQw7JLbfckk2bNmXr1q1Zv359jj766KV+DAAAAABLbHbJD5ydzQc+8IG89a1vzfbt2/Nbv/VbedaznrXUjwEAAABgiS35N7P+RTZu3Jh169bl4YcfzvHHH5+3v/3tu/Lx8Evjxz/+cf74j/84P/nJT7Js2bK88Y1vzO/+7u/mzjvvzCmnnJIf/ehHedrTnpaPfexjedKTntR3XZg6//J/luy7774577zzsmnTppx66qm566678vznPz9//ud/nt13373vmjB17r777rz//e/PD3/4w8zMzOTMM8/MM57xDJ990LFPfepT+Zu/+ZvMzMzk2c9+dj784Q9ny5YtPvtgiZ1++um55pprss8+++TSSy9Nksf8b7y2bbNu3bp86Utfyp577pmPfOQjOfjgg3/hM5b81bPHs3379nzoQx/KBRdckPXr1+fSSy/NTTfdtCsrwC+N3XbbLaeddlouu+yyfPazn81nPvOZ3HTTTTn//PNz5JFHZsOGDTnyyCNz/vnn910VptKFF16Ygw46aMfvzzrrrLzlLW/Jhg0bsnz58lx00UU9toPptW7durziFa/I5ZdfnksuuSQHHXSQzz7o2ObNm3PhhRfmb//2b3PppZdm+/btWb9+vc8+6MAb3vCGXHDBBf/Dnz3W59zGjRtzyy23ZMOGDfnTP/3TfPCDH5zXM3bpUHTDDTfk6U9/eg444IDsvvvuWbNmTa688spdWQF+aaxcuXLHWrz33nvnwAMPzObNm3PllVdm7dq1SZK1a9fmiiuu6LMmTKXbbrst11xzTY477rgkj/xE0K997Ws59thjkySvf/3rff5BB+6555584xvf2HH3dt999yxfvtxnH+wC27dvzwMPPJBt27blgQceSNM0PvugA4ceeujPfVXsY33O/cufz8zM5Fd/9Vdz9913Z8uWLb/wGbt0KNq8eXNWrVq14/f77rtvNm/evCsrwC+lW2+9NT/4wQ/yohe9KLfffvuOn0S4cuXK/PM//3PP7WD6nHnmmXnPe96TZcse+Zi94447snz58szOPvKtAVetWuXzDzqwadOmrFixIqeffnrWrl2bM844I/fdd5/PPujYvvvum9/7vd/LUUcdlZe//OXZe++9c/DBB/vsg13ksT7n/vUGM997uEuHokf7dkgzMzO7sgL80rn33ntz0kkn5X3ve1/23nvvvuvA1Lv66quzYsWKvOAFL3jcnM8/WHrbtm3L97///bzpTW/KxRdfnL322strZrAL3HXXXbnyyitz5ZVX5tprr83999+fjRs3/lzOZx/sWju7wSz5Tz17PKtWrcptt9224/ebN2/esXoBS++hhx7KSSedlNe+9rVZvXp1kmSfffbJli1bsnLlymzZsiUrVqzouSVMl29/+9u56qqrsnHjxjz44IO55557sm7dutx9993Ztm1bZmdnc9ttt/n8gw6sWrUqq1atyote9KIkyW/8xm/k/PPP99kHHfvKV76S/ffff8fdWr16db7zne/47INd5LE+5/71BjPfe7hLv6LokEMOyS233JJNmzZl69atWb9+fY4++uhdWQF+abRtmzPOOCMHHnhgTjzxxB1/fvTRR+fiiy9Oklx88cV51ate1VdFmErvfve7s3Hjxlx11VU5++yzc8QRR+SjH/1oDj/88HzhC19Iknzuc5/z+QcdaJomq1atys0335wk+epXv5qDDjrIZx907KlPfWq+973v5f7770/btvnqV7+aZz7zmT77YBd5rM+5f/nztm3z3e9+N6PRaF5D0Uz7aF+L1KEvfelLOfPMM3f82OA//MM/3JWPh18a3/zmN/Pbv/3befazn73j+6SceuqpeeELX5iTTz45P/7xj7PffvvlnHPOyZOf/OSe28J0uv766/PJT34y5513XjZt2pRTTjkld911V573vOflrLPO8iOCoQM/+MEPcsYZZ+Shhx7KAQcckA9/+MN5+OGHffZBx/7iL/4in//85zM7O5vnPe95WbduXTZv3uyzD5bYqaeemq9//eu54447ss8+++Rd73pXXv3qVz/q51zbtvnQhz6Ua6+9NnvttVfOPPPMHHLIIb/wGbt8KAIAAACgpl366hkAAAAAdRmKAAAAAEhiKAIAAABgwlAEAAAAQBJDEQAAAAAThiIAAAAAkhiKAAAAAJgwFAEAAACQJPn/Ad//KYDIN72bAAAAAElFTkSuQmCC\n",
592 | "text/plain": [
593 | ""
594 | ]
595 | },
596 | "metadata": {},
597 | "output_type": "display_data"
598 | }
599 | ],
600 | "source": [
601 | "# import the helper function\n",
602 | "from helpers import display_world\n",
603 | "\n",
604 | "# Display the final world!\n",
605 | "\n",
606 | "# define figure size\n",
607 | "plt.rcParams[\"figure.figsize\"] = (20,20)\n",
608 | "\n",
609 | "# check if poses has been created\n",
610 | "if 'poses' in locals():\n",
611 | " # print out the last pose\n",
612 | " print('Last pose: ', poses[-1])\n",
613 | " # display the last position of the robot *and* the landmark positions\n",
614 | " display_world(int(world_size), poses[-1], landmarks)"
615 | ]
616 | },
617 | {
618 | "cell_type": "markdown",
619 | "metadata": {},
620 | "source": [
621 | "## Test Cases\n",
622 | "\n",
623 | "The output should be **close-to or exactly** identical to the given results. If there are minor discrepancies it could be a matter of floating point accuracy or in the calculation of the inverse matrix.\n",
624 | "\n"
625 | ]
626 | },
627 | {
628 | "cell_type": "code",
629 | "execution_count": 298,
630 | "metadata": {},
631 | "outputs": [
632 | {
633 | "name": "stdout",
634 | "output_type": "stream",
635 | "text": [
636 | "\n",
637 | "\n",
638 | "Estimated Poses:\n",
639 | "[50.000, 50.000]\n",
640 | "[37.973, 33.652]\n",
641 | "[26.185, 18.155]\n",
642 | "[13.745, 2.116]\n",
643 | "[28.097, 16.783]\n",
644 | "[42.384, 30.902]\n",
645 | "[55.831, 44.497]\n",
646 | "[70.857, 59.699]\n",
647 | "[85.697, 75.543]\n",
648 | "[74.011, 92.434]\n",
649 | "[53.544, 96.454]\n",
650 | "[34.525, 100.080]\n",
651 | "[48.623, 83.953]\n",
652 | "[60.197, 68.107]\n",
653 | "[73.778, 52.935]\n",
654 | "[87.132, 38.538]\n",
655 | "[80.303, 20.508]\n",
656 | "[72.798, 2.945]\n",
657 | "[55.245, 13.255]\n",
658 | "[37.416, 22.317]\n",
659 | "\n",
660 | "\n",
661 | "Estimated Landmarks:\n",
662 | "[82.956, 13.539]\n",
663 | "[70.495, 74.141]\n",
664 | "[36.740, 61.281]\n",
665 | "[18.698, 66.060]\n",
666 | "[20.635, 16.875]\n"
667 | ]
668 | }
669 | ],
670 | "source": [
671 | "# Here is the data and estimated outputs for test case 1\n",
672 | "\n",
673 | "test_data1 = [[[[1, 19.457599255548065, 23.8387362100849], [2, -13.195807561967236, 11.708840328458608], [3, -30.0954905279171, 15.387879242505843]], [-12.2607279422326, -15.801093326936487]], [[[2, -0.4659930049620491, 28.088559771215664], [4, -17.866382374890936, -16.384904503932]], [-12.2607279422326, -15.801093326936487]], [[[4, -6.202512900833806, -1.823403210274639]], [-12.2607279422326, -15.801093326936487]], [[[4, 7.412136480918645, 15.388585962142429]], [14.008259661173426, 14.274756084260822]], [[[4, -7.526138813444998, -0.4563942429717849]], [14.008259661173426, 14.274756084260822]], [[[2, -6.299793150150058, 29.047830407717623], [4, -21.93551130411791, -13.21956810989039]], [14.008259661173426, 14.274756084260822]], [[[1, 15.796300959032276, 30.65769689694247], [2, -18.64370821983482, 17.380022987031367]], [14.008259661173426, 14.274756084260822]], [[[1, 0.40311325410337906, 14.169429532679855], [2, -35.069349468466235, 2.4945558982439957]], [14.008259661173426, 14.274756084260822]], [[[1, -16.71340983241936, -2.777000269543834]], [-11.006096015782283, 16.699276945166858]], [[[1, -3.611096830835776, -17.954019226763958]], [-19.693482634035977, 3.488085684573048]], [[[1, 18.398273354362416, -22.705102332550947]], [-19.693482634035977, 3.488085684573048]], [[[2, 2.789312482883833, -39.73720193121324]], [12.849049222879723, -15.326510824972983]], [[[1, 21.26897046581808, -10.121029799040915], [2, -11.917698965880655, -23.17711662602097], [3, -31.81167947898398, -16.7985673023331]], [12.849049222879723, -15.326510824972983]], [[[1, 10.48157743234859, 5.692957082575485], [2, -22.31488473554935, -5.389184118551409], [3, -40.81803984305378, -2.4703329790238118]], [12.849049222879723, -15.326510824972983]], [[[0, 10.591050242096598, -39.2051798967113], [1, -3.5675572049297553, 22.849456408289125], [2, -38.39251065320351, 7.288990306029511]], [12.849049222879723, -15.326510824972983]], [[[0, -3.6225556479370766, -25.58006865235512]], [-7.8874682868419965, -18.379005523261092]], [[[0, 1.9784503557879374, -6.5025974151499]], [-7.8874682868419965, -18.379005523261092]], [[[0, 10.050665232782423, 11.026385307998742]], [-17.82919359778298, 9.062000642947142]], [[[0, 26.526838150174818, -0.22563393232425621], [4, -33.70303936886652, 2.880339841013677]], [-17.82919359778298, 9.062000642947142]]]\n",
674 | "\n",
675 | "## Test Case 1\n",
676 | "##\n",
677 | "# Estimated Pose(s):\n",
678 | "# [50.000, 50.000]\n",
679 | "# [37.858, 33.921]\n",
680 | "# [25.905, 18.268]\n",
681 | "# [13.524, 2.224]\n",
682 | "# [27.912, 16.886]\n",
683 | "# [42.250, 30.994]\n",
684 | "# [55.992, 44.886]\n",
685 | "# [70.749, 59.867]\n",
686 | "# [85.371, 75.230]\n",
687 | "# [73.831, 92.354]\n",
688 | "# [53.406, 96.465]\n",
689 | "# [34.370, 100.134]\n",
690 | "# [48.346, 83.952]\n",
691 | "# [60.494, 68.338]\n",
692 | "# [73.648, 53.082]\n",
693 | "# [86.733, 38.197]\n",
694 | "# [79.983, 20.324]\n",
695 | "# [72.515, 2.837]\n",
696 | "# [54.993, 13.221]\n",
697 | "# [37.164, 22.283]\n",
698 | "\n",
699 | "\n",
700 | "# Estimated Landmarks:\n",
701 | "# [82.679, 13.435]\n",
702 | "# [70.417, 74.203]\n",
703 | "# [36.688, 61.431]\n",
704 | "# [18.705, 66.136]\n",
705 | "# [20.437, 16.983]\n",
706 | "\n",
707 | "\n",
708 | "### Uncomment the following three lines for test case 1 and compare the output to the values above ###\n",
709 | "\n",
710 | "mu_1 = slam(test_data1, 20, 5, 100.0, 2.0, 2.0)\n",
711 | "poses, landmarks = get_poses_landmarks(mu_1, 20)\n",
712 | "print_all(poses, landmarks)"
713 | ]
714 | },
715 | {
716 | "cell_type": "code",
717 | "execution_count": 299,
718 | "metadata": {},
719 | "outputs": [
720 | {
721 | "name": "stdout",
722 | "output_type": "stream",
723 | "text": [
724 | "\n",
725 | "\n",
726 | "Estimated Poses:\n",
727 | "[50.000, 50.000]\n",
728 | "[69.181, 45.665]\n",
729 | "[87.743, 39.703]\n",
730 | "[76.270, 56.311]\n",
731 | "[64.317, 72.176]\n",
732 | "[52.257, 88.154]\n",
733 | "[44.059, 69.401]\n",
734 | "[37.002, 49.918]\n",
735 | "[30.924, 30.955]\n",
736 | "[23.508, 11.419]\n",
737 | "[34.180, 27.133]\n",
738 | "[44.155, 43.846]\n",
739 | "[54.806, 60.920]\n",
740 | "[65.698, 78.546]\n",
741 | "[77.468, 95.626]\n",
742 | "[96.802, 98.821]\n",
743 | "[75.957, 99.971]\n",
744 | "[70.200, 81.181]\n",
745 | "[64.054, 61.723]\n",
746 | "[58.107, 42.628]\n",
747 | "\n",
748 | "\n",
749 | "Estimated Landmarks:\n",
750 | "[76.779, 42.887]\n",
751 | "[85.065, 77.438]\n",
752 | "[13.548, 95.652]\n",
753 | "[59.449, 39.595]\n",
754 | "[69.263, 94.240]\n"
755 | ]
756 | }
757 | ],
758 | "source": [
759 | "# Here is the data and estimated outputs for test case 2\n",
760 | "\n",
761 | "test_data2 = [[[[0, 26.543274387283322, -6.262538160312672], [3, 9.937396825799755, -9.128540360867689]], [18.92765331253674, -6.460955043986683]], [[[0, 7.706544739722961, -3.758467215445748], [1, 17.03954411948937, 31.705489938553438], [3, -11.61731288777497, -6.64964096716416]], [18.92765331253674, -6.460955043986683]], [[[0, -12.35130507136378, 2.585119104239249], [1, -2.563534536165313, 38.22159657838369], [3, -26.961236804740935, -0.4802312626141525]], [-11.167066095509824, 16.592065417497455]], [[[0, 1.4138633151721272, -13.912454837810632], [1, 8.087721200818589, 20.51845934354381], [3, -17.091723454402302, -16.521500551709707], [4, -7.414211721400232, 38.09191602674439]], [-11.167066095509824, 16.592065417497455]], [[[0, 12.886743222179561, -28.703968411636318], [1, 21.660953298391387, 3.4912891084614914], [3, -6.401401414569506, -32.321583037341625], [4, 5.034079343639034, 23.102207946092893]], [-11.167066095509824, 16.592065417497455]], [[[1, 31.126317672358578, -10.036784369535214], [2, -38.70878528420893, 7.4987265861424595], [4, 17.977218575473767, 6.150889254289742]], [-6.595520680493778, -18.88118393939265]], [[[1, 41.82460922922086, 7.847527392202475], [3, 15.711709540417502, -30.34633659912818]], [-6.595520680493778, -18.88118393939265]], [[[0, 40.18454208294434, -6.710999804403755], [3, 23.019508919299156, -10.12110867290604]], [-6.595520680493778, -18.88118393939265]], [[[3, 27.18579315312821, 8.067219022708391]], [-6.595520680493778, -18.88118393939265]], [[], [11.492663265706092, 16.36822198838621]], [[[3, 24.57154567653098, 13.461499960708197]], [11.492663265706092, 16.36822198838621]], [[[0, 31.61945290413707, 0.4272295085799329], [3, 16.97392299158991, -5.274596836133088]], [11.492663265706092, 16.36822198838621]], [[[0, 22.407381798735177, -18.03500068379259], [1, 29.642444125196995, 17.3794951934614], [3, 4.7969752441371645, -21.07505361639969], [4, 14.726069092569372, 32.75999422300078]], [11.492663265706092, 16.36822198838621]], [[[0, 10.705527984670137, -34.589764174299596], [1, 18.58772336795603, -0.20109708164787765], [3, -4.839806195049413, -39.92208742305105], [4, 4.18824810165454, 14.146847823548889]], [11.492663265706092, 16.36822198838621]], [[[1, 5.878492140223764, -19.955352450942357], [4, -7.059505455306587, -0.9740849280550585]], [19.628527845173146, 3.83678180657467]], [[[1, -11.150789592446378, -22.736641053247872], [4, -28.832815721158255, -3.9462962046291388]], [-19.841703647091965, 2.5113335861604362]], [[[1, 8.64427397916182, -20.286336970889053], [4, -5.036917727942285, -6.311739993868336]], [-5.946642674882207, -19.09548221169787]], [[[0, 7.151866679283043, -39.56103232616369], [1, 16.01535401373368, -3.780995345194027], [4, -3.04801331832137, 13.697362774960865]], [-5.946642674882207, -19.09548221169787]], [[[0, 12.872879480504395, -19.707592098123207], [1, 22.236710716903136, 16.331770792606406], [3, -4.841206109583004, -21.24604435851242], [4, 4.27111163223552, 32.25309748614184]], [-5.946642674882207, -19.09548221169787]]] \n",
762 | "\n",
763 | "\n",
764 | "## Test Case 2\n",
765 | "##\n",
766 | "# Estimated Pose(s):\n",
767 | "# [50.000, 50.000]\n",
768 | "# [69.035, 45.061]\n",
769 | "# [87.655, 38.971]\n",
770 | "# [76.084, 55.541]\n",
771 | "# [64.283, 71.684]\n",
772 | "# [52.396, 87.887]\n",
773 | "# [44.674, 68.948]\n",
774 | "# [37.532, 49.680]\n",
775 | "# [31.392, 30.893]\n",
776 | "# [24.796, 12.012]\n",
777 | "# [33.641, 26.440]\n",
778 | "# [43.858, 43.560]\n",
779 | "# [54.735, 60.659]\n",
780 | "# [65.884, 77.791]\n",
781 | "# [77.413, 94.554]\n",
782 | "# [96.740, 98.020]\n",
783 | "# [76.149, 99.586]\n",
784 | "# [70.211, 80.580]\n",
785 | "# [64.130, 61.270]\n",
786 | "# [58.183, 42.175]\n",
787 | "\n",
788 | "\n",
789 | "# Estimated Landmarks:\n",
790 | "# [76.777, 42.415]\n",
791 | "# [85.109, 76.850]\n",
792 | "# [13.687, 95.386]\n",
793 | "# [59.488, 39.149]\n",
794 | "# [69.283, 93.654]\n",
795 | "\n",
796 | "\n",
797 | "### Uncomment the following three lines for test case 2 and compare to the values above ###\n",
798 | "\n",
799 | "mu_2 = slam(test_data2, 20, 5, 100.0, 2.0, 2.0)\n",
800 | "poses, landmarks = get_poses_landmarks(mu_2, 20)\n",
801 | "print_all(poses, landmarks)\n"
802 | ]
803 | },
804 | {
805 | "cell_type": "code",
806 | "execution_count": null,
807 | "metadata": {},
808 | "outputs": [],
809 | "source": []
810 | }
811 | ],
812 | "metadata": {
813 | "kernelspec": {
814 | "display_name": "Python 3",
815 | "language": "python",
816 | "name": "python3"
817 | },
818 | "language_info": {
819 | "codemirror_mode": {
820 | "name": "ipython",
821 | "version": 3
822 | },
823 | "file_extension": ".py",
824 | "mimetype": "text/x-python",
825 | "name": "python",
826 | "nbconvert_exporter": "python",
827 | "pygments_lexer": "ipython3",
828 | "version": "3.8.10"
829 | }
830 | },
831 | "nbformat": 4,
832 | "nbformat_minor": 2
833 | }
834 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | Creative Commons Legal Code
2 |
3 | CC0 1.0 Universal
4 |
5 | CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE
6 | LEGAL SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN
7 | ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS
8 | INFORMATION ON AN "AS-IS" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES
9 | REGARDING THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS
10 | PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM
11 | THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED
12 | HEREUNDER.
13 |
14 | Statement of Purpose
15 |
16 | The laws of most jurisdictions throughout the world automatically confer
17 | exclusive Copyright and Related Rights (defined below) upon the creator
18 | and subsequent owner(s) (each and all, an "owner") of an original work of
19 | authorship and/or a database (each, a "Work").
20 |
21 | Certain owners wish to permanently relinquish those rights to a Work for
22 | the purpose of contributing to a commons of creative, cultural and
23 | scientific works ("Commons") that the public can reliably and without fear
24 | of later claims of infringement build upon, modify, incorporate in other
25 | works, reuse and redistribute as freely as possible in any form whatsoever
26 | and for any purposes, including without limitation commercial purposes.
27 | These owners may contribute to the Commons to promote the ideal of a free
28 | culture and the further production of creative, cultural and scientific
29 | works, or to gain reputation or greater distribution for their Work in
30 | part through the use and efforts of others.
31 |
32 | For these and/or other purposes and motivations, and without any
33 | expectation of additional consideration or compensation, the person
34 | associating CC0 with a Work (the "Affirmer"), to the extent that he or she
35 | is an owner of Copyright and Related Rights in the Work, voluntarily
36 | elects to apply CC0 to the Work and publicly distribute the Work under its
37 | terms, with knowledge of his or her Copyright and Related Rights in the
38 | Work and the meaning and intended legal effect of CC0 on those rights.
39 |
40 | 1. Copyright and Related Rights. A Work made available under CC0 may be
41 | protected by copyright and related or neighboring rights ("Copyright and
42 | Related Rights"). Copyright and Related Rights include, but are not
43 | limited to, the following:
44 |
45 | i. the right to reproduce, adapt, distribute, perform, display,
46 | communicate, and translate a Work;
47 | ii. moral rights retained by the original author(s) and/or performer(s);
48 | iii. publicity and privacy rights pertaining to a person's image or
49 | likeness depicted in a Work;
50 | iv. rights protecting against unfair competition in regards to a Work,
51 | subject to the limitations in paragraph 4(a), below;
52 | v. rights protecting the extraction, dissemination, use and reuse of data
53 | in a Work;
54 | vi. database rights (such as those arising under Directive 96/9/EC of the
55 | European Parliament and of the Council of 11 March 1996 on the legal
56 | protection of databases, and under any national implementation
57 | thereof, including any amended or successor version of such
58 | directive); and
59 | vii. other similar, equivalent or corresponding rights throughout the
60 | world based on applicable law or treaty, and any national
61 | implementations thereof.
62 |
63 | 2. Waiver. To the greatest extent permitted by, but not in contravention
64 | of, applicable law, Affirmer hereby overtly, fully, permanently,
65 | irrevocably and unconditionally waives, abandons, and surrenders all of
66 | Affirmer's Copyright and Related Rights and associated claims and causes
67 | of action, whether now known or unknown (including existing as well as
68 | future claims and causes of action), in the Work (i) in all territories
69 | worldwide, (ii) for the maximum duration provided by applicable law or
70 | treaty (including future time extensions), (iii) in any current or future
71 | medium and for any number of copies, and (iv) for any purpose whatsoever,
72 | including without limitation commercial, advertising or promotional
73 | purposes (the "Waiver"). Affirmer makes the Waiver for the benefit of each
74 | member of the public at large and to the detriment of Affirmer's heirs and
75 | successors, fully intending that such Waiver shall not be subject to
76 | revocation, rescission, cancellation, termination, or any other legal or
77 | equitable action to disrupt the quiet enjoyment of the Work by the public
78 | as contemplated by Affirmer's express Statement of Purpose.
79 |
80 | 3. Public License Fallback. Should any part of the Waiver for any reason
81 | be judged legally invalid or ineffective under applicable law, then the
82 | Waiver shall be preserved to the maximum extent permitted taking into
83 | account Affirmer's express Statement of Purpose. In addition, to the
84 | extent the Waiver is so judged Affirmer hereby grants to each affected
85 | person a royalty-free, non transferable, non sublicensable, non exclusive,
86 | irrevocable and unconditional license to exercise Affirmer's Copyright and
87 | Related Rights in the Work (i) in all territories worldwide, (ii) for the
88 | maximum duration provided by applicable law or treaty (including future
89 | time extensions), (iii) in any current or future medium and for any number
90 | of copies, and (iv) for any purpose whatsoever, including without
91 | limitation commercial, advertising or promotional purposes (the
92 | "License"). The License shall be deemed effective as of the date CC0 was
93 | applied by Affirmer to the Work. Should any part of the License for any
94 | reason be judged legally invalid or ineffective under applicable law, such
95 | partial invalidity or ineffectiveness shall not invalidate the remainder
96 | of the License, and in such case Affirmer hereby affirms that he or she
97 | will not (i) exercise any of his or her remaining Copyright and Related
98 | Rights in the Work or (ii) assert any associated claims and causes of
99 | action with respect to the Work, in either case contrary to Affirmer's
100 | express Statement of Purpose.
101 |
102 | 4. Limitations and Disclaimers.
103 |
104 | a. No trademark or patent rights held by Affirmer are waived, abandoned,
105 | surrendered, licensed or otherwise affected by this document.
106 | b. Affirmer offers the Work as-is and makes no representations or
107 | warranties of any kind concerning the Work, express, implied,
108 | statutory or otherwise, including without limitation warranties of
109 | title, merchantability, fitness for a particular purpose, non
110 | infringement, or the absence of latent or other defects, accuracy, or
111 | the present or absence of errors, whether or not discoverable, all to
112 | the greatest extent permissible under applicable law.
113 | c. Affirmer disclaims responsibility for clearing rights of other persons
114 | that may apply to the Work or any use thereof, including without
115 | limitation any person's Copyright and Related Rights in the Work.
116 | Further, Affirmer disclaims responsibility for obtaining any necessary
117 | consents, permissions or other rights required for any use of the
118 | Work.
119 | d. Affirmer understands and acknowledges that Creative Commons is not a
120 | party to this document and has no duty or obligation with respect to
121 | this CC0 or use of the Work.
122 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # SLAM on Raspberry Pi
2 |
3 | _Simultaneous Localization and Mapping (SLAM) Navigation using RP LIDAR A1 on RPi with MQTT remote point cloud visualization_
4 |
5 | Project Blog here:
6 | https://medium.com/towards-data-science/indoor-robot-localization-with-slam-f8b447bcb865
7 |
8 | SLAM (Simultaneous Localization And Mapping) algorithms use **LiDAR and IMU data to simultaneously locate the robot in real-time and generate a coherent map of surrounding landmarks such as buildings, trees, rocks, and other world features, at the same time.**
9 |
10 | Though a classic chicken and egg problem, it has been approximately solved using methods like **Particle Filter, Extended Kalman Filter (EKF), Covariance intersection, and GraphSLAM**. SLAM enables accurate mapping where GPS localization is unavailable, such as indoor spaces.
11 |
12 | Reasonably so, **SLAM is the core algorithm being used in robot navigation, autonomous cars, robotic mapping, virtual reality, augmented reality, etc. If we can do robot localization on RPi then it is easy to make a moving car or walking robot that can ply indoors autonomously.**
13 |
14 | You can see the **visualization of the LIDAR point cloud map and estimated robot trajectory below.** The robot is going across different rooms on the floor, as you can see in the project demo video below.
15 |
16 |
17 |
18 |
19 |
20 | First, lets see SLAM in action and then the theory behind. SLAM is deployed on a RaspberryPi with RPLidar A1 M8, running on a 5V 3A battery. As you can see in the video, **the robot (portable unit) is taken across various rooms of my house and a real-time trajectory is transmitted to the MQTT server and also stored on the SD card of RPi.**
21 |
22 | ## Watch Project Demo
23 |
24 | [](https://youtu.be/HrBcQpvx8gg)
25 |
26 |
27 | ## How to use this repo?
28 |
29 | ```
30 | git clone https://github.com/simondlevy/BreezySLAM.git
31 | cd to BreezySLAM/examples
32 | sudo python3 setup.py install
33 | ```
34 |
35 | ```
36 | git clone https://github.com/simondlevy/PyRoboViz.git
37 | change to PyRoboViz base directory
38 | sudo python3 setup.py install
39 | ```
40 |
41 | To execute SLAM on Raspberry Pi,
42 | ```
43 | python3 rpslam-thread.py
44 | ```
45 |
46 | The map will be visualized in RPi itself. To enable visualization in a remote system, you can execute the MQTT version,
47 | ```
48 | python3 rpslam-mqtt.py
49 | ```
50 |
51 | **Note:** To enable persistence of the visualization map, please replace the **__init__.py** file in roboviz directory with the one in this repo and execute PyRoboViz setup script again from the PyRoboViz base directory, using the command below.
52 |
53 | ```
54 | sudo python3 setup.py install
55 | ```
56 |
57 | Before execution, create a directory named 'gif' inside the base directory of slam script to let the map images saved. These images are saved in time sequence so that a gif animation can be easily created.
58 |
59 | _If the MQTT visualization is slow, then the bytearray transfer might be the bottleneck. You can either connect the RPi to router using a LAN cable (to improve the speed) or reduce the dimensions of the map to reduce the size of the bytearray. Instead you can reduce the MQTT publish frequency as well._
60 |
61 |
62 | ## Custom Graph SLAM Implementation
63 |
64 | Let's discuss the theory behind GraphSLAM and see a custom implementation. Then we will attempt to integrate RPi with RPLidar A1 M8, running on a battery, and **do SLAM along with visualization of the LIDAR point cloud map to assist navigation or even to generate a floor map. Finally, the LIDAR point cloud map is visualized on a remote machine using MQTT. **
65 |
66 | See the SLAM gadget below: **RaspberryPi integrated with RPLidar A1 M8, running on a 5V 3A battery.**
67 |
68 |
69 |
70 |
71 |
72 |
73 | ### Graph SLAM Algorithm
74 | - **Assume a robot in the 2D world**, tries to move 10 units to the right from x to x'. Due to motion uncertainty, x' = x + 10 may not hold, but **it will be a Gaussian centered around x + 10.** The Gaussian should peak when x' approaches x + 10
75 |
76 | 
77 |
78 | - Robot movement from x0 to x1 to x2 is **characterized by 2 Gaussian functions.** If x1 is away from x0 by 10 units, **the Kalman Filter models the uncertainty using the Gaussian with (x1 – x0 – 10).** Hence, there is still a probability associated with locations < 10 and > 10.
79 | - There is another similar Gaussian at x2 with a higher spread. **The total probability of the entire route is the product of the two Gaussians.** We can drop the constants, as we just need to maximize the likelihood of the position x1, given x0. Thus **the product of Gaussian becomes the sum of exponent terms, i.e. the constraint has only x's and sigma.**
80 | - **Graph SLAM models the constraints as System of Linear Equations (SLEs), with a Ω matrix containing the coefficients of the variables and a ξ vector that contains the limiting value of the constraints.** Every time an observation is made between 2 poses, a 'local addition' is done on the 4 matrix elements (as the product of gaussian becomes the sum of exponents).
81 |
82 | Let's say, the robot moves from x0 to x1 to x2 which are 5 and -4 units apart.
83 |
84 | 
85 |
86 | The coefficient of x's and RHS values are added to corresponding cells. Consider the landmark L0 is at a distance of 9 units from x1.
87 |
88 | 
89 |
90 | You need to **update the values in the 2D Ω matrix and ξ vector, to account for motion and measurement constraints in the x and y directions.** You can find the full source code of custom Graph SLAM implementation 
91 |
92 | Interestingly, we have re-routed the real-time visualization to a remote machine using MQTT. The robot position, angle, and map can be encoded as a byte array that is decoded by the MQTT client.
93 |
94 | Note that, the high-dense linear point cloud in the LIDAR maps represents stable obstacles like walls. Hence, we can **use algorithms like Hough Transform to find the best fit line on these linear point clouds to generate floor maps.**
95 |
96 | **We can use the idea of SLAM indoor navigation to deploy an autonomous mobile robot inside closed environments like airports, warehouses, or industrial plants.**
97 |
98 | If you have any queries or suggestions, you can reach me 
99 |
100 |
101 | ## References
102 | 1. https://github.com/simondlevy/BreezySLAM
103 | 2. https://github.com/simondlevy/PyRoboViz
104 | 3. https://www.udacity.com/course/computer-vision-nanodegree--nd891
105 | 4. https://www.thinkautonomous.ai/blog/?p=lidar-and-camera-sensor-fusion-in-self-driving-cars
106 |
107 |
108 |
--------------------------------------------------------------------------------
/__init__.py:
--------------------------------------------------------------------------------
1 | '''
2 |
3 | This file is a modification of the file below to enable map save
4 | https://github.com/simondlevy/PyRoboViz/blob/master/roboviz/__init__.py
5 |
6 | roboviz.py - Python classes for displaying maps and robots
7 |
8 | Requires: numpy, matplotlib
9 |
10 | Copyright (C) 2018 Simon D. Levy
11 |
12 | This file is part of PyRoboViz.
13 |
14 | PyRoboViz is free software: you can redistribute it and/or modify
15 | it under the terms of the GNU Lesser General Public License as
16 | published by the Free Software Foundation, either version 3 of the
17 | License, or (at your option) any later version.
18 |
19 | PyRoboViz is distributed in the hope that it will be useful,
20 | but WITHOUT ANY WARRANTY without even the implied warranty of
21 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
22 | GNU General Public License for more details.
23 |
24 | '''
25 | # Essential imports
26 | import matplotlib.pyplot as plt
27 | import matplotlib.cm as colormap
28 | import matplotlib.lines as mlines
29 | from mpl_toolkits.mplot3d import Axes3D
30 | import numpy as np
31 | import datetime
32 |
33 | # This helps with Raspberry Pi
34 | import matplotlib
35 | matplotlib.use('TkAgg')
36 |
37 | class Visualizer(object):
38 |
39 | # Robot display params
40 | ROBOT_HEIGHT_M = 0.5
41 | ROBOT_WIDTH_M = 0.3
42 |
43 | def __init__(self, map_size_pixels, map_size_meters, title, show_trajectory=False, zero_angle=0):
44 |
45 | # Put origin in center
46 | self._init(map_size_pixels, map_size_meters, title, -map_size_pixels / 2, show_trajectory, zero_angle)
47 |
48 | def display(self, x_m, y_m, theta_deg):
49 |
50 | self._setPose(x_m, y_m, theta_deg)
51 |
52 | return self._refresh()
53 |
54 | def _init(self, map_size_pixels, map_size_meters, title, shift, show_trajectory=False, zero_angle=0):
55 |
56 | # Store constants for update
57 | map_size_meters = map_size_meters
58 | self.map_size_pixels = map_size_pixels
59 | self.map_scale_meters_per_pixel = map_size_meters / float(map_size_pixels)
60 |
61 | # Create a byte array to display the map with a color overlay
62 | self.bgrbytes = bytearray(map_size_pixels * map_size_pixels * 3)
63 |
64 | # Make a nice big (10"x10") figure
65 | fig = plt.figure(figsize=(10,10), facecolor="white")
66 | fig.set_facecolor("white")
67 | # Added this line to make sure the map background is white
68 | plt.rcParams['figure.facecolor'] = 'white'
69 |
70 | # Store Python ID of figure to detect window close
71 | self.figid = id(fig)
72 |
73 | fig.canvas.set_window_title('SLAM')
74 | plt.title(title)
75 |
76 | # Use an "artist" to speed up map drawing
77 | self.img_artist = None
78 |
79 | # No vehicle to show yet
80 | self.vehicle = None
81 |
82 | # Create axes
83 | self.ax = fig.gca()
84 | self.ax.set_xlabel('X (m)')
85 | self.ax.set_ylabel('Y (m)')
86 | # self.ax.grid(False)
87 |
88 | # Hence we must relabel the axis ticks to show millimeters
89 | ticks = np.arange(shift,self.map_size_pixels+shift+100,100)
90 | labels = [str(self.map_scale_meters_per_pixel * tick) for tick in ticks]
91 | self.ax.set_xticklabels(labels)
92 | self.ax.set_yticklabels(labels)
93 |
94 | self.ax.set_facecolor('w')
95 |
96 | # Store previous position for trajectory
97 | self.prevpos = None
98 | self.showtraj = show_trajectory
99 |
100 | # We base the axis on pixels, to support displaying the map
101 | self.ax.set_xlim([shift, self.map_size_pixels+shift])
102 | self.ax.set_ylim([shift, self.map_size_pixels+shift])
103 |
104 | # Set up default shift for centering at origin
105 | shift = -self.map_size_pixels / 2
106 | # print("shift = " + str(shift))
107 |
108 | self.zero_angle = zero_angle
109 | self.start_angle = None
110 | self.rotate_angle = 0
111 |
112 | def _setPose(self, x_m, y_m, theta_deg):
113 | '''
114 | Sets vehicle pose:
115 | X: left/right (m)
116 | Y: forward/back (m)
117 | theta: rotation (degrees)
118 | '''
119 |
120 | # If zero-angle was indicated, grab first angle to compute rotation
121 | if self.start_angle is None and self.zero_angle != 0:
122 | self.start_angle = theta_deg
123 | self.rotate_angle = self.zero_angle - self.start_angle
124 |
125 | # Rotate by computed angle, or zero if no zero-angle indicated
126 | d = self.rotate_angle
127 | a = np.radians(d)
128 | c = np.cos(a)
129 | s = np.sin(a)
130 | x_m,y_m = x_m*c-y_m*s, y_m*c+x_m*s
131 |
132 | # Erase previous vehicle image after first iteration
133 | if not self.vehicle is None:
134 | self.vehicle.remove()
135 |
136 | # Use a very short arrow shaft to orient the head of the arrow
137 | theta_rad = np.radians(theta_deg+d)
138 | c = np.cos(theta_rad)
139 | s = np.sin(theta_rad)
140 | l = 0.1
141 | dx = l * c
142 | dy = l * s
143 |
144 | s = self.map_scale_meters_per_pixel
145 |
146 | self.vehicle=self.ax.arrow(x_m/s, y_m/s,
147 | dx, dy, head_width=Visualizer.ROBOT_WIDTH_M/s,
148 | head_length=Visualizer.ROBOT_HEIGHT_M/s, fc='r', ec='r')
149 |
150 | # Show trajectory if indicated
151 | currpos = self._m2pix(x_m,y_m)
152 | if self.showtraj and not self.prevpos is None:
153 | if (self.prevpos[0] != 0 and self.prevpos[1] != 0):
154 | self.ax.add_line(mlines.Line2D((self.prevpos[0],currpos[0]), (self.prevpos[1],currpos[1])))
155 | self.prevpos = currpos
156 |
157 | def _refresh(self):
158 |
159 | # If we have a new figure, something went wrong (closing figure failed)
160 | if self.figid != id(plt.gcf()):
161 | return False
162 |
163 | # Added this line to make sure the map background is white
164 | plt.rcParams['figure.facecolor'] = 'white'
165 | plt.rcParams['axes.facecolor'] = 'white'
166 | plt.rcParams['savefig.facecolor'] = 'white'
167 |
168 | # Redraw current objects without blocking
169 | plt.draw()
170 | now = datetime.datetime.now()
171 | # Create a directory named 'gif' inside the base directory
172 | plt.savefig('gif/slamMap' + '- ' + str(now.hour).zfill(2) + '- ' + str(now.minute).zfill(2) + '- ' + str(now.second).zfill(2) + '.png')
173 |
174 | # Refresh display, setting flag on window close or keyboard interrupt
175 | try:
176 | plt.pause(.01) # Arbitrary pause to force redraw
177 | return True
178 | except:
179 | return False
180 |
181 | return True
182 |
183 | def _m2pix(self, x_m, y_m):
184 |
185 | s = self.map_scale_meters_per_pixel
186 |
187 | return x_m/s, y_m/s
188 |
189 | class MapVisualizer(Visualizer):
190 |
191 | def __init__(self, map_size_pixels, map_size_meters, title='MapVisualizer', show_trajectory=False):
192 |
193 | # Put origin in lower left; disallow zero-angle setting
194 | Visualizer._init(self, map_size_pixels, map_size_meters, title, 0, show_trajectory, 0)
195 |
196 | def display(self, x_m, y_m, theta_deg, mapbytes):
197 |
198 | self._setPose(x_m, y_m, theta_deg)
199 |
200 | mapimg = np.reshape(np.frombuffer(mapbytes, dtype=np.uint8), (self.map_size_pixels, self.map_size_pixels))
201 |
202 | # Pause to allow display to refresh
203 | plt.pause(.001)
204 |
205 | if self.img_artist is None:
206 | self.img_artist = self.ax.imshow(mapimg, cmap=colormap.gray)
207 |
208 | else:
209 | self.img_artist.set_data(mapimg)
210 |
211 | return self._refresh()
212 |
213 |
--------------------------------------------------------------------------------
/helpers.py:
--------------------------------------------------------------------------------
1 | from robot_class import robot
2 | from math import *
3 | import random
4 | import numpy as np
5 | import matplotlib.pyplot as plt
6 | import seaborn as sns
7 |
8 |
9 | # --------
10 | # this helper function displays the world that a robot is in
11 | # it assumes the world is a square grid of some given size
12 | # and that landmarks is a list of landmark positions(an optional argument)
13 | def display_world(world_size, position, landmarks=None):
14 |
15 | # using seaborn, set background grid to gray
16 | sns.set_style("dark")
17 |
18 | # Plot grid of values
19 | world_grid = np.zeros((world_size+1, world_size+1))
20 |
21 | # Set minor axes in between the labels
22 | ax=plt.gca()
23 | cols = world_size+1
24 | rows = world_size+1
25 |
26 | ax.set_xticks([x for x in range(1,cols)],minor=True )
27 | ax.set_yticks([y for y in range(1,rows)],minor=True)
28 |
29 | # Plot grid on minor axes in gray (width = 1)
30 | plt.grid(which='minor',ls='-',lw=1, color='white')
31 |
32 | # Plot grid on major axes in larger width
33 | plt.grid(which='major',ls='-',lw=2, color='white')
34 |
35 | # Create an 'o' character that represents the robot
36 | # ha = horizontal alignment, va = vertical
37 | ax.text(position[0], position[1], 'o', ha='center', va='center', color='r', fontsize=30)
38 |
39 | # Draw landmarks if they exists
40 | if(landmarks is not None):
41 | # loop through all path indices and draw a dot (unless it's at the car's location)
42 | for pos in landmarks:
43 | if(pos != position):
44 | ax.text(pos[0], pos[1], 'x', ha='center', va='center', color='purple', fontsize=20)
45 |
46 | # Display final result
47 | plt.show()
48 |
49 |
50 | # --------
51 | # this routine makes the robot data
52 | # the data is a list of measurements and movements: [measurements, [dx, dy]]
53 | # collected over a specified number of time steps, N
54 | #
55 | def make_data(N, num_landmarks, world_size, measurement_range, motion_noise,
56 | measurement_noise, distance):
57 |
58 | # check that data has been made
59 | try:
60 | check_for_data(num_landmarks, world_size, measurement_range, motion_noise, measurement_noise)
61 | except ValueError:
62 | print('Error: You must implement the sense function in robot_class.py.')
63 | return []
64 |
65 | complete = False
66 |
67 | r = robot(world_size, measurement_range, motion_noise, measurement_noise)
68 | r.make_landmarks(num_landmarks)
69 |
70 | while not complete:
71 |
72 | data = []
73 |
74 | seen = [False for row in range(num_landmarks)]
75 |
76 | # guess an initial motion
77 | orientation = random.random() * 2.0 * pi
78 | dx = cos(orientation) * distance
79 | dy = sin(orientation) * distance
80 |
81 | for k in range(N-1):
82 |
83 | # collect sensor measurements in a list, Z
84 | Z = r.sense()
85 |
86 | # check off all landmarks that were observed
87 | for i in range(len(Z)):
88 | seen[Z[i][0]] = True
89 |
90 | # move
91 | while not r.move(dx, dy):
92 | # if we'd be leaving the robot world, pick instead a new direction
93 | orientation = random.random() * 2.0 * pi
94 | dx = cos(orientation) * distance
95 | dy = sin(orientation) * distance
96 |
97 | # collect/memorize all sensor and motion data
98 | data.append([Z, [dx, dy]])
99 |
100 | # we are done when all landmarks were observed; otherwise re-run
101 | complete = (sum(seen) == num_landmarks)
102 |
103 | print(' ')
104 | print('Landmarks: ', r.landmarks)
105 | print(r)
106 |
107 |
108 | return data
109 |
110 |
111 | def check_for_data(num_landmarks, world_size, measurement_range, motion_noise, measurement_noise):
112 | # make robot and landmarks
113 | r = robot(world_size, measurement_range, motion_noise, measurement_noise)
114 | r.make_landmarks(num_landmarks)
115 |
116 |
117 | # check that sense has been implemented/data has been made
118 | test_Z = r.sense()
119 | if(test_Z is None):
120 | raise ValueError
121 |
--------------------------------------------------------------------------------
/images/Project 1 Thumbnail_preview.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AdroitAnandAI/SLAM-on-Raspberry-Pi/0e3023efb5e6b6fd64e12f34deabea45ac7cadf2/images/Project 1 Thumbnail_preview.png
--------------------------------------------------------------------------------
/images/dummy.jpg:
--------------------------------------------------------------------------------
1 |
2 |
--------------------------------------------------------------------------------
/images/gaussian_edited1.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AdroitAnandAI/SLAM-on-Raspberry-Pi/0e3023efb5e6b6fd64e12f34deabea45ac7cadf2/images/gaussian_edited1.jpg
--------------------------------------------------------------------------------
/images/matrix1_edited1.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AdroitAnandAI/SLAM-on-Raspberry-Pi/0e3023efb5e6b6fd64e12f34deabea45ac7cadf2/images/matrix1_edited1.png
--------------------------------------------------------------------------------
/images/matrix2_edited1.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AdroitAnandAI/SLAM-on-Raspberry-Pi/0e3023efb5e6b6fd64e12f34deabea45ac7cadf2/images/matrix2_edited1.jpg
--------------------------------------------------------------------------------
/images/preview.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AdroitAnandAI/SLAM-on-Raspberry-Pi/0e3023efb5e6b6fd64e12f34deabea45ac7cadf2/images/preview.png
--------------------------------------------------------------------------------
/images/slam.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AdroitAnandAI/SLAM-on-Raspberry-Pi/0e3023efb5e6b6fd64e12f34deabea45ac7cadf2/images/slam.gif
--------------------------------------------------------------------------------
/images/slam_viz.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AdroitAnandAI/SLAM-on-Raspberry-Pi/0e3023efb5e6b6fd64e12f34deabea45ac7cadf2/images/slam_viz.gif
--------------------------------------------------------------------------------
/images/thumbnail_preview.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AdroitAnandAI/SLAM-on-Raspberry-Pi/0e3023efb5e6b6fd64e12f34deabea45ac7cadf2/images/thumbnail_preview.png
--------------------------------------------------------------------------------
/images/video_preview.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AdroitAnandAI/SLAM-on-Raspberry-Pi/0e3023efb5e6b6fd64e12f34deabea45ac7cadf2/images/video_preview.png
--------------------------------------------------------------------------------
/mqtt-slamviz.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env python
2 | import time
3 | from sys import exit
4 | from roboviz import MapVisualizer
5 | import numpy as np
6 |
7 |
8 |
9 | try:
10 | import paho.mqtt.client as mqtt
11 | except ImportError:
12 | exit("This example requires the paho-mqtt module\nInstall with: sudo pip install paho-mqtt")
13 |
14 |
15 | MQTT_SERVER = "test.mosquitto.org"
16 | MQTT_PORT = 1883
17 | MQTT_TOPIC = "safetycam/topic/slamviz"
18 |
19 | # Set these to use authorisation
20 | MQTT_USER = None
21 | MQTT_PASS = None
22 |
23 |
24 | MAP_SIZE_PIXELS = 200
25 | MAP_SIZE_METERS = 10
26 | # Set up a SLAM display
27 | viz = MapVisualizer(MAP_SIZE_PIXELS, MAP_SIZE_METERS, 'SLAM', show_trajectory=True)
28 |
29 | print("""
30 | Public MQTT messages from {server} on port {port} to visualize SLAM!
31 |
32 | It will monitor the {topic} topic by default, and decodes the bytearray
33 | """.format(
34 | server=MQTT_SERVER,
35 | port=MQTT_PORT,
36 | topic=MQTT_TOPIC
37 | ))
38 |
39 | def on_connect(client, userdata, flags, rc):
40 | print("Connected with result code "+str(rc))
41 |
42 | client.subscribe(MQTT_TOPIC)
43 |
44 | def on_message(client, userdata, msg):
45 |
46 | robotPos_bytes = msg.payload[:24]
47 | map_bytes = msg.payload[24:]
48 |
49 | robotPos = np.frombuffer(robotPos_bytes, dtype='float64')
50 | robotPos = np.array(robotPos)
51 |
52 | x, y, theta = robotPos
53 | viz.display(x / 1000., y / 1000., theta, map_bytes)
54 |
55 | return
56 |
57 |
58 |
59 | client = mqtt.Client()
60 | client.on_connect = on_connect
61 | client.on_message = on_message
62 |
63 | if MQTT_USER is not None and MQTT_PASS is not None:
64 | print("Using username: {un} and password: {pw}".format(un=MQTT_USER, pw="*" * len(MQTT_PASS)))
65 | client.username_pw_set(username=MQTT_USER, password=MQTT_PASS)
66 |
67 | client.connect(MQTT_SERVER, MQTT_PORT, 60)
68 |
69 | client.loop_forever()
--------------------------------------------------------------------------------
/robot_class.py:
--------------------------------------------------------------------------------
1 | from math import *
2 | import random
3 |
4 |
5 | ### ------------------------------------- ###
6 | # Below, is the robot class
7 | #
8 | # This robot lives in 2D, x-y space, and its motion is
9 | # pointed in a random direction, initially.
10 | # It moves in a straight line until it comes close to a wall
11 | # at which point it stops.
12 | #
13 | # For measurements, it senses the x- and y-distance
14 | # to landmarks. This is different from range and bearing as
15 | # commonly studied in the literature, but this makes it much
16 | # easier to implement the essentials of SLAM without
17 | # cluttered math.
18 | #
19 | class robot:
20 |
21 | # --------
22 | # init:
23 | # creates a robot with the specified parameters and initializes
24 | # the location (self.x, self.y) to the center of the world
25 | #
26 | def __init__(self, world_size = 100.0, measurement_range = 30.0,
27 | motion_noise = 1.0, measurement_noise = 1.0):
28 | self.measurement_noise = 0.0
29 | self.world_size = world_size
30 | self.measurement_range = measurement_range
31 | self.x = world_size / 2.0
32 | self.y = world_size / 2.0
33 | self.motion_noise = motion_noise
34 | self.measurement_noise = measurement_noise
35 | self.landmarks = []
36 | self.num_landmarks = 0
37 |
38 |
39 | # returns a positive, random float
40 | def rand(self):
41 | return random.random() * 2.0 - 1.0
42 |
43 |
44 | # --------
45 | # move: attempts to move robot by dx, dy. If outside world
46 | # boundary, then the move does nothing and instead returns failure
47 | #
48 | def move(self, dx, dy):
49 |
50 | x = self.x + dx + self.rand() * self.motion_noise
51 | y = self.y + dy + self.rand() * self.motion_noise
52 |
53 | if x < 0.0 or x > self.world_size or y < 0.0 or y > self.world_size:
54 | return False
55 | else:
56 | self.x = x
57 | self.y = y
58 | return True
59 |
60 |
61 | # --------
62 | # sense: returns x- and y- distances to landmarks within visibility range
63 | # because not all landmarks may be in this range, the list of measurements
64 | # is of variable length. Set measurement_range to -1 if you want all
65 | # landmarks to be visible at all times
66 | #
67 |
68 | ## TODO: paste your complete the sense function, here
69 | ## make sure the indentation of the code is correct
70 | ## TODO: complete the sense function
71 | def sense(self):
72 | ''' This function does not take in any parameters, instead it references internal variables
73 | (such as self.landamrks) to measure the distance between the robot and any landmarks
74 | that the robot can see (that are within its measurement range).
75 | This function returns a list of landmark indices, and the measured distances (dx, dy)
76 | between the robot's position and said landmarks.
77 | This function should account for measurement_noise and measurement_range.
78 | One item in the returned list should be in the form: [landmark_index, dx, dy].
79 | '''
80 |
81 | measurements = []
82 |
83 | ## TODO: iterate through all of the landmarks in a world
84 |
85 | ## TODO: For each landmark
86 | ## 1. compute dx and dy, the distances between the robot and the landmark
87 | ## 2. account for measurement noise by *adding* a noise component to dx and dy
88 | ## - The noise component should be a random value between [-1.0, 1.0)*measurement_noise
89 | ## - Feel free to use the function self.rand() to help calculate this noise component
90 | ## - It may help to reference the `move` function for noise calculation
91 | ## 3. If either of the distances, dx or dy, fall outside of the internal var, measurement_range
92 | ## then we cannot record them; if they do fall in the range, then add them to the measurements list
93 | ## as list.append([index, dx, dy]), this format is important for data creation done later
94 |
95 | for i, landmark in enumerate(self.landmarks):
96 | dx, dy = landmark[0] - self.x, landmark[1] - self.y
97 | dx, dy = dx + self.rand() * self.measurement_noise, dy + self.rand() * self.measurement_noise
98 |
99 | if (dx < self.measurement_range) and (dy < self.measurement_range):
100 | measurements.append([i, dx, dy])
101 |
102 | ## TODO: return the final, complete list of measurements
103 | return measurements
104 |
105 |
106 |
107 | # --------
108 | # make_landmarks:
109 | # make random landmarks located in the world
110 | #
111 | def make_landmarks(self, num_landmarks):
112 | self.landmarks = []
113 | for i in range(num_landmarks):
114 | self.landmarks.append([round(random.random() * self.world_size),
115 | round(random.random() * self.world_size)])
116 | self.num_landmarks = num_landmarks
117 |
118 |
119 | # called when print(robot) is called; prints the robot's location
120 | def __repr__(self):
121 | return 'Robot: [x=%.5f y=%.5f]' % (self.x, self.y)
122 |
123 |
124 |
125 | ####### END robot class #######
--------------------------------------------------------------------------------
/rpslam-mqtt.py:
--------------------------------------------------------------------------------
1 | """
2 |
3 | This file as been modified extensively to implement SLAM in RPLidar
4 |
5 | Concepts taken from rpslam.py : BreezySLAM Python with SLAMTECH RP A1 Lidar
6 | https://github.com/simondlevy/BreezySLAM
7 |
8 | Consume LIDAR measurement file and create an image for display.
9 |
10 | Adafruit invests time and resources providing this open source code.
11 | Please support Adafruit and open source hardware by purchasing
12 | products from Adafruit!
13 |
14 | Written by Dave Astels for Adafruit Industries
15 | Copyright (c) 2019 Adafruit Industries
16 | Licensed under the MIT license.
17 |
18 | All text above must be included in any redistribution.
19 | """
20 |
21 | import os
22 | from math import cos, sin, pi, floor
23 | # import pygame
24 | from adafruit_rplidar import RPLidar, RPLidarException
25 | import numpy as np
26 | import matplotlib.pyplot as plt
27 | import paho.mqtt.client as mqtt
28 |
29 | from breezyslam.algorithms import RMHC_SLAM
30 | from breezyslam.sensors import RPLidarA1 as LaserModel
31 |
32 |
33 | # Screen width & height
34 | W = 640
35 | H = 480
36 |
37 |
38 | MAP_SIZE_PIXELS = 200
39 | MAP_SIZE_METERS = 10
40 | MIN_SAMPLES = 150
41 |
42 | SCAN_BYTE = b'\x20'
43 | SCAN_TYPE = 129
44 |
45 | slamData = []
46 |
47 | # This is the Publisher
48 | client = mqtt.Client()
49 | client.connect("test.mosquitto.org", 1883, 600)
50 |
51 | # Setup the RPLidar
52 | PORT_NAME = '/dev/ttyUSB0'
53 | lidar = RPLidar(None, PORT_NAME)
54 |
55 | # Create an RMHC SLAM object with a laser model and optional robot model
56 | slam = RMHC_SLAM(LaserModel(), MAP_SIZE_PIXELS, MAP_SIZE_METERS)
57 |
58 | # Initialize an empty trajectory
59 | trajectory = []
60 |
61 | # Initialize empty map
62 | mapbytes = bytearray(MAP_SIZE_PIXELS * MAP_SIZE_PIXELS)
63 |
64 | # used to scale data to fit on the screen
65 | max_distance = 0
66 |
67 | #pylint: disable=redefined-outer-name,global-statement
68 | def process_data(data):
69 | global max_distance
70 | lcd.fill((0,0,0))
71 | point = ( int(W / 2) , int(H / 2) )
72 |
73 | pygame.draw.circle(lcd,pygame.Color(255, 255, 255),point,10 )
74 | pygame.draw.circle(lcd,pygame.Color(100, 100, 100),point,100 , 1 )
75 | pygame.draw.line( lcd,pygame.Color(100, 100, 100) , ( 0, int(H/2)),( W , int(H/2) ) )
76 | pygame.draw.line( lcd,pygame.Color(100, 100, 100) , ( int(W/2),0),( int(W/2) , H ) )
77 |
78 | for angle in range(360):
79 | distance = data[angle]
80 | if distance > 0: # ignore initially ungathered data points
81 | max_distance = max([min([5000, distance]), max_distance])
82 | radians = angle * pi / 180.0
83 | x = distance * cos(radians)
84 | y = distance * sin(radians)
85 | point = ( int(W / 2) + int(x / max_distance * (W/2)), int(H/2) + int(y / max_distance * (H/2) ))
86 | pygame.draw.circle(lcd,pygame.Color(255, 0, 0),point,2 )
87 | pygame.display.update()
88 |
89 |
90 | scan_data = [0]*360
91 |
92 | def _process_scan(raw):
93 | '''Processes input raw data and returns measurment data'''
94 | new_scan = bool(raw[0] & 0b1)
95 | inversed_new_scan = bool((raw[0] >> 1) & 0b1)
96 | quality = raw[0] >> 2
97 | if new_scan == inversed_new_scan:
98 | raise RPLidarException('New scan flags mismatch')
99 | check_bit = raw[1] & 0b1
100 | if check_bit != 1:
101 | raise RPLidarException('Check bit not equal to 1')
102 | angle = ((raw[1] >> 1) + (raw[2] << 7)) / 64.
103 | distance = (raw[3] + (raw[4] << 8)) / 4.
104 | return new_scan, quality, angle, distance
105 |
106 | def lidar_measurments(self, max_buf_meas=500):
107 |
108 | lidar.set_pwm(800)
109 | status, error_code = self.health
110 |
111 | cmd = SCAN_BYTE
112 | self._send_cmd(cmd)
113 | dsize, is_single, dtype = self._read_descriptor()
114 | if dsize != 5:
115 | raise RPLidarException('Wrong info reply length')
116 | if is_single:
117 | raise RPLidarException('Not a multiple response mode')
118 | if dtype != SCAN_TYPE:
119 | raise RPLidarException('Wrong response data type')
120 | while True:
121 | raw = self._read_response(dsize)
122 | self.log_bytes('debug', 'Received scan response: ', raw)
123 | if max_buf_meas:
124 | data_in_buf = self._serial_port.in_waiting
125 | if data_in_buf > max_buf_meas*dsize:
126 | self.log('warning',
127 | 'Too many measurments in the input buffer: %d/%d. '
128 | 'Clearing buffer...' %
129 | (data_in_buf//dsize, max_buf_meas))
130 | self._serial_port.read(data_in_buf//dsize*dsize)
131 | yield _process_scan(raw)
132 |
133 |
134 | def lidar_scans(self, max_buf_meas=800, min_len=100):
135 |
136 | scan = []
137 | iterator = lidar_measurments(lidar,max_buf_meas)
138 | for new_scan, quality, angle, distance in iterator:
139 | if new_scan:
140 | if len(scan) > min_len:
141 | yield scan
142 | scan = []
143 | if quality > 0 and distance > 0:
144 | scan.append((quality, angle, distance))
145 |
146 | try:
147 |
148 | # We will use these to store previous scan in case current scan is inadequate
149 | previous_distances = None
150 | previous_angles = None
151 | scan_count = 0
152 |
153 | for scan in lidar_scans(lidar):
154 |
155 | scan_count += 1
156 |
157 | # Extract (quality, angle, distance) triples from current scan
158 | items = [item for item in scan]
159 |
160 | # Extract distances and angles from triples
161 | distances = [item[2] for item in items]
162 | angles = [item[1] for item in items]
163 |
164 | # Update SLAM with current Lidar scan and scan angles if adequate
165 | if len(distances) > MIN_SAMPLES:
166 | slam.update(distances, scan_angles_degrees=angles)
167 | previous_distances = distances.copy()
168 | previous_angles = angles.copy()
169 |
170 | # If not adequate, use previous
171 | elif previous_distances is not None:
172 | slam.update(previous_distances, scan_angles_degrees=previous_angles)
173 |
174 | # Get current robot position
175 | x, y, theta = slam.getpos()
176 |
177 | # print("x = " + str(x) + " y = " + str(y) + "theta = " + str(theta))
178 |
179 | # Get current map bytes as grayscale
180 | slam.getmap(mapbytes)
181 |
182 | data2Transmit = np.array([x, y, theta])
183 |
184 | if scan_count % 30 == 0:
185 | client.publish("safetycam/topic/slamviz", data2Transmit.tobytes() + mapbytes)
186 |
187 |
188 |
189 | except KeyboardInterrupt:
190 | #print('Stopping the LIDAR SLAM')
191 | raise
192 |
193 | finally:
194 | lidar.stop()
195 | lidar.disconnect()
196 |
--------------------------------------------------------------------------------
/rpslam-thread.py:
--------------------------------------------------------------------------------
1 | """
2 |
3 | This file as been modified extensively to implement SLAM in RPLidar.
4 | It can run well on Raspberry Pi 3B or 4
5 |
6 | Many concepts are taken from rpslam.py : BreezySLAM in Python with SLAMTECH RP A1 Lidar
7 | https://github.com/simondlevy/BreezySLAM
8 |
9 | Consume LIDAR measurement file and create an image for display.
10 |
11 | Adafruit invests time and resources providing this open source code.
12 | Please support Adafruit and open source hardware by purchasing
13 | products from Adafruit!
14 |
15 | Written by Dave Astels for Adafruit Industries
16 | Copyright (c) 2019 Adafruit Industries
17 | Licensed under the MIT license.
18 |
19 | All text above must be included in any redistribution.
20 | """
21 |
22 | import os
23 | import time
24 | from math import cos, sin, pi, floor
25 | # import pygame
26 | from adafruit_rplidar import RPLidar, RPLidarException
27 | import numpy as np
28 | import matplotlib.pyplot as plt
29 | import paho.mqtt.client as mqtt
30 | from threading import Thread
31 |
32 |
33 | from breezyslam.algorithms import RMHC_SLAM
34 | from breezyslam.sensors import RPLidarA1 as LaserModel
35 | # from rplidar import RPLidar as Lidar
36 | # from adafruit_rplidar import RPLidar as Lidar
37 | from roboviz import MapVisualizer
38 |
39 |
40 |
41 | # Screen width & height
42 | W = 640
43 | H = 480
44 |
45 |
46 | MAP_SIZE_PIXELS = 250
47 | MAP_SIZE_METERS = 15
48 | MIN_SAMPLES = 150
49 |
50 | SCAN_BYTE = b'\x20'
51 | SCAN_TYPE = 129
52 |
53 | slamData = []
54 |
55 | # Setup the RPLidar
56 | PORT_NAME = '/dev/ttyUSB0'
57 | lidar = RPLidar(None, PORT_NAME)
58 |
59 | # Create an RMHC SLAM object with a laser model and optional robot model
60 | slam = RMHC_SLAM(LaserModel(), MAP_SIZE_PIXELS, MAP_SIZE_METERS)
61 |
62 | # # Set up a SLAM display
63 | viz = MapVisualizer(MAP_SIZE_PIXELS, MAP_SIZE_METERS, 'SLAM', show_trajectory=True)
64 |
65 | # Initialize an empty trajectory
66 | trajectory = []
67 |
68 | # To exit lidar scan thread gracefully
69 | runThread = True
70 |
71 | # Initialize empty map
72 | mapbytes = bytearray(MAP_SIZE_PIXELS * MAP_SIZE_PIXELS)
73 |
74 | # used to scale data to fit on the screen
75 | max_distance = 0
76 | # x, y, theta = 0, 0, 0
77 |
78 | # Pose will be modified in our threaded code
79 | pose = [0, 0, 0]
80 |
81 | scan_data = [0]*360
82 |
83 | def _process_scan(raw):
84 | '''Processes input raw data and returns measurment data'''
85 | new_scan = bool(raw[0] & 0b1)
86 | inversed_new_scan = bool((raw[0] >> 1) & 0b1)
87 | quality = raw[0] >> 2
88 | if new_scan == inversed_new_scan:
89 | raise RPLidarException('New scan flags mismatch')
90 | check_bit = raw[1] & 0b1
91 | if check_bit != 1:
92 | raise RPLidarException('Check bit not equal to 1')
93 | angle = ((raw[1] >> 1) + (raw[2] << 7)) / 64.
94 | distance = (raw[3] + (raw[4] << 8)) / 4.
95 | return new_scan, quality, angle, distance
96 |
97 | def lidar_measurments(self, max_buf_meas=500):
98 |
99 | lidar.set_pwm(800)
100 | status, error_code = self.health
101 |
102 | cmd = SCAN_BYTE
103 | self._send_cmd(cmd)
104 | dsize, is_single, dtype = self._read_descriptor()
105 | if dsize != 5:
106 | raise RPLidarException('Wrong info reply length')
107 | if is_single:
108 | raise RPLidarException('Not a multiple response mode')
109 | if dtype != SCAN_TYPE:
110 | raise RPLidarException('Wrong response data type')
111 | while True:
112 | raw = self._read_response(dsize)
113 | self.log_bytes('debug', 'Received scan response: ', raw)
114 | if max_buf_meas:
115 | data_in_buf = self._serial_port.in_waiting
116 | if data_in_buf > max_buf_meas*dsize:
117 | self.log('warning',
118 | 'Too many measurments in the input buffer: %d/%d. '
119 | 'Clearing buffer...' %
120 | (data_in_buf//dsize, max_buf_meas))
121 | self._serial_port.read(data_in_buf//dsize*dsize)
122 | yield _process_scan(raw)
123 |
124 |
125 | def lidar_scans(self, max_buf_meas=800, min_len=100):
126 |
127 | scan = []
128 | iterator = lidar_measurments(lidar,max_buf_meas)
129 | for new_scan, quality, angle, distance in iterator:
130 | if new_scan:
131 | if len(scan) > min_len:
132 | yield scan
133 | scan = []
134 | if quality > 0 and distance > 0:
135 | scan.append((quality, angle, distance))
136 |
137 |
138 | def slam_compute(pose, mapbytes):
139 |
140 | try:
141 |
142 | # We will use these to store previous scan in case current scan is inadequate
143 | previous_distances = None
144 | previous_angles = None
145 | scan_count = 0
146 |
147 | for scan in lidar_scans(lidar):
148 |
149 | # To stop the thread
150 | if not runThread:
151 | break
152 |
153 | scan_count += 1
154 |
155 | # Extract (quality, angle, distance) triples from current scan
156 | items = [item for item in scan]
157 |
158 | # Extract distances and angles from triples
159 | distances = [item[2] for item in items]
160 | angles = [item[1] for item in items]
161 |
162 | # Update SLAM with current Lidar scan and scan angles if adequate
163 | if len(distances) > MIN_SAMPLES:
164 | slam.update(distances, scan_angles_degrees=angles)
165 | previous_distances = distances.copy()
166 | previous_angles = angles.copy()
167 |
168 | # If not adequate, use previous
169 | elif previous_distances is not None:
170 | slam.update(previous_distances, scan_angles_degrees=previous_angles)
171 |
172 | # Get new position
173 | pose[0], pose[1], pose[2] = slam.getpos()
174 |
175 | # Get current map bytes as grayscale
176 | slam.getmap(mapbytes)
177 |
178 | except KeyboardInterrupt:
179 | lidar.stop()
180 | lidar.disconnect()
181 | raise
182 |
183 |
184 |
185 | # Launch the slam computation thread
186 | thread = Thread(target=slam_compute,
187 | args=(pose, mapbytes))
188 | thread.daemon = True
189 | thread.start()
190 |
191 | try:
192 | # Loop forever,displaying current map and pose
193 | while True:
194 |
195 | #print("x = " + str(pose[0]) + " y = " + str(pose[1]) + "theta = " + str(pose[2]))
196 | if not viz.display(pose[0]/1000., pose[1]/1000., pose[2], mapbytes):
197 | raise KeyboardInterrupt
198 |
199 |
200 | except KeyboardInterrupt:
201 | runThread = False
202 | thread.join()
203 | lidar.stop()
204 | lidar.disconnect()
205 | exit(0)
--------------------------------------------------------------------------------
/slam.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/AdroitAnandAI/SLAM-on-Raspberry-Pi/0e3023efb5e6b6fd64e12f34deabea45ac7cadf2/slam.gif
--------------------------------------------------------------------------------