├── .gitignore
├── Code
├── Fit Data.ipynb
├── Load Data.ipynb
├── Old Code
│ ├── __init__.py
│ ├── gps_pos_plot.py
│ ├── plot-stuff.py
│ ├── plot_test.py
│ ├── test.py
│ └── ublox_script.py
├── Old Notebooks
│ ├── GPS Error Classifier.ipynb
│ └── GPS Pos Plot.ipynb
├── Port Data.ipynb
├── Untitled0.ipynb
├── Untitled1.ipynb
├── Utils
│ ├── __init__.py
│ ├── __init__.pyc
│ ├── __pycache__
│ │ ├── __init__.cpython-33.pyc
│ │ ├── ephem_utils.cpython-33.pyc
│ │ ├── file_utils.cpython-33.pyc
│ │ ├── gpstime.cpython-33.pyc
│ │ └── rtklib_utils.cpython-33.pyc
│ ├── ephem_utils.py
│ ├── file_utils.py
│ ├── gpstime.py
│ ├── plot_utils.py
│ ├── plot_utils.pyc
│ ├── rtklib_utils.py
│ └── testClassificationModel.py
├── __init__.py
└── __pycache__
│ ├── ephem_utils.cpython-33.pyc
│ ├── file_utils.cpython-33.pyc
│ ├── fileutils.cpython-33.pyc
│ ├── gpstime.cpython-33.pyc
│ ├── plot_utils.cpython-33.pyc
│ ├── rtklib_utils.cpython-33.pyc
│ └── testClassificationModel.cpython-33.pyc
├── Docs
└── Poster
│ ├── Micro Aerial Vehicle Localization with Real Time Kinematic GPS - small.png
│ └── Micro Aerial Vehicle Localization with Real Time Kinematic GPS.png
└── README.md
/.gitignore:
--------------------------------------------------------------------------------
1 | /.project
2 |
--------------------------------------------------------------------------------
/Code/Load Data.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "Load Data"
4 | },
5 | "nbformat": 3,
6 | "nbformat_minor": 0,
7 | "worksheets": [
8 | {
9 | "cells": [
10 | {
11 | "cell_type": "heading",
12 | "level": 1,
13 | "metadata": {},
14 | "source": [
15 | "Load Data"
16 | ]
17 | },
18 | {
19 | "cell_type": "heading",
20 | "level": 2,
21 | "metadata": {},
22 | "source": [
23 | "Load Data from raw ublox GPS observations"
24 | ]
25 | },
26 | {
27 | "cell_type": "markdown",
28 | "metadata": {},
29 | "source": [
30 | "This notebook is used to convert raw observation data in the form of ublox\u2019s proprietary message format into industry standard RINEX v3.02 format. This is then used to calculate static and kinematic positional solution files."
31 | ]
32 | },
33 | {
34 | "cell_type": "markdown",
35 | "metadata": {},
36 | "source": [
37 | "First we need to import the relevant utilities that contain the necessary functions to operate on the data.\n",
38 | "file_utils is used for navigating and verifying the directories that contains or will contain the observation data.\n",
39 | "rtklib_utils is for calling RTK and DGPS processes to observation. \n"
40 | ]
41 | },
42 | {
43 | "cell_type": "code",
44 | "collapsed": false,
45 | "input": [
46 | "from Utils.file_utils import *\n",
47 | "from Utils.rtklib_utils import *"
48 | ],
49 | "language": "python",
50 | "metadata": {},
51 | "outputs": [],
52 | "prompt_number": 1
53 | },
54 | {
55 | "cell_type": "markdown",
56 | "metadata": {},
57 | "source": [
58 | "Here we define the specific constants required to perform several tasks in the workflow. For downloading and fetching base station data for differential GPS corrections, we need to specify the FTP server address as well as the host path on the server containing ephemeris position data of satellite corrections. The station name of the particular continuously operated reference station is also the specified will be used as a part of the directory path for downloading the relevant observations that are on the same day as the starting observation specified in the converted RINEX files."
59 | ]
60 | },
61 | {
62 | "cell_type": "code",
63 | "collapsed": false,
64 | "input": [
65 | "server = 'ftp.ngs.noaa.gov'\n",
66 | "hostPath = '/cors/rinex/'\n",
67 | "station = 'paap'"
68 | ],
69 | "language": "python",
70 | "metadata": {},
71 | "outputs": [],
72 | "prompt_number": 2
73 | },
74 | {
75 | "cell_type": "markdown",
76 | "metadata": {},
77 | "source": [
78 | "In order to calculate the constellations or position of a given navigation satellites at any time during the observation, we need to specify the URL database containing up-to-date NORAD Two-Line Element Sets that define each satellite trajectory, as well as the name of the downloaded you would like to specify."
79 | ]
80 | },
81 | {
82 | "cell_type": "code",
83 | "collapsed": false,
84 | "input": [
85 | "noradUrl = 'http://www.celestrak.com/NORAD/elements/gps-ops.txt'\n",
86 | "noradFile = 'gps-ops.txt'"
87 | ],
88 | "language": "python",
89 | "metadata": {},
90 | "outputs": [],
91 | "prompt_number": 3
92 | },
93 | {
94 | "cell_type": "markdown",
95 | "metadata": {},
96 | "source": [
97 | "Here we specify the complete path for RTKLIB\u2019s compiled binaries for command user interface programs. Rather on the fact whether these CUI have been defined in your shell environment, path can be made explicitly here so that it is possible to conveniently change or upgrade used versions of RTKLIB"
98 | ]
99 | },
100 | {
101 | "cell_type": "code",
102 | "collapsed": false,
103 | "input": [
104 | "rnx2rtkp = '/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp'\n",
105 | "pos2kml = '/home/ruffin/git/RTKLIB/app/pos2kml/gcc/pos2kml'\n",
106 | "convbin = '/home/ruffin/git/RTKLIB/app/convbin/gcc/convbin'"
107 | ],
108 | "language": "python",
109 | "metadata": {},
110 | "outputs": [],
111 | "prompt_number": 4
112 | },
113 | {
114 | "cell_type": "markdown",
115 | "metadata": {},
116 | "source": [
117 | "Here we specify the \u201cin\u201d and \u201cout\u201d file path directories. These directories are used to hold multiple observation files for batch operations. For every ublox log file, we will generate a folder of the same name that will contain observations in RINEX format with in \u201cindir\u201d. \u201coutdir\u201d will be used later to store position solutions. We also checked to make sure that both directories exist in their path contains proper syntax."
118 | ]
119 | },
120 | {
121 | "cell_type": "code",
122 | "collapsed": false,
123 | "input": [
124 | "indir = '/home/ruffin/Documents/Data/in/'\n",
125 | "outdir = '/home/ruffin/Documents/Data/out/'\n",
126 | "indir = checkDir(indir,'r')\n",
127 | "outdir = checkDir(outdir,'r')"
128 | ],
129 | "language": "python",
130 | "metadata": {},
131 | "outputs": [],
132 | "prompt_number": 5
133 | },
134 | {
135 | "cell_type": "markdown",
136 | "metadata": {},
137 | "source": [
138 | "Here for every observation file found within the \u201cindir\u201d directory, we convert the log file into RINEX format, then download the relevant CORS data for that day and date."
139 | ]
140 | },
141 | {
142 | "cell_type": "code",
143 | "collapsed": false,
144 | "input": [
145 | "files = findFiles(indir,'.ubx')\n",
146 | "for file in files:\n",
147 | " convbinFile(indir, file, convbin)\n",
148 | " #fetchData(indir, file, server, hostPath, station)"
149 | ],
150 | "language": "python",
151 | "metadata": {},
152 | "outputs": [
153 | {
154 | "output_type": "stream",
155 | "stream": "stdout",
156 | "text": [
157 | "Running \n",
158 | "/home/ruffin/git/RTKLIB/app/convbin/gcc/convbin /home/ruffin/Documents/Data/in/CMU_center/CMU_center.ubx -v 3.02 -od -os -oi -ot -ol\n"
159 | ]
160 | }
161 | ],
162 | "prompt_number": 4
163 | },
164 | {
165 | "cell_type": "markdown",
166 | "metadata": {},
167 | "source": [
168 | "The \u201cindir\u201d directory will now be filled with CORS compressed archives files that will need to be decompressed."
169 | ]
170 | },
171 | {
172 | "cell_type": "code",
173 | "collapsed": false,
174 | "input": [
175 | "decompressData(indir)"
176 | ],
177 | "language": "python",
178 | "metadata": {},
179 | "outputs": [],
180 | "prompt_number": 8
181 | },
182 | {
183 | "cell_type": "markdown",
184 | "metadata": {},
185 | "source": [
186 | "Here we download the necessary NORAD Two-Line Element Sets that define each satellite trajectory."
187 | ]
188 | },
189 | {
190 | "cell_type": "code",
191 | "collapsed": false,
192 | "input": [
193 | "getURL(noradUrl,indir,noradFile)"
194 | ],
195 | "language": "python",
196 | "metadata": {},
197 | "outputs": []
198 | },
199 | {
200 | "cell_type": "markdown",
201 | "metadata": {},
202 | "source": [
203 | "(Optional for training machine learning)\n",
204 | "Here we apply a function that will generate more observation by extrapolating measurements from the recorded data set and applying a series of virtual instructions; a rotating half hemisphere that iterates between every ephemeris of all the observed satellites. This will create additional folders or versions of the original given dataset. Retrieving NORAD Two-Line Element data will be necessary before this step.\n"
205 | ]
206 | },
207 | {
208 | "cell_type": "code",
209 | "collapsed": false,
210 | "input": [
211 | "generateData(indir, noradFile)"
212 | ],
213 | "language": "python",
214 | "metadata": {},
215 | "outputs": [
216 | {
217 | "output_type": "stream",
218 | "stream": "stdout",
219 | "text": [
220 | "reading: /home/ruffin/Documents/Data/in/CMU_center/CMU_center.obs\n",
221 | "['G04', 'G23', 'G13', 'G17', 'G10']"
222 | ]
223 | },
224 | {
225 | "output_type": "stream",
226 | "stream": "stdout",
227 | "text": [
228 | "\n",
229 | "['G23', 'G13', 'G17', 'G10', 'G05']"
230 | ]
231 | },
232 | {
233 | "output_type": "stream",
234 | "stream": "stdout",
235 | "text": [
236 | "\n",
237 | "['G13', 'G17', 'G10', 'G05']"
238 | ]
239 | },
240 | {
241 | "output_type": "stream",
242 | "stream": "stdout",
243 | "text": [
244 | "\n",
245 | "['G17', 'G10', 'G05']"
246 | ]
247 | },
248 | {
249 | "output_type": "stream",
250 | "stream": "stdout",
251 | "text": [
252 | "\n",
253 | "['G10', 'G05', 'G12', 'G02']"
254 | ]
255 | },
256 | {
257 | "output_type": "stream",
258 | "stream": "stdout",
259 | "text": [
260 | "\n",
261 | "['G05', 'G12', 'G02', 'G25']"
262 | ]
263 | },
264 | {
265 | "output_type": "stream",
266 | "stream": "stdout",
267 | "text": [
268 | "\n",
269 | "['G04', 'G23', 'G12', 'G02', 'G25']"
270 | ]
271 | },
272 | {
273 | "output_type": "stream",
274 | "stream": "stdout",
275 | "text": [
276 | "\n",
277 | "['G04', 'G23', 'G13', 'G02', 'G25']"
278 | ]
279 | },
280 | {
281 | "output_type": "stream",
282 | "stream": "stdout",
283 | "text": [
284 | "\n",
285 | "['G04', 'G23', 'G13', 'G17', 'G25']"
286 | ]
287 | },
288 | {
289 | "output_type": "stream",
290 | "stream": "stdout",
291 | "text": [
292 | "\n"
293 | ]
294 | }
295 | ],
296 | "prompt_number": 9
297 | },
298 | {
299 | "cell_type": "markdown",
300 | "metadata": {},
301 | "source": [
302 | "Here we will iterate through every folder we've created, being careful we process in numerical order. This is where positional solutions are solved for and saved in to \u201coutdir\u201d, using the combination of observations from both the rover and base station. Static and kinematic solutions are generated for each unique observation. Where duplicated versions used for machine mining; kinematic pose solutions are also generated but static pose solutions are simply duplicated from the original observations folder to reduce competition time and provide the most accurate ground truth for error estimation in static points."
303 | ]
304 | },
305 | {
306 | "cell_type": "code",
307 | "collapsed": false,
308 | "input": [
309 | "folders = findFolders(indir)\n",
310 | "folders.sort(key=lambda x: x.lower())\n",
311 | "for folder in folders:\n",
312 | " print(folder)\n",
313 | " rnx2rtkpFile(indir, folder, outdir, station, rnx2rtkp)\n",
314 | " #pos2kmlFile(outdir, folder, outdir, pos2kml)\n",
315 | " "
316 | ],
317 | "language": "python",
318 | "metadata": {},
319 | "outputs": [
320 | {
321 | "output_type": "stream",
322 | "stream": "stdout",
323 | "text": [
324 | "CMU_center\n",
325 | "CMU_center\n",
326 | "\n",
327 | "Running \n",
328 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_static.conf -o /home/ruffin/Documents/Data/out/CMU_center/CMU_center_static.pos /home/ruffin/Documents/Data/in/CMU_center/CMU_center.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center/CMU_center.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center/CMU_center.sbs\n",
329 | "\n",
330 | "Running "
331 | ]
332 | },
333 | {
334 | "output_type": "stream",
335 | "stream": "stdout",
336 | "text": [
337 | "\n",
338 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_kinetic.conf -o /home/ruffin/Documents/Data/out/CMU_center/CMU_center_kinetic.pos /home/ruffin/Documents/Data/in/CMU_center/CMU_center.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center/CMU_center.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center/CMU_center.sbs\n",
339 | "CMU_center.v0"
340 | ]
341 | },
342 | {
343 | "output_type": "stream",
344 | "stream": "stdout",
345 | "text": [
346 | "\n",
347 | "CMU_center.v0\n",
348 | "Copy static from original\n",
349 | "\n",
350 | "Running \n",
351 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_kinetic.conf -o /home/ruffin/Documents/Data/out/CMU_center.v0/CMU_center.v0_kinetic.pos /home/ruffin/Documents/Data/in/CMU_center.v0/CMU_center.v0.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center.v0/CMU_center.v0.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center.v0/CMU_center.v0.sbs\n",
352 | "CMU_center.v1"
353 | ]
354 | },
355 | {
356 | "output_type": "stream",
357 | "stream": "stdout",
358 | "text": [
359 | "\n",
360 | "CMU_center.v1\n",
361 | "Copy static from original\n",
362 | "\n",
363 | "Running \n",
364 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_kinetic.conf -o /home/ruffin/Documents/Data/out/CMU_center.v1/CMU_center.v1_kinetic.pos /home/ruffin/Documents/Data/in/CMU_center.v1/CMU_center.v1.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center.v1/CMU_center.v1.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center.v1/CMU_center.v1.sbs\n",
365 | "CMU_center.v2"
366 | ]
367 | },
368 | {
369 | "output_type": "stream",
370 | "stream": "stdout",
371 | "text": [
372 | "\n",
373 | "CMU_center.v2\n",
374 | "Copy static from original\n",
375 | "\n",
376 | "Running \n",
377 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_kinetic.conf -o /home/ruffin/Documents/Data/out/CMU_center.v2/CMU_center.v2_kinetic.pos /home/ruffin/Documents/Data/in/CMU_center.v2/CMU_center.v2.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center.v2/CMU_center.v2.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center.v2/CMU_center.v2.sbs\n",
378 | "CMU_center.v3"
379 | ]
380 | },
381 | {
382 | "output_type": "stream",
383 | "stream": "stdout",
384 | "text": [
385 | "\n",
386 | "CMU_center.v3\n",
387 | "Copy static from original\n",
388 | "\n",
389 | "Running \n",
390 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_kinetic.conf -o /home/ruffin/Documents/Data/out/CMU_center.v3/CMU_center.v3_kinetic.pos /home/ruffin/Documents/Data/in/CMU_center.v3/CMU_center.v3.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center.v3/CMU_center.v3.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center.v3/CMU_center.v3.sbs\n",
391 | "CMU_center.v4"
392 | ]
393 | },
394 | {
395 | "output_type": "stream",
396 | "stream": "stdout",
397 | "text": [
398 | "\n",
399 | "CMU_center.v4\n",
400 | "Copy static from original\n",
401 | "\n",
402 | "Running \n",
403 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_kinetic.conf -o /home/ruffin/Documents/Data/out/CMU_center.v4/CMU_center.v4_kinetic.pos /home/ruffin/Documents/Data/in/CMU_center.v4/CMU_center.v4.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center.v4/CMU_center.v4.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center.v4/CMU_center.v4.sbs\n",
404 | "CMU_center.v5"
405 | ]
406 | },
407 | {
408 | "output_type": "stream",
409 | "stream": "stdout",
410 | "text": [
411 | "\n",
412 | "CMU_center.v5\n",
413 | "Copy static from original\n",
414 | "\n",
415 | "Running \n",
416 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_kinetic.conf -o /home/ruffin/Documents/Data/out/CMU_center.v5/CMU_center.v5_kinetic.pos /home/ruffin/Documents/Data/in/CMU_center.v5/CMU_center.v5.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center.v5/CMU_center.v5.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center.v5/CMU_center.v5.sbs\n",
417 | "CMU_center.v6"
418 | ]
419 | },
420 | {
421 | "output_type": "stream",
422 | "stream": "stdout",
423 | "text": [
424 | "\n",
425 | "CMU_center.v6\n",
426 | "Copy static from original\n",
427 | "\n",
428 | "Running \n",
429 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_kinetic.conf -o /home/ruffin/Documents/Data/out/CMU_center.v6/CMU_center.v6_kinetic.pos /home/ruffin/Documents/Data/in/CMU_center.v6/CMU_center.v6.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center.v6/CMU_center.v6.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center.v6/CMU_center.v6.sbs\n",
430 | "CMU_center.v7"
431 | ]
432 | },
433 | {
434 | "output_type": "stream",
435 | "stream": "stdout",
436 | "text": [
437 | "\n",
438 | "CMU_center.v7\n",
439 | "Copy static from original\n",
440 | "\n",
441 | "Running \n",
442 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_kinetic.conf -o /home/ruffin/Documents/Data/out/CMU_center.v7/CMU_center.v7_kinetic.pos /home/ruffin/Documents/Data/in/CMU_center.v7/CMU_center.v7.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center.v7/CMU_center.v7.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center.v7/CMU_center.v7.sbs\n",
443 | "CMU_center.v8"
444 | ]
445 | },
446 | {
447 | "output_type": "stream",
448 | "stream": "stdout",
449 | "text": [
450 | "\n",
451 | "CMU_center.v8\n",
452 | "Copy static from original\n",
453 | "\n",
454 | "Running \n",
455 | "/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp -k /home/ruffin/Documents/Data/in/rtkoptions_kinetic.conf -o /home/ruffin/Documents/Data/out/CMU_center.v8/CMU_center.v8_kinetic.pos /home/ruffin/Documents/Data/in/CMU_center.v8/CMU_center.v8.obs /home/ruffin/Documents/Data/in/paap2090.13o /home/ruffin/Documents/Data/in/CMU_center.v8/CMU_center.v8.nav /home/ruffin/Documents/Data/in/igr17510.sp3 /home/ruffin/Documents/Data/in/CMU_center.v8/CMU_center.v8.sbs\n"
456 | ]
457 | }
458 | ],
459 | "prompt_number": 12
460 | }
461 | ],
462 | "metadata": {}
463 | }
464 | ]
465 | }
--------------------------------------------------------------------------------
/Code/Old Code/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/Old Code/__init__.py
--------------------------------------------------------------------------------
/Code/Old Code/gps_pos_plot.py:
--------------------------------------------------------------------------------
1 | '''
2 | Created on Jun 26, 2013
3 |
4 | @author: ruffin
5 | '''
6 |
7 | if __name__ == '__main__':
8 | pass
9 | # from pylab import *
10 | from matplotlib import pyplot, mpl
11 | import numpy as np
12 | from geopy import point, distance
13 | from plot_utils import *
14 | from file_utils import *
15 |
16 | # str = input("Enter your input: ");
17 | # print("Received input is : ", str)
18 |
19 | #------------------------------------------------------------------------------
20 | # File Directory
21 | #------------------------------------------------------------------------------
22 | indir = '/home/ruffin/Documents/Data/in/'
23 | outdir = '/home/ruffin/Documents/Data/out/'
24 | # reference = point.Point(40.442635758, -79.943065017, 257)
25 | reference = point.Point(40.443874, -79.945517, 272)
26 |
27 | #------------------------------------------------------------------------------
28 | # Check output directory can be dumped to
29 | #------------------------------------------------------------------------------
30 | indir = checkDir(indir,'r')
31 | outdir = checkDir(outdir,'r')
32 |
33 | #------------------------------------------------------------------------------
34 | # Get log gps file name
35 | #------------------------------------------------------------------------------
36 | posFile = findFile(outdir,'.pos')
37 |
38 | #------------------------------------------------------------------------------
39 | # Parse data from log file
40 | #------------------------------------------------------------------------------
41 | data = parsePosFile(posFile)
42 |
43 |
44 | #------------------------------------------------------------------------------
45 | # Parse data from log file
46 | #------------------------------------------------------------------------------
47 | print('Interpolating Data')
48 | data = np.delete(data, 0, 0)
49 | data = data.T
50 | tdate = data[0]-data[0][0]
51 | lon = data[1]
52 | lat = data[2]
53 | elv = data[3]
54 | q = data[4]
55 | ns = data[5]
56 | sdn = data[6]
57 | sde = data[7]
58 | sdu = data[8]
59 | sdne = data[9]
60 | sdeu = data[10]
61 | sdun = data[11]
62 | age = data[12]
63 | ratio = data[13]
64 | dist = np.array([])
65 | distn = np.array([])
66 | diste = np.array([])
67 | distu = np.array([])
68 | distne = np.array([])
69 | disteu = np.array([])
70 | distun = np.array([])
71 | d = distance.distance
72 | for i in data.T:
73 | j = point.Point(i[1],i[2])
74 | k = point.Point(reference.latitude,i[2])
75 | l = point.Point(i[1],reference.longitude)
76 |
77 | j = d(reference, j).meters
78 | k = d(reference, k).meters*np.sign(i[1]-reference.latitude)
79 | l = d(reference, l).meters*np.sign(i[2]-reference.longitude)
80 | m = (i[3] - reference.altitude)
81 |
82 | n = np.core.sqrt(k**2+l**2)*np.sign(k)
83 | o = np.core.sqrt(l**2+m**2)*np.sign(l)
84 | p = np.core.sqrt(m**2+k**2)*np.sign(m)
85 |
86 | dist = np.append(dist, [j], axis=0)
87 | distn = np.append(distn, [k], axis=0)
88 | diste = np.append(diste, [l], axis=0)
89 | distu = np.append(distu, [m], axis=0)
90 | distne = np.append(distne, [n], axis=0)
91 | disteu = np.append(disteu, [o], axis=0)
92 | distun = np.append(distun, [p], axis=0)
93 |
94 | sd = ['sdn','sde','sdu','sdne','sdeu','sdun']
95 | sdd = ['distn','diste','distu','distne','disteu','distun']
96 | sddist = [distn,diste,distu,distne,disteu,distun]
97 |
98 | color = colors()
99 | latmin = lat.min()
100 | latmax = lat.max()
101 | lonmin = lon.min()
102 | lonmax = lon.max()
103 |
104 | latdif = latmax - latmin
105 | londif = lonmax - lonmin
106 |
107 | latminlim = lat.min() - latdif*0.1
108 | latmaxlim = lat.max() + latdif*0.1
109 | lonminlim = lon.min() - londif*0.1
110 | lonmaxlim = lon.max() + londif*0.1
111 |
112 | distnmin = distn.min()
113 | distnmax = distn.max()
114 | distemin = diste.min()
115 | distemax = diste.max()
116 |
117 | distndif = distnmax - distnmin
118 | distedif = distemax - distemin
119 |
120 | distnminlim = distn.min() - distndif*0.1
121 | distnmaxlim = distn.max() + distndif*0.1
122 | disteminlim = diste.min() - distedif*0.1
123 | distemaxlim = diste.max() + distedif*0.1
124 |
125 | distmax = dist.max()
126 | distmin = dist.min()
127 | distnorm = mpl.colors.Normalize(vmin=-distmax, vmax=0)
128 |
129 | #------------------------------------------------------------------------------
130 | # Plot data
131 | #------------------------------------------------------------------------------
132 | print('Generating Fig 1')
133 | # Create a new figure of size 10x6 points, using 80 dots per inch
134 | fig1 = pyplot.figure(figsize=(10,6), dpi=80)
135 | fig1.suptitle('Dist and Standard Deviation vs Time', fontsize=14, fontweight='bold')
136 | ax = fig1.add_subplot(1,1,1)
137 |
138 | # Make plots
139 | for i in range(6):
140 | sdx = data[i+6]
141 | ax.plot(tdate, sdx, label=sd[i])
142 |
143 | ax.plot(tdate,dist, label='dist')
144 | ax.grid(True)
145 | ax.set_ylabel('Meters')
146 | ax.set_xlabel('Seconds')
147 |
148 | # Make legend
149 | pyplot.legend(loc='upper left')
150 |
151 | # Save figure using 72 dots per inch
152 | # pyplot.savefig("exercice_2.png",dpi=72)
153 |
154 |
155 | print('Generating Fig 2')
156 | fig2 = pyplot.figure(figsize=(10,6), dpi=80)
157 | fig2.suptitle('Correlation of Standard Distribution vs Distance', fontsize=14, fontweight='bold')
158 |
159 | for i in range(6):
160 | sdx = data[i+6]
161 | color = -np.core.sqrt(sdx**2+sddist[i]**2)
162 | ax = fig2.add_subplot(2,3,i+1)
163 | ax.scatter(sdx,sddist[i], c=color, alpha=.2)
164 | ax.set_title(sd[i]+' vs ' +sdd[i])
165 | ax.set_ylabel(sdd[i]+' (m)')
166 | ax.set_xlabel(sd[i]+' (m)')
167 | ax.grid(True)
168 | print('Cross-correlation: ' + sd[i]+' vs ' +sdd[i])
169 | print(np.correlate(sdx, sddist[i]))
170 |
171 | # # Show result on screen
172 | # pyplot.show()
173 |
174 |
175 |
176 | print('Generating Fig 3')
177 | fig3 = pyplot.figure(figsize=(10,6), dpi=80)
178 | fig3.suptitle('Position and Standard Distribution', fontsize=14, fontweight='bold')
179 |
180 | sd = ['sdn','sde','sdu','sdne','sdeu','sdun']
181 |
182 | for i in range(6):
183 | sdx = data[i+6]
184 | ax = fig3.add_subplot(2, 3, i+1)
185 | p = ax.scatter(distn, diste, c=-abs(sdx), alpha=.2)
186 | fig3.colorbar(p,norm=distnorm)
187 | xlim(distnminlim, distnmaxlim)
188 | ylim(disteminlim, distemaxlim)
189 | ax.set_title('Pos w/ ' + sd[i])
190 | ax.set_ylabel('North')
191 | ax.set_xlabel('East')
192 | ax.get_xaxis().set_ticks([])
193 | ax.get_yaxis().set_ticks([])
194 | ax.grid(True)
195 |
196 | # 3D stuff
197 | # for i in range(1):
198 | # sdx = data[i+6]
199 | # ax = fig3.add_subplot(1, 1, i+1, projection='3d')
200 | #
201 | # ax.scatter(lat,lon, sdx, c=dist, edgecolors = 'none')
202 | # xlim(latminlim, latmaxlim)
203 | # ylim(lonminlim, lonmaxlim)
204 | # ax.set_title('Pos w/ ' + sd[i])
205 | # ax.set_xlabel('lat (deg)')
206 | # ax.set_ylabel('lon (deg)')
207 | # ax.set_ylabel(sd[i] +' (m)')
208 |
209 |
210 | # Show result on screen
211 |
212 | print('Generating Fig 4')
213 | fig4 = pyplot.figure(figsize=(10,6), dpi=80)
214 | fig4.suptitle('Position and Standard Distribution', fontsize=14, fontweight='bold')
215 | ax = fig4.add_subplot(1, 1, 1)
216 | p = ax.scatter(distn,diste, c=-dist, alpha=.2)
217 | fig4.colorbar(p, norm=distnorm)
218 | xlim(distnminlim, distnmaxlim)
219 | ylim(disteminlim, distemaxlim)
220 | ax.set_xlabel('East (meters)')
221 | ax.set_ylabel('North (meters)')
222 | ax.grid(True)
223 |
224 |
225 | pyplot.show()
--------------------------------------------------------------------------------
/Code/Old Code/plot-stuff.py:
--------------------------------------------------------------------------------
1 | '''
2 | Created on Jun 26, 2013
3 |
4 | @author: ruffin
5 | '''
6 |
7 | if __name__ == '__main__':
8 | pass
9 |
10 | # Import everything from matplotlib (numpy is accessible via 'np' alias)
11 | from pylab import *
12 |
13 | # Create a new figure of size 8x6 points, using 80 dots per inch
14 | # figure(figsize=(8,6), dpi=80)
15 | figure(figsize=(10,6), dpi=80)
16 |
17 | # Create a new subplot from a grid of 1x1
18 | subplot(1,1,1)
19 |
20 | X = np.linspace(-np.pi, np.pi, 256,endpoint=True)
21 | C,S = np.cos(X), np.sin(X)
22 |
23 | # Plot cosine using blue color with a continuous line of width 1 (pixels)
24 | # plot(X, C, color="blue", linewidth=1.0, linestyle="-")
25 | # Plot sine using green color with a continuous line of width 1 (pixels)
26 | # plot(X, S, color="green", linewidth=1.0, linestyle="--")
27 | plot(X, C, color="blue", linewidth=2.5, linestyle="-")
28 | plot(X, S, color="red", linewidth=2.5, linestyle="-")
29 |
30 | # Set x limits
31 | # xlim(-4.0,4.0)
32 | # Set y limits
33 | # ylim(-1.0,1.0)
34 | xlim(X.min()*1.1, X.max()*1.1)
35 | ylim(C.min()*1.1, C.max()*1.1)
36 |
37 | ax = gca()
38 | ax.spines['right'].set_color('none')
39 | ax.spines['top'].set_color('none')
40 | ax.xaxis.set_ticks_position('bottom')
41 | ax.spines['bottom'].set_position(('data',0))
42 | ax.yaxis.set_ticks_position('left')
43 | ax.spines['left'].set_position(('data',0))
44 |
45 | # Set x ticks
46 | # xticks(np.linspace(-4,4,9,endpoint=True))
47 | # Set y ticks
48 | # yticks(np.linspace(-1,1,5,endpoint=True))
49 | # xticks( [-np.pi, -np.pi/2, 0, np.pi/2, np.pi])
50 | # yticks([-1, 0, +1])
51 | xticks([-np.pi, -np.pi/2, 0, np.pi/2, np.pi],
52 | [r'$-\pi$', r'$-\pi/2$', r'$0$', r'$+\pi/2$', r'$+\pi$'])
53 |
54 | yticks([-1, 0, +1],
55 | [r'$-1$', r'', r'$+1$'])
56 |
57 | for label in ax.get_xticklabels() + ax.get_yticklabels():
58 | label.set_fontsize(16)
59 | label.set_bbox(dict(facecolor='white', edgecolor='None', alpha=0.65 ))
60 |
61 | plot(X, C, color="blue", linewidth=2.5, linestyle="-", label="cosine")
62 | plot(X, S, color="red", linewidth=2.5, linestyle="-", label="sine")
63 |
64 | legend(loc='upper left')
65 |
66 | t = 2*np.pi/3
67 | plot([t,t],[0,np.cos(t)], color ='blue', linewidth=2.5, linestyle="--")
68 | scatter([t,],[np.cos(t),], 50, color ='blue')
69 |
70 | annotate(r'$\sin(\frac{2\pi}{3})=\frac{\sqrt{3}}{2}$',
71 | xy=(t, np.sin(t)), xycoords='data',
72 | xytext=(+10, +30), textcoords='offset points', fontsize=16,
73 | arrowprops=dict(arrowstyle="->", connectionstyle="arc3,rad=.2"))
74 |
75 | plot([t,t],[0,np.sin(t)], color ='red', linewidth=2.5, linestyle="--")
76 | scatter([t,],[np.sin(t),], 50, color ='red')
77 |
78 | annotate(r'$\cos(\frac{2\pi}{3})=-\frac{1}{2}$',
79 | xy=(t, np.cos(t)), xycoords='data',
80 | xytext=(-90, -50), textcoords='offset points', fontsize=16,
81 | arrowprops=dict(arrowstyle="->", connectionstyle="arc3,rad=.2"))
82 |
83 | # Save figure using 72 dots per inch
84 | # savefig("exercice_2.png",dpi=72)
85 |
86 | # Show result on screen
87 | show()
--------------------------------------------------------------------------------
/Code/Old Code/plot_test.py:
--------------------------------------------------------------------------------
1 | '''
2 | Created on Jul 1, 2013
3 |
4 | @author: ruffin
5 | '''
6 |
7 | if __name__ == '__main__':
8 | pass
9 | from mpl_toolkits.mplot3d import Axes3D
10 | import numpy as np
11 | import matplotlib.pyplot as plt
12 |
13 | fig = plt.figure()
14 | ax = fig.gca(projection='3d')
15 |
16 | x = np.linspace(0, 1, 100)
17 | y = np.sin(x * 2 * np.pi) / 2 + 0.5
18 | ax.plot(x, y, zs=0, zdir='z', label='zs=0, zdir=z')
19 |
20 | colors = ('r', 'g', 'b', 'k')
21 | for c in colors:
22 | x = np.random.sample(20)
23 | y = np.random.sample(20)
24 | ax.scatter(x, y, 0, zdir='y', c=c)
25 |
26 | ax.legend()
27 | ax.set_xlim3d(0, 1)
28 | ax.set_ylim3d(0, 1)
29 | # ax.set_zlim3d(0, 1)
30 |
31 | plt.show()
32 |
--------------------------------------------------------------------------------
/Code/Old Code/test.py:
--------------------------------------------------------------------------------
1 | '''
2 | Created on Jun 14, 2013
3 |
4 | @author: ruffin
5 | '''
6 |
7 |
8 | if __name__ == '__main__':
9 | pass
10 |
11 | import numpy as np
12 | import matplotlib.pyplot as plt
13 | import scipy.spatial as spatial
14 |
15 | def fmt(x, y):
16 | return 'x: {x:0.2f}\ny: {y:0.2f}'.format(x=x, y=y)
17 |
18 | class FollowDotCursor(object):
19 | """Display the x,y location of the nearest data point.
20 | """
21 | def __init__(self, ax, x, y, tolerance=5, formatter=fmt, offsets=(-20, 20)):
22 | try:
23 | x = np.asarray(x, dtype='float')
24 | except (TypeError, ValueError):
25 | x = np.asarray(mdates.date2num(x), dtype='float')
26 | y = np.asarray(y, dtype='float')
27 | self._points = np.column_stack((x, y))
28 | self.offsets = offsets
29 | self.scale = x.ptp()
30 | self.scale = y.ptp() / self.scale if self.scale else 1
31 | self.tree = spatial.cKDTree(self.scaled(self._points))
32 | self.formatter = formatter
33 | self.tolerance = tolerance
34 | self.ax = ax
35 | self.fig = ax.figure
36 | self.ax.xaxis.set_label_position('top')
37 | self.dot = ax.scatter(
38 | [x.min()], [y.min()], s=130, color='green', alpha=0.7)
39 | self.annotation = self.setup_annotation()
40 | plt.connect('motion_notify_event', self)
41 |
42 | def scaled(self, points):
43 | points = np.asarray(points)
44 | return points * (self.scale, 1)
45 |
46 | def __call__(self, event):
47 | ax = self.ax
48 | # event.inaxes is always the current axis. If you use twinx, ax could be
49 | # a different axis.
50 | if event.inaxes == ax:
51 | x, y = event.xdata, event.ydata
52 | elif event.inaxes is None:
53 | return
54 | else:
55 | inv = ax.transData.inverted()
56 | x, y = inv.transform([(event.x, event.y)]).ravel()
57 | annotation = self.annotation
58 | x, y = self.snap(x, y)
59 | annotation.xy = x, y
60 | annotation.set_text(self.formatter(x, y))
61 | self.dot.set_offsets((x, y))
62 | bbox = ax.viewLim
63 | event.canvas.draw()
64 |
65 | def setup_annotation(self):
66 | """Draw and hide the annotation box."""
67 | annotation = self.ax.annotate(
68 | '', xy=(0, 0), ha = 'right',
69 | xytext = self.offsets, textcoords = 'offset points', va = 'bottom',
70 | bbox = dict(
71 | boxstyle='round,pad=0.5', fc='yellow', alpha=0.75),
72 | arrowprops = dict(
73 | arrowstyle='->', connectionstyle='arc3,rad=0'))
74 | return annotation
75 |
76 | def snap(self, x, y):
77 | """Return the value in self.tree closest to x, y."""
78 | dist, idx = self.tree.query(self.scaled((x, y)), k=1, p=1)
79 | try:
80 | return self._points[idx]
81 | except IndexError:
82 | # IndexError: index out of bounds
83 | return self._points[0]
84 |
85 | x = np.random.normal(0,1.0,100)
86 | y = np.random.normal(0,1.0,100)
87 | fig, ax = plt.subplots()
88 |
89 | cursor = FollowDotCursor(ax, x, y, formatter=fmt, tolerance=20)
90 | scatter_plot = plt.scatter(x, y, facecolor="b", marker="o")
91 |
92 | #update the colour
93 | new_facecolors = ["r","g"]*50
94 | scatter_plot.set_facecolors(new_facecolors)
95 |
96 | plt.show()
97 |
98 | # suffix = 'wow!!!';
99 | # print(suffix)
100 | # print(suffix.endswith(suffix))
101 | #
102 | # weight = float(input("How many pounds does your suitcase weigh? "))
103 | # if weight > 50:
104 | # print("There is a $25 charge for luggage that heavy.")
105 | # print("Thank you for your business.")
106 |
107 | #
108 | # num = int(ymdhms.split()[0])
109 | # print(num)
110 | # print(type(num))
111 |
112 | #
113 | # print (tdate.strftime("%d-%b-%Y %H:%M:%S"))
114 | # print (tnow.strftime("%d-%b-%Y %H:%M:%S"))
115 |
116 |
117 | # lol = subprocess.Popen('ls', cwd= outdir)
118 | # lol.wait()
119 | # print(lol.stdout)
120 | # subprocess.call(['ls', '-l'])
121 | # output = subprocess.check_output('grep "TIME OF FIRST OBS" /home/ruffin/Documents/Data/in/COM10_130604_230821.obs')
122 |
123 |
124 | # fcorfile = open(indir + corfile,'wb')
125 | # ftp.retrbinary('RETR ' + corfile, fcorfile.write)
126 | # fcorfile.close()
127 |
128 |
129 | # subprocess.check_output(['rnx2rtkp','-k', indir + 'rtkoptions.conf','-o', outdir + namefile + '.pos', indir + obsfile, indir + navfile, indir + sp3file, '-d', '-r', indir])
130 | # subprocess.check_output(['gzip', '-d', '-r', indir])
131 | #
132 | # pos2kml COM10_130604_230821.pos
133 |
134 | # rnx2rtkp -k rtkoptions.conf -o COM10_130604_230821.pos COM10_130604_230821.obs COM10_130604_230821.nav paap1550.13o igr17432.sp3
135 | # pos2kml COM10_130604_230821.pos
136 |
137 |
138 | # subprocess.check_output(' '.join(command1), shell=True)
139 | # subprocess.check_output(' '.join(command2), shell=True)
--------------------------------------------------------------------------------
/Code/Old Code/ublox_script.py:
--------------------------------------------------------------------------------
1 | '''
2 | Created on Jun 10, 2013
3 |
4 | @author: whitemrj
5 | '''
6 | if __name__ == '__main__':
7 | pass
8 |
9 | import subprocess
10 | import os, sys
11 | import locale
12 | from datetime import datetime
13 | from file_utils import *
14 |
15 | encoding = locale.getdefaultlocale()[1]
16 | #------------------------------------------------------------------------------
17 | # File Directory
18 | #------------------------------------------------------------------------------
19 | indir = '/home/ruffin/Documents/Data/in/'
20 | outdir = '/home/ruffin/Documents/Data/out/'
21 | server = 'ftp.ngs.noaa.gov'
22 | hostPath = '/cors/rinex/'
23 | station = 'paap'
24 | rnx2rtkp = '/home/ruffin/git/RTKLIB/app/rnx2rtkp/gcc/rnx2rtkp'
25 | pos2kml = '/home/ruffin/git/RTKLIB/app/pos2kml/gcc/pos2kml'
26 | convbin = '/home/ruffin/git/RTKLIB/app/convbin/gcc/convbin'
27 | google_earth = '/usr/bin/google-earth'
28 |
29 |
30 |
31 | #------------------------------------------------------------------------------
32 | # Check output directory can be dumped to
33 | #------------------------------------------------------------------------------
34 | # print('Starting ublox script')
35 | if not outdir.endswith('/'):
36 | outdir += '/'
37 | if not os.path.exists(outdir):
38 | os.makedirs(outdir)
39 |
40 | #------------------------------------------------------------------------------
41 | # Convert log files
42 | #------------------------------------------------------------------------------
43 | # print('Checking ublox logs')
44 | os.chdir(indir)
45 | for file in os.listdir("."):
46 | if file.endswith(".ubx"):
47 | ubxfile = file
48 | namefile = os.path.splitext(ubxfile)[0]
49 | print('File name base found to be:\n' + namefile, end='\n\n')
50 |
51 | # print('Converting ublox logs')
52 | command0 = ([convbin, namefile + '.ubx'])
53 | print('Running ')
54 | print(' '.join(command0))
55 | subprocess.check_output(command0)
56 |
57 |
58 | #------------------------------------------------------------------------------
59 | # Extract time stamp from log files
60 | #------------------------------------------------------------------------------
61 | for file in os.listdir("."):
62 | if file.endswith(".obs"):
63 | obsfile = file
64 |
65 | # print('Parsing time from data')
66 | ymdhms = subprocess.check_output(['grep', 'TIME OF FIRST OBS', indir+obsfile]).decode(encoding)
67 | tdate = datetime.strptime(ymdhms[:42], ' %Y %m %d %H %M %S.%f')
68 | tnow = datetime.now()
69 | print('Recorded Date')
70 | print(tdate, end='\n\n')
71 | print('Current Date')
72 | print(tnow, end='\n\n')
73 | dt = tnow - tdate
74 | print('Date Diffrince')
75 | print(dt, end='\n\n')
76 |
77 | #------------------------------------------------------------------------------
78 | # Get files from FTP server
79 | #------------------------------------------------------------------------------
80 | # print('Fetching NOAA CORS corrections')
81 | corfile = station + tdate.strftime("%j0.%y") + 'o.gz'
82 | navfile = station + tdate.strftime("%j0.%y") + 'd.Z'
83 | hostPath = hostPath + tdate.strftime("%Y/%j/")
84 |
85 | ftp = FTP(server)
86 | ftp.login()
87 | print('FTP Login', end = '\n\n')
88 | # print(ftp.getwelcome(), end='\n\n')
89 |
90 | print('FTP Current Working Directory\n' + hostPath, end='\n\n')
91 |
92 | # print('Fetching updated ephemerides')
93 | if fetchFiles(ftp, hostPath, indir, 'igs*'):
94 | print('No IGS file yet')
95 | if fetchFiles(ftp, hostPath, indir, 'igr*'):
96 | print('No IGR file yet')
97 | if fetchFiles(ftp, hostPath, indir, 'igu*'):
98 | print('Not even an IGU file yet')
99 | print('Have a little patients!')
100 |
101 | hostPath = hostPath + station
102 | print('FTP Current Working Directory\n' + hostPath, end='\n\n')
103 |
104 | # print('FTP List')
105 | # ftp.retrlines('LIST')
106 | # print()
107 |
108 | # print('Fetching station broadcasts')
109 | if fetchFiles(ftp, hostPath, indir):
110 | print('No data files yet')
111 |
112 | ftp.quit()
113 |
114 |
115 | #------------------------------------------------------------------------------
116 | # Decompress downloaded files
117 | #------------------------------------------------------------------------------
118 | # print('Uncompressing fetched data')
119 | subprocess.check_output(['gzip', '-d', '-f', '-r', indir])
120 |
121 |
122 | #------------------------------------------------------------------------------
123 | # Sort downloaded files
124 | #------------------------------------------------------------------------------
125 | os.chdir(indir)
126 | for file in os.listdir("."):
127 | if file.endswith(".nav"):
128 | navfile = file
129 | if file.endswith(".13o"):
130 | o13file = file
131 | if file.endswith(".sp3"):
132 | sp3file = file
133 |
134 |
135 | #------------------------------------------------------------------------------
136 | # Run RTKLIB process
137 | #------------------------------------------------------------------------------
138 | # print('Running RTK solution')
139 | command1 = ([rnx2rtkp,'-k', indir + 'rtkoptions.conf','-o', outdir + namefile + '.pos', indir + obsfile, indir + navfile, indir + o13file, indir + sp3file])
140 | command2 = ([pos2kml, outdir + namefile + '.pos'])
141 | command3 = ([google_earth, outdir + namefile + '.kml'])
142 |
143 |
144 | print('\nRunning ')
145 | print(' '.join(command1))
146 | subprocess.check_output(command1)
147 | print('\nRunning ')
148 | print(' '.join(command2))
149 | subprocess.check_output(command2)
150 | # print(' '.join(command3))
151 | # subprocess.check_output(command3)
--------------------------------------------------------------------------------
/Code/Old Notebooks/GPS Error Classifier.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": ""
4 | },
5 | "nbformat": 3,
6 | "nbformat_minor": 0,
7 | "worksheets": [
8 | {
9 | "cells": [
10 | {
11 | "cell_type": "code",
12 | "collapsed": false,
13 | "input": [
14 | "from matplotlib import pyplot, mpl\n",
15 | "from pandas.lib import Timestamp\n",
16 | "import pandas as pd\n",
17 | "import scipy as sp\n",
18 | "import numpy as np\n",
19 | "import pylab as pl\n",
20 | "import geopy as gp\n",
21 | "import sklearn as skp\n",
22 | "from Utils.plot_utils import *\n",
23 | "from Utils.file_utils import *"
24 | ],
25 | "language": "python",
26 | "metadata": {},
27 | "outputs": [],
28 | "prompt_number": 8
29 | },
30 | {
31 | "cell_type": "code",
32 | "collapsed": false,
33 | "input": [
34 | "indir = '/home/ruffin/Documents/Data/in/'\n",
35 | "outdir = '/home/ruffin/Documents/Data/out/'\n",
36 | "indir = checkDir(indir,'r')\n",
37 | "outdir = checkDir(outdir,'r')\n",
38 | "posFile = findFile(outdir,'.pos')\n",
39 | "print(posFile)"
40 | ],
41 | "language": "python",
42 | "metadata": {},
43 | "outputs": [
44 | {
45 | "output_type": "stream",
46 | "stream": "stdout",
47 | "text": [
48 | "Checking directory: /home/ruffin/Documents/Data/in/\n",
49 | "Checking directory: /home/ruffin/Documents/Data/out/\n",
50 | "CMU Center static part.pos\n"
51 | ]
52 | }
53 | ],
54 | "prompt_number": 11
55 | },
56 | {
57 | "cell_type": "code",
58 | "collapsed": false,
59 | "input": [
60 | "skiprow = 0\n",
61 | "with open(posFile) as search:\n",
62 | " for i, line in enumerate(search):\n",
63 | " if \"% GPST\" in line:\n",
64 | " skiprow = i\n",
65 | " break\n",
66 | "df = pd.read_csv(posFile, skiprows=skiprow, delim_whitespace=True, parse_dates=[[0, 1]])\n",
67 | "df.head(10)"
68 | ],
69 | "language": "python",
70 | "metadata": {},
71 | "outputs": [
72 | {
73 | "html": [
74 | "
\n",
75 | "
\n",
76 | " \n",
77 | " \n",
78 | " | \n",
79 | " %_GPST | \n",
80 | " latitude(deg) | \n",
81 | " longitude(deg) | \n",
82 | " height(m) | \n",
83 | " Q | \n",
84 | " ns | \n",
85 | " sdn(m) | \n",
86 | " sde(m) | \n",
87 | " sdu(m) | \n",
88 | " sdne(m) | \n",
89 | " sdeu(m) | \n",
90 | " sdun(m) | \n",
91 | " age(s) | \n",
92 | " ratio | \n",
93 | "
\n",
94 | " \n",
95 | " \n",
96 | " \n",
97 | " 0 | \n",
98 | " 2013-06-04 23:09:00 | \n",
99 | " 40.442636 | \n",
100 | " -79.943065 | \n",
101 | " 258.6469 | \n",
102 | " 2 | \n",
103 | " 11 | \n",
104 | " 0.0208 | \n",
105 | " 0.0156 | \n",
106 | " 0.0446 | \n",
107 | " -0.0062 | \n",
108 | " 0.0030 | \n",
109 | " -0.0178 | \n",
110 | " 0.0 | \n",
111 | " 1.7 | \n",
112 | "
\n",
113 | " \n",
114 | " 1 | \n",
115 | " 2013-06-04 23:09:00.200000 | \n",
116 | " 40.442636 | \n",
117 | " -79.943064 | \n",
118 | " 258.9419 | \n",
119 | " 2 | \n",
120 | " 11 | \n",
121 | " 0.0382 | \n",
122 | " 0.0269 | \n",
123 | " 0.0641 | \n",
124 | " -0.0098 | \n",
125 | " 0.0178 | \n",
126 | " -0.0233 | \n",
127 | " -29.8 | \n",
128 | " 1.7 | \n",
129 | "
\n",
130 | " \n",
131 | " 2 | \n",
132 | " 2013-06-04 23:09:00.400000 | \n",
133 | " 40.442636 | \n",
134 | " -79.943064 | \n",
135 | " 258.9461 | \n",
136 | " 2 | \n",
137 | " 11 | \n",
138 | " 0.0380 | \n",
139 | " 0.0267 | \n",
140 | " 0.0639 | \n",
141 | " -0.0097 | \n",
142 | " 0.0177 | \n",
143 | " -0.0233 | \n",
144 | " -29.6 | \n",
145 | " 1.7 | \n",
146 | "
\n",
147 | " \n",
148 | " 3 | \n",
149 | " 2013-06-04 23:09:00.600000 | \n",
150 | " 40.442636 | \n",
151 | " -79.943064 | \n",
152 | " 258.9477 | \n",
153 | " 2 | \n",
154 | " 11 | \n",
155 | " 0.0378 | \n",
156 | " 0.0266 | \n",
157 | " 0.0636 | \n",
158 | " -0.0097 | \n",
159 | " 0.0175 | \n",
160 | " -0.0232 | \n",
161 | " -29.4 | \n",
162 | " 1.7 | \n",
163 | "
\n",
164 | " \n",
165 | " 4 | \n",
166 | " 2013-06-04 23:09:00.800000 | \n",
167 | " 40.442636 | \n",
168 | " -79.943064 | \n",
169 | " 258.9534 | \n",
170 | " 2 | \n",
171 | " 11 | \n",
172 | " 0.0376 | \n",
173 | " 0.0265 | \n",
174 | " 0.0634 | \n",
175 | " -0.0097 | \n",
176 | " 0.0174 | \n",
177 | " -0.0231 | \n",
178 | " -29.2 | \n",
179 | " 1.6 | \n",
180 | "
\n",
181 | " \n",
182 | " 5 | \n",
183 | " 2013-06-04 23:09:01 | \n",
184 | " 40.442636 | \n",
185 | " -79.943064 | \n",
186 | " 258.9547 | \n",
187 | " 2 | \n",
188 | " 11 | \n",
189 | " 0.0375 | \n",
190 | " 0.0264 | \n",
191 | " 0.0632 | \n",
192 | " -0.0096 | \n",
193 | " 0.0173 | \n",
194 | " -0.0230 | \n",
195 | " -29.0 | \n",
196 | " 1.6 | \n",
197 | "
\n",
198 | " \n",
199 | " 6 | \n",
200 | " 2013-06-04 23:09:01.200000 | \n",
201 | " 40.442636 | \n",
202 | " -79.943064 | \n",
203 | " 258.9594 | \n",
204 | " 2 | \n",
205 | " 11 | \n",
206 | " 0.0373 | \n",
207 | " 0.0262 | \n",
208 | " 0.0629 | \n",
209 | " -0.0096 | \n",
210 | " 0.0171 | \n",
211 | " -0.0229 | \n",
212 | " -28.8 | \n",
213 | " 1.6 | \n",
214 | "
\n",
215 | " \n",
216 | " 7 | \n",
217 | " 2013-06-04 23:09:01.400000 | \n",
218 | " 40.442636 | \n",
219 | " -79.943064 | \n",
220 | " 258.9595 | \n",
221 | " 2 | \n",
222 | " 11 | \n",
223 | " 0.0371 | \n",
224 | " 0.0261 | \n",
225 | " 0.0627 | \n",
226 | " -0.0096 | \n",
227 | " 0.0170 | \n",
228 | " -0.0229 | \n",
229 | " -28.6 | \n",
230 | " 1.5 | \n",
231 | "
\n",
232 | " \n",
233 | " 8 | \n",
234 | " 2013-06-04 23:09:01.600000 | \n",
235 | " 40.442636 | \n",
236 | " -79.943064 | \n",
237 | " 258.9645 | \n",
238 | " 2 | \n",
239 | " 11 | \n",
240 | " 0.0369 | \n",
241 | " 0.0260 | \n",
242 | " 0.0625 | \n",
243 | " -0.0095 | \n",
244 | " 0.0169 | \n",
245 | " -0.0228 | \n",
246 | " -28.4 | \n",
247 | " 1.4 | \n",
248 | "
\n",
249 | " \n",
250 | " 9 | \n",
251 | " 2013-06-04 23:09:01.800000 | \n",
252 | " 40.442636 | \n",
253 | " -79.943064 | \n",
254 | " 258.9656 | \n",
255 | " 2 | \n",
256 | " 11 | \n",
257 | " 0.0367 | \n",
258 | " 0.0259 | \n",
259 | " 0.0622 | \n",
260 | " -0.0095 | \n",
261 | " 0.0167 | \n",
262 | " -0.0227 | \n",
263 | " -28.2 | \n",
264 | " 1.4 | \n",
265 | "
\n",
266 | " \n",
267 | "
\n",
268 | "
"
269 | ],
270 | "output_type": "pyout",
271 | "prompt_number": 19,
272 | "text": [
273 | " %_GPST latitude(deg) longitude(deg) height(m) Q ns sdn(m) \\\n",
274 | "0 2013-06-04 23:09:00 40.442636 -79.943065 258.6469 2 11 0.0208 \n",
275 | "1 2013-06-04 23:09:00.200000 40.442636 -79.943064 258.9419 2 11 0.0382 \n",
276 | "2 2013-06-04 23:09:00.400000 40.442636 -79.943064 258.9461 2 11 0.0380 \n",
277 | "3 2013-06-04 23:09:00.600000 40.442636 -79.943064 258.9477 2 11 0.0378 \n",
278 | "4 2013-06-04 23:09:00.800000 40.442636 -79.943064 258.9534 2 11 0.0376 \n",
279 | "5 2013-06-04 23:09:01 40.442636 -79.943064 258.9547 2 11 0.0375 \n",
280 | "6 2013-06-04 23:09:01.200000 40.442636 -79.943064 258.9594 2 11 0.0373 \n",
281 | "7 2013-06-04 23:09:01.400000 40.442636 -79.943064 258.9595 2 11 0.0371 \n",
282 | "8 2013-06-04 23:09:01.600000 40.442636 -79.943064 258.9645 2 11 0.0369 \n",
283 | "9 2013-06-04 23:09:01.800000 40.442636 -79.943064 258.9656 2 11 0.0367 \n",
284 | "\n",
285 | " sde(m) sdu(m) sdne(m) sdeu(m) sdun(m) age(s) ratio \n",
286 | "0 0.0156 0.0446 -0.0062 0.0030 -0.0178 0.0 1.7 \n",
287 | "1 0.0269 0.0641 -0.0098 0.0178 -0.0233 -29.8 1.7 \n",
288 | "2 0.0267 0.0639 -0.0097 0.0177 -0.0233 -29.6 1.7 \n",
289 | "3 0.0266 0.0636 -0.0097 0.0175 -0.0232 -29.4 1.7 \n",
290 | "4 0.0265 0.0634 -0.0097 0.0174 -0.0231 -29.2 1.6 \n",
291 | "5 0.0264 0.0632 -0.0096 0.0173 -0.0230 -29.0 1.6 \n",
292 | "6 0.0262 0.0629 -0.0096 0.0171 -0.0229 -28.8 1.6 \n",
293 | "7 0.0261 0.0627 -0.0096 0.0170 -0.0229 -28.6 1.5 \n",
294 | "8 0.0260 0.0625 -0.0095 0.0169 -0.0228 -28.4 1.4 \n",
295 | "9 0.0259 0.0622 -0.0095 0.0167 -0.0227 -28.2 1.4 "
296 | ]
297 | }
298 | ],
299 | "prompt_number": 19
300 | },
301 | {
302 | "cell_type": "code",
303 | "collapsed": false,
304 | "input": [
305 | "df.describe()"
306 | ],
307 | "language": "python",
308 | "metadata": {},
309 | "outputs": []
310 | },
311 | {
312 | "cell_type": "code",
313 | "collapsed": false,
314 | "input": [
315 | "# Center CMU\n",
316 | "reference = point.Point(40.442635758, -79.943065017, 258.7)\n",
317 | "# Center CMU 2\n",
318 | "#reference = point.Point(40.4410859, -79.94852589, 243.5994)\n",
319 | "# Office Window\n",
320 | "#reference = point.Point(40.443874, -79.945517, 272)\n",
321 | "# Under Bridge Center\n",
322 | "#reference = point.Point(40.441369, -79.949325, 254)\n",
323 | "\n",
324 | "df['sd(m)'] = np.sqrt(df['sdn(m)']**2+df['sde(m)']**2+df['sdu(m)']**2+df['sdne(m)']**2+df['sdeu(m)']**2+df['sdun(m)']**2)\n",
325 | "df['dist(m)'] = 0.0\n",
326 | "df['distn(m)'] = 0.0\n",
327 | "df['diste(m)'] = 0.0\n",
328 | "df['distu(m)'] = 0.0\n",
329 | "d = distance.distance\n",
330 | "for i in df.index :\n",
331 | " j = point.Point(df['latitude(deg)'][i],df['longitude(deg)'][i])\n",
332 | " k = point.Point(reference.latitude,df['longitude(deg)'][i])\n",
333 | " l = point.Point(df['latitude(deg)'][i],reference.longitude)\n",
334 | " \n",
335 | " lat1 = math.radians(reference.latitude)\n",
336 | " lon1 = math.radians(reference.longitude)\n",
337 | " lat2 = math.radians(df['latitude(deg)'][i])\n",
338 | " lon2 = math.radians(df['longitude(deg)'][i])\n",
339 | " \n",
340 | " dLon = lon2 - lon1;\n",
341 | " y = math.sin(dLon) * math.cos(lat2);\n",
342 | " x = math.cos(lat1)*math.sin(lat2) - math.sin(lat1)*math.cos(lat2)*math.cos(dLon);\n",
343 | " brng = math.atan2(y, x);\n",
344 | " north = math.cos(brng)\n",
345 | " east = math.sin(brng)\n",
346 | " \n",
347 | " j = d(reference, j).meters\n",
348 | " k = d(reference, k).meters*np.sign(east)\n",
349 | " l = d(reference, l).meters*np.sign(north)\n",
350 | " m = (df['height(m)'][i] - reference.altitude)\n",
351 | " q = np.core.sqrt(j**2+m**2)\n",
352 | " \n",
353 | " df['dist(m)'][i] = q\n",
354 | " df['distn(m)'][i] = k\n",
355 | " df['diste(m)'][i] = l\n",
356 | " df['distu(m)'][i] = m\n",
357 | "df.describe()"
358 | ],
359 | "language": "python",
360 | "metadata": {},
361 | "outputs": []
362 | },
363 | {
364 | "cell_type": "code",
365 | "collapsed": false,
366 | "input": [
367 | "plt.figure()\n",
368 | "df[['sd(m)','sdn(m)','sde(m)','sdu(m)']].plot()\n",
369 | "plt.legend(loc='upper right')"
370 | ],
371 | "language": "python",
372 | "metadata": {},
373 | "outputs": []
374 | },
375 | {
376 | "cell_type": "code",
377 | "collapsed": false,
378 | "input": [
379 | "plt.figure()\n",
380 | "df[['distn(m)','diste(m)','distu(m)','dist(m)']].plot()\n",
381 | "plt.legend(loc='upper right')"
382 | ],
383 | "language": "python",
384 | "metadata": {},
385 | "outputs": []
386 | },
387 | {
388 | "cell_type": "code",
389 | "collapsed": false,
390 | "input": [
391 | "dff = df[df['dist(m)'] <= .1]\n",
392 | "print('Generating Fig 8')\n",
393 | "fig8 = pyplot.figure(figsize=(18,12), dpi=80)\n",
394 | "ax = fig8.add_subplot(1,1,1)\n",
395 | "dff.hist(column=['sdn(m)','sde(m)','sdu(m)'], bins=50,ax=ax)"
396 | ],
397 | "language": "python",
398 | "metadata": {},
399 | "outputs": []
400 | },
401 | {
402 | "cell_type": "code",
403 | "collapsed": false,
404 | "input": [
405 | "#df.describe"
406 | ],
407 | "language": "python",
408 | "metadata": {},
409 | "outputs": []
410 | },
411 | {
412 | "cell_type": "code",
413 | "collapsed": false,
414 | "input": [
415 | "import numpy as np\n",
416 | "import pylab as pl\n",
417 | "from sklearn import svm, datasets\n",
418 | "from sklearn.utils import shuffle\n",
419 | "from sklearn.metrics import roc_curve, auc\n",
420 | "random_state = np.random.RandomState(0)"
421 | ],
422 | "language": "python",
423 | "metadata": {},
424 | "outputs": []
425 | },
426 | {
427 | "cell_type": "code",
428 | "collapsed": false,
429 | "input": [
430 | "#X = np.asarray(df[['Q','ns','sdn(m)','sde(m)','sdu(m)','ratio']])\n",
431 | "dff = df\n",
432 | "X = ['Q','ns','sd(m)','ratio']\n",
433 | "y = 'y'\n",
434 | "n_samples, n_features = dff.shape\n",
435 | "half = int(n_samples / 2)\n",
436 | "dff[y] = np.less(np.abs(np.asarray(dff['dist(m)'])),.2)\n",
437 | "\n",
438 | "dff_train = dff[::2]\n",
439 | "dff_test = dff[1::2]\n",
440 | "\n",
441 | "dff_train = dff_train.set_index('%_GPST')\n",
442 | "dff_test = dff_test.set_index('%_GPST')\n",
443 | "\n",
444 | "#print(dff_train)\n",
445 | "#print(dff_test)"
446 | ],
447 | "language": "python",
448 | "metadata": {},
449 | "outputs": []
450 | },
451 | {
452 | "cell_type": "code",
453 | "collapsed": false,
454 | "input": [
455 | "# Run classifier\n",
456 | "classifier = svm.SVC(kernel='rbf', probability=True)\n",
457 | "classifier.fit(dff_train[X], dff_train[y])\n",
458 | "probas_ = classifier.predict_proba(dff_test[X])\n",
459 | "dff_test['probas_'] = probas_[:, 1]\n",
460 | "y_ = classifier.predict(dff_test[X])\n",
461 | "dff_test['y_'] = np.asarray(y_)"
462 | ],
463 | "language": "python",
464 | "metadata": {},
465 | "outputs": []
466 | },
467 | {
468 | "cell_type": "code",
469 | "collapsed": false,
470 | "input": [
471 | "# Compute ROC curve and area the curve\n",
472 | "fpr, tpr, thresholds = roc_curve(dff_test['y'], dff_test['probas_'])\n",
473 | "roc_auc = auc(fpr, tpr)\n",
474 | "print(\"Area under the ROC curve : %f\" % roc_auc)\n",
475 | "\n",
476 | "# Plot ROC curve\n",
477 | "pl.clf()\n",
478 | "pl.figure(figsize=(18,12), dpi=80)\n",
479 | "pl.plot(fpr, tpr, label='ROC curve (area = %0.2f)' % roc_auc)\n",
480 | "pl.plot([0, 1], [0, 1], 'k--')\n",
481 | "pl.xlim([0.0, 1.0])\n",
482 | "pl.ylim([0.0, 1.0])\n",
483 | "pl.xlabel('False Positive Rate')\n",
484 | "pl.ylabel('True Positive Rate')\n",
485 | "pl.title('Receiver operating characteristic example')\n",
486 | "pl.legend(loc=\"lower right\")\n",
487 | "pl.show()"
488 | ],
489 | "language": "python",
490 | "metadata": {},
491 | "outputs": []
492 | },
493 | {
494 | "cell_type": "code",
495 | "collapsed": false,
496 | "input": [
497 | "#from sklearn.decomposition import PCA\n",
498 | "#pca = PCA(n_components=2)\n",
499 | "#pca.fit(X)\n",
500 | "#X_reduced = pca.transform(X)\n",
501 | "#print(\"Reduced dataset shape:\", X_reduced.shape)\n",
502 | "\n",
503 | "#import pylab as pl\n",
504 | "#pl.figure(figsize=(18,12), dpi=80)\n",
505 | "#pl.scatter(X_reduced[:, 0], X_reduced[:, 1], c=y)\n",
506 | "\n",
507 | "#print(\"Meaning of the 2 components:\")\n",
508 | "#for component in pca.components_:\n",
509 | "# print(\" + \".join(\"%.3f x %s\" % (value, name))\n",
510 | "# for value, name in zip(component, dff.columns))"
511 | ],
512 | "language": "python",
513 | "metadata": {},
514 | "outputs": []
515 | },
516 | {
517 | "cell_type": "code",
518 | "collapsed": false,
519 | "input": [
520 | "from sklearn.metrics import classification_report\n",
521 | "target_names = ['Outside', 'Inside']\n",
522 | "print(classification_report(dff_test['y'], dff_test['y_'], target_names=target_names))"
523 | ],
524 | "language": "python",
525 | "metadata": {},
526 | "outputs": []
527 | },
528 | {
529 | "cell_type": "code",
530 | "collapsed": false,
531 | "input": [
532 | "from sklearn.metrics import confusion_matrix\n",
533 | "# Compute confusion matrix\n",
534 | "cm = confusion_matrix(dff_test['y'], dff_test['y_'])\n",
535 | "\n",
536 | "print(cm)\n",
537 | "\n",
538 | "# Show confusion matrix\n",
539 | "pl.matshow(cm)\n",
540 | "pl.title('Confusion matrix')\n",
541 | "pl.colorbar()\n",
542 | "pl.show()"
543 | ],
544 | "language": "python",
545 | "metadata": {},
546 | "outputs": []
547 | },
548 | {
549 | "cell_type": "code",
550 | "collapsed": false,
551 | "input": [
552 | "dff_test['TFPN'] = ''\n",
553 | "for i in dff_test.index:\n",
554 | " yp = dff_test['y'][i]\n",
555 | " y_ = dff_test['y_'][i]\n",
556 | " s = ''\n",
557 | " if ~ (yp ^ y_):\n",
558 | " s += 'T'\n",
559 | " if y_:\n",
560 | " s += 'P'\n",
561 | " else:\n",
562 | " s += 'N'\n",
563 | " else:\n",
564 | " s += 'F'\n",
565 | " if y_:\n",
566 | " s += 'P'\n",
567 | " else:\n",
568 | " s += 'N'\n",
569 | " \n",
570 | " dff_test['TFPN'][i] = s"
571 | ],
572 | "language": "python",
573 | "metadata": {},
574 | "outputs": []
575 | },
576 | {
577 | "cell_type": "code",
578 | "collapsed": false,
579 | "input": [
580 | "dff_test.to_csv('/home/ruffin/Documents/Data/test.csv')"
581 | ],
582 | "language": "python",
583 | "metadata": {},
584 | "outputs": []
585 | },
586 | {
587 | "cell_type": "code",
588 | "collapsed": false,
589 | "input": [
590 | "dff_test_TP = dff_test[dff_test['TFPN'].str.contains(\"TP\")]\n",
591 | "dff_test_TN = dff_test[dff_test['TFPN'].str.contains(\"TN\")]\n",
592 | "dff_test_FP = dff_test[dff_test['TFPN'].str.contains(\"FP\")]\n",
593 | "dff_test_FN = dff_test[dff_test['TFPN'].str.contains(\"FN\")]\n",
594 | "\n",
595 | "print('Generating Fig')\n",
596 | "fig1 = pyplot.figure(figsize=(18,18), dpi=200)\n",
597 | "fig1.suptitle('Labeled Solutions', fontsize=14, fontweight='bold')\n",
598 | "#ax = fig1.add_subplot(111)\n",
599 | "ax = fig1.add_subplot(111,aspect='equal')\n",
600 | "p = ax.scatter(dff_test_TP['diste(m)'],dff_test_TP['distn(m)'], label= 'TP (Green) ', alpha=.2, c='green')\n",
601 | "p = ax.scatter(dff_test_TN['diste(m)'],dff_test_TN['distn(m)'], label= 'TN (Blue) ', alpha=.2, c='blue')\n",
602 | "p = ax.scatter(dff_test_FP['diste(m)'],dff_test_FP['distn(m)'], label= 'FP (Red) ', alpha=.2, c='red')\n",
603 | "p = ax.scatter(dff_test_FN['diste(m)'],dff_test_FN['distn(m)'], label= 'FN (Orange)', alpha=.2, c='orange')\n",
604 | "pl.legend(loc=\"lower right\")\n",
605 | "#xlim(distnminlim, distnmaxlim)\n",
606 | "#ylim(disteminlim, distemaxlim)\n",
607 | "ax.set_xlabel('East (meters)')\n",
608 | "ax.set_ylabel('North (meters)')\n",
609 | "ax.grid(True)"
610 | ],
611 | "language": "python",
612 | "metadata": {},
613 | "outputs": []
614 | },
615 | {
616 | "cell_type": "code",
617 | "collapsed": false,
618 | "input": [],
619 | "language": "python",
620 | "metadata": {},
621 | "outputs": []
622 | },
623 | {
624 | "cell_type": "code",
625 | "collapsed": false,
626 | "input": [],
627 | "language": "python",
628 | "metadata": {},
629 | "outputs": []
630 | },
631 | {
632 | "cell_type": "code",
633 | "collapsed": false,
634 | "input": [],
635 | "language": "python",
636 | "metadata": {},
637 | "outputs": []
638 | }
639 | ],
640 | "metadata": {}
641 | }
642 | ]
643 | }
--------------------------------------------------------------------------------
/Code/Port Data.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "Port Data"
4 | },
5 | "nbformat": 3,
6 | "nbformat_minor": 0,
7 | "worksheets": [
8 | {
9 | "cells": [
10 | {
11 | "cell_type": "heading",
12 | "level": 1,
13 | "metadata": {},
14 | "source": [
15 | "Port Data"
16 | ]
17 | },
18 | {
19 | "cell_type": "heading",
20 | "level": 2,
21 | "metadata": {},
22 | "source": [
23 | "Port data from positional solutions"
24 | ]
25 | },
26 | {
27 | "cell_type": "markdown",
28 | "metadata": {},
29 | "source": [
30 | "Port data from positional solution files and combined batch solutions into a single usable database or .csv file. This will also perform some additional feature extraction using static position to calculate true error."
31 | ]
32 | },
33 | {
34 | "cell_type": "markdown",
35 | "metadata": {},
36 | "source": [
37 | "First we need to import the relevant utilities that contain the necessary functions to operate on the data. \"file_utils\" is used for navigating and verifying the directories that contains or will contain the position files and data frame. \"rtklib_utils\" is for calling parsing the RINEX and position files. \n"
38 | ]
39 | },
40 | {
41 | "cell_type": "code",
42 | "collapsed": false,
43 | "input": [
44 | "from Utils.file_utils import *\n",
45 | "from Utils.rtklib_utils import *"
46 | ],
47 | "language": "python",
48 | "metadata": {},
49 | "outputs": [],
50 | "prompt_number": 1
51 | },
52 | {
53 | "cell_type": "markdown",
54 | "metadata": {},
55 | "source": [
56 | "Here we specify the \u201cin\u201d and \u201cout\u201d file path directories. These directories are used to hold multiple observation files for batch operations. For every ublox log file, we will generate a folder of the same name that will contain observations in RINEX format with in \u201cindir\u201d. \u201coutdir\u201d will be used later to store position solutions. We also checked to make sure that both directories exist in their path contains proper syntax."
57 | ]
58 | },
59 | {
60 | "cell_type": "code",
61 | "collapsed": false,
62 | "input": [
63 | "indir = '/home/ruffin/Documents/Data/in/'\n",
64 | "outdir = '/home/ruffin/Documents/Data/out/'\n",
65 | "indir = checkDir(indir,'r')\n",
66 | "outdir = checkDir(outdir,'r')"
67 | ],
68 | "language": "python",
69 | "metadata": {},
70 | "outputs": [],
71 | "prompt_number": 2
72 | },
73 | {
74 | "cell_type": "markdown",
75 | "metadata": {},
76 | "source": [
77 | "Here we iterate through each position data set to build our data frame. After each folder is read, the data frame is attended to the growing master data frame. These data frames are built using the kinematic and static positional solutions, the kinematic used for a specific path over time, while the static is used to calculate relative motion with respect to a stationary point."
78 | ]
79 | },
80 | {
81 | "cell_type": "code",
82 | "collapsed": false,
83 | "input": [
84 | "folders = findFolders(outdir)\n",
85 | "folders.sort(key=lambda x: x.lower())\n",
86 | "for i, folder in enumerate(folders):\n",
87 | " print(folder)\n",
88 | " if(i==0):\n",
89 | " df = buildDataFrame(outdir, folder)\n",
90 | " else:\n",
91 | " dff = buildDataFrame(outdir, folder)\n",
92 | " if(len(dff.index)>1):\n",
93 | " df = df.append(dff)"
94 | ],
95 | "language": "python",
96 | "metadata": {},
97 | "outputs": [
98 | {
99 | "output_type": "stream",
100 | "stream": "stdout",
101 | "text": [
102 | "CMU_center\n",
103 | "qmin:"
104 | ]
105 | },
106 | {
107 | "ename": "AttributeError",
108 | "evalue": "'module' object has no attribute 'Dataframe'",
109 | "output_type": "pyerr",
110 | "traceback": [
111 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[0;31mAttributeError\u001b[0m Traceback (most recent call last)",
112 | "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 5\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mfolder\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mfolders\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 6\u001b[0m \u001b[0mprint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mfolder\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 7\u001b[0;31m \u001b[0mdff\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mbuildDataFrame\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0moutdir\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfolder\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 8\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdff\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mindex\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m>\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 9\u001b[0m \u001b[0mdf\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mdf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mdff\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
113 | "\u001b[0;32m/home/ruffin/Documents/Data/out/CMU_center/rtklib_utils.py\u001b[0m in \u001b[0;36mbuildDataFrame\u001b[0;34m(dir, folder)\u001b[0m\n",
114 | "\u001b[0;31mAttributeError\u001b[0m: 'module' object has no attribute 'Dataframe'"
115 | ]
116 | },
117 | {
118 | "output_type": "stream",
119 | "stream": "stdout",
120 | "text": [
121 | " 1\n",
122 | "qmins: 4676\n",
123 | "40.44181870789327 -79.94396109462126 260.6512470491796\n"
124 | ]
125 | }
126 | ],
127 | "prompt_number": 4
128 | },
129 | {
130 | "cell_type": "markdown",
131 | "metadata": {},
132 | "source": [
133 | "Here we simply save our finished master data frame to comma separated file within the 'outdir' directory for convenient retrieval."
134 | ]
135 | },
136 | {
137 | "cell_type": "code",
138 | "collapsed": false,
139 | "input": [
140 | "df.to_csv(outdir + 'DataFrame.csv')"
141 | ],
142 | "language": "python",
143 | "metadata": {},
144 | "outputs": []
145 | }
146 | ],
147 | "metadata": {}
148 | }
149 | ]
150 | }
--------------------------------------------------------------------------------
/Code/Untitled0.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "Untitled0"
4 | },
5 | "nbformat": 3,
6 | "nbformat_minor": 0,
7 | "worksheets": [
8 | {
9 | "cells": [
10 | {
11 | "cell_type": "code",
12 | "collapsed": false,
13 | "input": [
14 | "from matplotlib import pyplot, mpl\n",
15 | "from pandas.lib import Timestamp\n",
16 | "import pandas as pd\n",
17 | "import scipy as sp\n",
18 | "import numpy as np\n",
19 | "import pylab as pl\n",
20 | "import geopy as gp\n",
21 | "import sklearn as sk\n",
22 | "import ephem\n",
23 | "from plot_utils import *\n",
24 | "from file_utils import *\n",
25 | "from ephem_utils import *\n",
26 | "\n",
27 | "from sklearn import svm\n",
28 | "from sklearn.metrics import roc_curve, auc"
29 | ],
30 | "language": "python",
31 | "metadata": {},
32 | "outputs": [
33 | {
34 | "output_type": "stream",
35 | "stream": "stderr",
36 | "text": [
37 | "/usr/lib/python3.3/importlib/_bootstrap.py:313: UserWarning: Module pytz was already imported from /usr/local/lib/python3.3/dist-packages/pytz/__init__.py, but /usr/lib/python3/dist-packages is being added to sys.path\n",
38 | " return f(*args, **kwds)\n",
39 | "/usr/lib/python3.3/importlib/_bootstrap.py:313: UserWarning: Module dateutil was already imported from /usr/local/lib/python3.3/dist-packages/dateutil/__init__.py, but /usr/lib/python3/dist-packages is being added to sys.path\n",
40 | " return f(*args, **kwds)\n"
41 | ]
42 | }
43 | ],
44 | "prompt_number": 1
45 | },
46 | {
47 | "cell_type": "code",
48 | "collapsed": false,
49 | "input": [
50 | "#import datetime\n",
51 | "# Setup lat long of telescope\n",
52 | "oxford = ephem.Observer()\n",
53 | "oxford.lat = np.deg2rad(40.441814610)\n",
54 | "oxford.long = np.deg2rad(-79.943959327)\n",
55 | "oxford.date = datetime.datetime.now()\n",
56 | "\n",
57 | "# Load Satellite TLE data.\n",
58 | "l1 = 'GPS BIIR-12 (PRN 23)'\n",
59 | "l2 = '1 28361U 04023A 13213.45207733 -.00000028 00000-0 00000+0 0 6256'\n",
60 | "l3 = '2 28361 54.7212 277.7598 0088963 198.1655 188.7394 2.00563458 66743' \n",
61 | "biif1 = ephem.readtle(l1,l2,l3)\n",
62 | "\n",
63 | "# Make some datetimes\n",
64 | "midnight = datetime.datetime(2013,7,28,0,4,42)\n",
65 | "dt = [midnight + timedelta(seconds=.1*x) for x in range(0, 15*60*10)]\n",
66 | "\n",
67 | "# Compute satellite locations at each datetime\n",
68 | "sat_alt, sat_az = [], []\n",
69 | "for date in dt:\n",
70 | " oxford.date = date\n",
71 | " biif1.compute(oxford)\n",
72 | " sat_alt.append(np.rad2deg(biif1.alt))\n",
73 | " sat_az.append(np.rad2deg(biif1.az))\n",
74 | " \n",
75 | "# Plot satellite tracks\n",
76 | "pl.subplot(211)\n",
77 | "pl.plot(dt, sat_alt)\n",
78 | "pl.ylabel(\"Altitude (deg)\")\n",
79 | "pl.xticks(rotation=25)\n",
80 | "pl.subplot(212)\n",
81 | "pl.plot(dt, sat_az)\n",
82 | "pl.ylabel(\"Azimuth (deg)\")\n",
83 | "pl.xticks(rotation=25)\n",
84 | "pl.show()\n",
85 | "\n",
86 | "# Plot satellite track in polar coordinates\n",
87 | "pl.polar(np.deg2rad(sat_az), 90-np.array(sat_alt))\n",
88 | "pl.ylim(0,90)\n",
89 | "pl.show()"
90 | ],
91 | "language": "python",
92 | "metadata": {},
93 | "outputs": [
94 | {
95 | "ename": "AttributeError",
96 | "evalue": "type object 'datetime.datetime' has no attribute 'datetime'",
97 | "output_type": "pyerr",
98 | "traceback": [
99 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m\n\u001b[0;31mAttributeError\u001b[0m Traceback (most recent call last)",
100 | "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 4\u001b[0m \u001b[0moxford\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlat\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdeg2rad\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;36m40.441814610\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 5\u001b[0m \u001b[0moxford\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlong\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdeg2rad\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m79.943959327\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 6\u001b[0;31m \u001b[0moxford\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdate\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mdatetime\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdatetime\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mnow\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 7\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 8\u001b[0m \u001b[0;31m# Load Satellite TLE data.\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
101 | "\u001b[0;31mAttributeError\u001b[0m: type object 'datetime.datetime' has no attribute 'datetime'"
102 | ]
103 | }
104 | ],
105 | "prompt_number": 2
106 | },
107 | {
108 | "cell_type": "code",
109 | "collapsed": false,
110 | "input": [
111 | "gatech = ephem.Observer()\n",
112 | "gatech.lon, gatech.lat = '-84.39733', '33.775867'\n",
113 | "gatech.date = '1984/5/30 16:22:56' # 12:22:56 EDT\n",
114 | "sun, moon = ephem.Sun(), ephem.Moon()\n",
115 | "sun.compute(gatech)\n",
116 | "moon.compute(gatech)\n",
117 | "print(sun.alt, sun.az)\n",
118 | "print(moon.alt, moon.az)"
119 | ],
120 | "language": "python",
121 | "metadata": {},
122 | "outputs": []
123 | },
124 | {
125 | "cell_type": "code",
126 | "collapsed": false,
127 | "input": [
128 | "obsfile = '/home/ruffin/Documents/Data/in/FlagStaff_center/FlagStaff_center.obs'"
129 | ],
130 | "language": "python",
131 | "metadata": {},
132 | "outputs": []
133 | },
134 | {
135 | "cell_type": "code",
136 | "collapsed": false,
137 | "input": [
138 | "df = pd.read_csv(obsfile, skiprows=23)"
139 | ],
140 | "language": "python",
141 | "metadata": {},
142 | "outputs": []
143 | },
144 | {
145 | "cell_type": "code",
146 | "collapsed": false,
147 | "input": [
148 | "df.head(20)"
149 | ],
150 | "language": "python",
151 | "metadata": {},
152 | "outputs": []
153 | },
154 | {
155 | "cell_type": "code",
156 | "collapsed": false,
157 | "input": [
158 | "# Download the file from `url` and save it locally under `file_name`:\n",
159 | "url = 'http://www.celestrak.com/NORAD/elements/gps-ops.txt'\n",
160 | "indir = '/home/ruffin/Documents/Data/in/'\n",
161 | "filename = 'gps-ops.txt'\n",
162 | "getURL(url,indir,filename)\n",
163 | "satlist = loadTLE(indir+filename)\n",
164 | "print(satlist)"
165 | ],
166 | "language": "python",
167 | "metadata": {},
168 | "outputs": [
169 | {
170 | "output_type": "stream",
171 | "stream": "stdout",
172 | "text": [
173 | "32 satellites loaded into list\n",
174 | "[, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ]\n"
175 | ]
176 | }
177 | ],
178 | "prompt_number": 2
179 | },
180 | {
181 | "cell_type": "code",
182 | "collapsed": false,
183 | "input": [
184 | "line = 'GPS BIIA-10 (PRN 32)'\n",
185 | "[s[:-1] for s in line.split() if s[:-1].isdigit()]"
186 | ],
187 | "language": "python",
188 | "metadata": {},
189 | "outputs": [
190 | {
191 | "output_type": "pyout",
192 | "prompt_number": 1,
193 | "text": [
194 | "['32']"
195 | ]
196 | }
197 | ],
198 | "prompt_number": 1
199 | },
200 | {
201 | "cell_type": "code",
202 | "collapsed": false,
203 | "input": [],
204 | "language": "python",
205 | "metadata": {},
206 | "outputs": [],
207 | "prompt_number": 11
208 | },
209 | {
210 | "cell_type": "code",
211 | "collapsed": false,
212 | "input": [],
213 | "language": "python",
214 | "metadata": {},
215 | "outputs": [],
216 | "prompt_number": 12
217 | },
218 | {
219 | "cell_type": "code",
220 | "collapsed": false,
221 | "input": [
222 | "reference = reference = gp.point.Point(40.441814610, -79.943959327, 250)\n",
223 | "getSatConsts(satlist,['06','32'], datetime(2013,7,28,0,4,42), reference)"
224 | ],
225 | "language": "python",
226 | "metadata": {},
227 | "outputs": [
228 | {
229 | "output_type": "pyout",
230 | "prompt_number": 4,
231 | "text": [
232 | "[['32', -23.591353864913568, 27.200166275159845],\n",
233 | " ['06', -85.32314658352783, 264.7794958553078]]"
234 | ]
235 | }
236 | ],
237 | "prompt_number": 4
238 | },
239 | {
240 | "cell_type": "code",
241 | "collapsed": false,
242 | "input": [],
243 | "language": "python",
244 | "metadata": {},
245 | "outputs": []
246 | }
247 | ],
248 | "metadata": {}
249 | }
250 | ]
251 | }
--------------------------------------------------------------------------------
/Code/Untitled1.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "metadata": {
3 | "name": "Untitled1"
4 | },
5 | "nbformat": 3,
6 | "nbformat_minor": 0,
7 | "worksheets": [
8 | {
9 | "cells": [
10 | {
11 | "cell_type": "code",
12 | "collapsed": false,
13 | "input": [
14 | "from matplotlib import pyplot, mpl\n",
15 | "from pandas.lib import Timestamp\n",
16 | "import pandas as pd\n",
17 | "import scipy as sp\n",
18 | "import numpy as np\n",
19 | "import pylab as pl\n",
20 | "import geopy as gp\n",
21 | "import sklearn as sk\n",
22 | "import ephem\n",
23 | "from plot_utils import *\n",
24 | "from file_utils import *\n",
25 | "from rtklib_utils import *\n",
26 | "from ephem_utils import *\n",
27 | "\n",
28 | "from sklearn import svm\n",
29 | "from sklearn.metrics import roc_curve, auc"
30 | ],
31 | "language": "python",
32 | "metadata": {},
33 | "outputs": [
34 | {
35 | "output_type": "stream",
36 | "stream": "stderr",
37 | "text": [
38 | "/usr/lib/python3.3/importlib/_bootstrap.py:313: UserWarning: Module pytz was already imported from /usr/local/lib/python3.3/dist-packages/pytz/__init__.py, but /usr/lib/python3/dist-packages is being added to sys.path\n",
39 | " return f(*args, **kwds)\n",
40 | "/usr/lib/python3.3/importlib/_bootstrap.py:313: UserWarning: Module dateutil was already imported from /usr/local/lib/python3.3/dist-packages/dateutil/__init__.py, but /usr/lib/python3/dist-packages is being added to sys.path\n",
41 | " return f(*args, **kwds)\n"
42 | ]
43 | }
44 | ],
45 | "prompt_number": 1
46 | },
47 | {
48 | "cell_type": "code",
49 | "collapsed": false,
50 | "input": [],
51 | "language": "python",
52 | "metadata": {},
53 | "outputs": [],
54 | "prompt_number": 48
55 | },
56 | {
57 | "cell_type": "code",
58 | "collapsed": false,
59 | "input": [
60 | "indir = '/home/ruffin/Documents/Data/in/'\n",
61 | "filename = 'CMU_center'\n",
62 | "df, header = readObs(indir + filename + '/', filename + '.obs')\n",
63 | "writeObs(indir + filename + '/', filename + '2.obs', df, header)"
64 | ],
65 | "language": "python",
66 | "metadata": {},
67 | "outputs": [
68 | {
69 | "html": [
70 | "\n",
71 | "
\n",
72 | " \n",
73 | " \n",
74 | " | \n",
75 | " | \n",
76 | " satData | \n",
77 | "
\n",
78 | " \n",
79 | " %_GPST | \n",
80 | " satID | \n",
81 | " | \n",
82 | "
\n",
83 | " \n",
84 | " \n",
85 | " \n",
86 | " 2013-07-28 00:04:41.800000 | \n",
87 | " G02 | \n",
88 | " G 2 20798587.222 64245.344 -292.... | \n",
89 | "
\n",
90 | " \n",
91 | " S38 | \n",
92 | " S38 38119695.104 334980.3572 -1274.... | \n",
93 | "
\n",
94 | " \n",
95 | " G04 | \n",
96 | " G 4 20340108.358 679975.293 -2616.... | \n",
97 | "
\n",
98 | " \n",
99 | " G25 | \n",
100 | " G25 23622313.488 -364769.798 1299.... | \n",
101 | "
\n",
102 | " \n",
103 | " G05 | \n",
104 | " G 5 22881315.972 -596016.782 2204.... | \n",
105 | "
\n",
106 | " \n",
107 | "
\n",
108 | "
"
109 | ],
110 | "output_type": "pyout",
111 | "prompt_number": 4,
112 | "text": [
113 | " satData\n",
114 | "%_GPST satID \n",
115 | "2013-07-28 00:04:41.800000 G02 G 2 20798587.222 64245.344 -292....\n",
116 | " S38 S38 38119695.104 334980.3572 -1274....\n",
117 | " G04 G 4 20340108.358 679975.293 -2616....\n",
118 | " G25 G25 23622313.488 -364769.798 1299....\n",
119 | " G05 G 5 22881315.972 -596016.782 2204...."
120 | ]
121 | }
122 | ],
123 | "prompt_number": 4
124 | },
125 | {
126 | "cell_type": "code",
127 | "collapsed": false,
128 | "input": [
129 | "uniq_satID = getSatsList(df)\n",
130 | "print(uniq_satID)"
131 | ],
132 | "language": "python",
133 | "metadata": {},
134 | "outputs": [
135 | {
136 | "output_type": "stream",
137 | "stream": "stdout",
138 | "text": [
139 | "['04', '25', '05', '23', '17', '13', '10', '12', '02']\n"
140 | ]
141 | }
142 | ],
143 | "prompt_number": 4
144 | },
145 | {
146 | "cell_type": "code",
147 | "collapsed": false,
148 | "input": [],
149 | "language": "python",
150 | "metadata": {},
151 | "outputs": [],
152 | "prompt_number": 4
153 | },
154 | {
155 | "cell_type": "code",
156 | "collapsed": false,
157 | "input": [
158 | "noradUrl = 'http://www.celestrak.com/NORAD/elements/gps-ops.txt'\n",
159 | "noradFile = 'gps-ops.txt'"
160 | ],
161 | "language": "python",
162 | "metadata": {},
163 | "outputs": [],
164 | "prompt_number": 5
165 | },
166 | {
167 | "cell_type": "code",
168 | "collapsed": false,
169 | "input": [
170 | "dir = indir\n",
171 | "dfObs = df\n",
172 | "headerObs = header\n",
173 | "\n",
174 | "\n",
175 | "headerObsLines = headerObs.split('\\n')\n",
176 | "for line in headerObsLines:\n",
177 | " if 'APPROX POSITION XYZ' in line:\n",
178 | " list = line.split()\n",
179 | " x = float(list[0])\n",
180 | " y = float(list[1])\n",
181 | " z = float(list[2])\n",
182 | " lat, lon, elv = xyz2plh(x,y,z)\n",
183 | " break\n",
184 | "lat, lon, elv = xyz2plh(x,y,z)\n",
185 | "reference = gp.point.Point(lat,lon,elv)\n",
186 | "date = df.index[0][0]\n",
187 | "satlist = loadTLE(dir + noradFile)\n",
188 | "satObs = getSatsList(dfObs)\n",
189 | "satConsts = getSatConsts(satlist, satObs, date, reference)"
190 | ],
191 | "language": "python",
192 | "metadata": {},
193 | "outputs": [
194 | {
195 | "output_type": "stream",
196 | "stream": "stdout",
197 | "text": [
198 | "04\n",
199 | "10\n",
200 | "13\n",
201 | "23\n",
202 | "02\n",
203 | "17\n",
204 | "12\n",
205 | "05\n",
206 | "25\n"
207 | ]
208 | }
209 | ],
210 | "prompt_number": 6
211 | },
212 | {
213 | "cell_type": "code",
214 | "collapsed": false,
215 | "input": [
216 | "print(satConsts)"
217 | ],
218 | "language": "python",
219 | "metadata": {},
220 | "outputs": [
221 | {
222 | "output_type": "stream",
223 | "stream": "stdout",
224 | "text": [
225 | "[['G04', 64.492722520768083, 39.861305015085406], ['G10', 68.004730659907764, 132.30105109109212], ['G13', 14.311659075501021, 83.831289834824474], ['G23', 12.377180455371452, 56.434697564418812], ['G02', 65.141139699434063, 290.09231352315845], ['G17', 32.983024483126592, 112.96004539950204], ['G12', 50.408466055946171, 280.67913805434364], ['G05', 29.301436407273833, 199.40500957306944], ['G25', 20.05583109200882, 317.15125539194457]]\n"
226 | ]
227 | }
228 | ],
229 | "prompt_number": 7
230 | },
231 | {
232 | "cell_type": "code",
233 | "collapsed": false,
234 | "input": [
235 | "import ephem"
236 | ],
237 | "language": "python",
238 | "metadata": {},
239 | "outputs": [],
240 | "prompt_number": 9
241 | },
242 | {
243 | "cell_type": "code",
244 | "collapsed": false,
245 | "input": [
246 | "def generateData(dir, noradFile):\n",
247 | " folders = findFolders(dir)\n",
248 | " for folder in folders:\n",
249 | " print('reading: ' + dir + folder + '/' + folder + '.obs')\n",
250 | " df, header = readObs(dir + folder + '/', folder + '.obs')\n",
251 | " headerLines = header.split('\\n')\n",
252 | " for line in headerLines:\n",
253 | " if 'APPROX POSITION XYZ' in line:\n",
254 | " list = line.split()\n",
255 | " x = float(list[0])\n",
256 | " y = float(list[1])\n",
257 | " z = float(list[2])\n",
258 | " lat, lon, elv = xyz2plh(x,y,z)\n",
259 | " break\n",
260 | " lat, lon, elv = xyz2plh(x,y,z)\n",
261 | " reference = gp.point.Point(lat,lon,elv)\n",
262 | " date = df.index[0][0]\n",
263 | " satlist = loadTLE(dir + noradFile)\n",
264 | " satObs = getSatsList(df)\n",
265 | " satConsts = getSatConsts(satlist, satObs, date, reference)\n",
266 | " dfs = generateMasks(df, satConsts)\n",
267 | " \n",
268 | " for x, dfx in enumerate(dfs):\n",
269 | " folderx = folder + '.v' + str(x)\n",
270 | " dirx = dir + folderx + '/'\n",
271 | " filepathx = dirx + folderx\t\n",
272 | " filepath = dir + folder + '/' + folder\n",
273 | " checkDir(dirx,'w')\n",
274 | " print(dirx + folderx)\n",
275 | " writeObs(dirx, folderx + '.obs', df, header)\n",
276 | " shutil.copyfile(filepath + '.nav', filepathx + '.nav')\n",
277 | " shutil.copyfile(filepath + '.sbs', filepathx + '.sbs')\n"
278 | ],
279 | "language": "python",
280 | "metadata": {},
281 | "outputs": [],
282 | "prompt_number": 25
283 | },
284 | {
285 | "cell_type": "code",
286 | "collapsed": false,
287 | "input": [
288 | "dir = indir\n",
289 | "generateData(dir, noradFile)"
290 | ],
291 | "language": "python",
292 | "metadata": {},
293 | "outputs": [
294 | {
295 | "output_type": "stream",
296 | "stream": "stdout",
297 | "text": [
298 | "reading: /home/ruffin/Documents/Data/in/CMU_center/CMU_center.obs\n",
299 | "04\n",
300 | "10\n",
301 | "13\n",
302 | "23\n",
303 | "02\n",
304 | "17\n",
305 | "12\n",
306 | "05\n",
307 | "25\n",
308 | "Checking directory: /home/ruffin/Documents/Data/in/CMU_center.v0/\n",
309 | "/home/ruffin/Documents/Data/in/CMU_center.v0/CMU_center.v0\n",
310 | "Checking directory: /home/ruffin/Documents/Data/in/CMU_center.v1/\n",
311 | "/home/ruffin/Documents/Data/in/CMU_center.v1/CMU_center.v1\n",
312 | "Checking directory: /home/ruffin/Documents/Data/in/CMU_center.v2/"
313 | ]
314 | },
315 | {
316 | "output_type": "stream",
317 | "stream": "stdout",
318 | "text": [
319 | "\n",
320 | "/home/ruffin/Documents/Data/in/CMU_center.v2/CMU_center.v2\n"
321 | ]
322 | }
323 | ],
324 | "prompt_number": 26
325 | },
326 | {
327 | "cell_type": "code",
328 | "collapsed": false,
329 | "input": [
330 | "dict = {}"
331 | ],
332 | "language": "python",
333 | "metadata": {},
334 | "outputs": [],
335 | "prompt_number": 3
336 | },
337 | {
338 | "cell_type": "code",
339 | "collapsed": false,
340 | "input": [
341 | "dict['cat'] = 2"
342 | ],
343 | "language": "python",
344 | "metadata": {},
345 | "outputs": [],
346 | "prompt_number": 4
347 | },
348 | {
349 | "cell_type": "code",
350 | "collapsed": false,
351 | "input": [
352 | "dict['cat']"
353 | ],
354 | "language": "python",
355 | "metadata": {},
356 | "outputs": [
357 | {
358 | "output_type": "pyout",
359 | "prompt_number": 5,
360 | "text": [
361 | "2"
362 | ]
363 | }
364 | ],
365 | "prompt_number": 5
366 | },
367 | {
368 | "cell_type": "code",
369 | "collapsed": false,
370 | "input": [],
371 | "language": "python",
372 | "metadata": {},
373 | "outputs": []
374 | }
375 | ],
376 | "metadata": {}
377 | }
378 | ]
379 | }
--------------------------------------------------------------------------------
/Code/Utils/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/Utils/__init__.py
--------------------------------------------------------------------------------
/Code/Utils/__init__.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/Utils/__init__.pyc
--------------------------------------------------------------------------------
/Code/Utils/__pycache__/__init__.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/Utils/__pycache__/__init__.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/Utils/__pycache__/ephem_utils.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/Utils/__pycache__/ephem_utils.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/Utils/__pycache__/file_utils.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/Utils/__pycache__/file_utils.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/Utils/__pycache__/gpstime.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/Utils/__pycache__/gpstime.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/Utils/__pycache__/rtklib_utils.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/Utils/__pycache__/rtklib_utils.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/Utils/ephem_utils.py:
--------------------------------------------------------------------------------
1 | '''
2 | Created on Aug 2, 2013
3 |
4 | @author: ruffin
5 | '''
6 | import pandas as pd
7 | import scipy as sp
8 | import numpy as np
9 | import geopy as gp
10 | import ephem
11 | import urllib.request
12 | import shutil
13 | from Utils.file_utils import *
14 |
15 | def loadTLE(filename):
16 | """ Loads a TLE file and creates a list of satellites."""
17 | f = open(filename)
18 | satlist = []
19 | l1 = f.readline()
20 | while l1:
21 | l1 = [s[:-1] for s in l1.split() if s[:-1].isdigit()][0]
22 | l2 = f.readline()
23 | l3 = f.readline()
24 | sat = ephem.readtle(l1,l2,l3)
25 | satlist.append(sat)
26 | l1 = f.readline()
27 |
28 | f.close()
29 | #print("%i satellites loaded into list"%len(satlist))
30 | return(satlist)
31 |
32 | def getSatConsts(satlist, satObs, date, reference):
33 | constellations = []
34 | for sat in satlist:
35 | if sat.name in satObs:
36 | #print(sat.name)
37 | constellation = getSatConst(sat, date, reference)
38 | constellations.append(constellation)
39 | return(constellations)
40 |
41 | def getSatConst(sat, date, reference):
42 | observer = ephem.Observer()
43 | observer.lat = np.deg2rad(reference.latitude)
44 | observer.long = np.deg2rad(reference.longitude)
45 | observer.date = date
46 | sat.compute(observer)
47 | sat_alt = np.rad2deg(sat.alt)
48 | sat_az = np.rad2deg(sat.az)
49 | constellation = ['G' + sat.name, sat_alt, sat_az]
50 | return(constellation)
--------------------------------------------------------------------------------
/Code/Utils/file_utils.py:
--------------------------------------------------------------------------------
1 | '''
2 | Created on Jul 1, 2013
3 |
4 | @author: ruffin
5 | '''
6 |
7 | import os
8 | from pylab import *
9 | from datetime import *
10 | import numpy as np
11 | import ftplib
12 | from ftplib import FTP
13 | import urllib.request
14 | import shutil
15 |
16 | def checkDir(dir,option):
17 | """Check Directory"""
18 | if not dir.endswith('/'):
19 | dir += '/'
20 | if option == 'w':
21 | if not os.path.exists(dir):
22 | os.makedirs(dir)
23 | #print('Checking directory: ' + dir)
24 | return dir
25 |
26 | def findFile(dir,extension):
27 | file = 0
28 | os.chdir(dir)
29 | for files in os.listdir("."):
30 | if files.endswith(extension):
31 | file = files
32 | return file
33 |
34 | def findFiles(dir,extension):
35 | os.chdir(dir)
36 | allFiles = os.listdir(".")
37 | foundFiles = []
38 | for file in allFiles:
39 | if file.endswith(extension):
40 | foundFiles.append(file)
41 | return foundFiles
42 |
43 | def findFolders(dir):
44 | os.chdir(dir)
45 | foundFolders = []
46 | allFiles = os.listdir(".")
47 | for file in allFiles:
48 | if os.path.isdir(file):
49 | foundFolders.append(file)
50 | return foundFolders
51 |
52 |
53 | def parsePosFile(posFile):
54 | """Parses pos file for data"""
55 | print('Parsing file: ' + posFile)
56 | data = np.zeros([1,14])
57 | with open(posFile, 'r') as f:
58 | for line in f:
59 | if(line[0]!='%'):
60 | tdate = datetime.strptime(line[:23], '%Y/%m/%d %H:%M:%S.%f')
61 | tdate = datetime.timestamp(tdate)
62 | lon = float(line[25:38])
63 | lat = float(line[40:53])
64 | elv = float(line[55:64])
65 | q = int(line[66:68])
66 | ns = int(line[70:72])
67 | sdn = float(line[74:81])
68 | sde = float(line[83:90])
69 | sdu = float(line[92:99])
70 | sdne = float(line[101:108])
71 | sdeu = float(line[110:117])
72 | sdun = float(line[119:126])
73 | age = float(line[127:133])
74 | ratio = float(line[135:])
75 | temp = np.array([tdate, lon, lat, elv, q, ns, sdn, sde, sdu, sdne, sdeu, sdun, age, ratio])
76 | data = vstack((data, temp))
77 | f.closed
78 | return data
79 |
80 | # def parseObsFile(obsFile):
81 | # """Parses RINEX v3.04 obs file for data"""
82 | # print('Parsing file: ' + obsFile)
83 | # df = pd.Da
84 | # data = np.zeros([1,14])
85 | # with open(obsFile, 'r') as file:
86 | # for line in file:
87 | # if(line[0]!='%'):
88 | # f.closed
89 | # return data
90 |
91 | def fetchFiles(ftp, path, dir, key=None):
92 | try:
93 | ftp.cwd(path)
94 | os.chdir(dir)
95 | if key == None:
96 | list = ftp.nlst()
97 | else:
98 | list = ftp.nlst(key)
99 | for filename in list:
100 | haveFile =False
101 | for file in os.listdir("."):
102 | if file in filename:
103 | haveFile = True
104 | if haveFile:
105 | print('Found local\n' + filename, end = '\n\n')
106 | else:
107 | print('Downloading\n' + filename, end = '\n\n')
108 | fhandle = open(os.path.join(dir, filename), 'wb')
109 | ftp.retrbinary('RETR ' + filename, fhandle.write)
110 | fhandle.close()
111 | return False
112 | except ftplib.error_perm:
113 | return True
114 |
115 | def getURL(url,dir,filename):
116 | # Download the file from `url` and save it locally under `dir/file_name`:
117 | #filename = dir+url.split('/')[-1]
118 | with urllib.request.urlopen(url) as response, open(dir + filename, 'wb') as out_file:
119 | shutil.copyfileobj(response, out_file)
120 |
121 |
--------------------------------------------------------------------------------
/Code/Utils/gpstime.py:
--------------------------------------------------------------------------------
1 | """
2 | A Python implementation of GPS related time conversions.
3 |
4 | Copyright 2002 by Bud P. Bruegger, Sistema, Italy
5 | mailto:bud@sistema.it
6 | http://www.sistema.it
7 |
8 | Modifications for GPS seconds by Duncan Brown
9 |
10 | PyUTCFromGpsSeconds added by Ben Johnson
11 |
12 | This program is free software; you can redistribute it and/or modify it under
13 | the terms of the GNU Lesser General Public License as published by the Free
14 | Software Foundation; either version 2 of the License, or (at your option) any
15 | later version.
16 |
17 | This program is distributed in the hope that it will be useful, but WITHOUT ANY
18 | WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
19 | PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
20 | details.
21 |
22 | You should have received a copy of the GNU Lesser General Public License along
23 | with this program; if not, write to the Free Software Foundation, Inc., 59
24 | Temple Place, Suite 330, Boston, MA 02111-1307 USA
25 |
26 | GPS Time Utility functions
27 |
28 | This file contains a Python implementation of GPS related time conversions.
29 |
30 | The two main functions convert between UTC and GPS time (GPS-week, time of
31 | week in seconds, GPS-day, time of day in seconds). The other functions are
32 | convenience wrappers around these base functions.
33 |
34 | A good reference for GPS time issues is:
35 | http://www.oc.nps.navy.mil/~jclynch/timsys.html
36 |
37 | Note that python time types are represented in seconds since (a platform
38 | dependent Python) Epoch. This makes implementation quite straight forward
39 | as compared to some algorigthms found in the literature and on the web.
40 | """
41 |
42 | __author__ = 'Duncan Brown '
43 | __date__ = '$Date: 2006/02/16 04:36:09 $'
44 | __version__ = '$Revision: 1.6 $'[11:-2]
45 | # $Source: /usr/local/cvs/lscsoft/glue/glue/gpstime.py,v $
46 |
47 | import time, math
48 |
49 | secsInWeek = 604800
50 | secsInDay = 86400
51 | gpsEpoch = (1980, 1, 6, 0, 0, 0) # (year, month, day, hh, mm, ss)
52 |
53 | def dayOfWeek(year, month, day):
54 | "returns day of week: 0=Sun, 1=Mon, .., 6=Sat"
55 | hr = 12 #make sure you fall into right day, middle is save
56 | t = time.mktime((year, month, day, hr, 0, 0, 0, 0, -1))
57 | pyDow = time.localtime(t)[6]
58 | gpsDow = (pyDow + 1) % 7
59 | return gpsDow
60 |
61 | def gpsWeek(year, month, day):
62 | "returns (full) gpsWeek for given date (in UTC)"
63 | hr = 12 #make sure you fall into right day, middle is save
64 | return gpsFromUTC(year, month, day, hr, 0, 0)[0]
65 |
66 |
67 | def julianDay(year, month, day):
68 | "returns julian day=day since Jan 1 of year"
69 | hr = 12 #make sure you fall into right day, middle is save
70 | t = time.mktime((year, month, day, hr, 0, 0, 0, 0, -1))
71 | julDay = time.localtime(t)[7]
72 | return julDay
73 |
74 | def mkUTC(year, month, day, hour, min, sec):
75 | "similar to python's mktime but for utc"
76 | spec = (year, month, day, hour, min, int(sec), 0, 0, 0)
77 | utc = time.mktime(spec) - time.timezone
78 | return utc
79 |
80 | def ymdhmsFromPyUTC(pyUTC):
81 | "returns tuple from a python time value in UTC"
82 | ymdhmsXXX = time.gmtime(pyUTC)
83 | return ymdhmsXXX[:-3]
84 |
85 | # def wtFromUTCpy(pyUTC, leapSecs=14):
86 | # """convenience function:
87 | # allows to use python UTC times and
88 | # returns only week and tow"""
89 | # ymdhms = ymdhmsFromPyUTC(pyUTC)
90 | # wSowDSoD = apply(gpsFromUTC, ymdhms + (leapSecs,))
91 | # return wSowDSoD[0:2]
92 |
93 | def gpsFromUTC(year, month, day, hour, min, sec, leapSecs=0):
94 | # leapSecs=14 from UTC, but I am going from GPS to GPS time
95 | """converts UTC to: gpsWeek, secsOfWeek, gpsDay, secsOfDay
96 |
97 | a good reference is: http://www.oc.nps.navy.mil/~jclynch/timsys.html
98 |
99 | This is based on the following facts (see reference above):
100 |
101 | GPS time is basically measured in (atomic) seconds since
102 | January 6, 1980, 00:00:00.0 (the GPS Epoch)
103 |
104 | The GPS week starts on Saturday midnight (Sunday morning), and runs
105 | for 604800 seconds.
106 |
107 | Currently, GPS time is 13 seconds ahead of UTC (see above reference).
108 | While GPS SVs transmit this difference and the date when another leap
109 | second takes effect, the use of leap seconds cannot be predicted. This
110 | routine is precise until the next leap second is introduced and has to be
111 | updated after that.
112 |
113 | SOW = Seconds of Week
114 | SOD = Seconds of Day
115 |
116 | Note: Python represents time in integer seconds, fractions are lost!!!
117 | """
118 | secFract = sec % 1
119 | epochTuple = gpsEpoch + (-1, -1, 0)
120 | t0 = time.mktime(epochTuple)
121 | t = time.mktime((year, month, day, hour, min, int(sec), -1, -1, 0))
122 | # Note: time.mktime strictly works in localtime and to yield UTC, it should be
123 | # corrected with time.timezone
124 | # However, since we use the difference, this correction is unnecessary.
125 | # Warning: trouble if daylight savings flag is set to -1 or 1 !!!
126 | t = t + leapSecs
127 | tdiff = t - t0
128 | gpsSOW = (tdiff % secsInWeek) + secFract
129 | gpsWeek = int(math.floor(tdiff/secsInWeek))
130 | gpsDay = int(math.floor(gpsSOW/secsInDay))
131 | gpsSOD = (gpsSOW % secsInDay)
132 | return (gpsWeek, gpsSOW, gpsDay, gpsSOD)
133 |
134 |
135 | def UTCFromGps(gpsWeek, SOW, leapSecs=14):
136 | """converts gps week and seconds to UTC
137 |
138 | see comments of inverse function!
139 |
140 | SOW = seconds of week
141 | gpsWeek is the full number (not modulo 1024)
142 | """
143 | secFract = SOW % 1
144 | epochTuple = gpsEpoch + (-1, -1, 0)
145 | t0 = time.mktime(epochTuple) - time.timezone #mktime is localtime, correct for UTC
146 | tdiff = (gpsWeek * secsInWeek) + SOW - leapSecs
147 | t = t0 + tdiff
148 | (year, month, day, hh, mm, ss, dayOfWeek, julianDay, daylightsaving) = time.gmtime(t)
149 | #use gmtime since localtime does not allow to switch off daylighsavings correction!!!
150 | return (year, month, day, hh, mm, ss + secFract)
151 |
152 | def GpsSecondsFromPyUTC( pyUTC, leapSecs=14 ):
153 | """converts the python epoch to gps seconds
154 |
155 | pyEpoch = the python epoch from time.time()
156 | """
157 | t = t=gpsFromUTC(*ymdhmsFromPyUTC( pyUTC ))
158 | return int(t[0] * 60 * 60 * 24 * 7 + t[1])
159 |
160 | def PyUTCFromGpsSeconds(gpsseconds):
161 | """converts gps seconds to the
162 | python epoch. That is, the time
163 | that would be returned from time.time()
164 | at gpsseconds.
165 | """
166 | pyUTC
167 |
168 | #===== Tests =========================================
169 |
170 | def testTimeStuff():
171 | print("-"*20)
172 | print()
173 | print( "The GPS Epoch when everything began (1980, 1, 6, 0, 0, 0, leapSecs=0)")
174 | (w, sow, d, sod) = gpsFromUTC(1980, 1, 6, 0, 0, 0, leapSecs=0)
175 | print( "**** week: %s, sow: %s, day: %s, sod: %s" % (w, sow, d, sod))
176 | print( " and hopefully back:")
177 | print( "**** %s, %s, %s, %s, %s, %s\n" % UTCFromGps(w, sow, leapSecs=0))
178 |
179 | print( "The time of first Rollover of GPS week (1999, 8, 21, 23, 59, 47)")
180 | (w, sow, d, sod) = gpsFromUTC(1999, 8, 21, 23, 59, 47)
181 | print( "**** week: %s, sow: %s, day: %s, sod: %s" % (w, sow, d, sod))
182 | print( " and hopefully back:")
183 | print( "**** %s, %s, %s, %s, %s, %s\n" % UTCFromGps(w, sow, leapSecs=14))
184 |
185 | print( "Today is GPS week 1186, day 3, seems to run ok (2002, 10, 2, 12, 6, 13.56)")
186 | (w, sow, d, sod) = gpsFromUTC(2002, 10, 2, 12, 6, 13.56)
187 | print( "**** week: %s, sow: %s, day: %s, sod: %s" % (w, sow, d, sod))
188 | print( " and hopefully back:")
189 | print( "**** %s, %s, %s, %s, %s, %s\n" % UTCFromGps(w, sow))
190 |
191 | def testJulD():
192 | print( '2002, 10, 11 -> 284 ==??== ', julianDay(2002, 10, 11))
193 |
194 | def testGpsWeek():
195 | print( '2002, 10, 11 -> 1187 ==??== ', gpsWeek(2002, 10, 11))
196 |
197 | def testDayOfWeek():
198 | print( '2002, 10, 12 -> 6 ==??== ', dayOfWeek(2002, 10, 12))
199 | print( '2002, 10, 6 -> 0 ==??== ', dayOfWeek(2002, 10, 6))
200 |
201 | # def testPyUtilties():
202 | # ymdhms = (2002, 10, 12, 8, 34, 12.3)
203 | # print( "testing for: ", ymdhms)
204 | # #pyUtc = apply(mkUTC, ymdhms)
205 | # pyUtc = mkUTC(2002, 10, 12, 8, 34, 12.3)
206 | # back = ymdhmsFromPyUTC(pyUtc)
207 | # print( "yields : ", back)
208 | # #*********************** !!!!!!!!
209 | # #assert(ymdhms == back)
210 | # #! TODO: this works only with int seconds!!! fix!!!
211 | # (w, t) = wtFromUTCpy(pyUtc)
212 | # print( "week and time: ", (w,t))
213 |
214 |
215 | #===== Main =========================================
216 | if __name__ == "__main__":
217 | pass
218 | testTimeStuff()
219 | testGpsWeek()
220 | testJulD()
221 | testDayOfWeek()
222 | # testPyUtilties()
--------------------------------------------------------------------------------
/Code/Utils/plot_utils.py:
--------------------------------------------------------------------------------
1 | '''
2 | Created on Jul 11, 2013
3 |
4 | @author: ruffin
5 | '''
6 |
7 | from matplotlib import pyplot, mpl
8 | import pandas as pd
9 | import scipy as sp
10 | import numpy as np
11 | import pylab as pl
12 | from file_utils import *
13 |
14 |
15 |
16 | def to_percent(y, position):
17 | # Ignore the passed in position. This has the effect of scaling the default
18 | # tick locations.
19 | s = str(100 * y)
20 |
21 | # The percent symbol needs escaping in latex
22 | if mpl.rcParams['text.usetex'] == True:
23 | return s + r'$\%$'
24 | else:
25 | return s + '%'
26 |
27 | def neuTFPNplot(dff_test, y_, fig, tittle = 'Labeled Solutions', radius = 0.3 , aspect='equal'):
28 | dff_test['y_'] = y_
29 |
30 | dff_test['TFPN'] = ''
31 | for i in dff_test.index:
32 | yp = dff_test['y'][i]
33 | y_ = dff_test['y_'][i]
34 | s = ''
35 | if ~ (yp ^ y_):
36 | s += 'T'
37 | if y_:
38 | s += 'P'
39 | else:
40 | s += 'N'
41 | else:
42 | s += 'F'
43 | if y_:
44 | s += 'P'
45 | else:
46 | s += 'N'
47 |
48 | dff_test['TFPN'][i] = s
49 |
50 | dff_test_TP = dff_test[dff_test['TFPN'].str.contains("TP")]
51 | dff_test_TN = dff_test[dff_test['TFPN'].str.contains("TN")]
52 | dff_test_FP = dff_test[dff_test['TFPN'].str.contains("FP")]
53 | dff_test_FN = dff_test[dff_test['TFPN'].str.contains("FN")]
54 |
55 | print('Generating Fig')
56 | #fig = pyplot.figure(figsize=(20,18), dpi=200)
57 | fig.suptitle(tittle, fontsize=14, fontweight='bold')
58 | #ax = fig.add_subplot(111)
59 | ax = fig.add_subplot(1,3,1,aspect=aspect)
60 | p = ax.scatter(dff_test_TP['diste(m)'],dff_test_TP['distn(m)'], label= 'TP (Green) ', alpha=.1, c='green')
61 | p = ax.scatter(dff_test_TN['diste(m)'],dff_test_TN['distn(m)'], label= 'TN (Blue) ', alpha=.1, c='blue')
62 | p = ax.scatter(dff_test_FP['diste(m)'],dff_test_FP['distn(m)'], label= 'FP (Red) ', alpha=.1, c='red')
63 | p = ax.scatter(dff_test_FN['diste(m)'],dff_test_FN['distn(m)'], label= 'FN (Orange)', alpha=.1, c='orange')
64 | circle = Circle((0,0), radius=0.3, color='gray', fill=False)
65 | fig.gca().add_artist(circle)
66 | pl.legend(loc="lower right")
67 | #xlim(distnminlim, distnmaxlim)
68 | #ylim(disteminlim, distemaxlim)
69 | ax.set_xlabel('East (meters)')
70 | ax.set_ylabel('North (meters)')
71 | ax.grid(True)
72 |
73 | ax = fig.add_subplot(1,3,3,aspect=aspect)
74 | p = ax.scatter(dff_test_TP['distn(m)'],dff_test_TP['distu(m)'], label= 'TP (Green) ', alpha=.1, c='green')
75 | p = ax.scatter(dff_test_TN['distn(m)'],dff_test_TN['distu(m)'], label= 'TN (Blue) ', alpha=.1, c='blue')
76 | p = ax.scatter(dff_test_FP['distn(m)'],dff_test_FP['distu(m)'], label= 'FP (Red) ', alpha=.1, c='red')
77 | p = ax.scatter(dff_test_FN['distn(m)'],dff_test_FN['distu(m)'], label= 'FN (Orange)', alpha=.1, c='orange')
78 | circle = Circle((0,0), radius=radius, color='gray', fill=False)
79 | fig.gca().add_artist(circle)
80 | pl.legend(loc="lower right")
81 | #xlim(distnminlim, distnmaxlim)
82 | #ylim(disteminlim, distemaxlim)
83 | ax.set_xlabel('North (meters)')
84 | ax.set_ylabel('Up (meters)')
85 | ax.grid(True)
86 |
87 | ax = fig.add_subplot(1,3,2,aspect=aspect)
88 | p = ax.scatter(dff_test_TP['diste(m)'],dff_test_TP['distu(m)'], label= 'TP (Green) ', alpha=.1, c='green')
89 | p = ax.scatter(dff_test_TN['diste(m)'],dff_test_TN['distu(m)'], label= 'TN (Blue) ', alpha=.1, c='blue')
90 | p = ax.scatter(dff_test_FP['diste(m)'],dff_test_FP['distu(m)'], label= 'FP (Red) ', alpha=.1, c='red')
91 | p = ax.scatter(dff_test_FN['diste(m)'],dff_test_FN['distu(m)'], label= 'FN (Orange)', alpha=.1, c='orange')
92 | circle = Circle((0,0), radius=radius, color='gray', fill=False)
93 | fig.gca().add_artist(circle)
94 | pl.legend(loc="lower right")
95 | #xlim(distnminlim, distnmaxlim)
96 | #ylim(disteminlim, distemaxlim)
97 | ax.set_xlabel('East (meters)')
98 | ax.set_ylabel('Up (meters)')
99 | ax.grid(True)
100 |
101 | def neTFPNplot(dff_test, y_, fig, tittle = 'Labeled Solutions', radius = 0.3, aspect='equal'):
102 | dff_test['y_'] = y_
103 |
104 | dff_test['TFPN'] = ''
105 | for i in dff_test.index:
106 | yp = dff_test['y'][i]
107 | y_ = dff_test['y_'][i]
108 | s = ''
109 | if ~ (yp ^ y_):
110 | s += 'T'
111 | if y_:
112 | s += 'P'
113 | else:
114 | s += 'N'
115 | else:
116 | s += 'F'
117 | if y_:
118 | s += 'P'
119 | else:
120 | s += 'N'
121 |
122 | dff_test['TFPN'][i] = s
123 |
124 | dff_test_TP = dff_test[dff_test['TFPN'].str.contains("TP")]
125 | dff_test_TN = dff_test[dff_test['TFPN'].str.contains("TN")]
126 | dff_test_FP = dff_test[dff_test['TFPN'].str.contains("FP")]
127 | dff_test_FN = dff_test[dff_test['TFPN'].str.contains("FN")]
128 |
129 | print('Generating Fig')
130 | #fig = pyplot.figure(figsize=(20,18), dpi=200)
131 | fig.suptitle(tittle, fontsize=14, fontweight='bold')
132 | #ax = fig.add_subplot(111)
133 | ax = fig.add_subplot(1,1,1,aspect=aspect)
134 | p = ax.scatter(dff_test_TP['diste(m)'],dff_test_TP['distn(m)'], label= 'TP (Green) ', alpha=.1, c='green')
135 | p = ax.scatter(dff_test_TN['diste(m)'],dff_test_TN['distn(m)'], label= 'TN (Blue) ', alpha=.1, c='blue')
136 | p = ax.scatter(dff_test_FP['diste(m)'],dff_test_FP['distn(m)'], label= 'FP (Red) ', alpha=.1, c='red')
137 | p = ax.scatter(dff_test_FN['diste(m)'],dff_test_FN['distn(m)'], label= 'FN (Orange)', alpha=.1, c='orange')
138 | circle = Circle((0,0), radius=0.3, color='gray', fill=False)
139 | fig.gca().add_artist(circle)
140 | pl.legend(loc="lower right")
141 | #xlim(distnminlim, distnmaxlim)
142 | #ylim(disteminlim, distemaxlim)
143 | ax.set_xlabel('East (meters)')
144 | ax.set_ylabel('North (meters)')
145 | ax.grid(True)
--------------------------------------------------------------------------------
/Code/Utils/plot_utils.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/Utils/plot_utils.pyc
--------------------------------------------------------------------------------
/Code/Utils/rtklib_utils.py:
--------------------------------------------------------------------------------
1 | '''
2 | Created on Jul 26, 2013
3 |
4 | @author: ruffin
5 | '''
6 | import os, sys
7 | import shutil
8 | import subprocess
9 | import locale
10 | from ftplib import FTP
11 | from Utils.ephem_utils import *
12 | from Utils.file_utils import *
13 | from Utils.gpstime import *
14 |
15 | from pandas.lib import Timestamp
16 | import pandas as pd
17 | import numpy as np
18 | import geopy as gp
19 | from geopy import distance
20 |
21 |
22 |
23 | #------------------------------------------------------------------------------
24 | # Convert ubx files
25 | #------------------------------------------------------------------------------
26 | def convbinFile(dir, file, convbin):
27 | os.chdir(dir)
28 | filename = os.path.splitext(file)[0]
29 | if (os.path.isdir(filename) == 0):
30 | os.mkdir(filename)
31 | #shutil.move(file, dir + filename)
32 | shutil.copyfile(file,dir + filename + '/' + file)
33 | os.chdir(dir + filename)
34 | command = ([convbin, dir + filename + '/' + file, '-v', '3.02', '-od', '-os', '-oi', '-ot', '-ol'])
35 | print('Running ')
36 | print(' '.join(command))
37 | subprocess.check_output(command)
38 |
39 | #------------------------------------------------------------------------------
40 | # Convert ubx files
41 | #------------------------------------------------------------------------------
42 | def rnx2rtkpFile(indir, filename, outdir, station, rnx2rtkp):
43 | #filename = os.path.splitext(file)[0]
44 | #filename = file
45 | os.chdir(indir + filename)
46 | obsfile = findFile(indir + filename,".obs")
47 | command = ['grep', 'TIME OF FIRST OBS', indir + filename + '/' + obsfile]
48 | ymdhms = subprocess.check_output(command).decode(locale.getdefaultlocale()[1])
49 | tdate = datetime.strptime(ymdhms[:42], ' %Y %m %d %H %M %S.%f')
50 | gpsweek = gpsWeek(tdate.year, tdate.month, tdate.day)
51 | gpsweekday = dayOfWeek(tdate.year, tdate.month, tdate.day)
52 | julianday = julianDay(tdate.year, tdate.month, tdate.day)
53 |
54 | sp3file = str(gpsweek) + str(gpsweekday) +'.sp3'
55 |
56 | os.chdir(indir)
57 | for file in os.listdir("."):
58 | if file.endswith(sp3file):
59 | sp3file = file
60 | if file.endswith(sp3file):
61 | sp3file = file
62 | if file.endswith('rtkoptions_static.conf'):
63 | staticConf = file
64 | if file.endswith('rtkoptions_kinetic.conf'):
65 | kineticConf = file
66 |
67 |
68 | navfilePath = indir + filename + '/' + filename + '.nav'
69 | obsfilePath = indir + filename + '/' + filename + '.obs'
70 | sbsfilePath = indir + filename + '/' + filename + '.sbs'
71 | o13filePath = indir + station + str(julianday) + '0.13o'
72 | sp3filePath = indir + sp3file
73 | staticConfPath = indir + staticConf
74 | kineticConfPath = indir + kineticConf
75 | staticPosPath = outdir + filename + '/' + filename + '_static.pos'
76 | kineticPosPath = outdir + filename + '/' + filename + '_kinetic.pos'
77 |
78 | if('.v' in filename):
79 | originalFile = filename.split('.v')[0]
80 | originalstaticPosPath = outdir + originalFile + '/' + originalFile + '_static.pos'
81 | checkDir(staticPosPath[:-11],'w')
82 | shutil.copyfile(originalstaticPosPath, staticPosPath)
83 | print('Copy static from original')
84 | else:
85 | command0 = ([rnx2rtkp,'-k', staticConfPath,'-o', staticPosPath, obsfilePath, o13filePath, navfilePath, sp3filePath, sbsfilePath])
86 | print('\nRunning ')
87 | print(' '.join(command0))
88 | subprocess.check_output(command0)
89 |
90 | command1 = ([rnx2rtkp,'-k', kineticConfPath,'-o', kineticPosPath, obsfilePath, o13filePath, navfilePath, sp3filePath, sbsfilePath])
91 | print('\nRunning ')
92 | print(' '.join(command1))
93 | subprocess.check_output(command1)
94 |
95 |
96 | #------------------------------------------------------------------------------
97 | # Convert ubx files
98 | #------------------------------------------------------------------------------
99 | def fetchData(dir, file, server, hostPath, station):
100 | # Extract time stamp from log files
101 | filename = os.path.splitext(file)[0]
102 | os.chdir(dir + filename)
103 | obsfile = findFile(dir + filename,".obs")
104 |
105 | # Parsing time from data
106 | command = ['grep', 'TIME OF FIRST OBS', dir + filename + '/' + obsfile]
107 | #print('Running ')
108 | #print(' '.join(command))
109 | ymdhms = subprocess.check_output(command).decode(locale.getdefaultlocale()[1])
110 | tdate = datetime.strptime(ymdhms[:42], ' %Y %m %d %H %M %S.%f')
111 | tnow = datetime.now()
112 | #print('Recorded Date')
113 | #print(tdate, end='\n\n')
114 | #print('Current Date')
115 | #print(tnow, end='\n\n')
116 | dt = tnow - tdate
117 | #print('Date Diffrince')
118 | #print(dt, end='\n\n')
119 |
120 | # Get files from FTP server
121 | corfile = station + tdate.strftime("%j0.%y") + 'o.gz'
122 | navfile = station + tdate.strftime("%j0.%y") + 'd.Z'
123 | hostPath = hostPath + tdate.strftime("%Y/%j/")
124 |
125 | ftp = FTP(server)
126 | ftp.login()
127 |
128 | #print('Fetching updated ephemerides')
129 | if fetchFiles(ftp, hostPath, dir, 'igs*'):
130 | print('No IGS file yet')
131 | if fetchFiles(ftp, hostPath, dir, 'igr*'):
132 | print('No IGR file yet')
133 | if fetchFiles(ftp, hostPath, dir, 'igu*'):
134 | print('Not even an IGU file yet')
135 | print('Have a little patients!')
136 | hostPath = hostPath + station
137 | #print('FTP Current Working Directory\n' + hostPath, end='\n\n')
138 | #print('Fetching station broadcasts')
139 | if fetchFiles(ftp, hostPath, dir):
140 | print('No data files yet')
141 |
142 | ftp.quit()
143 |
144 | def buildDataFrame(dir, folder):
145 | staticPosFile = findFile(dir + folder,'_static.pos')
146 | kineticPosFile = findFile(dir + folder,'_kinetic.pos')
147 |
148 | skiprow = 0
149 | with open(staticPosFile) as search:
150 | for i, line in enumerate(search):
151 | if "% GPST" in line:
152 | skiprow = i
153 | break
154 | dff = pd.read_csv(staticPosFile, skiprows=skiprow, delim_whitespace=True, parse_dates=[[0, 1]])
155 | qmin = dff['Q'].min()
156 | print('qmin:', qmin)
157 | qmins = dff['Q'] == qmin
158 | print('qmins:',len(qmins))
159 | if (len(qmins) > 1):
160 | dff = dff[qmins]
161 | reference = gp.point.Point(dff['latitude(deg)'].mean(), dff['longitude(deg)'].mean(), dff['height(m)'].mean())
162 | print(reference.latitude, reference.longitude, reference.altitude)
163 |
164 | skiprow = 0
165 | with open(kineticPosFile) as search:
166 | for i, line in enumerate(search):
167 | if "% GPST" in line:
168 | skiprow = i
169 | break
170 | df = pd.read_csv(kineticPosFile, skiprows=skiprow, delim_whitespace=True, parse_dates=[[0, 1]])
171 | #reference = gp.point.Point(df['latitude(deg)'][0], df['longitude(deg)'][0], df['height(m)'][0])
172 |
173 | df['sd(m)'] = np.sqrt(df['sdn(m)']**2+df['sde(m)']**2+df['sdu(m)']**2+df['sdne(m)']**2+df['sdeu(m)']**2+df['sdun(m)']**2)
174 | df['dist(m)'] = 0.0
175 | df['distn(m)'] = 0.0
176 | df['diste(m)'] = 0.0
177 | df['distu(m)'] = 0.0
178 | d = distance.distance
179 | for i in df.index :
180 | j = gp.point.Point(df['latitude(deg)'][i],df['longitude(deg)'][i])
181 | k = gp.point.Point(reference.latitude,df['longitude(deg)'][i])
182 | l = gp.point.Point(df['latitude(deg)'][i],reference.longitude)
183 |
184 | lat1 = math.radians(reference.latitude)
185 | lon1 = math.radians(reference.longitude)
186 | lat2 = math.radians(df['latitude(deg)'][i])
187 | lon2 = math.radians(df['longitude(deg)'][i])
188 |
189 | dLon = lon2 - lon1;
190 | y = math.sin(dLon) * math.cos(lat2);
191 | x = math.cos(lat1)*math.sin(lat2) - math.sin(lat1)*math.cos(lat2)*math.cos(dLon);
192 | brng = math.atan2(y, x);
193 | north = math.cos(brng)
194 | east = math.sin(brng)
195 |
196 | j = d(reference, j).meters
197 | k = d(reference, k).meters*np.sign(east)
198 | l = d(reference, l).meters*np.sign(north)
199 | m = (df['height(m)'][i] - reference.altitude)
200 | q = np.core.sqrt(j**2+m**2)
201 |
202 | df['dist(m)'][i] = q
203 | df['distn(m)'][i] = k
204 | df['diste(m)'][i] = l
205 | df['distu(m)'][i] = m
206 | return df
207 |
208 |
209 | #------------------------------------------------------------------------------
210 | # Decompress downloaded files
211 | #------------------------------------------------------------------------------
212 | def decompressData(dir):
213 | # print('Uncompressing fetched data')
214 | subprocess.check_output(['gzip', '-d', '-f', '-r', dir])
215 |
216 | def readObs(dir, file):
217 | #Read header
218 | header = ''
219 | with open(dir + file) as handler:
220 | for i, line in enumerate(handler):
221 | header += line
222 | if 'END OF HEADER' in line:
223 | break
224 | #Read Data
225 | with open(dir + file) as handler:
226 | #Make a list to hold dictionaries
227 | date_list = []
228 | for line in handler:
229 | #Check for a Timestamp lable
230 | if '> ' in line:
231 | #Grab Timestamp
232 | links = line.split()
233 | index = datetime.strptime(' '.join(links[1:7]), '%Y %m %d %H %M %S.%f0')
234 | #Identify number of satellites
235 | satNum = int(links[8])
236 | #For every sat
237 | for sat in range(satNum):
238 | #Make a holder dictionary
239 | sat_dict = {}
240 | #just save the data as a string for now
241 | satData = handler.readline()
242 | #Fix the names
243 | satID = satData.replace("G ", "G0").split()[0]
244 | #Add holder dictionary
245 | sat_dict['%_GPST'] = index
246 | sat_dict['satID'] = satID
247 | sat_dict['satData'] = satData
248 | #Add to the growing list
249 | date_list.append(sat_dict)
250 | #Convert to dataframe
251 | df = pd.DataFrame(date_list)
252 | #Set the multi index
253 | df = df.set_index(['%_GPST', 'satID'])
254 | return df, header
255 |
256 | def writeObs(dir, file, df, header):
257 | #Write header
258 | with open(dir + file, 'w') as handler:
259 | handler.write(header)
260 |
261 | with open(dir + file, 'a') as handler:
262 | for group in df.groupby(level=0, axis=0):
263 | timestampObj = group[0]
264 | dff = group[1]
265 | sats = len(dff.index)
266 | timestampLine = timestampObj.strftime('> %Y %m %d %H %M %S.%f0 0 ') + str(sats) + '\n'
267 | handler.write(timestampLine)
268 | for sat in range(sats):
269 | satDataLine = dff.ix[sat][0]
270 | handler.write(satDataLine)
271 | return 0
272 |
273 |
274 | def getSatsList(df):
275 | df = df.reset_index()
276 | uniq_satID = df['satID'].unique()
277 | uniq_satID = [x[1:] for x in uniq_satID if not 'S' in x]
278 | return uniq_satID
279 |
280 | def xyz2plh(x,y,z):
281 | emajor = 6378137.0
282 | eflat = 0.00335281068118
283 | A = emajor
284 | FL = eflat
285 | ZERO = 0.0
286 | ONE = 1.0
287 | TWO = 2.0
288 | THREE = 3.0
289 | FOUR = 4.0
290 |
291 | '''
292 | 1.0 compute semi-minor axis and set sign to that of z in order
293 | to get sign of Phi correct
294 | '''
295 | B = A * (ONE - FL)
296 | if( z < ZERO ):
297 | B = -B
298 | '''
299 | 2.0 compute intermediate values for latitude
300 | '''
301 | r = math.sqrt( x*x + y*y )
302 | e = ( B*z - (A*A - B*B) ) / ( A*r )
303 | f = ( B*z + (A*A - B*B) ) / ( A*r )
304 |
305 | '''
306 | 3.0 find solution to:
307 | t^4 + 2*E*t^3 + 2*F*t - 1 = 0
308 | '''
309 | p = (FOUR / THREE) * (e*f + ONE)
310 | q = TWO * (e*e - f*f)
311 | d = p*p*p + q*q
312 |
313 | if( d >= ZERO ):
314 | v= pow( (sqrt( d ) - q), (ONE / THREE) ) - pow( (sqrt( d ) + q), (ONE / THREE) )
315 | else:
316 | v= TWO * sqrt( -p ) * cos( acos( q/(p * sqrt( -p )) ) / THREE )
317 | '''
318 | 4.0 improve v
319 | NOTE: not really necessary unless point is near pole
320 | '''
321 | if( v*v < fabs(p) ):
322 | v = -(v*v*v + TWO*q) / (THREE*p)
323 | g = (sqrt( e*e + v ) + e) / TWO
324 | t = sqrt( g*g + (f - v*g)/(TWO*g - e) ) - g
325 |
326 | lat = math.atan( (A*(ONE - t*t)) / (TWO*B*t) )
327 |
328 | '''
329 | 5.0 compute height above ellipsoid
330 | '''
331 | elv = (r - A*t)*cos( lat ) + (z - B)*sin( lat );
332 |
333 | '''
334 | 6.0 compute longitude east of Greenwich
335 | '''
336 | zlong = math.atan2( y, x )
337 | if( zlong < ZERO ):
338 | zlong = zlong + math.pi*2
339 | lon = zlong
340 |
341 | '''
342 | 7.0 convert latitude and longitude to degrees
343 | '''
344 | lat = np.rad2deg(lat)
345 | lon = np.rad2deg(lon)
346 | lon = -math.fmod((360.0-lon), 360.0)
347 |
348 | return lat, lon, elv
349 |
350 | def diff2Angles(firstAngle, secondAngle):
351 | difference = secondAngle - firstAngle
352 | difference = mod( secondAngle - firstAngle + 180, 360 ) - 180
353 | return difference
354 |
355 | def generateMaskList(satConsts):
356 | satConsts = sorted(satConsts, key=lambda tup: tup[2])
357 | angels = []
358 | for i in range(len(satConsts)):
359 | a = np.deg2rad(satConsts[i-1][2])
360 | b = np.deg2rad(satConsts[i][2])
361 | c = math.atan2((sin(a)+sin(b)),(cos(a)+cos(b)))
362 | c = np.rad2deg(c)
363 | c = math.fmod((c+360), 360.0)
364 | angels.append(c)
365 | maskList = []
366 | for i, mask in enumerate(angels):
367 | masks = []
368 | for j, sat in enumerate(satConsts):
369 | diff = diff2Angles(sat[2],mask)
370 | if diff < 0:
371 | masks.append(sat[0])
372 | maskList.append(masks)
373 | return maskList
374 |
375 | def applyMasks(df, satConsts):
376 | dfs = []
377 | maskList = generateMaskList(satConsts)
378 | for mask in maskList:
379 | dff = df
380 | dff = dff.reset_index()
381 | print(mask)
382 | for sat in mask:
383 | dff = dff[dff['satID'] != sat]
384 | dff = dff.set_index(['%_GPST', 'satID'])
385 | dfs.append(dff)
386 | return dfs
387 |
388 | def generateData(dir, noradFile):
389 | folders = findFolders(dir)
390 | for folder in folders:
391 | print('reading: ' + dir + folder + '/' + folder + '.obs')
392 | df, header = readObs(dir + folder + '/', folder + '.obs')
393 | headerLines = header.split('\n')
394 | for line in headerLines:
395 | if 'APPROX POSITION XYZ' in line:
396 | list = line.split()
397 | x = float(list[0])
398 | y = float(list[1])
399 | z = float(list[2])
400 | lat, lon, elv = xyz2plh(x,y,z)
401 | break
402 | lat, lon, elv = xyz2plh(x,y,z)
403 | reference = gp.point.Point(lat,lon,elv)
404 | date = df.index[0][0]
405 | satlist = loadTLE(dir + noradFile)
406 | satObs = getSatsList(df)
407 | satConsts = getSatConsts(satlist, satObs, date, reference)
408 | dfs = applyMasks(df, satConsts)
409 |
410 | for x, dfx in enumerate(dfs):
411 | folderx = folder + '.v' + str(x)
412 | dirx = dir + folderx + '/'
413 | filepathx = dirx + folderx
414 | filepath = dir + folder + '/' + folder
415 | checkDir(dirx,'w')
416 | writeObs(dirx, folderx + '.obs', dfx, header)
417 | shutil.copyfile(filepath + '.nav', filepathx + '.nav')
418 | shutil.copyfile(filepath + '.sbs', filepathx + '.sbs')
419 |
--------------------------------------------------------------------------------
/Code/Utils/testClassificationModel.py:
--------------------------------------------------------------------------------
1 | '''
2 | By Brian Boates
3 | boates
4 | https://gist.github.com/boates/5127281
5 | Modified by Ruffsl for Python 3.3
6 | '''
7 |
8 | def splitData(df, trainPerc=0.6, cvPerc=0.2, testPerc=0.2, seed=42):
9 | """
10 | By Brian Boates: boates
11 | return: training, cv, test
12 | (as pandas dataframes)
13 | params:
14 | df: pandas dataframe
15 | trainPerc: float | percentage of data for trainin set (default=0.6
16 | cvPerc: float | percentage of data for cross validation set (default=0.2)
17 | testPerc: float | percentage of data for test set (default=0.2)
18 | (trainPerc + cvPerc + testPerc must equal 1.0)
19 | """
20 | assert trainPerc + cvPerc + testPerc == 1.0
21 |
22 | # create random list of indices
23 | import random
24 | random.seed(seed)
25 | N = len(df)
26 | l = list(range(N))
27 | random.shuffle(l)
28 |
29 | # get splitting indicies
30 | trainLen = int(N*trainPerc)
31 | cvLen = int(N*cvPerc)
32 | testLen = int(N*testPerc)
33 |
34 | # get training, cv, and test sets
35 | training = df.ix[l[:trainLen]]
36 | cv = df.ix[l[trainLen:trainLen+cvLen]]
37 | test = df.ix[l[trainLen+cvLen:]]
38 |
39 | #print len(cl), len(training), len(cv), len(test)
40 |
41 | return training, cv, test
42 |
43 |
44 | def getScore(df, classifier, classTitle, trainPerc, testPerc):
45 | """
46 | return: float | accuracy score for classification model (e[0,1])
47 | params:
48 | df: pandas dataframe
49 | classifier: sklearn classifier
50 | classTitle: string | title of class column in df
51 | trainPerc: percentage of data to train on (default=0.80)
52 | testPerc: percentage of data to test on (default=0.20)
53 | (trainPerc + testPerc = 1.0)
54 | """
55 | assert trainPerc + testPerc == 1.0
56 |
57 | # split the dataset
58 | training, cv, test = splitData(df, trainPerc=trainPerc, cvPerc=0.00, testPerc=testPerc)
59 |
60 | # get the features and classes
61 | featureNames = [col for col in df.columns if col != classTitle]
62 | trainFeatures = training[ featureNames ].values
63 | trainClasses = training[ classTitle ].values
64 |
65 | # create class dict to track numeric classes
66 | classToString = {}
67 | classToNumber = {}
68 | for i, c in enumerate( sorted(set(trainClasses)) ):
69 | classToString[i] = c
70 | classToNumber[c] = i
71 |
72 | # change classes to numbers (if not already)
73 | trainClasses = [classToNumber[c] for c in trainClasses]
74 |
75 | # fit the model
76 | classifier.fit(trainFeatures, trainClasses)
77 |
78 | # formt cross validation set
79 | testFeatures = test[ featureNames ].values
80 | testClasses = [classToNumber[c] for c in test[classTitle].values]
81 |
82 | # compute the score on the test set
83 | score = classifier.score(testFeatures, testClasses)
84 |
85 | return score
86 |
87 |
88 | def testModel(df, classifier, classTitle, N=1, trainPerc=0.80, testPerc=0.20):
89 | """
90 | return: list[float] | list of scores for model (e[0,1])
91 | params:
92 | df: pandas dataframe
93 | classifier: sklearn classifier
94 | classTitle: string | title of class column in df
95 | N: int | number of tests to run (default=1)
96 | trainPerc: percentage of data to train on (default=0.80)
97 | testPerc: percentage of data to test on (default=0.20)
98 | (trainPerc + testPerc = 1.0)
99 | """
100 | # compute N scores
101 | scores = []
102 | for i in range(N):
103 | score = getScore(df=df, classifier=classifier, classTitle=classTitle, trainPerc=trainPerc, testPerc=testPerc)
104 | scores.append(score)
105 |
106 | return scores
107 |
108 |
109 |
--------------------------------------------------------------------------------
/Code/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/__init__.py
--------------------------------------------------------------------------------
/Code/__pycache__/ephem_utils.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/__pycache__/ephem_utils.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/__pycache__/file_utils.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/__pycache__/file_utils.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/__pycache__/fileutils.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/__pycache__/fileutils.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/__pycache__/gpstime.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/__pycache__/gpstime.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/__pycache__/plot_utils.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/__pycache__/plot_utils.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/__pycache__/rtklib_utils.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/__pycache__/rtklib_utils.cpython-33.pyc
--------------------------------------------------------------------------------
/Code/__pycache__/testClassificationModel.cpython-33.pyc:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Code/__pycache__/testClassificationModel.cpython-33.pyc
--------------------------------------------------------------------------------
/Docs/Poster/Micro Aerial Vehicle Localization with Real Time Kinematic GPS - small.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Docs/Poster/Micro Aerial Vehicle Localization with Real Time Kinematic GPS - small.png
--------------------------------------------------------------------------------
/Docs/Poster/Micro Aerial Vehicle Localization with Real Time Kinematic GPS.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ruffsl/RTKLIB-Tools/9c2fd3e149efc23c4252650085cacae54bb104f1/Docs/Poster/Micro Aerial Vehicle Localization with Real Time Kinematic GPS.png
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | RTKLIB-Tools
2 | ============
3 |
4 | Tools for RTKLIB
5 |
--------------------------------------------------------------------------------