344 |
406 |
407 |
408 |
409 |
410 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_admin_0_countries.VERSION.txt:
--------------------------------------------------------------------------------
1 | 4.1.0
2 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_admin_0_countries.cpg:
--------------------------------------------------------------------------------
1 | UTF-8
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_admin_0_countries.dbf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/Arcgis Scripting with Python Arcpy/Data/ne_10m_admin_0_countries.dbf
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_admin_0_countries.prj:
--------------------------------------------------------------------------------
1 | GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137.0,298.257223563]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.017453292519943295]]
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_admin_0_countries.shp:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/Arcgis Scripting with Python Arcpy/Data/ne_10m_admin_0_countries.shp
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_admin_0_countries.shx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/Arcgis Scripting with Python Arcpy/Data/ne_10m_admin_0_countries.shx
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_populated_places.CPG:
--------------------------------------------------------------------------------
1 | UTF-8
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_populated_places.README.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
88 |
89 |
94 |
95 |
98 |
114 |
115 |
116 |
121 |
122 |
123 |
124 |
125 |
150 |
151 |
161 |
162 |
163 |
164 |
165 |
166 | «
1:10m Cultural Vectors
167 | «
Downloads
168 |
169 |
Populated Places
170 |
171 |
181 |
182 |
About
183 |
Point symbols with name attributes. Includes all admin-0 and many admin-1 capitals, major cities and towns, plus a sampling of smaller towns in sparsely inhabited regions. We favor regional significance over population census in determining our selection of places. Use the scale rankings to filter the number of towns that appear on your map.
184 |
185 |
LandScan derived population estimates are provided for 90% of our cities. Those lacking population estimates are often in sparsely inhabited areas. We provide a range of population values that account for the total “metropolitan” population rather than it’s administrative boundary population. Use the PopMax column to size your town labels. Starting in version 1.1, popMax has been throttled down to the UN estimated metro population for the ~500 largest urban areas in the world. This affects towns in China, India, and parts of Africa where our Landscan counting method usually over estimated.
186 |
Population estimates were derived from the LANDSCAN dataset maintained and distributed by the Oak Ridge National Laboratory. These data were converted from raster to vector and pixels with fewer than 200 persons per square kilometer were removed from the dataset as they were classified as rural. Once urban pixels were selected, these pixels were aggregated into contiguous units. Concurrently Thiessen polygons were created based on the selected city points. The Thiessen polygons were used to intersect the contiguous city boundaries to produce bounded areas for the cities. As a result, our estimates capture a metropolitan and micropolitan populations per city regardless of administrative units.
187 |
Once intersected, the contiguous polygons were recalculated, using aerial interpolation assuming uniform population distribution within each pixel, to determine the population total. This process was conducted multiple times, for each scale level, to produce population estimates for each city at nested scales of 1:300 million, 1:110 million, 1:50 million, 1:20 million, and 1:10 million.
188 |
191 |
Population ranks
192 |
Are calculated as rank_max and rank_min using this general VB formula that can be pasted into ArcMap Field Calculator advanced area (set your output to x):
193 |
194 | a = [pop_max]
195 | if( a > 10000000 ) then
196 | x = 14
197 | elseif( a > 5000000 ) then
198 | x = 13
199 | elseif( a > 1000000 ) then
200 | x = 12
201 | elseif( a > 500000 ) then
202 | x = 11
203 | elseif( a > 200000 ) then
204 | x = 10
205 | elseif( a > 100000 ) then
206 | x = 9
207 | elseif( a > 50000 ) then
208 | x = 8
209 | elseif( a > 20000 ) then
210 | x = 7
211 | elseif( a > 10000 ) then
212 | x = 6
213 | elseif( a > 5000 ) then
214 | x = 5
215 | elseif( a > 2000 ) then
216 | x = 4
217 | elseif( a > 1000 ) then
218 | x = 3
219 | elseif( a > 200 ) then
220 | x = 2
221 | elseif( a > 0 ) then
222 | x = 1
223 | else
224 | x = 0
225 | end if
226 |
Issues
227 |
While we don’t want to show every admin-1 capital, for those countries where we show most admin-1 capitals, we should have a complete set. If you find we are missing one, please log it in the Cx tool at right.
228 |
Version History
229 |
230 |
231 | 4.1.0
232 |
233 |
234 | 4.0.0
235 |
236 |
237 | 3.3.1
238 |
239 |
240 | 3.0.0
241 |
242 |
243 | 2.0.0
244 |
245 |
246 | 1.4.0
247 |
248 |
249 | 1.3.0
250 |
251 |
252 | 1.1.0
253 |
254 |
255 | 0.9.0
256 |
257 |
258 |
259 |
The master changelog is available on Github »
260 |
261 |
262 |
263 |
264 |
265 |
266 |
267 |
268 |
269 |
270 |
271 |
379 |
380 |
381 |
382 |
444 |
445 |
446 |
447 |
448 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_populated_places.VERSION.txt:
--------------------------------------------------------------------------------
1 | 4.1.0
2 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_populated_places.dbf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/Arcgis Scripting with Python Arcpy/Data/ne_10m_populated_places.dbf
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_populated_places.prj:
--------------------------------------------------------------------------------
1 | GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137.0,298.257223563]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.017453292519943295]]
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_populated_places.shp:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/Arcgis Scripting with Python Arcpy/Data/ne_10m_populated_places.shp
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Data/ne_10m_populated_places.shx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/Arcgis Scripting with Python Arcpy/Data/ne_10m_populated_places.shx
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/Toolbox.tbx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/Arcgis Scripting with Python Arcpy/Scripts/Toolbox.tbx
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_10.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | arcpy.env.overwriteOutput = True
4 |
5 | points = r"E:\Files\GIS\_Tutorial\Data\ne_10m_populated_places.shp"
6 | countries = r"E:\Files\GIS\_Tutorial\Data\ne_10m_admin_0_countries.shp"
7 | outpath = r"E:\Files\GIS\_Tutorial\outputs"
8 |
9 | arcpy.MakeFeatureLayer_management(points, 'points_layer')
10 |
11 | with arcpy.da.SearchCursor(countries, ['FID', 'SOVEREIGNT']) as country_cursor:
12 | for x in country_cursor:
13 | print x[1]
14 | arcpy.MakeFeatureLayer_management(countries, 'countries_layer', """ "FID" = {} """.format(x[0]))
15 | arcpy.SelectLayerByLocation_management('points_layer', 'WITHIN', 'countries_layer')
16 | # 2021 Update - had to replace '-' and also encode into utf-8
17 | formatted_output_name = x[1].replace('(', '_').replace(')', '_').replace('-', '_').encode('utf-8')
18 | arcpy.FeatureClassToFeatureClass_conversion('points_layer', outpath, 'cities_in_{}_{}'.format(formatted_output_name, x[0]))
19 | print 'Successfully Converted {} \n'.format(formatted_output_name)
20 |
21 | print 'Finished'
22 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_11.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | arcpy.env.overwriteOutput = True
4 |
5 | points = r"E:\Files\GIS\_Tutorial\Data\ne_10m_populated_places.shp"
6 | countries = r"E:\Files\GIS\_Tutorial\Data\ne_10m_admin_0_countries.shp"
7 | outpath = r"E:\Files\GIS\_Tutorial\outputs"
8 | total_count = 0
9 | created_count = 0
10 |
11 | arcpy.MakeFeatureLayer_management(points, 'points_layer')
12 |
13 | with arcpy.da.SearchCursor(countries, ['FID', 'SOVEREIGNT', 'POP_EST']) as country_cursor:
14 | for x in country_cursor:
15 | total_count += 1
16 | if x[2] > 50000000:
17 |
18 | created_count += 1
19 | print x[1]
20 | arcpy.MakeFeatureLayer_management(countries, 'countries_layer', """ "FID" = {} """.format(x[0]))
21 | arcpy.SelectLayerByLocation_management('points_layer', 'WITHIN', 'countries_layer')
22 | # 2021 Update - had to replace '-' and also encode into utf-8
23 | formatted_output_name = x[1].replace('(', '_').replace(')', '_').replace('-', '_').encode('utf-8')
24 | arcpy.FeatureClassToFeatureClass_conversion('points_layer', outpath, 'cities_in_{}_{}'.format(formatted_output_name, x[0]))
25 | print 'Successfully Converted {} \n'.format(formatted_output_name)
26 | else:
27 | # 2021 update - encode into utf-8
28 | print "{} didn't meet the criteria".format(x[1].encode('utf-8'))
29 |
30 | print 'Finished'
31 | print '{0} met the criteria out of {1} countries'.format(created_count, total_count)
32 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_12.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | arcpy.env.overwriteOutput = True
4 |
5 | points = arcpy.GetParameterAsText(0)
6 | countries = arcpy.GetParameterAsText(1)
7 | outpath = arcpy.GetParameterAsText(2)
8 | pop_number = arcpy.GetParameterAsText(3)
9 |
10 | total_count = 0
11 | created_count = 0
12 |
13 | arcpy.MakeFeatureLayer_management(points, 'points_layer')
14 |
15 | with arcpy.da.SearchCursor(countries, ['FID', 'SOVEREIGNT', 'POP_EST']) as country_cursor:
16 | for x in country_cursor:
17 | total_count += 1
18 | if x[2] > float(pop_number):
19 |
20 | created_count += 1
21 | print x[1]
22 | arcpy.MakeFeatureLayer_management(countries, 'countries_layer', """ "FID" = {} """.format(x[0]) )
23 | arcpy.SelectLayerByLocation_management('points_layer', 'WITHIN', 'countries_layer')
24 | # 2021 Update - had to replace '-' and also encode into utf-8
25 | formatted_output_name = x[1].replace('(', '_').replace(')', '_').replace('-', '_').encode('utf-8')
26 | arcpy.FeatureClassToFeatureClass_conversion('points_layer', outpath, 'cities_in_{0}_{1}'.format(formatted_output_name, x[0]))
27 | print 'Successfully converted {} \n'.format(formatted_output_name)
28 | arcpy.AddMessage('Successfully converted {} \n'.format(formatted_output_name))
29 |
30 | else:
31 | # 2021 update - encode into utf-8
32 | print "{} didn't meet the criteria".format(x[1].encode('utf-8'))
33 | arcpy.AddMessage("{} didn't meet the criteria".format(x[1].encode('utf-8')))
34 |
35 | print 'Finished'
36 | arcpy.AddMessage('Finished!!!')
37 | print '{0} met the criteria out of {1} countries'.format(created_count, total_count)
38 | arcpy.AddMessage('{0} met the criteria out of {1} countries'.format(created_count, total_count))
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_13.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | points = r"E:\Files\GIS\_Tutorial\Data\ne_10m_populated_places.shp"
4 |
5 | with arcpy.da.UpdateCursor(points, ['NAMEPAR']) as city_cursor:
6 | for x in city_cursor:
7 | print x[0]
8 | x[0]= 'WE_JUST_UPDATED_THIS'
9 | city_cursor.updateRow(x)
10 | print 'We updated this value to {}'.format(x[0])
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_14.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | points = r"E:\Files\GIS\_Tutorial\Data\ne_10m_populated_places.shp"
4 |
5 | field_list = arcpy.ListFields(points)
6 | print field_list
7 | list_field = []
8 |
9 | for x in field_list:
10 | print x.name
11 | print x.type
12 | if x.type == 'String':
13 | list_field.append(x.name)
14 | else:
15 | print 'This is not a String, it is {}'.format(x.type)
16 |
17 | for field in list_field:
18 |
19 | with arcpy.da.UpdateCursor(points, [field]) as city_cursor:
20 | for x in city_cursor:
21 | print x[0]
22 | if x[0] == ' ':
23 | x[0] = 'STARWARS'
24 | city_cursor.updateRow(x)
25 | print 'We updated this value to {}'.format(x[0])
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_3.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | arcpy.env.workspace = r"E:\Files\GIS\_Tutorial\Data"
4 |
5 | feature_list = arcpy.ListFeatureClasses()
6 |
7 | print feature_list
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_4.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | points = r"E:\Files\GIS\_Tutorial\Data\ne_10m_populated_places.shp"
4 | countries = r"E:\Files\GIS\_Tutorial\Data\ne_10m_admin_0_countries.shp"
5 | outpath = r"E:\Files\GIS\_Tutorial\outputs"
6 |
7 | arcpy.MakeFeatureLayer_management(points, 'points_layer')
8 | # In 2021, the United States "Name" value has changed from 'United States' to 'United States of America'
9 | arcpy.MakeFeatureLayer_management(countries, 'countries_layer', """ "NAME" = 'United States of America' """)
10 |
11 | arcpy.SelectLayerByLocation_management('points_layer', 'WITHIN', 'countries_layer')
12 | arcpy.FeatureClassToFeatureClass_conversion('points_layer', outpath, 'cities_in_us')
13 |
14 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_5.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | arcpy.env.overwriteOutput = True
4 |
5 | points = r"E:\Files\GIS\_Tutorial\Data\ne_10m_populated_places.shp"
6 | countries = r"E:\Files\GIS\_Tutorial\Data\ne_10m_admin_0_countries.shp"
7 | outpath = r"E:\Files\GIS\_Tutorial\outputs"
8 |
9 | countries_of_interest = ['United States', 'Italy', 'Kenya', 'Jordan', 'Lebanon', 'Scotland', 'France']
10 |
11 | arcpy.MakeFeatureLayer_management(points, 'points_layer')
12 |
13 | for x in countries_of_interest:
14 | print x
15 | arcpy.MakeFeatureLayer_management(countries, 'countries_layer', """ "NAME" = '{}' """.format(x))
16 | arcpy.SelectLayerByLocation_management('points_layer', 'WITHIN', 'countries_layer')
17 | arcpy.FeatureClassToFeatureClass_conversion('points_layer', outpath, 'cities_in_{}'.format(x))
18 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_6.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | arcpy.env.overwriteOutput = True
4 |
5 | points = r"E:\Files\GIS\_Tutorial\Data\ne_10m_populated_places.shp"
6 | countries = r"E:\Files\GIS\_Tutorial\Data\ne_10m_admin_0_countries.shp"
7 | outpath = r"E:\Files\GIS\_Tutorial\outputs"
8 |
9 | with arcpy.da.SearchCursor(points, ['Name', 'POP_MAX', 'TIMEZONE']) as cities_cursor:
10 | for x in cities_cursor:
11 | print x[0]
12 | print x[1]
13 | print x[2] + '\n'
14 | print x
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_7.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | arcpy.env.overwriteOutput = True
4 |
5 | points = r"E:\Files\GIS\_Tutorial\Data\ne_10m_populated_places.shp"
6 | countries = r"E:\Files\GIS\_Tutorial\Data\ne_10m_admin_0_countries.shp"
7 | outpath = r"E:\Files\GIS\_Tutorial\outputs"
8 |
9 | arcpy.MakeFeatureLayer_management(points, 'points_layer')
10 |
11 | with arcpy.da.SearchCursor(countries, ['FID', 'SOVEREIGNT']) as country_cursor:
12 | for x in country_cursor:
13 | print x[0]
14 | arcpy.MakeFeatureLayer_management(countries, 'countries_layer', """ "FID" = {} """.format(x[0]))
15 | arcpy.SelectLayerByLocation_management('points_layer', 'WITHIN', 'countries_layer')
16 | arcpy.FeatureClassToFeatureClass_conversion('points_layer', outpath, 'cities_in_{}'.format(x[1]))
17 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_8.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | arcpy.env.overwriteOutput = True
4 |
5 | points = r"E:\Files\GIS\_Tutorial\Data\ne_10m_populated_places.shp"
6 | countries = r"E:\Files\GIS\_Tutorial\Data\ne_10m_admin_0_countries.shp"
7 | outpath = r"E:\Files\GIS\_Tutorial\outputs"
8 |
9 | arcpy.MakeFeatureLayer_management(points, 'points_layer')
10 |
11 | with arcpy.da.SearchCursor(countries, ['FID', 'SOVEREIGNT']) as country_cursor:
12 | for x in country_cursor:
13 | print x[0]
14 | arcpy.MakeFeatureLayer_management(countries, 'countries_layer', """ "FID" = {} """.format(x[0]))
15 | arcpy.SelectLayerByLocation_management('points_layer', 'WITHIN', 'countries_layer')
16 | # 2021 update - had to replace '-' and also encode into utf-8
17 | formatted_output_name = x[1].replace('(', '_').replace(')', '_').replace('-', '_').encode('utf-8')
18 | arcpy.FeatureClassToFeatureClass_conversion('points_layer', outpath, 'cities_in_{}'.format(formatted_output_name))
19 | print 'Successfully Converted {} \n'.format(formatted_output_name)
20 |
21 | print 'Finished'
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/gis_script_vid_9.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | arcpy.env.overwriteOutput = True
4 |
5 | points = r"E:\Files\GIS\_Tutorial\Data\ne_10m_populated_places.shp"
6 | countries = r"E:\Files\GIS\_Tutorial\Data\ne_10m_admin_0_countries.shp"
7 | outpath = r"E:\Files\GIS\_Tutorial\outputs"
8 |
9 | arcpy.MakeFeatureLayer_management(points, 'points_layer')
10 |
11 | with arcpy.da.SearchCursor(countries, ['FID', 'SOVEREIGNT']) as country_cursor:
12 | for x in country_cursor:
13 | print x[1]
14 | arcpy.MakeFeatureLayer_management(countries, 'countries_layer', """ "FID" = {} """.format(x[0]))
15 | arcpy.SelectLayerByLocation_management('points_layer', 'WITHIN', 'countries_layer')
16 | # 2021 Update - had to replace '-' and also encode into utf-8
17 | formatted_output_name = x[1].replace('(', '_').replace(')', '_').replace('-', '_').encode('utf-8')
18 | arcpy.FeatureClassToFeatureClass_conversion('points_layer', outpath, 'cities_in_{}_{}'.format(formatted_output_name, x[0]))
19 | print 'Successfully Converted {} \n'.format(formatted_output_name)
20 |
21 | print 'Finished'
22 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/img_to_shp_vid_15.py:
--------------------------------------------------------------------------------
1 | import os
2 |
3 | img_folder = r"D:\Files\GIS\_Tutorial\Data\imgs"
4 | img_contents = os.listdir(img_folder)
5 |
6 | for image in img_contents:
7 | print(image)
8 | full_path = os.path.join(img_folder, image)
9 | print(full_path)
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/img_to_shp_vid_16.py:
--------------------------------------------------------------------------------
1 | import os
2 | from PIL import Image, ExifTags
3 |
4 | img_folder = r"D:\Files\GIS\_Tutorial\Data\imgs"
5 | img_contents = os.listdir(img_folder)
6 |
7 | for image in img_contents:
8 |
9 | full_path = os.path.join(img_folder, image)
10 | pil_img = Image.open(full_path)
11 |
12 | exif = {ExifTags.TAGS[k]: v for k, v in pil_img._getexif().items() if k in ExifTags.TAGS}
13 | print(exif)
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/img_to_shp_vid_17.py:
--------------------------------------------------------------------------------
1 | import os
2 | from PIL import Image, ExifTags
3 |
4 | img_folder = r"D:\Files\GIS\_Tutorial\Data\imgs"
5 | img_contents = os.listdir(img_folder)
6 |
7 | for image in img_contents:
8 |
9 | full_path = os.path.join(img_folder, image)
10 | pil_img = Image.open(full_path)
11 |
12 | exif = {ExifTags.TAGS[k]: v for k, v in pil_img._getexif().items() if k in ExifTags.TAGS}
13 |
14 | gps_all = {}
15 |
16 | try:
17 | for key in exif['GPSInfo'].keys():
18 |
19 | decoded_value = ExifTags.GPSTAGS.get(key)
20 | gps_all[decoded_value] = exif['GPSInfo'][key]
21 |
22 | long_ref = gps_all.get('GPSLongitudeRef')
23 | lat_ref = gps_all.get('GPSLatitudeRef')
24 |
25 | long = gps_all.get('GPSLongitude')
26 | lat = gps_all.get('GPSLatitude')
27 |
28 | print(long_ref)
29 | print(lat_ref)
30 |
31 | print(long)
32 | print(lat)
33 |
34 | except:
35 | print("This image has no GPS info in it")
36 | print(full_path)
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/img_to_shp_vid_18.py:
--------------------------------------------------------------------------------
1 | import os
2 | from PIL import Image, ExifTags
3 |
4 | img_folder = r"D:\Files\GIS\_Tutorial\Data\imgs"
5 | img_contents = os.listdir(img_folder)
6 |
7 |
8 | def convert_to_degrees(value):
9 |
10 | d0 = value[0][0]
11 | d1 = value[0][1]
12 | d = float(d0) / float(d1)
13 |
14 | m0 = value[1][0]
15 | m1 = value[1][1]
16 | m = float(m0) / float(m1)
17 |
18 | s0 = value[2][0]
19 | s1 = value[2][1]
20 | s = float(s0) / float(s1)
21 |
22 | return d + (m / 60.0) + (s / 3600.0)
23 |
24 |
25 | for image in img_contents:
26 |
27 | full_path = os.path.join(img_folder, image)
28 | pil_img = Image.open(full_path)
29 |
30 | exif = {ExifTags.TAGS[k]: v for k, v in pil_img._getexif().items() if k in ExifTags.TAGS}
31 |
32 | gps_all = {}
33 |
34 | try:
35 | for key in exif['GPSInfo'].keys():
36 |
37 | decoded_value = ExifTags.GPSTAGS.get(key)
38 | gps_all[decoded_value] = exif['GPSInfo'][key]
39 |
40 | long_ref = gps_all.get('GPSLongitudeRef')
41 | lat_ref = gps_all.get('GPSLatitudeRef')
42 |
43 | long = gps_all.get('GPSLongitude')
44 | lat = gps_all.get('GPSLatitude')
45 |
46 | print(long_ref)
47 | print(lat_ref)
48 |
49 | if long_ref == "W":
50 | long_in_degrees = -abs(convert_to_degrees(long))
51 | else:
52 | long_in_degrees = convert_to_degrees(long)
53 |
54 | if lat_ref == "S":
55 | lat_in_degrees = -abs(convert_to_degrees(lat))
56 | else:
57 | lat_in_degrees = convert_to_degrees(lat)
58 |
59 | print(lat_in_degrees)
60 | print(long_in_degrees)
61 |
62 | except:
63 | print("This image has no GPS info in it")
64 | print(full_path)
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/img_to_shp_vid_19.py:
--------------------------------------------------------------------------------
1 | import os
2 | from PIL import Image, ExifTags
3 | import arcpy
4 |
5 | img_folder = r"D:\Files\GIS\_Tutorial\Data\imgs"
6 | img_contents = os.listdir(img_folder)
7 | out_shapefile = r"D:\Files\GIS\_Tutorial\outputs\out_shape.shp"
8 | shp_list = []
9 | spatial_ref = arcpy.env.outputCoordinateSystem = arcpy.SpatialReference(4326)
10 |
11 |
12 | def shape_creator():
13 |
14 | pt = arcpy.Point()
15 | ptGeoms = []
16 |
17 | for p in shp_list:
18 |
19 | pt.X = p[0]
20 | pt.Y = p[1]
21 |
22 | ptGeoms.append(arcpy.PointGeometry(pt, spatial_ref))
23 |
24 | arcpy.CopyFeatures_management(ptGeoms, out_shapefile)
25 |
26 |
27 | def convert_to_degrees(value):
28 |
29 | d0 = value[0][0]
30 | d1 = value[0][1]
31 | d = float(d0) / float(d1)
32 |
33 | m0 = value[1][0]
34 | m1 = value[1][1]
35 | m = float(m0) / float(m1)
36 |
37 | s0 = value[2][0]
38 | s1 = value[2][1]
39 | s = float(s0) / float(s1)
40 |
41 | return d + (m / 60.0) + (s / 3600.0)
42 |
43 |
44 | for image in img_contents:
45 |
46 | full_path = os.path.join(img_folder, image)
47 | pil_img = Image.open(full_path)
48 |
49 | exif = {ExifTags.TAGS[k]: v for k, v in pil_img._getexif().items() if k in ExifTags.TAGS}
50 |
51 | gps_all = {}
52 |
53 | try:
54 | for key in exif['GPSInfo'].keys():
55 |
56 | decoded_value = ExifTags.GPSTAGS.get(key)
57 | gps_all[decoded_value] = exif['GPSInfo'][key]
58 |
59 | long_ref = gps_all.get('GPSLongitudeRef')
60 | lat_ref = gps_all.get('GPSLatitudeRef')
61 |
62 | long = gps_all.get('GPSLongitude')
63 | lat = gps_all.get('GPSLatitude')
64 |
65 | print(long_ref)
66 | print(lat_ref)
67 |
68 | if long_ref == "W":
69 | long_in_degrees = -abs(convert_to_degrees(long))
70 | else:
71 | long_in_degrees = convert_to_degrees(long)
72 |
73 | if lat_ref == "S":
74 | lat_in_degrees = -abs(convert_to_degrees(lat))
75 | else:
76 | lat_in_degrees = convert_to_degrees(lat)
77 |
78 | shp_list.append([long_in_degrees, lat_in_degrees])
79 |
80 | except:
81 | print("This image has no GPS info in it")
82 | print(full_path)
83 | pass
84 |
85 | print(shp_list)
86 | shape_creator()
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/img_to_shp_vid_20.py:
--------------------------------------------------------------------------------
1 | import os
2 | from PIL import Image, ExifTags
3 | import arcpy
4 |
5 | img_folder = r"D:\Files\GIS\_Tutorial\Data\imgs"
6 | img_contents = os.listdir(img_folder)
7 | out_shapefile = r"D:\Files\GIS\_Tutorial\outputs\out_shape.shp"
8 | shp_list = []
9 | spatial_ref = arcpy.env.outputCoordinateSystem = arcpy.SpatialReference(4326)
10 |
11 |
12 | def shape_creator():
13 |
14 | pt = arcpy.Point()
15 | ptGeoms = []
16 |
17 | for p in shp_list:
18 |
19 | pt.X = p[0]
20 | pt.Y = p[1]
21 |
22 | ptGeoms.append(arcpy.PointGeometry(pt, spatial_ref))
23 |
24 | arcpy.CopyFeatures_management(ptGeoms, out_shapefile)
25 | arcpy.AddXY_management(out_shapefile)
26 | arcpy.AddField_management(out_shapefile, "timestamp", "TEXT", 9, "", "", "refcode", "NULLABLE", "REQUIRED")
27 | arcpy.AddField_management(out_shapefile, "img_path", "TEXT", 9, "", "", "refcode", "NULLABLE", "REQUIRED")
28 |
29 | for x in shp_list:
30 | with arcpy.da.UpdateCursor(out_shapefile, ["timestamp", "img_path"]) as our_cursor:
31 | for c in our_cursor:
32 |
33 | c[0] = x[3]
34 | c[1] = x[4]
35 |
36 | our_cursor.updateRow(c)
37 |
38 |
39 | def convert_to_degrees(value):
40 |
41 | d0 = value[0][0]
42 | d1 = value[0][1]
43 | d = float(d0) / float(d1)
44 |
45 | m0 = value[1][0]
46 | m1 = value[1][1]
47 | m = float(m0) / float(m1)
48 |
49 | s0 = value[2][0]
50 | s1 = value[2][1]
51 | s = float(s0) / float(s1)
52 |
53 | return d + (m / 60.0) + (s / 3600.0)
54 |
55 |
56 | for image in img_contents:
57 |
58 | full_path = os.path.join(img_folder, image)
59 | pil_img = Image.open(full_path)
60 |
61 | exif = {ExifTags.TAGS[k]: v for k, v in pil_img._getexif().items() if k in ExifTags.TAGS}
62 |
63 | gps_all = {}
64 |
65 | try:
66 | img_time = (exif['DateTime'])
67 | for key in exif['GPSInfo'].keys():
68 |
69 | decoded_value = ExifTags.GPSTAGS.get(key)
70 | gps_all[decoded_value] = exif['GPSInfo'][key]
71 |
72 | long_ref = gps_all.get('GPSLongitudeRef')
73 | lat_ref = gps_all.get('GPSLatitudeRef')
74 |
75 | long = gps_all.get('GPSLongitude')
76 | lat = gps_all.get('GPSLatitude')
77 |
78 | print(long_ref)
79 | print(lat_ref)
80 |
81 | if long_ref == "W":
82 | long_in_degrees = -abs(convert_to_degrees(long))
83 | else:
84 | long_in_degrees = convert_to_degrees(long)
85 |
86 | if lat_ref == "S":
87 | lat_in_degrees = -abs(convert_to_degrees(lat))
88 | else:
89 | lat_in_degrees = convert_to_degrees(lat)
90 |
91 | shp_list.append([long_in_degrees, lat_in_degrees, full_path, img_time])
92 |
93 | except:
94 | print("This image has no GPS info in it")
95 | print(full_path)
96 | pass
97 |
98 | print(shp_list)
99 | shape_creator()
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/Scripts/img_to_shp_vid_21.py:
--------------------------------------------------------------------------------
1 | import os
2 | from PIL import Image, ExifTags
3 | import arcpy
4 |
5 | img_folder = r"D:\Files\GIS\_Tutorial\Data\imgs"
6 | img_contents = os.listdir(img_folder)
7 | out_shapefile = r"D:\Files\GIS\_Tutorial\outputs\out_shape.shp"
8 | shp_list = []
9 | spatial_ref = arcpy.env.outputCoordinateSystem = arcpy.SpatialReference(4326)
10 |
11 |
12 | def shape_creator():
13 |
14 | pt = arcpy.Point()
15 | ptGeoms = []
16 |
17 | for p in shp_list:
18 |
19 | pt.X = p[0]
20 | pt.Y = p[1]
21 |
22 | ptGeoms.append(arcpy.PointGeometry(pt, spatial_ref))
23 |
24 | arcpy.CopyFeatures_management(ptGeoms, out_shapefile)
25 | arcpy.AddXY_management(out_shapefile)
26 | arcpy.AddField_management(out_shapefile, "timestamp", "TEXT", 9, "", "", "refcode", "NULLABLE", "REQUIRED")
27 | arcpy.AddField_management(out_shapefile, "img_path", "TEXT", 9, "", "", "refcode", "NULLABLE", "REQUIRED")
28 |
29 | count = 0
30 |
31 | with arcpy.da.UpdateCursor(out_shapefile, ["timestamp", "img_path"]) as our_cursor:
32 | for c in our_cursor:
33 |
34 | c[0] = shp_list[count][3]
35 | c[1] = shp_list[count][2]
36 | count +=1
37 |
38 | our_cursor.updateRow(c)
39 |
40 |
41 | def convert_to_degrees(value):
42 |
43 | d0 = value[0][0]
44 | d1 = value[0][1]
45 | d = float(d0) / float(d1)
46 |
47 | m0 = value[1][0]
48 | m1 = value[1][1]
49 | m = float(m0) / float(m1)
50 |
51 | s0 = value[2][0]
52 | s1 = value[2][1]
53 | s = float(s0) / float(s1)
54 |
55 | return d + (m / 60.0) + (s / 3600.0)
56 |
57 |
58 | for image in img_contents:
59 |
60 | full_path = os.path.join(img_folder, image)
61 | pil_img = Image.open(full_path)
62 |
63 | exif = {ExifTags.TAGS[k]: v for k, v in pil_img._getexif().items() if k in ExifTags.TAGS}
64 |
65 | gps_all = {}
66 |
67 | try:
68 | img_time = (exif['DateTime'])
69 | for key in exif['GPSInfo'].keys():
70 |
71 | decoded_value = ExifTags.GPSTAGS.get(key)
72 | gps_all[decoded_value] = exif['GPSInfo'][key]
73 |
74 | long_ref = gps_all.get('GPSLongitudeRef')
75 | lat_ref = gps_all.get('GPSLatitudeRef')
76 |
77 | long = gps_all.get('GPSLongitude')
78 | lat = gps_all.get('GPSLatitude')
79 |
80 | print(long_ref)
81 | print(lat_ref)
82 |
83 | if long_ref == "W":
84 | long_in_degrees = -abs(convert_to_degrees(long))
85 | else:
86 | long_in_degrees = convert_to_degrees(long)
87 |
88 | if lat_ref == "S":
89 | lat_in_degrees = -abs(convert_to_degrees(lat))
90 | else:
91 | lat_in_degrees = convert_to_degrees(lat)
92 |
93 | shp_list.append([long_in_degrees, lat_in_degrees, full_path, img_time])
94 |
95 | except:
96 | print("This image has no GPS info in it")
97 | print(full_path)
98 | pass
99 |
100 | print(shp_list)
101 | shape_creator()
102 |
--------------------------------------------------------------------------------
/Arcgis Scripting with Python Arcpy/readme.txt:
--------------------------------------------------------------------------------
1 | Data downloaded from here -
2 | Cities - https://www.naturalearthdata.com/downloads/10m-cultural-vectors/10m-populated-places/
3 | Countries - https://www.naturalearthdata.com/downloads/50m-cultural-vectors/50m-admin-0-countries-2/
4 |
5 | Data can also be found in this repo, but may be an older version of the data.
6 |
7 | To get the script in the toolbox working, you will need to open the toolbox in ArcMap, right click on the SelectByLocation script and click properties.
8 | Go to source tab and under Script File, update it to the path of the select_by_location_tool.py.
9 |
10 | Currently it points to where the script existed on my computer
11 | but this will be different for you.
12 |
13 | Any questions reach out to me on my Youtube channel!
14 |
15 |
16 |
17 |
--------------------------------------------------------------------------------
/FitBit API/fitbit_requester.py:
--------------------------------------------------------------------------------
1 | import requests
2 |
3 | # Implicit Grant Flow
4 | # Get this token from requesting in browser
5 |
6 | access_token = "paste your token that you got from browser url here"
7 |
8 | header = {'Authorization': 'Bearer {}'.format(access_token)}
9 | response = requests.get("https://api.fitbit.com/1/user/-/profile.json", headers=header).json()
10 |
11 | print(response['user'])
12 |
13 | for k, v in response['user'].items():
14 | print(k)
15 | print(v)
16 | print("\n")
17 |
18 |
19 |
--------------------------------------------------------------------------------
/FitBit API/get_fitbit_data.js:
--------------------------------------------------------------------------------
1 |
2 | const access_token = "your access token"
3 |
4 | fetch('https://api.fitbit.com/1/user/-/activities/steps/date/today/1y.json', {
5 | method: "GET",
6 | headers: {"Authorization": "Bearer " + access_token}
7 | })
8 | .then(response => response.json())
9 | .then(json => console.log(json));
10 |
--------------------------------------------------------------------------------
/FitBit API/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
FitBit API Demo
6 |
7 |
8 |
9 |
--------------------------------------------------------------------------------
/FitBit API/notes.txt:
--------------------------------------------------------------------------------
1 | FitBit API Info 11/30/2020
2 |
3 | https://dev.fitbit.com/build/reference/web-api/oauth2/#authorization-page
4 | We are using the Implicit Grant Flow. In order to do this, the app must be registered as personal or client.
5 |
6 | First make this request in a browser -
7 | https://www.fitbit.com/oauth2/authorize?response_type=token&client_id=22942C&redirect_uri=https%3A%2F%2Fexample.com%2Ffitbit_auth&scope=activity%20nutrition%20heartrate%20location%20nutrition%20profile%20settings%20sleep%20social%20weight&expires_in=604800
8 | Remember to fill in client_id with your client_id. Also increase time limit (in seconds) if you'd like it to last longer.
9 | 86400 for 1 day
10 | 604800 for 1 week
11 | 2592000 for 30 days
12 | 31536000 for 1 year
13 |
14 | Allow all in the browser and then copy the access_token value from the response URL in the browser.
15 |
16 | Store the access token in a variable in Python and use it to make requests to the Fitbit API.
17 |
18 |
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/B21C0631.FIT:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/GIS Python Scripting with Arcpy in ArcGIS Pro/B21C0631.FIT
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/add_csv.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | working_directory = r"E:\Youtube_Videos\fgdb"
4 | input_csv_file = r"E:\Youtube_Videos\arcpy\weather_stations.csv"
5 |
6 | file_gdb_path = arcpy.management.CreateFileGDB(working_directory, f"fgdb_1", "CURRENT")
7 | print(file_gdb_path)
8 | new_feature_class = arcpy.management.XYTableToPoint(input_csv_file, fr"{file_gdb_path}\new_feature_class", "longitude", "Latitude", None, "GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]];-400 -400 1000000000;-100000 10000;-100000 10000;8.98315284119521E-09;0.001;0.001;IsHighPrecision")
9 |
10 | print(new_feature_class)
11 |
12 | arcpy.AddField_management(new_feature_class, "temperature__F", "DOUBLE")
13 | print("added new field")
14 |
15 | arcpy.management.CalculateField(new_feature_class, "temperature__F", "(!temperature__C! * 9/5) + 32", "PYTHON3", '', "TEXT")
16 |
17 | print("finished everything")
18 |
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/create_points.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | coords = [(-78.144527, 38.582526), (-121.747833, 38.410558), (-59.825562, -7.710992), (72.023570, 22.755921)]
4 |
5 | new_shape_file = arcpy.CreateFeatureclass_management(r"C:\Fran_Temp_Working_Files\arc", "TEST_142.shp", "POINT", spatial_reference=4326)
6 | print(new_shape_file)
7 |
8 | with arcpy.da.InsertCursor(new_shape_file, ['SHAPE@XY']) as insert_cursor:
9 | for coord in coords:
10 | print("Inserted {} into {}".format(coord, new_shape_file))
11 | insert_cursor.insertRow([coord])
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/create_points_with_field.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 |
3 | coords = [('VIRGINIA', (-78.144527, 38.582526)), ('CALIFORNIA', (-121.747833, 38.410558)), ('SOUTH AMERICA', (-59.825562, -7.710992)), ('India', (72.023570, 22.755921))]
4 |
5 | new_shape_file = arcpy.CreateFeatureclass_management(r"C:\Fran_Temp_Working_Files\arc", "TEST_143.shp", "POINT", spatial_reference=4326)
6 | print(new_shape_file)
7 | arcpy.AddField_management(new_shape_file, "NAME", "TEXT")
8 |
9 | with arcpy.da.InsertCursor(new_shape_file, ['NAME', 'SHAPE@XY']) as insert_cursor:
10 | for coord in coords:
11 | print("Inserted {} into {}".format(coord, new_shape_file))
12 | insert_cursor.insertRow(coord)
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/garmin_csv_script.py:
--------------------------------------------------------------------------------
1 | import csv
2 | import arcpy
3 |
4 | new_shape_file = arcpy.CreateFeatureclass_management(r"C:\Fran_Temp_Working_Files\garmin_fit", "garmin_shapefile_1.shp", "POINT", spatial_reference=4326)
5 | arcpy.AddField_management(new_shape_file, "HEARTRATE", "TEXT")
6 |
7 | with open(r"C:\Fran_Temp_Working_Files\garmin_fit\B21D3144.csv") as csv_file:
8 | csv_reader=csv.reader(csv_file, delimiter=',')
9 |
10 | coords_list = []
11 |
12 | for row in csv_reader:
13 | if row[1] == '7' and row[0] == 'Data':
14 | heart_rate = row[22]
15 | position_lat_semi_circles = row[7]
16 | position_long_semi_circles = row[10]
17 |
18 | position_lat_degrees = float(position_lat_semi_circles) * (180 / 2**31)
19 | position_long_degrees = float(position_long_semi_circles) * (180 / 2 ** 31)
20 | print(f"position_lat_degrees: {position_lat_degrees}")
21 | print(f"position_long_degrees: {position_long_degrees}\n")
22 | coords_list.append((heart_rate, (position_long_degrees, position_lat_degrees)) )
23 |
24 | print(coords_list)
25 |
26 | with arcpy.da.InsertCursor(new_shape_file, ["HEARTRATE", 'SHAPE@XY']) as insert_cursor:
27 | for coord in coords_list:
28 | print("Inserted {} into {}".format(coord, new_shape_file))
29 | insert_cursor.insertRow(coord)
30 |
31 | arcpy.management.AddXY(new_shape_file)
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/gpx_to_shp.py:
--------------------------------------------------------------------------------
1 | import os
2 | import arcpy
3 |
4 |
5 | gpx_directory_path = r"C:\Users\Fran\Downloads\export_22552451\activities"
6 |
7 | gpx_list = os.listdir(gpx_directory_path)
8 |
9 |
10 | for gpx_file in gpx_list:
11 | if gpx_file.endswith(".gpx"):
12 | full_gpx_path = os.path.join(gpx_directory_path, gpx_file)
13 | print(f"Converting {full_gpx_path} to shapefile")
14 | output_shapefile = os.path.splitext(full_gpx_path)[0] + ".shp"
15 | try:
16 | arcpy.GPXtoFeatures_conversion(full_gpx_path, output_shapefile)
17 | print(f"Finished Converting {full_gpx_path} to shapefile \n")
18 | except Exception as e:
19 | print("failed to convert")
20 | print(e)
21 |
22 | print("Finished converting all gpx to shp")
23 |
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/image_processor.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 | import boto3
3 | from botocore.handlers import disable_signing
4 | import os
5 |
6 | s3 = boto3.resource('s3')
7 | s3.meta.client.meta.events.register('choose-signer.s3.*', disable_signing)
8 | bucket = 'njogis-imagery'
9 |
10 |
11 | mosaic_dataset = r"E:\Youtube_Videos\New File Geodatabase.gdb\youtube_test_mosaic_dataset"
12 |
13 | my_bucket = s3.Bucket(bucket)
14 |
15 | prefix = "2015/cog/"
16 |
17 | counter = 0
18 |
19 | for file in my_bucket.objects.filter(Prefix=prefix):
20 | counter+=1
21 | vsis3_image_path = os.path.join("/vsis3/", bucket, file.key).replace("\\", "/")
22 | print("Adding {}".format(vsis3_image_path))
23 | print("counter = {}".format(counter))
24 | arcpy.AddRastersToMosaicDataset_management(mosaic_dataset, "Raster Dataset", vsis3_image_path)
25 | print(arcpy.GetMessages())
26 |
27 | if counter > 5:
28 | break
29 |
30 | print("Program finished")
31 |
32 |
33 |
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/image_processor_2.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 | import boto3
3 | from botocore.handlers import disable_signing
4 | import os
5 |
6 | s3 = boto3.resource('s3')
7 | s3.meta.client.meta.events.register('choose-signer.s3.*', disable_signing)
8 | bucket = 'njogis-imagery'
9 |
10 |
11 | mosaic_dataset = r"E:\Youtube_Videos\New File Geodatabase.gdb\youtube_test_mosaic_dataset"
12 |
13 | my_bucket = s3.Bucket(bucket)
14 |
15 | prefix = "2015/cog/"
16 |
17 | counter = 0
18 | images_to_add =[]
19 |
20 | for file in my_bucket.objects.filter(Prefix=prefix):
21 |
22 | counter+=1
23 | vsis3_image_path = os.path.join("/vsis3/", bucket, file.key).replace("\\", "/")
24 | print("Adding {} to list".format(vsis3_image_path))
25 | print("counter = {}".format(counter))
26 | images_to_add.append(vsis3_image_path)
27 |
28 | if counter > 20:
29 | break
30 |
31 | print("This is the image list, to be used as input for AddRasters {}".format(images_to_add))
32 |
33 | arcpy.AddRastersToMosaicDataset_management(mosaic_dataset, "Raster Dataset", images_to_add)
34 | print(arcpy.GetMessages())
35 | print("Program finished")
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/landsat_8_processor.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 | import boto3
3 | from botocore.handlers import disable_signing
4 | import os
5 |
6 | s3 = boto3.resource('s3')
7 | s3.meta.client.meta.events.register('choose-signer.s3.*', disable_signing)
8 | bucket = 'landsat-pds'
9 |
10 | my_bucket = s3.Bucket(bucket)
11 |
12 | path = "150"
13 | row = "045"
14 |
15 | download_folder = r"E:\Youtube_Videos\Landsat Data"
16 | prefix = "c1/L8/{}/{}/".format(path, row)
17 |
18 | landsat_bands = []
19 |
20 | how_many_times_to_run = 5
21 | counter = 0
22 |
23 | for file in my_bucket.objects.filter(Prefix=prefix):
24 |
25 | if counter == how_many_times_to_run:
26 | break
27 | vsis3_image_path = os.path.join("/vsis3/", bucket, file.key).replace("\\", "/")
28 | if vsis3_image_path.endswith(".TIF"):
29 | vsis3_list = vsis3_image_path.split("/")
30 |
31 | if vsis3_image_path[-6:-4] == 'B2' or vsis3_image_path[-6:-4] == 'B3' or vsis3_image_path[-6:-4] == 'B4':
32 | print("Downloading {}".format(vsis3_image_path))
33 | my_bucket.download_file(file.key, os.path.join(download_folder, vsis3_list[8]))
34 | print("Adding: {} to list".format(os.path.join(download_folder, vsis3_list[8])))
35 | landsat_bands.append(os.path.join(download_folder, vsis3_list[8]))
36 |
37 | if len(landsat_bands) == 3:
38 | print("Running composite bands with these values {}".format(list(reversed(landsat_bands))))
39 |
40 | arcpy.CompositeBands_management(list(reversed(landsat_bands)), os.path.join(download_folder, "landsat_composite_{}.tif".format(counter)))
41 | print(arcpy.GetMessages())
42 | counter += 1
43 | landsat_bands = []
44 | print("set landsat bands back to 0")
45 |
46 | print("Finished Processing {} Landsats".format(how_many_times_to_run))
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/make_fgdbs.py:
--------------------------------------------------------------------------------
1 | import arcpy
2 | import os
3 | import time
4 |
5 | start_time = time.time()
6 | # where you want the fgdbs to go
7 | working_directory = r"E:\Youtube_Videos\fgdb"
8 |
9 | for fgdb in range(5):
10 | fgdb_full_path = os.path.join(working_directory, f"fgdb_{fgdb}.gdb")
11 |
12 | print(f"Creating {fgdb_full_path}")
13 | arcpy.management.CreateFileGDB(working_directory, f"fgdb_{fgdb}", "CURRENT")
14 |
15 | print(f"Creating feature class in {fgdb_full_path}")
16 | arcpy.management.CreateFeatureclass(fgdb_full_path, "fc_1", "POINT", None, "DISABLED", "DISABLED", None, '', 0, 0, 0, '')
17 |
18 | print(f"Importing cities shapefile in {fgdb_full_path}\n")
19 | arcpy.FeatureClassToFeatureClass_conversion(r"C:\Users\Fran\Downloads\ne_10m_populated_places (2)\ne_10m_populated_places.shp", fgdb_full_path, "world_cities")
20 |
21 | print(f"Process Complete. Took --- {time.time() - start_time} seconds ---")
22 |
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/videos_10_to_16_notes.txt:
--------------------------------------------------------------------------------
1 | aws s3 cp s3://landsat-pds/c1/L8/139/045/LC08_L1TP_139045_20170304_20170316_01_T1/LC08_L1TP_139045_20170304_20170316_01_T1_B2.TIF . --no-sign-request
2 | aws s3 cp s3://landsat-pds/c1/L8/139/045/LC08_L1TP_139045_20170304_20170316_01_T1/LC08_L1TP_139045_20170304_20170316_01_T1_B3.TIF . --no-sign-request
3 | aws s3 cp s3://landsat-pds/c1/L8/139/045/LC08_L1TP_139045_20170304_20170316_01_T1/LC08_L1TP_139045_20170304_20170316_01_T1_B4.TIF . --no-sign-request
4 |
5 |
6 | https://www.usgs.gov/media/files/landsat-wrs-2-descending-path-row-shapefile
7 |
8 | https://landsat.usgs.gov/landsat_acq
9 |
10 |
11 | /vsis3/landsat-pds/c1/L8/150/045/LC08_L1GT_150045_20180624_20180624_01_RT/LC08_L1GT_150045_20180624_20180624_01_RT_B4.TIF
12 | /vsis3/landsat-pds/c1/L8/150/045/LC08_L1GT_150045_20180624_20180624_01_RT/LC08_L1GT_150045_20180624_20180624_01_RT_B3.TIF
13 | /vsis3/landsat-pds/c1/L8/150/045/LC08_L1GT_150045_20180624_20180624_01_RT/LC08_L1GT_150045_20180624_20180624_01_RT_B2.TIF
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/videos_17_to_18_notes.txt:
--------------------------------------------------------------------------------
1 | https://www.naturalearthdata.com/downloads/10m-cultural-vectors/10m-populated-places/
2 | https://pro.arcgis.com/en/pro-app/tool-reference/conversion/feature-class-to-feature-class.htm
--------------------------------------------------------------------------------
/GIS Python Scripting with Arcpy in ArcGIS Pro/videos_1_to_9_notes.txt:
--------------------------------------------------------------------------------
1 | NOTES for videos 1-9
2 |
3 | FREE DATA IN S3
4 |
5 | https://registry.opendata.aws/
6 | https://registry.opendata.aws/nj-imagery/
7 |
8 | Install AWS CLI https://aws.amazon.com/cli/, once that is installed access data via
9 |
10 | aws s3 ls s3://njogis-imagery/ --no-sign-request
11 |
12 | bucket: njogis-imagery
13 |
14 | key: /2015/cog/L9C9.tif
15 |
16 | Examples of s3 image paths (use this notation to access imagery in ArcGIS Pro):
17 |
18 | /vsis3/njogis-imagery/2015/cog/L9C9.tif
19 | /vsis3/njogis-imagery/2015/cog/L5C4.tif
20 | /vsis3/njogis-imagery/2015/cog/L6A3.tif
21 | /vsis3/njogis-imagery/2015/cog/L5B2.tif
22 | /vsis3/njogis-imagery/2015/cog/L5A6.tif
23 | /vsis3/njogis-imagery/2015/cog/L6A10.tif
24 | /vsis3/njogis-imagery/2015/cog/L6A6.tif
25 | /vsis3/njogis-imagery/2015/cog/L5A5.tif
26 |
27 |
28 | ArcPy command copied from Pro:
29 |
30 | with arcpy.EnvManager(scratchWorkspace=r"C:\Users\Fran\Documents\ArcGIS\Projects\MyProject11\MyProject11.gdb", workspace=r"C:\Users\Fran\Documents\ArcGIS\Projects\MyProject11\MyProject11.gdb"):
31 | arcpy.management.AddRastersToMosaicDataset(r"E:\Youtube_Videos\New File Geodatabase.gdb\youtube_test_mosaic_dataset", "Raster Dataset", "/vsis3/njogis-imagery/2015/cog/L6A6.tif", "UPDATE_CELL_SIZES", "UPDATE_BOUNDARY", "NO_OVERVIEWS", None, 0, 1500, None, '', "SUBFOLDERS", "ALLOW_DUPLICATES", "NO_PYRAMIDS", "NO_STATISTICS", "NO_THUMBNAILS", '', "NO_FORCE_SPATIAL_REFERENCE", "NO_STATISTICS", None, "NO_PIXEL_CACHE", r"C:\Users\Fran\AppData\Local\ESRI\rasterproxies\youtube_test_mosaic_dataset")
32 |
33 |
34 |
35 | Python Command Prompt RUN AS ADMIN
36 |
37 | conda env list
38 | conda create --name youtube_clone --clone arcgispro-py3
39 | conda list
40 | conda install -c anaconda boto3
41 |
42 | https://anaconda.org/anaconda/boto3
43 |
44 | https://stackoverflow.com/questions/30249069/listing-contents-of-a-bucket-with-boto3
45 | https://stackoverflow.com/questions/34865927/can-i-use-boto3-anonymously
46 | https://stackoverflow.com/questions/35803027/retrieving-subfolders-names-in-s3-bucket-from-boto3
47 |
48 | list objects in s3 bucket with boto3
49 |
50 | import boto3
51 | s3 = boto3.resource('s3')
52 |
53 | my_bucket = s3.Bucket('bucket_name')
54 |
55 | for file in my_bucket.objects.all():
56 | print(file.key)
57 |
58 | GOOD LUCK!! Comment on youtube if you have questions!
--------------------------------------------------------------------------------
/Garmin API/B21C0631.FIT:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/Garmin API/B21C0631.FIT
--------------------------------------------------------------------------------
/Garmin API/garmin_csv_script.py:
--------------------------------------------------------------------------------
1 | import csv
2 | import arcpy
3 |
4 | new_shape_file = arcpy.CreateFeatureclass_management(r"C:\Fran_Temp_Working_Files\garmin_fit", "garmin_shapefile_1.shp", "POINT", spatial_reference=4326)
5 | arcpy.AddField_management(new_shape_file, "HEARTRATE", "TEXT")
6 |
7 | with open(r"C:\Fran_Temp_Working_Files\garmin_fit\B21D3144.csv") as csv_file:
8 | csv_reader=csv.reader(csv_file, delimiter=',')
9 |
10 | coords_list = []
11 |
12 | for row in csv_reader:
13 | if row[1] == '7' and row[0] == 'Data':
14 | heart_rate = row[22]
15 | position_lat_semi_circles = row[7]
16 | position_long_semi_circles = row[10]
17 |
18 | position_lat_degrees = float(position_lat_semi_circles) * (180 / 2**31)
19 | position_long_degrees = float(position_long_semi_circles) * (180 / 2 ** 31)
20 | print(f"position_lat_degrees: {position_lat_degrees}")
21 | print(f"position_long_degrees: {position_long_degrees}\n")
22 | coords_list.append((heart_rate, (position_long_degrees, position_lat_degrees)) )
23 |
24 | print(coords_list)
25 |
26 | with arcpy.da.InsertCursor(new_shape_file, ["HEARTRATE", 'SHAPE@XY']) as insert_cursor:
27 | for coord in coords_list:
28 | print("Inserted {} into {}".format(coord, new_shape_file))
29 | insert_cursor.insertRow(coord)
30 |
31 | arcpy.management.AddXY(new_shape_file)
--------------------------------------------------------------------------------
/Garmin API/pyshp_shapefile.py:
--------------------------------------------------------------------------------
1 | import csv
2 | import shapefile
3 |
4 | with open(r"C:\Fran_Temp_Working_Files\obs_vids_temp\B21C0631.csv") as csv_file:
5 | csv_reader = csv.reader(csv_file, delimiter=',')
6 |
7 | with shapefile.Writer(r"C:\Fran_Temp_Working_Files\garmin_fit\pyshp_demo\new_file") as w:
8 | w.field('HeartRate', 'C')
9 |
10 | for row in csv_reader:
11 | if row[1] == '7' and row[0] == 'Data':
12 | heart_rate = row[22]
13 | position_lat_semi_circles = row[7]
14 | position_long_semi_circles = row[10]
15 |
16 | position_lat_degrees = float(position_lat_semi_circles) * (180 / 2**31)
17 | position_long_degrees = float(position_long_semi_circles) * (180 / 2 ** 31)
18 | print(f"position_lat_degrees: {position_lat_degrees}")
19 | print(f"position_long_degrees: {position_long_degrees}\n")
20 | w.point(position_long_degrees, position_lat_degrees)
21 | w.record(heart_rate)
22 |
23 |
24 | with open(r"C:\Fran_Temp_Working_Files\garmin_fit\pyshp_demo\new_file.prj", "w") as text_file:
25 | epsg = 'GEOGCS["WGS 84",'
26 | epsg += 'DATUM["WGS_1984",'
27 | epsg += 'SPHEROID["WGS 84",6378137,298.257223563]]'
28 | epsg += ',PRIMEM["Greenwich",0],'
29 | epsg += 'UNIT["degree",0.0174532925199433]]'
30 |
31 | print(epsg, file=text_file)
32 |
--------------------------------------------------------------------------------
/Garmin API/qgis_shapefile.py:
--------------------------------------------------------------------------------
1 | ## in order to use python from qgis in pycharm need to set interpreter to "C:\OSGeo4W64\bin\python-qgis.bat"
2 | ## https://pcjericks.github.io/py-gdalogr-cookbook/vector_layers.html#create-a-new-shapefile-and-add-data
3 | ## https://en.wikipedia.org/wiki/Well-known_text_representation_of_geometry
4 |
5 | import csv
6 | import osgeo.ogr as ogr
7 | import osgeo.osr as osr
8 |
9 | driver = ogr.GetDriverByName("ESRI Shapefile")
10 | data_source = driver.CreateDataSource(r"E:\garmin_points.shp")
11 | srs = osr.SpatialReference()
12 | srs.ImportFromEPSG(4326)
13 |
14 | # create the layer
15 | layer = data_source.CreateLayer("garmin_points", srs, ogr.wkbPoint)
16 |
17 | # Add the fields we're interested in
18 | field_name = ogr.FieldDefn("Heartrate", ogr.OFTString)
19 | field_name.SetWidth(24)
20 | layer.CreateField(field_name)
21 | layer.CreateField(ogr.FieldDefn("Latitude", ogr.OFTReal))
22 | layer.CreateField(ogr.FieldDefn("Longitude", ogr.OFTReal))
23 |
24 | # Save and close the data source
25 |
26 | with open(r"C:\Fran_Temp_Working_Files\obs_vids_temp\B21C0631.csv") as csv_file:
27 | csv_reader = csv.reader(csv_file, delimiter=',')
28 |
29 | for row in csv_reader:
30 | if row[1] == '7' and row[0] == 'Data':
31 | heart_rate = row[22]
32 | position_lat_semi_circles = row[7]
33 | position_long_semi_circles = row[10]
34 |
35 | position_lat_degrees = float(position_lat_semi_circles) * (180 / 2**31)
36 | position_long_degrees = float(position_long_semi_circles) * (180 / 2 ** 31)
37 | print(f"position_lat_degrees: {position_lat_degrees}")
38 | print(f"position_long_degrees: {position_long_degrees}\n")
39 |
40 | # create the feature
41 | feature = ogr.Feature(layer.GetLayerDefn())
42 | # Set the attributes using the values from the delimited text file
43 | feature.SetField("Heartrate", heart_rate)
44 | feature.SetField("Latitude", position_lat_degrees)
45 | feature.SetField("Longitude", position_long_degrees)
46 |
47 | # create the WKT for the feature using Python string formatting
48 | wkt = f"POINT({position_long_degrees} {position_lat_degrees})"
49 | print("wkt: {}".format(wkt))
50 |
51 | # Create the point from the Well Known Txt
52 | point = ogr.CreateGeometryFromWkt(wkt)
53 | feature.SetGeometry(point)
54 | layer.CreateFeature(feature)
55 |
56 | feature = None
57 | data_source = None
58 |
59 |
60 |
--------------------------------------------------------------------------------
/Misc Projects/exiftool_gpx_creator.py:
--------------------------------------------------------------------------------
1 | import os
2 | import subprocess
3 | root_vid_directory = r"/Users/fpolig01/Videos/dashcam_footage/"
4 |
5 | for path, directories, files in os.walk(root_vid_directory):
6 | for video_file in files:
7 | if video_file.endswith("MP4"):
8 | full_mp4_path = os.path.join(path, video_file)
9 | full_gpx_output_path = full_mp4_path.replace(".MP4", ".GPX")
10 | print(f"Processing: {full_mp4_path}")
11 | with open(full_gpx_output_path, "w") as gpx_file:
12 | exiftool_command = ["exiftool", "-ee", "-m", "-p", "/Users/fpolig01/Videos/dashcam_footage/gpx.fmt", full_mp4_path]
13 | subprocess.run(exiftool_command, stdout=gpx_file)
14 | print(f"Succesfully created: {full_gpx_output_path}\n")
15 |
16 |
--------------------------------------------------------------------------------
/Misc Projects/exiftool_notes.txt:
--------------------------------------------------------------------------------
1 | Notes on how to use Exiftool Command Line Interface (CLI)
2 |
3 | https://exiftool.org/
4 | Documentation: https://exiftool.org/exiftool_pod.html
5 | location of gpx.fmt file: https://github.com/exiftool/exiftool/blob/master/fmt_files/gpx.fmt
6 |
7 | Print embedded metadata to screen:
8 |
9 | "C:\Users\Administrator\Downloads\exiftool-12.30\exiftool.exe" -ee "C:\Users\Administrator\Desktop\exif_tool_example\GC1U1218.MP4"
10 |
11 | Write embedded metadata to a text file:
12 | "C:\Users\Administrator\Downloads\exiftool-12.30\exiftool.exe" -ee "C:\Users\Administrator\Desktop\exif_tool_example\GC1U1218.MP4" > "C:\Users\Administrator\Desktop\exif_tool_example\exiftool_output_test.txt"
13 |
14 | Print embedded metadata to screen in .gpx format:
15 |
16 | "C:\Users\Administrator\Downloads\exiftool-12.30\exiftool.exe" -p "C:\Users\Administrator\Desktop\exif_tool_example\gpx.fmt" -ee "C:\Users\Administrator\Desktop\exif_tool_example\GC1U1218.MP4"
17 |
18 | Write embedded metadata to a file in .gpx format:
19 |
20 | "C:\Users\Administrator\Downloads\exiftool-12.30\exiftool.exe" -p "C:\Users\Administrator\Desktop\exif_tool_example\gpx.fmt" -ee "C:\Users\Administrator\Desktop\exif_tool_example\GC1U1218.MP4" > "C:\Users\Administrator\Desktop\exif_tool_example\exiftool_gpx_output_test.gpx"
21 |
22 | Create .gpx files for all .mp4 files in a given directory:
23 |
24 | "C:\Users\Administrator\Downloads\exiftool-12.30\exiftool.exe" -p "C:\\Users\\Administrator\\Desktop\\exif_tool_example\\gpx.fmt" -ee -ext MP4 -w "C:\\Users\\Administrator\\Desktop\\exif_tool_example\\%f.gpx" "C:\Users\Administrator\Desktop\exif_tool_example"
25 |
26 | used this to merge all gpx in a directory to a shapefile
27 |
28 | for file in /Volumes/FranArchives/Home_Media/Frans_Videos/2022/Dashcam_Footage/Lompac_Solvang_Partial/gpx/*GC1*.GPX; do ogr2ogr /Volumes/FranArchives/Home_Media/Frans_Videos/2022/Dashcam_Footage/Lompac_Solvang_Partial/gpx.shp -append "${file}" track_points -fieldTypeToString DateTime; done
29 |
30 |
--------------------------------------------------------------------------------
/Misc Projects/ip_location.py:
--------------------------------------------------------------------------------
1 | import requests
2 |
3 | ## single ip request
4 | # response = requests.get("http://ip-api.com/json/24.48.0.1").json()
5 | #
6 | # print(response['lat'])
7 | # print(response['lon'])
8 |
9 | # batch ip request
10 |
11 | response = requests.post("http://ip-api.com/batch", json=[
12 | {"query": "208.80.152.201"},
13 | {"query": "167.71.3.52"},
14 | {"query": "206.189.198.234"},
15 | {"query": "157.230.75.212"}
16 | ]).json()
17 |
18 | for ip_info in response:
19 | for k,v in ip_info.items():
20 | print(k,v)
21 | print("\n")
22 |
23 |
24 |
--------------------------------------------------------------------------------
/Misc Projects/ips.txt:
--------------------------------------------------------------------------------
1 | https://ip-api.com/
2 |
3 |
4 | 157.230.10.237
5 | 167.71.3.52
6 | 206.189.198.234
7 | 157.230.75.212
8 | 157.230.75.222
9 | 54.183.16.194
10 | 3.226.243.139
11 | 185.22.172.130
12 | 68.183.101.175
13 | 139.59.85.200
14 |
15 |
--------------------------------------------------------------------------------
/Misc Projects/openstreetmap_to_postgres_notes:
--------------------------------------------------------------------------------
1 | Youtube Videos
2 |
3 | https://youtu.be/decUXZZlstA
4 |
5 | 1) Download and Install Postgres and PostGIS
6 | 2) Create new OSM database
7 | 3) Enable PostGIS Extension
8 | 4) Enable hstore Extension
9 | 5) Download osm2pgsql Tool and Style File (https://osm2pgsql.org/doc/install.html) https://learnosm.org/en/osm-data/osm2pgsql/
10 | 6) Download OSM data/osm2pgsql (https://download.geofabrik.de/)
11 | 7) Import data using osm2pgsql
12 |
13 | Sample osm2pgsql command -
14 |
15 | "C:\Users\Administrator\Downloads\osm2pgsql-latest-x64\osm2pgsql-bin\osm2pgsql.exe" -c -d osm -U postgres -W -H localhost -S "C:\Users\Administrator\Downloads\default.style" "C:\Users\Administrator\Downloads\us-northeast-latest.osm.pbf"
16 |
17 |
18 |
19 |
20 |
21 |
--------------------------------------------------------------------------------
/Open Source GIS/fiona_shapely.py:
--------------------------------------------------------------------------------
1 | import csv
2 | import fiona
3 | from fiona.crs import from_epsg
4 | from shapely.geometry import Point, mapping
5 |
6 | with open(r"C:\Fran_Temp_Working_Files\obs_vids_temp\B21C0631.csv") as csv_file:
7 | csv_reader = csv.reader(csv_file, delimiter=',')
8 |
9 | yourschema = {'geometry': 'Point', 'properties': {'name': 'str', 'heart_rate': 'int'}}
10 |
11 | with fiona.open(r"C:\Fran_Temp_Working_Files\obs_vids_temp\fiona_shapely.shp", 'w', crs=from_epsg(4326), driver='ESRI Shapefile', schema=yourschema) as output:
12 | for row in csv_reader:
13 | if row[1] == '7' and row[0] == 'Data':
14 | heart_rate = row[22]
15 | position_lat_semi_circles = row[7]
16 | position_long_semi_circles = row[10]
17 |
18 | position_lat_degrees = float(position_lat_semi_circles) * (180 / 2**31)
19 | position_long_degrees = float(position_long_semi_circles) * (180 / 2 ** 31)
20 | print(f"position_lat_degrees: {position_lat_degrees}")
21 | print(f"position_long_degrees: {position_long_degrees}\n")
22 |
23 | the_point = Point(float(position_long_degrees), float(position_lat_degrees))
24 | prop = {'name': "this is the name value", 'heart_rate': int(heart_rate)}
25 | output.write({'geometry': mapping(the_point), 'properties': prop})
26 |
27 |
28 |
29 |
30 |
31 |
--------------------------------------------------------------------------------
/Open Source GIS/gpx_to_shp_ogr.py:
--------------------------------------------------------------------------------
1 | import os
2 | import ogr2ogr
3 |
4 | gpx_directory_path = r"C:\Users\Fran\Downloads\export_22552451\activities"
5 |
6 | gpx_list = os.listdir(gpx_directory_path)
7 |
8 |
9 | for gpx_file in gpx_list:
10 | if gpx_file.endswith(".gpx"):
11 | full_gpx_path = os.path.join(gpx_directory_path, gpx_file)
12 | print(f"Converting {full_gpx_path} to shapefile")
13 | output_shapefile = os.path.splitext(full_gpx_path)[0] + ".shp"
14 | try:
15 | ogr2ogr.main(["", "-f", "ESRI Shapefile", output_shapefile, full_gpx_path])
16 | ## download ogr2ogr.py from here https://svn.osgeo.org/gdal/trunk/gdal/swig/python/samples/ogr2ogr.py
17 | print(f"Finished Converting {full_gpx_path} to shapefile \n")
18 | except Exception as e:
19 | print("failed to convert")
20 | print(e)
21 |
22 | print("Finished converting all gpx to shp")
23 |
--------------------------------------------------------------------------------
/Open Source GIS/pyshp_shapefile.py:
--------------------------------------------------------------------------------
1 | import csv
2 | import shapefile
3 |
4 | with open(r"C:\Fran_Temp_Working_Files\obs_vids_temp\B21C0631.csv") as csv_file:
5 | csv_reader = csv.reader(csv_file, delimiter=',')
6 |
7 | with shapefile.Writer(r"C:\Fran_Temp_Working_Files\garmin_fit\pyshp_demo\new_file") as w:
8 | w.field('HeartRate', 'C')
9 |
10 | for row in csv_reader:
11 | if row[1] == '7' and row[0] == 'Data':
12 | heart_rate = row[22]
13 | position_lat_semi_circles = row[7]
14 | position_long_semi_circles = row[10]
15 |
16 | position_lat_degrees = float(position_lat_semi_circles) * (180 / 2**31)
17 | position_long_degrees = float(position_long_semi_circles) * (180 / 2 ** 31)
18 | print(f"position_lat_degrees: {position_lat_degrees}")
19 | print(f"position_long_degrees: {position_long_degrees}\n")
20 | w.point(position_long_degrees, position_lat_degrees)
21 | w.record(heart_rate)
22 |
23 |
24 | with open(r"C:\Fran_Temp_Working_Files\garmin_fit\pyshp_demo\new_file.prj", "w") as text_file:
25 | epsg = 'GEOGCS["WGS 84",'
26 | epsg += 'DATUM["WGS_1984",'
27 | epsg += 'SPHEROID["WGS 84",6378137,298.257223563]]'
28 | epsg += ',PRIMEM["Greenwich",0],'
29 | epsg += 'UNIT["degree",0.0174532925199433]]'
30 |
31 | print(epsg, file=text_file)
32 |
--------------------------------------------------------------------------------
/Python Boto3/boto3_practice.py:
--------------------------------------------------------------------------------
1 | import boto3
2 | import os
3 |
4 | s3 = boto3.resource('s3')
5 |
6 | # list all buckets and all objects
7 | # for bucket in s3.buckets.all():
8 | # my_bucket = s3.Bucket(bucket.name)
9 | #
10 | # for file in my_bucket.objects.all():
11 | # print(f"Bucket: {bucket.name} Key: {file.key}")
12 |
13 | my_bucket = s3.Bucket("fran-lambda-bucket")
14 |
15 | # List starting from a certain object
16 |
17 | working_directory = r"E:\Youtube_Videos\boto3_project"
18 |
19 | #Download objects
20 |
21 | # for file in my_bucket.objects.filter(Prefix="fran_testing/LAMBDA_GDAL/"):
22 | #
23 | # if file.key.endswith(".lrc"):
24 | #
25 | # local_file_name = os.path.join(working_directory, file.key.split("/")[2])
26 | #
27 | # print(f"Downloading {file.key} to {local_file_name}")
28 | # my_bucket.download_file(file.key, local_file_name)
29 | # print(f"Finished downloading {local_file_name}\n")
30 |
31 | # upload objects
32 |
33 | local_upload_directory = r"Z:\Frans_Files\Drone Files\20201010"
34 |
35 | for image in os.listdir(local_upload_directory):
36 | full_upload_path = os.path.join(local_upload_directory, image)
37 | print(f"uploading {full_upload_path} to youtube_examples/{image}")
38 | my_bucket.upload_file(full_upload_path, f"youtube_examples/{image}")
39 | print(f"done uploading {full_upload_path} to youtube_examples/{image}\n")
40 |
41 |
42 |
43 |
44 |
45 |
46 |
47 |
48 |
--------------------------------------------------------------------------------
/Python Boto3/boto3_storage_class.py:
--------------------------------------------------------------------------------
1 | import boto3
2 |
3 | s3_resource = boto3.resource('s3')
4 | bucket = s3_resource.Bucket("francis-destination-bucket")
5 |
6 | for bucket_obj in bucket.objects.all():
7 | object_name = bucket_obj.key
8 | storage_class = bucket_obj.storage_class
9 |
10 | print(f'This is the Key: {object_name} and this is the Storage Class: {storage_class}')
11 |
12 |
--------------------------------------------------------------------------------
/Python Boto3/launch_ec2_instances.py:
--------------------------------------------------------------------------------
1 | import boto3
2 |
3 | ec2 = boto3.resource('ec2')
4 |
5 | # instance = ec2.create_instances(
6 | # ImageId='ami-0885b1f6bd170450c',
7 | # MinCount=1,
8 | # MaxCount=2,
9 | # InstanceType='t2.micro',
10 | # )
11 | # print(instance)
12 |
13 | #ec2.instances.filter(InstanceIds=["i-0b92235b01c30cafd", "i-0d6cedd2bb5544c46"]).terminate()
14 |
15 | instances = ec2.instances.filter(Filters=[{'Name': 'instance-state-name', 'Values': ['running']}])
16 |
17 | for instance in instances:
18 | print(instance.id, instance.instance_type)
--------------------------------------------------------------------------------
/Python Boto3/list_objects_to_text_file.py:
--------------------------------------------------------------------------------
1 | import boto3
2 |
3 | s3 = boto3.resource('s3')
4 | my_bucket = s3.Bucket("fran-lambda-bucket")
5 |
6 | text_file_location = r"E:\Youtube_Videos\boto3_project\boto3_list.txt"
7 |
8 | with open(text_file_location, 'a') as text_file:
9 |
10 | for file in my_bucket.objects.all():
11 |
12 | if file.key.endswith(".lrc"):
13 | print(file.key)
14 | text_file.write(file.key + '\n')
15 |
16 |
17 |
18 |
19 |
--------------------------------------------------------------------------------
/Python Boto3/notes.txt:
--------------------------------------------------------------------------------
1 | Here are some of the posts I used in the video -
2 | Any questions please leave a comment!
3 |
4 | https://stackoverflow.com/questions/30249069/listing-contents-of-a-bucket-with-boto3
5 | https://stackoverflow.com/questions/36209068/boto3-grabbing-only-selected-objects-from-the-s3-resource
6 | https://stackoverflow.com/questions/29378763/how-to-save-s3-object-to-a-file-using-boto3
7 | https://stackoverflow.com/questions/3207219/how-do-i-list-all-files-of-a-directory
8 | https://stackoverflow.com/questions/20429246/how-to-write-to-txt-files-in-python-3
--------------------------------------------------------------------------------
/Python Folium Web Mapping/folium_convert_to_geojson.py:
--------------------------------------------------------------------------------
1 | from osgeo import gdal
2 | import folium
3 | import os
4 |
5 |
6 | def shapefile2geojson(infile, outfile):
7 |
8 | options = gdal.VectorTranslateOptions(format="GeoJSON", dstSRS="EPSG:4326")
9 | gdal.VectorTranslate(outfile, infile, options=options)
10 |
11 |
12 | infile = input("Copy/Paste the location of your shapefile: ").strip('"')
13 | output_geojson = os.path.splitext(infile)[0] + ".geojson"
14 |
15 | shapefile2geojson(infile, output_geojson)
16 |
17 |
18 | m = folium.Map(location=[37, 0],
19 | zoom_start=2.5,
20 | tiles='https://server.arcgisonline.com/arcgis/rest/services/Canvas/World_Dark_Gray_Base/MapServer/tile/{z}/{y}/{x}',
21 | attr='My Data Attribution')
22 |
23 |
24 | g = folium.GeoJson(
25 | output_geojson,
26 | name='geojson'
27 | ).add_to(m)
28 |
29 | folium.GeoJsonTooltip(fields=["NAME"]).add_to(g)
30 |
31 |
32 | m.save("E:\pythonProject\index_2.html")
--------------------------------------------------------------------------------
/Python Folium Web Mapping/folium_geojson.py:
--------------------------------------------------------------------------------
1 | import folium
2 |
3 | m = folium.Map(location=[37, 0],
4 | zoom_start=2.5,
5 | tiles='https://server.arcgisonline.com/arcgis/rest/services/Canvas/World_Dark_Gray_Base/MapServer/tile/{z}/{y}/{x}',
6 | attr='My Data Attribution')
7 |
8 |
9 | geojson = r"E:\pythonProject\gz_2010_us_040_00_500k\gz_2010_us_040_00_500k\gz_2010_us_040_00_500k.geojson"
10 |
11 | g = folium.GeoJson(
12 | geojson,
13 | name='geojson'
14 | ).add_to(m)
15 |
16 | folium.GeoJsonTooltip(fields=["NAME"]).add_to(g)
17 |
18 |
19 | m.save("E:\pythonProject\index.html")
--------------------------------------------------------------------------------
/Python Folium Web Mapping/folium_project.py:
--------------------------------------------------------------------------------
1 | import folium
2 | import requests
3 |
4 | m = folium.Map(location=[37, 0], zoom_start=2.5, tiles='Stamen Terrain')
5 |
6 |
7 | response = requests.get('https://restcountries.eu/rest/v2/all').json()
8 | #print(response)
9 |
10 | for country_data in response:
11 | #print(country_data)
12 | # print(country_data['name'])
13 |
14 | if country_data['latlng']:
15 | lat = country_data['latlng'][0]
16 | long = country_data['latlng'][1]
17 | print(country_data)
18 | folium.Marker([lat, long], popup='
{} {} '.format(country_data['name'], country_data['population']), icon=folium.Icon(icon='cloud', color='green')).add_to(m)
19 | else:
20 | continue
21 | #
22 | m.save("E:\pythonProject\index.html")
--------------------------------------------------------------------------------
/Python Folium Web Mapping/notes.txt:
--------------------------------------------------------------------------------
1 | https://python-visualization.github.io/folium/
2 | https://leafletjs.com/
3 | https://restcountries.eu/#api-endpoints-all
4 | https://realpython.com/python-requests/
5 |
6 | notes from videos 7-9
7 |
8 | where to download geojson and shapefile used in video -
9 | https://eric.clst.org/tech/usgeojson/
10 |
11 | leaflet basemaps providers -
12 |
13 | https://leaflet-extras.github.io/leaflet-providers/preview/
14 |
15 | esri map server -
16 | https://server.arcgisonline.com/arcgis/rest/services/
17 |
18 | shapefile to geojson command -
19 | https://shallowsky.com/blog/mapping/folium-with-shapefiles.html
20 |
21 | gdal anaconda
22 | https://anaconda.org/conda-forge/gdal
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Code From Youtube Tutorials
2 | Code from my YouTube Tutorials
3 |
--------------------------------------------------------------------------------
/Rasperry Pi Projects/GPS/gps_reader.py:
--------------------------------------------------------------------------------
1 | import pynmea2
2 |
3 | with open(r'C:\Users\Fran\Desktop\gps_10_samples.txt', mode='r') as in_file, open(r'C:\Users\Fran\Desktop\gps_10_samples_output.txt', mode='w') as out_file:
4 |
5 | for line in in_file:
6 | try:
7 | msg = pynmea2.parse(line)
8 |
9 | if msg.sentence_type == "RMC":
10 |
11 | gps_info_array = []
12 | gps_info_array.append(str(msg.datestamp))
13 | gps_info_array.append(str(msg.latitude))
14 | gps_info_array.append(str(msg.longitude))
15 | out_file.write(str(gps_info_array))
16 | out_file.write("\n")
17 | except pynmea2.ParseError as e:
18 | continue
--------------------------------------------------------------------------------
/Rasperry Pi Projects/GPS/library_gps.py:
--------------------------------------------------------------------------------
1 | ## I got this code from https://maker.pro/raspberry-pi/tutorial/how-to-read-gps-data-with-python-on-a-raspberry-pi
2 |
3 | from gps import *
4 | import time
5 |
6 | running = True
7 |
8 | def getPositionData(gps):
9 | nx = gpsd.next()
10 | if nx['class'] == 'TPV':
11 | latitude = getattr(nx,'lat', "Unknown")
12 | longitude = getattr(nx,'lon', "Unknown")
13 | print "Your position: lon = " + str(longitude) + ", lat = " + str(latitude)
14 |
15 | gpsd = gps(mode=WATCH_ENABLE|WATCH_NEWSTYLE)
16 |
17 | try:
18 | print "Application started!"
19 | while running:
20 | getPositionData(gpsd)
21 | time.sleep(1.0)
22 |
23 | except (KeyboardInterrupt):
24 | running = False
25 | print "Applications closed!"
--------------------------------------------------------------------------------
/Rasperry Pi Projects/GPS/rpi_4_gps_notes.txt:
--------------------------------------------------------------------------------
1 | Raspberry Pi GPS NOTES
2 |
3 | 1) INSTALL FRESH RASPBIAN ==> Download Raspberry Pi Imager and write Raspbian Lite to sd card.
4 | 2) ENABLE SSH ==> Enable it by putting a file called ssh on the sd card before putting it in the PI
5 |
6 | https://www.raspberrypi.org/documentation/remote-access/ssh/
7 |
8 | 3) Plug in USB GPS and find it
9 |
10 | lsusb
11 | Bus 001 Device 006: ID 067b:2303 Prolific Technology, Inc. PL2303 Serial Port
12 |
13 | dmesg | grep tty
14 |
15 | 4) install GPSD and clients
16 |
17 | sudo apt update
18 | sudo apt upgrade
19 | sudo apt install gpsd gpsd-clients
20 |
21 | cgps
22 | not working...No output from GPS receiver..
23 |
24 | need to edit config file then restart gpsd service
25 |
26 | sudo nano /etc/default/gpsd
27 | DEVICES=/dev/ttyUSB0
28 | sudo service gpsd restart
29 | sudo service gpsd status
30 |
31 | cgps
32 | now its working..
33 | now write to log file
34 |
35 | gpspipe -r -o output.txt
36 | the -n 10 will print 10 lines then exit. Chnage to larger number if you want more gps data logged.
37 | gpspipe -r -n 10 -o output10.txt
38 | https://gpsd.gitlab.io/gpsd/gpspipe.html
39 |
40 | install pip so we can install other packages
41 | sudo apt-get install python3-pip
42 |
43 | use this to parse a log file and generate a new one
44 | pip3 install pynmea2
45 |
46 | python3
47 | import pynmea2
48 | msg = pynmea2.parse("$GPGGA,184353.07,1929.045,S,02410.506,E,1,04,2.6,100.00,M,-33.9,M,,0000*6D")
49 | msg
50 |
51 | In the video, I modified the script on my local machine (since its easier for me to use an IDE) then I copied it over to the Raspberry Pi via SCP.
52 | https://linuxize.com/post/how-to-use-scp-command-to-securely-transfer-files/
53 |
54 | sample scp -
55 | scp file.txt remote_username@10.10.0.2:/remote/directory
56 |
57 | Use this to print in real time gps with python and "gps" module. Similar to cgps
58 | pip3 install gps
59 | python3 library_gps.py
60 |
61 | Links/Guides I used -
62 | https://www.raspberrypi.org/software/
63 | https://maker.pro/raspberry-pi/tutorial/how-to-read-gps-data-with-python-on-a-raspberry-pi (sample scripts available at bottom of article)
64 | https://github.com/Knio/pynmea2
65 | https://ozzmaker.com/berrygps-setup-guide-raspberry-pi/
66 | https://docs.novatel.com/oem7/Content/Logs/GPRMC.htm?tocpath=Commands%20%2526%20Logs%7CLogs%7CAll%20Logs%7CGNSS%20Logs%7C_____65
67 | https://gist.github.com/wolfg1969/4653340
68 | https://stackoverflow.com/questions/50435295/read-from-file-and-write-to-another-python/50435495
69 | https://gpsd.gitlab.io/gpsd/gpspipe.html
70 | https://www.raspberrypi.org/documentation/remote-access/ssh/
71 | https://linuxize.com/post/how-to-use-scp-command-to-securely-transfer-files/
--------------------------------------------------------------------------------
/React with Leaflet/my-app/.eslintcache:
--------------------------------------------------------------------------------
1 | [{"C:\\Fran_Temp_Working_Files\\leaflet_project_youtube\\my-app\\src\\index.tsx":"1","C:\\Fran_Temp_Working_Files\\leaflet_project_youtube\\my-app\\src\\reportWebVitals.ts":"2","C:\\Fran_Temp_Working_Files\\leaflet_project_youtube\\my-app\\src\\App.tsx":"3","E:\\Frans_Files\\Youtube_Videos\\github\\Code_From_Tutorials\\React with Leaflet\\my-app\\src\\index.tsx":"4","E:\\Frans_Files\\Youtube_Videos\\github\\Code_From_Tutorials\\React with Leaflet\\my-app\\src\\reportWebVitals.ts":"5","E:\\Frans_Files\\Youtube_Videos\\github\\Code_From_Tutorials\\React with Leaflet\\my-app\\src\\App.tsx":"6"},{"size":500,"mtime":499162500000,"results":"7","hashOfConfig":"8"},{"size":425,"mtime":499162500000,"results":"9","hashOfConfig":"8"},{"size":1041,"mtime":1610302293611,"results":"10","hashOfConfig":"8"},{"size":500,"mtime":499162500000,"results":"11","hashOfConfig":"12"},{"size":425,"mtime":499162500000,"results":"13","hashOfConfig":"12"},{"size":1041,"mtime":1610302293611,"results":"14","hashOfConfig":"12"},{"filePath":"15","messages":"16","errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":"17"},"1i77jhq",{"filePath":"18","messages":"19","errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0,"usedDeprecatedRules":"17"},{"filePath":"20","messages":"21","errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0},{"filePath":"22","messages":"23","errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0},"1lm6mar",{"filePath":"24","messages":"25","errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0},{"filePath":"26","messages":"27","errorCount":0,"warningCount":0,"fixableErrorCount":0,"fixableWarningCount":0},"C:\\Fran_Temp_Working_Files\\leaflet_project_youtube\\my-app\\src\\index.tsx",[],["28","29"],"C:\\Fran_Temp_Working_Files\\leaflet_project_youtube\\my-app\\src\\reportWebVitals.ts",[],"C:\\Fran_Temp_Working_Files\\leaflet_project_youtube\\my-app\\src\\App.tsx",[],"E:\\Frans_Files\\Youtube_Videos\\github\\Code_From_Tutorials\\React with Leaflet\\my-app\\src\\index.tsx",[],"E:\\Frans_Files\\Youtube_Videos\\github\\Code_From_Tutorials\\React with Leaflet\\my-app\\src\\reportWebVitals.ts",[],"E:\\Frans_Files\\Youtube_Videos\\github\\Code_From_Tutorials\\React with Leaflet\\my-app\\src\\App.tsx",[],{"ruleId":"30","replacedBy":"31"},{"ruleId":"32","replacedBy":"33"},"no-native-reassign",["34"],"no-negated-in-lhs",["35"],"no-global-assign","no-unsafe-negation"]
--------------------------------------------------------------------------------
/React with Leaflet/my-app/README.md:
--------------------------------------------------------------------------------
1 | # Getting Started with Create React App
2 |
3 | This project was bootstrapped with [Create React App](https://github.com/facebook/create-react-app).
4 |
5 | ## Available Scripts
6 |
7 | In the project directory, you can run:
8 |
9 | ### `npm start`
10 |
11 | Runs the app in the development mode.\
12 | Open [http://localhost:3000](http://localhost:3000) to view it in the browser.
13 |
14 | The page will reload if you make edits.\
15 | You will also see any lint errors in the console.
16 |
17 | ### `npm test`
18 |
19 | Launches the test runner in the interactive watch mode.\
20 | See the section about [running tests](https://facebook.github.io/create-react-app/docs/running-tests) for more information.
21 |
22 | ### `npm run build`
23 |
24 | Builds the app for production to the `build` folder.\
25 | It correctly bundles React in production mode and optimizes the build for the best performance.
26 |
27 | The build is minified and the filenames include the hashes.\
28 | Your app is ready to be deployed!
29 |
30 | See the section about [deployment](https://facebook.github.io/create-react-app/docs/deployment) for more information.
31 |
32 | ### `npm run eject`
33 |
34 | **Note: this is a one-way operation. Once you `eject`, you can’t go back!**
35 |
36 | If you aren’t satisfied with the build tool and configuration choices, you can `eject` at any time. This command will remove the single build dependency from your project.
37 |
38 | Instead, it will copy all the configuration files and the transitive dependencies (webpack, Babel, ESLint, etc) right into your project so you have full control over them. All of the commands except `eject` will still work, but they will point to the copied scripts so you can tweak them. At this point you’re on your own.
39 |
40 | You don’t have to ever use `eject`. The curated feature set is suitable for small and middle deployments, and you shouldn’t feel obligated to use this feature. However we understand that this tool wouldn’t be useful if you couldn’t customize it when you are ready for it.
41 |
42 | ## Learn More
43 |
44 | You can learn more in the [Create React App documentation](https://facebook.github.io/create-react-app/docs/getting-started).
45 |
46 | To learn React, check out the [React documentation](https://reactjs.org/).
47 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/package.json:
--------------------------------------------------------------------------------
1 | {
2 | "name": "my-app",
3 | "version": "0.1.0",
4 | "private": true,
5 | "dependencies": {
6 | "@testing-library/jest-dom": "^5.11.8",
7 | "@testing-library/react": "^11.2.3",
8 | "@testing-library/user-event": "^12.6.0",
9 | "@types/jest": "^26.0.20",
10 | "@types/node": "^12.19.12",
11 | "@types/react": "^16.14.2",
12 | "@types/react-dom": "^16.9.10",
13 | "leaflet": "^1.7.1",
14 | "react": "^17.0.1",
15 | "react-dom": "^17.0.1",
16 | "react-leaflet": "^3.0.5",
17 | "react-scripts": "4.0.1",
18 | "typescript": "^4.1.3",
19 | "web-vitals": "^0.2.4"
20 | },
21 | "scripts": {
22 | "start": "react-scripts start",
23 | "build": "react-scripts build",
24 | "test": "react-scripts test",
25 | "eject": "react-scripts eject"
26 | },
27 | "eslintConfig": {
28 | "extends": [
29 | "react-app",
30 | "react-app/jest"
31 | ]
32 | },
33 | "browserslist": {
34 | "production": [
35 | ">0.2%",
36 | "not dead",
37 | "not op_mini all"
38 | ],
39 | "development": [
40 | "last 1 chrome version",
41 | "last 1 firefox version",
42 | "last 1 safari version"
43 | ]
44 | },
45 | "devDependencies": {
46 | "@types/leaflet": "^1.5.19"
47 | }
48 | }
49 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/public/favicon.ico:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/React with Leaflet/my-app/public/favicon.ico
--------------------------------------------------------------------------------
/React with Leaflet/my-app/public/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
12 |
13 |
17 |
18 |
19 |
22 |
31 |
React App
32 |
33 |
34 |
You need to enable JavaScript to run this app.
35 |
36 |
46 |
47 |
48 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/public/logo192.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/React with Leaflet/my-app/public/logo192.png
--------------------------------------------------------------------------------
/React with Leaflet/my-app/public/logo512.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/franchyze923/Code_From_Tutorials/6b6291f42c97abc0c33e3d8dc762dd6d5e048ee6/React with Leaflet/my-app/public/logo512.png
--------------------------------------------------------------------------------
/React with Leaflet/my-app/public/manifest.json:
--------------------------------------------------------------------------------
1 | {
2 | "short_name": "React App",
3 | "name": "Create React App Sample",
4 | "icons": [
5 | {
6 | "src": "favicon.ico",
7 | "sizes": "64x64 32x32 24x24 16x16",
8 | "type": "image/x-icon"
9 | },
10 | {
11 | "src": "logo192.png",
12 | "type": "image/png",
13 | "sizes": "192x192"
14 | },
15 | {
16 | "src": "logo512.png",
17 | "type": "image/png",
18 | "sizes": "512x512"
19 | }
20 | ],
21 | "start_url": ".",
22 | "display": "standalone",
23 | "theme_color": "#000000",
24 | "background_color": "#ffffff"
25 | }
26 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/public/robots.txt:
--------------------------------------------------------------------------------
1 | # https://www.robotstxt.org/robotstxt.html
2 | User-agent: *
3 | Disallow:
4 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/src/App.css:
--------------------------------------------------------------------------------
1 | .leaflet-container{
2 |
3 | width: 100vw;
4 | height: 100vh;
5 | }
--------------------------------------------------------------------------------
/React with Leaflet/my-app/src/App.tsx:
--------------------------------------------------------------------------------
1 | import React from 'react';
2 | import { MapContainer, TileLayer, Marker, Popup } from 'react-leaflet'
3 | import './App.css';
4 | import teslaData from "./data/tesla-sites.json"
5 |
6 | function App() {
7 |
8 | const filteredStations = teslaData.filter(tsla => tsla.address.country === "Italy")
9 |
10 | return (
11 |
12 |
16 |
17 | {filteredStations.map(tsla => (
18 |
19 |
20 |
21 |
{"Name: " + tsla.name}
22 |
{"Status: " + tsla.status}
23 |
{"Number of Charging Stations: " + tsla.stallCount}
24 |
25 |
26 |
27 | ))}
28 |
29 | );
30 | }
31 |
32 | export default App;
33 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/src/index.css:
--------------------------------------------------------------------------------
1 | body {
2 | margin: 0;
3 | font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
4 | 'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
5 | sans-serif;
6 | -webkit-font-smoothing: antialiased;
7 | -moz-osx-font-smoothing: grayscale;
8 | }
9 |
10 | code {
11 | font-family: source-code-pro, Menlo, Monaco, Consolas, 'Courier New',
12 | monospace;
13 | }
14 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/src/index.tsx:
--------------------------------------------------------------------------------
1 | import React from 'react';
2 | import ReactDOM from 'react-dom';
3 | import './index.css';
4 | import App from './App';
5 | import reportWebVitals from './reportWebVitals';
6 |
7 | ReactDOM.render(
8 |
9 |
10 | ,
11 | document.getElementById('root')
12 | );
13 |
14 | // If you want to start measuring performance in your app, pass a function
15 | // to log results (for example: reportWebVitals(console.log))
16 | // or send to an analytics endpoint. Learn more: https://bit.ly/CRA-vitals
17 | reportWebVitals();
18 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/src/react-app-env.d.ts:
--------------------------------------------------------------------------------
1 | ///
2 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/src/reportWebVitals.ts:
--------------------------------------------------------------------------------
1 | import { ReportHandler } from 'web-vitals';
2 |
3 | const reportWebVitals = (onPerfEntry?: ReportHandler) => {
4 | if (onPerfEntry && onPerfEntry instanceof Function) {
5 | import('web-vitals').then(({ getCLS, getFID, getFCP, getLCP, getTTFB }) => {
6 | getCLS(onPerfEntry);
7 | getFID(onPerfEntry);
8 | getFCP(onPerfEntry);
9 | getLCP(onPerfEntry);
10 | getTTFB(onPerfEntry);
11 | });
12 | }
13 | };
14 |
15 | export default reportWebVitals;
16 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/src/setupTests.ts:
--------------------------------------------------------------------------------
1 | // jest-dom adds custom jest matchers for asserting on DOM nodes.
2 | // allows you to do things like:
3 | // expect(element).toHaveTextContent(/react/i)
4 | // learn more: https://github.com/testing-library/jest-dom
5 | import '@testing-library/jest-dom';
6 |
--------------------------------------------------------------------------------
/React with Leaflet/my-app/tsconfig.json:
--------------------------------------------------------------------------------
1 | {
2 | "compilerOptions": {
3 | "target": "es5",
4 | "lib": [
5 | "dom",
6 | "dom.iterable",
7 | "esnext"
8 | ],
9 | "allowJs": true,
10 | "skipLibCheck": true,
11 | "esModuleInterop": true,
12 | "allowSyntheticDefaultImports": true,
13 | "strict": true,
14 | "forceConsistentCasingInFileNames": true,
15 | "noFallthroughCasesInSwitch": true,
16 | "module": "esnext",
17 | "moduleResolution": "node",
18 | "resolveJsonModule": true,
19 | "isolatedModules": true,
20 | "noEmit": true,
21 | "jsx": "react-jsx"
22 | },
23 | "include": [
24 | "src"
25 | ]
26 | }
27 |
--------------------------------------------------------------------------------
/Strava_Api/LeafletUpdates/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2016 Jan Pieter Waagmeester
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
--------------------------------------------------------------------------------
/Strava_Api/LeafletUpdates/Polyline.encoded.js:
--------------------------------------------------------------------------------
1 | /*
2 | * Utility functions to decode/encode numbers and array's of numbers
3 | * to/from strings (Google maps polyline encoding)
4 | *
5 | * Extends the L.Polyline and L.Polygon object with methods to convert
6 | * to and create from these strings.
7 | *
8 | * Jan Pieter Waagmeester
9 | *
10 | * Original code from:
11 | * http://facstaff.unca.edu/mcmcclur/GoogleMaps/EncodePolyline/
12 | * (which is down as of december 2014)
13 | */
14 |
15 | // MIT License
16 |
17 | // Copyright (c) 2016 Jan Pieter Waagmeester
18 |
19 | // Permission is hereby granted, free of charge, to any person obtaining a copy
20 | // of this software and associated documentation files (the "Software"), to deal
21 | // in the Software without restriction, including without limitation the rights
22 | // to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
23 | // copies of the Software, and to permit persons to whom the Software is
24 | // furnished to do so, subject to the following conditions:
25 |
26 | // The above copyright notice and this permission notice shall be included in all
27 | // copies or substantial portions of the Software.
28 |
29 | // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
30 | // IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
31 | // FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
32 | // AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
33 | // LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
34 | // OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
35 | // SOFTWARE.
36 |
37 | (function () {
38 | 'use strict';
39 |
40 | var defaultOptions = function (options) {
41 | if (typeof options === 'number') {
42 | // Legacy
43 | options = {
44 | precision: options
45 | };
46 | } else {
47 | options = options || {};
48 | }
49 |
50 | options.precision = options.precision || 5;
51 | options.factor = options.factor || Math.pow(10, options.precision);
52 | options.dimension = options.dimension || 2;
53 | return options;
54 | };
55 |
56 | var PolylineUtil = {
57 | encode: function (points, options) {
58 | options = defaultOptions(options);
59 |
60 | var flatPoints = [];
61 | for (var i = 0, len = points.length; i < len; ++i) {
62 | var point = points[i];
63 |
64 | if (options.dimension === 2) {
65 | flatPoints.push(point.lat || point[0]);
66 | flatPoints.push(point.lng || point[1]);
67 | } else {
68 | for (var dim = 0; dim < options.dimension; ++dim) {
69 | flatPoints.push(point[dim]);
70 | }
71 | }
72 | }
73 |
74 | return this.encodeDeltas(flatPoints, options);
75 | },
76 |
77 | decode: function (encoded, options) {
78 | options = defaultOptions(options);
79 |
80 | var flatPoints = this.decodeDeltas(encoded, options);
81 |
82 | var points = [];
83 | for (var i = 0, len = flatPoints.length; i + (options.dimension - 1) < len;) {
84 | var point = [];
85 |
86 | for (var dim = 0; dim < options.dimension; ++dim) {
87 | point.push(flatPoints[i++]);
88 | }
89 |
90 | points.push(point);
91 | }
92 |
93 | return points;
94 | },
95 |
96 | encodeDeltas: function (numbers, options) {
97 | options = defaultOptions(options);
98 |
99 | var lastNumbers = [];
100 |
101 | for (var i = 0, len = numbers.length; i < len;) {
102 | for (var d = 0; d < options.dimension; ++d, ++i) {
103 | var num = numbers[i].toFixed(options.precision);
104 | var delta = num - (lastNumbers[d] || 0);
105 | lastNumbers[d] = num;
106 |
107 | numbers[i] = delta;
108 | }
109 | }
110 |
111 | return this.encodeFloats(numbers, options);
112 | },
113 |
114 | decodeDeltas: function (encoded, options) {
115 | options = defaultOptions(options);
116 |
117 | var lastNumbers = [];
118 |
119 | var numbers = this.decodeFloats(encoded, options);
120 | for (var i = 0, len = numbers.length; i < len;) {
121 | for (var d = 0; d < options.dimension; ++d, ++i) {
122 | numbers[i] = Math.round((lastNumbers[d] = numbers[i] + (lastNumbers[d] || 0)) * options.factor) / options.factor;
123 | }
124 | }
125 |
126 | return numbers;
127 | },
128 |
129 | encodeFloats: function (numbers, options) {
130 | options = defaultOptions(options);
131 |
132 | for (var i = 0, len = numbers.length; i < len; ++i) {
133 | numbers[i] = Math.round(numbers[i] * options.factor);
134 | }
135 |
136 | return this.encodeSignedIntegers(numbers);
137 | },
138 |
139 | decodeFloats: function (encoded, options) {
140 | options = defaultOptions(options);
141 |
142 | var numbers = this.decodeSignedIntegers(encoded);
143 | for (var i = 0, len = numbers.length; i < len; ++i) {
144 | numbers[i] /= options.factor;
145 | }
146 |
147 | return numbers;
148 | },
149 |
150 | encodeSignedIntegers: function (numbers) {
151 | for (var i = 0, len = numbers.length; i < len; ++i) {
152 | var num = numbers[i];
153 | numbers[i] = (num < 0) ? ~(num << 1) : (num << 1);
154 | }
155 |
156 | return this.encodeUnsignedIntegers(numbers);
157 | },
158 |
159 | decodeSignedIntegers: function (encoded) {
160 | var numbers = this.decodeUnsignedIntegers(encoded);
161 |
162 | for (var i = 0, len = numbers.length; i < len; ++i) {
163 | var num = numbers[i];
164 | numbers[i] = (num & 1) ? ~(num >> 1) : (num >> 1);
165 | }
166 |
167 | return numbers;
168 | },
169 |
170 | encodeUnsignedIntegers: function (numbers) {
171 | var encoded = '';
172 | for (var i = 0, len = numbers.length; i < len; ++i) {
173 | encoded += this.encodeUnsignedInteger(numbers[i]);
174 | }
175 | return encoded;
176 | },
177 |
178 | decodeUnsignedIntegers: function (encoded) {
179 | var numbers = [];
180 |
181 | var current = 0;
182 | var shift = 0;
183 |
184 | for (var i = 0, len = encoded.length; i < len; ++i) {
185 | var b = encoded.charCodeAt(i) - 63;
186 |
187 | current |= (b & 0x1f) << shift;
188 |
189 | if (b < 0x20) {
190 | numbers.push(current);
191 | current = 0;
192 | shift = 0;
193 | } else {
194 | shift += 5;
195 | }
196 | }
197 |
198 | return numbers;
199 | },
200 |
201 | encodeSignedInteger: function (num) {
202 | num = (num < 0) ? ~(num << 1) : (num << 1);
203 | return this.encodeUnsignedInteger(num);
204 | },
205 |
206 | // This function is very similar to Google's, but I added
207 | // some stuff to deal with the double slash issue.
208 | encodeUnsignedInteger: function (num) {
209 | var value, encoded = '';
210 | while (num >= 0x20) {
211 | value = (0x20 | (num & 0x1f)) + 63;
212 | encoded += (String.fromCharCode(value));
213 | num >>= 5;
214 | }
215 | value = num + 63;
216 | encoded += (String.fromCharCode(value));
217 |
218 | return encoded;
219 | }
220 | };
221 |
222 | // Export Node module
223 | if (typeof module === 'object' && typeof module.exports === 'object') {
224 | module.exports = PolylineUtil;
225 | }
226 |
227 | // Inject functionality into Leaflet
228 | if (typeof L === 'object') {
229 | if (!(L.Polyline.prototype.fromEncoded)) {
230 | L.Polyline.fromEncoded = function (encoded, options) {
231 | return L.polyline(PolylineUtil.decode(encoded), options);
232 | };
233 | }
234 | if (!(L.Polygon.prototype.fromEncoded)) {
235 | L.Polygon.fromEncoded = function (encoded, options) {
236 | return L.polygon(PolylineUtil.decode(encoded), options);
237 | };
238 | }
239 |
240 | var encodeMixin = {
241 | encodePath: function () {
242 | return PolylineUtil.encode(this.getLatLngs());
243 | }
244 | };
245 |
246 | if (!L.Polyline.prototype.encodePath) {
247 | L.Polyline.include(encodeMixin);
248 | }
249 | if (!L.Polygon.prototype.encodePath) {
250 | L.Polygon.include(encodeMixin);
251 | }
252 |
253 | L.PolylineUtil = PolylineUtil;
254 | }
255 | })();
--------------------------------------------------------------------------------
/Strava_Api/LeafletUpdates/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
8 |
9 |
12 |
13 |
14 |
15 |
16 |
17 |
18 | Hello
19 |
20 |
21 |
22 |
23 |
24 |
25 |
--------------------------------------------------------------------------------
/Strava_Api/LeafletUpdates/strava_api.js:
--------------------------------------------------------------------------------
1 |
2 | const auth_link = "https://www.strava.com/oauth/token"
3 |
4 | function getActivites(res){
5 |
6 | const activities_link = `https://www.strava.com/api/v3/athlete/activities?access_token=${res.access_token}`
7 | fetch(activities_link)
8 | .then((res) => res.json())
9 | .then(function (data){
10 |
11 | var map = L.map('map').setView([38.895417, -77.033616], 11);
12 |
13 | L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {
14 | attribution: '© OpenStreetMap contributors'
15 | }).addTo(map);
16 |
17 | for(var x=0; x res.json())
59 | .then(res => getActivites(res))
60 | }
61 |
62 | reAuthorize()
--------------------------------------------------------------------------------
/Strava_Api/LeafletUpdates/styles.css:
--------------------------------------------------------------------------------
1 | #map {
2 |
3 | height: 800px;
4 | width: 800px;
5 | display: inline-block;
6 | }
7 |
8 | body{
9 |
10 | text-align: center;
11 | }
--------------------------------------------------------------------------------
/Strava_Api/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 | Hello
7 |
8 |
9 |
10 |
11 |
--------------------------------------------------------------------------------
/Strava_Api/request_links.txt:
--------------------------------------------------------------------------------
1 | 1) Get authorization code from authorization page. This is a one time, manual step.
2 | Paste the below code in a browser, hit enter then grab the "code" part from the resulting url.
3 |
4 | https://www.strava.com/oauth/authorize?client_id=your_client_id&redirect_uri=http://localhost&response_type=code&scope=activity:read_all
5 |
6 | 2) Exchange authorization code for access token & refresh token
7 |
8 | https://www.strava.com/oauth/token?client_id=your_client_id&client_secret=your_client_secret&code=your_code_from_previous_step&grant_type=authorization_code
9 |
10 | 3) View your activities using the access token just received
11 |
12 | https://www.strava.com/api/v3/athlete/activities?access_token=access_token_from_previous_step
13 |
14 | 3) Use refresh token to get new access tokens
15 |
16 | https://www.strava.com/oauth/token?client_id=your_client_id&client_secret=your_client_secret&refresh_token=your_refresh_token_from_previous_step&grant_type=refresh_token
17 |
18 |
19 |
20 |
--------------------------------------------------------------------------------
/Strava_Api/strava_api.js:
--------------------------------------------------------------------------------
1 |
2 | const auth_link = "https://www.strava.com/oauth/token"
3 |
4 | function getActivites(res){
5 |
6 | const activities_link = `https://www.strava.com/api/v3/athlete/activities?access_token=${res.access_token}`
7 | fetch(activities_link)
8 | .then((res) => console.log(res.json()))
9 | }
10 |
11 | function reAuthorize(){
12 | fetch(auth_link,{
13 | method: 'post',
14 | headers: {
15 | 'Accept': 'application/json, text/plain, */*',
16 | 'Content-Type': 'application/json'
17 |
18 | },
19 |
20 | body: JSON.stringify({
21 |
22 | client_id: 'xxxx',
23 | client_secret: 'xxxx',
24 | refresh_token: 'xxxx',
25 | grant_type: 'refresh_token'
26 | })
27 | }).then(res => res.json())
28 | .then(res => getActivites(res))
29 | }
30 |
31 | reAuthorize()
--------------------------------------------------------------------------------
/Strava_Api/strava_api.py:
--------------------------------------------------------------------------------
1 | import requests
2 | import urllib3
3 | urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
4 |
5 | auth_url = "https://www.strava.com/oauth/token"
6 | activites_url = "https://www.strava.com/api/v3/athlete/activities"
7 |
8 | payload = {
9 | 'client_id': "xxxx",
10 | 'client_secret': 'xxxx',
11 | 'refresh_token': 'xxxx',
12 | 'grant_type': "refresh_token",
13 | 'f': 'json'
14 | }
15 |
16 | print("Requesting Token...\n")
17 | res = requests.post(auth_url, data=payload, verify=False)
18 | access_token = res.json()['access_token']
19 | print("Access Token = {}\n".format(access_token))
20 |
21 | header = {'Authorization': 'Bearer ' + access_token}
22 | param = {'per_page': 200, 'page': 1}
23 | my_dataset = requests.get(activites_url, headers=header, params=param).json()
24 |
25 | print(my_dataset[0]["name"])
26 | print(my_dataset[0]["map"]["summary_polyline"])
--------------------------------------------------------------------------------
/Strava_Api/strava_api_ALL_activities.py:
--------------------------------------------------------------------------------
1 | import requests
2 | import urllib3
3 | urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
4 |
5 | auth_url = "https://www.strava.com/oauth/token"
6 | activites_url = "https://www.strava.com/api/v3/athlete/activities"
7 |
8 | payload = {
9 | 'client_id': "xxxx",
10 | 'client_secret': 'xxxx',
11 | 'refresh_token': 'xxxx',
12 | 'grant_type': "refresh_token",
13 | 'f': 'json'
14 | }
15 |
16 | print("Requesting Token...\n")
17 | res = requests.post(auth_url, data=payload, verify=False)
18 | access_token = res.json()['access_token']
19 |
20 | print("Access Token = {}\n".format(access_token))
21 | header = {'Authorization': 'Bearer ' + access_token}
22 |
23 | # The first loop, request_page_number will be set to one, so it requests the first page. Increment this number after
24 | # each request, so the next time we request the second page, then third, and so on...
25 | request_page_num = 1
26 | all_activities = []
27 |
28 | while True:
29 | param = {'per_page': 200, 'page': request_page_num}
30 | # initial request, where we request the first page of activities
31 | my_dataset = requests.get(activites_url, headers=header, params=param).json()
32 |
33 | # check the response to make sure it is not empty. If it is empty, that means there is no more data left. So if you have
34 | # 1000 activities, on the 6th request, where we request page 6, there would be no more data left, so we will break out of the loop
35 | if len(my_dataset) == 0:
36 | print("breaking out of while loop because the response is zero, which means there must be no more activities")
37 | break
38 |
39 | # if the all_activities list is already populated, that means we want to add additional data to it via extend.
40 | if all_activities:
41 | print("all_activities is populated")
42 | all_activities.extend(my_dataset)
43 |
44 | # if the all_activities is empty, this is the first time adding data so we just set it equal to my_dataset
45 | else:
46 | print("all_activities is NOT populated")
47 | all_activities = my_dataset
48 |
49 | request_page_num += 1
50 |
51 | print(len(all_activities))
52 | for count, activity in enumerate(all_activities):
53 | print(activity["name"])
54 | print(count)
55 |
56 |
57 |
58 |
59 |
--------------------------------------------------------------------------------
/Strava_Api/strava_arc/script2.js:
--------------------------------------------------------------------------------
1 | function getMiles(i) {
2 | return i * 0.000621371192;
3 | }
4 |
5 | const auth_user = 'https://www.strava.com/oauth/authorize?client_id=xxxx&redirect_uri=http://localhost&response_type=code&scope=activity:read';
6 | // the above link is how end user authroizes. This will be done once manually then we will use token/refresh token
7 | const auth_link = 'https://www.strava.com/oauth/token';
8 | // above link used for getting token and refreshing token
9 | //const activites_link = 'https://www.strava.com/api/v3/athlete/activities?access_token=xxxxxx'
10 | // above link used for getting activites
11 |
12 | function getActivites(code) {
13 | const activites_link = `https://www.strava.com/api/v3/athlete/activities?access_token=${code.access_token}`
14 | console.log(code)
15 | fetch(activites_link)
16 | .then((res) => res.json())
17 | //.then(res => console.log(res));
18 | .then(function (data) {
19 |
20 | require([
21 | "esri/Map",
22 | "esri/views/MapView",
23 | "esri/Graphic",
24 | "esri/layers/FeatureLayer",
25 | ], function (Map, MapView, Graphic, FeatureLayer) {
26 |
27 | var map = new Map({
28 | basemap: "dark-gray"
29 | });
30 | // explore this later...this is how to add layers hosted on AGOL
31 | // var featureLayer = new FeatureLayer({
32 | // url: "https://services9.arcgis.com/X6ugfq6o3XVY10xo/arcgis/rest/services/pa_counties_clipshp/FeatureServer"
33 | // });
34 |
35 | // map.add(featureLayer);
36 | var view = new MapView({
37 | center: [-77.033616, 38.895417],
38 | container: "viewDiv",
39 | map: map,
40 | zoom: 11
41 | });
42 |
43 | for (var x = 0; x < data.length; x++) {
44 |
45 | console.log(data[x])
46 | var coordinates = L.Polyline.fromEncoded(data[x].map.summary_polyline).getLatLngs();
47 | console.log(coordinates);
48 | var coord_array = [];
49 |
50 | for (var y = 0; y < coordinates.length; y++) {
51 | var new_arr = [];
52 |
53 | new_arr.push(coordinates[y].lng);
54 | new_arr.push(coordinates[y].lat);
55 | coord_array.push(new_arr);
56 | // console.log(coordinates[x].lng);
57 | // console.log(coordinates[x].lat);
58 | }
59 | //console.log(coord_array);
60 |
61 | if (x === 0) {
62 | weight = 4;
63 | run_color = "yellow";
64 | } else if (x === 1) {
65 | weight = 3;
66 | run_color = "blue"
67 | } else if (x === 2) {
68 | weight = 2;
69 | run_color = "pink";
70 | } else {
71 | weight = 1;
72 | run_color = "red"
73 | }
74 |
75 | var polyline = {
76 | type: "polyline", // autocasts as new Polyline()
77 | paths: [
78 | coord_array
79 | ]
80 | };
81 |
82 | // Create a symbol for drawing the line
83 | var lineSymbol = {
84 | type: "simple-line", // autocasts as SimpleLineSymbol()
85 | color: run_color,
86 | width: weight
87 | };
88 |
89 | var lineAtt = {
90 | Name: data[x].name,
91 | Owner: data[x].athlete.id,
92 | Length: getMiles(data[x].distance)
93 | };
94 |
95 | var polylineGraphic = new Graphic({
96 | geometry: polyline,
97 | symbol: lineSymbol,
98 | attributes: lineAtt,
99 | popupTemplate: { // autocasts as new PopupTemplate()
100 | title: "{Name}",
101 | content: [{
102 | type: "fields",
103 | fieldInfos: [{
104 | fieldName: "Name"
105 | }, {
106 | fieldName: "Owner"
107 | }, {
108 | fieldName: "Length"
109 | }]
110 | }]
111 | }
112 | });
113 |
114 | // Add the graphics to the view's graphics layer
115 | view.graphics.addMany([polylineGraphic]);
116 | }
117 | });
118 | })
119 | .catch(function (error) {
120 | console.log(error);
121 | })
122 | };
123 |
124 | function reAuthorize() {
125 | fetch(auth_link, {
126 | method: 'post',
127 |
128 | headers: {
129 | 'Accept': 'application/json, text/plain, */*',
130 | 'Content-Type': 'application/json'
131 | },
132 | body: JSON.stringify({
133 |
134 | client_id: 'xxxx',
135 | client_secret: 'xxxxx',
136 | refresh_token: 'xxxxx',
137 | grant_type: 'refresh_token'
138 |
139 | })
140 |
141 | }).then(res => res.json())
142 | .then(res => getActivites(res))
143 | }
144 |
145 | reAuthorize()
--------------------------------------------------------------------------------
/Strava_Api/strava_arc/strava_arc.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 | FranRunningAppArcGIS
8 |
9 |
10 |
13 |
14 |
15 |
16 |
17 |
31 |
32 |
33 |
34 |
35 |
36 |
37 |
38 |
39 |
40 |
41 |
--------------------------------------------------------------------------------
/Strava_Api/strava_leaflet/script.js:
--------------------------------------------------------------------------------
1 | function getMiles(i) {
2 | return i * 0.000621371192;
3 | }
4 |
5 | const auth_user = 'https://www.strava.com/oauth/authorize?client_id=xxx&redirect_uri=http://localhost&response_type=code&scope=activity:read';
6 | // the above link is how end user authroizes. This will be done once manually then we will use token/refresh token
7 | const auth_link = 'https://www.strava.com/oauth/token';
8 | // above link used for getting token and refreshing token
9 | //const activites_link = 'https://www.strava.com/api/v3/athlete/activities?access_token=xxxx'
10 | // above link used for getting activites
11 |
12 | function getActivites(code) {
13 | const activites_link = `https://www.strava.com/api/v3/athlete/activities?access_token=${code.access_token}`
14 | console.log(code)
15 | fetch(activites_link)
16 | .then((res) => res.json())
17 | //.then(res => console.log(res));
18 | .then(function (data) {
19 |
20 | var today = new Date();
21 | var dd = today.getDate();
22 | var mm = today.getMonth() + 1; //January is 0!
23 | var yyyy = today.getFullYear();
24 | if (dd < 10) {
25 | dd = '0' + dd;
26 | }
27 |
28 | if (mm < 10) {
29 | mm = '0' + mm;
30 | }
31 |
32 | today = yyyy + '-' + mm + '-' + dd;
33 | console.log(today);
34 | console.log(data[0].start_date.slice(0, 10));
35 |
36 | if (today === data[0].start_date.slice(0, 10)) {
37 |
38 | document.getElementById('run').innerHTML = `user ran ${getMiles(data[0].distance)} miles today !`
39 | } else {
40 | document.getElementById('run').innerHTML = `user has not run today!!`;
41 | }
42 |
43 | var mymap = L.map('mapid').setView([38.895417, -77.033616], 11);
44 | L.tileLayer('https://api.mapbox.com/styles/v1/{id}/tiles/{z}/{x}/{y}?access_token={accessToken}', {
45 | attribution: '© Mapbox © OpenStreetMap Improve this map ',
46 | tileSize: 512,
47 | maxZoom: 18,
48 | zoomOffset: -1,
49 | id: 'mapbox/streets-v11',
50 | accessToken: 'xxxxx'
51 | }).addTo(mymap);
52 |
53 |
54 | var run_color = "red";
55 | var weight = 5;
56 |
57 | for (var x = 0; x < data.length; x++) {
58 |
59 | var act_id = data[x].id;
60 | console.log(act_id);
61 | console.log(data[x])
62 |
63 | var coordinates = L.Polyline.fromEncoded(data[x].map.summary_polyline).getLatLngs();
64 |
65 | if (x === 0) {
66 | weight = 7;
67 | run_color = "green";
68 | } else if (x === 1) {
69 | weight = 6;
70 | run_color = "blue"
71 | } else if (x === 2) {
72 | weight = 5;
73 | run_color = "pink";
74 | } else {
75 | weight = 2;
76 | run_color = "red"
77 | }
78 |
79 | L.polyline(
80 | coordinates,
81 | {
82 | color: run_color,
83 | weight: weight,
84 | opacity: .7,
85 | lineJoin: 'round'
86 |
87 | }
88 | ).addTo(mymap);
89 | }
90 | })
91 | .catch(function (error) {
92 | console.log(error);
93 | });
94 |
95 | }
96 |
97 | function reAuthorize() {
98 | fetch(auth_link, {
99 | method: 'post',
100 |
101 | headers: {
102 | 'Accept': 'application/json, text/plain, */*',
103 | 'Content-Type': 'application/json'
104 | },
105 | body: JSON.stringify({
106 |
107 | client_id: 'xxx',
108 | client_secret: 'xxx',
109 | refresh_token: 'xxx',
110 | grant_type: 'refresh_token'
111 |
112 | })
113 |
114 | }).then(res => res.json())
115 | .then(res => getActivites(res))
116 | }
117 |
118 | reAuthorize()
--------------------------------------------------------------------------------
/Strava_Api/strava_leaflet/strava_leaflet.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 | FranRunningAppLeaflet
9 |
12 |
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 | Fran page
23 |
24 |
25 |
26 |
27 |
28 |
--------------------------------------------------------------------------------
/Strava_Api/strava_leaflet/style.css:
--------------------------------------------------------------------------------
1 | #mapid{
2 | height: 800px;
3 | width:800px;
4 |
5 | display: inline-block;
6 | border-style: solid;
7 | border-width: medium;
8 |
9 | font-family: 'Open Sans Condensed', sans-serif;
10 |
11 | }
12 |
13 | body{
14 | text-align: center;
15 | }
--------------------------------------------------------------------------------
/Untappd API/config.py:
--------------------------------------------------------------------------------
1 | credentials = {"client_id":"put client id here", "client_secret":"put client secret here"}
--------------------------------------------------------------------------------
/Untappd API/untappd_api.geojson:
--------------------------------------------------------------------------------
1 | {"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.855003, 33.852501]}, "properties": {"beer_name": "Nelson X", "venue_name": "Untappd at Home", "checkin_comment": "1000th unique check in! Solid ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.855003, 33.852501]}, "properties": {"beer_name": "Amarillo X", "venue_name": "Brewery X", "checkin_comment": "Nice clean tame ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.855003, 33.852501]}, "properties": {"beer_name": "Slap & Tickle", "venue_name": "Brewery X", "checkin_comment": "Tasty west coast ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.855003, 33.852501]}, "properties": {"beer_name": "Pint Whacker", "venue_name": "Brewery X", "checkin_comment": "Solid ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-122.662003, 38.2724]}, "properties": {"beer_name": "Maximus Colossal IPA", "venue_name": "Untappd at Home", "checkin_comment": "Big ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.205002, 32.7719]}, "properties": {"beer_name": "Weekend Vibes", "venue_name": "Untappd at Home", "checkin_comment": "Nice ip"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-118.314003, 33.847099]}, "properties": {"beer_name": "10th Anniversary IPA", "venue_name": "Untappd at Home", "checkin_comment": "Nice ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.702004, 34.109001]}, "properties": {"beer_name": "Baseline", "venue_name": "Untappd at Home", "checkin_comment": "Very smooth , doesn\u2019t taste like 8%"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-119.236999, 34.255901]}, "properties": {"beer_name": "Short-Lived W/ There Does Not Exist", "venue_name": "Untappd at Home", "checkin_comment": "Reminds me of tired hands"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.206001, 32.7542]}, "properties": {"beer_name": "Dungeon Map", "venue_name": "Untappd at Home", "checkin_comment": "Very tasty"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.348, 33.159599]}, "properties": {"beer_name": "Coastal Access", "venue_name": "Untappd at Home", "checkin_comment": "Tasty"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.120003, 33.1157]}, "properties": {"beer_name": "Stone Vengeful Spirit IPA", "venue_name": "Untappd at Home", "checkin_comment": "Nice ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.157997, 32.888199]}, "properties": {"beer_name": "Habanero Sculpin", "venue_name": "Ballast Point - Anaheim", "checkin_comment": "Spicy"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.157997, 32.888199]}, "properties": {"beer_name": "Big Gus", "venue_name": "Ballast Point - Anaheim", "checkin_comment": "Decent"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.926002, 33.815399]}, "properties": {"beer_name": "Bad Motivator IPA", "venue_name": "Oga's Cantina", "checkin_comment": "At the cantina! Good ipa, crisp"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.120003, 33.1157]}, "properties": {"beer_name": "Stone Ruination Double IPA 2.0 Sans Filtre", "venue_name": "Stone Brewing World Bistro & Gardens", "checkin_comment": "Very tasty"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.120003, 33.1157]}, "properties": {"beer_name": "Stone Imperial Star Fawker", "venue_name": "Stone Brewing World Bistro & Gardens", "checkin_comment": "Big hazy ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.120003, 33.1157]}, "properties": {"beer_name": "Stone Hazy IPA", "venue_name": "Stone Brewing World Bistro & Gardens", "checkin_comment": "Tasty hazy"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.120003, 33.1157]}, "properties": {"beer_name": "Stone 25th Anniversary Triple IPA", "venue_name": "Stone Brewing Tap Room", "checkin_comment": "Big ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-119.299004, 34.278]}, "properties": {"beer_name": "Chief Peak IPA", "venue_name": "La Grande Orange Cafe", "checkin_comment": "Citrusy and hoppy ipa , I like it"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-116.765999, 32.8354]}, "properties": {"beer_name": "Duet", "venue_name": "Crush & Brew", "checkin_comment": "Awesome ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.120003, 33.1157]}, "properties": {"beer_name": "Stone Enjoy By 09.06.21 IPA", "venue_name": "Stone Brewing Tap Room", "checkin_comment": "Solid ipa and fresh!"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.649002, 34.0956]}, "properties": {"beer_name": "Fun Finder IPA", "venue_name": "Rescue Brewing Co.", "checkin_comment": "Good ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.611999, 34.098499]}, "properties": {"beer_name": "Glass Bong Rip", "venue_name": "King's Brewing", "checkin_comment": "Tasty ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.599998, 34.092701]}, "properties": {"beer_name": "Sunsation IPA", "venue_name": "Untappd at Home", "checkin_comment": "Tasty ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.591896, 34.088039]}, "properties": {"beer_name": "Stud Breaker", "venue_name": "Hamilton Family Brewery", "checkin_comment": "Tasty ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.688003, 34.1035]}, "properties": {"beer_name": "2-Headed Monster Hazy IPA", "venue_name": "R\u00f6k House Brewing Company", "checkin_comment": "Tasty ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.688003, 34.1035]}, "properties": {"beer_name": "Hammer Of Thor", "venue_name": "R\u00f6k House Brewing Company", "checkin_comment": "Nice ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.691002, 34.1092]}, "properties": {"beer_name": "Cream Ale", "venue_name": "Last Name Brewing", "checkin_comment": "Very refreshing after a long hike"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-118.344002, 34.0602]}, "properties": {"beer_name": "Bullitt West Coast IPA", "venue_name": "All Season Brewing Company", "checkin_comment": "Tasty , hoppy ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-87.967697, 43.042198]}, "properties": {"beer_name": "Mickey's", "venue_name": "Untappd at Home", "checkin_comment": "Tasty if ice cold"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.231003, 32.8321]}, "properties": {"beer_name": "Boat Shoes Hazy IPA", "venue_name": "Copehouse Bar & Bistro", "checkin_comment": "Good ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-118.191002, 33.770302]}, "properties": {"beer_name": "LBC IPA", "venue_name": "The Whisper House", "checkin_comment": "Tasty ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-117.141998, 34.083099]}, "properties": {"beer_name": "Betty IPA", "venue_name": "Hangar 24 Craft Brewery", "checkin_comment": "Nice ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-111.843002, 33.3032]}, "properties": {"beer_name": "MoonJuice", "venue_name": "Fairmont Scottsdale Princess", "checkin_comment": "Solid ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-111.649002, 35.1964]}, "properties": {"beer_name": "Flagstaff IPA", "venue_name": "Open Range Grill and Tavern", "checkin_comment": "Good ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-111.943001, 33.4217]}, "properties": {"beer_name": "Church Music", "venue_name": "Vino loco", "checkin_comment": "Good ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-111.650002, 35.199402]}, "properties": {"beer_name": "Westie", "venue_name": "Dark Sky Brewing Company", "checkin_comment": "Great ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-106.647003, 35.0928]}, "properties": {"beer_name": "India Pale Ale", "venue_name": "Marble Brewery Downtown", "checkin_comment": "Tasty ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-106.647003, 35.0928]}, "properties": {"beer_name": "IPA-X", "venue_name": "Marble Brewery Downtown", "checkin_comment": "Good ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-106.613998, 35.117901]}, "properties": {"beer_name": "Elevated IPA", "venue_name": "10,400 Feet / Sandia Crest", "checkin_comment": "Good ipa and awesome view via America\u2019s largest Tram!"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-95.460999, 29.805901]}, "properties": {"beer_name": "Hopadillo IPA", "venue_name": "Polk St Eats", "checkin_comment": "Nice ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-123.107002, 44.058201]}, "properties": {"beer_name": "Bubble Stash", "venue_name": "Polk St Eats", "checkin_comment": "Decent , very light tasting like a session ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-97.503899, 35.334301]}, "properties": {"beer_name": "Kveiking Punch IPA", "venue_name": "Anthem Brewing Company", "checkin_comment": "Tasty IPA"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-97.518997, 35.480499]}, "properties": {"beer_name": "Magic Juice", "venue_name": "Bricktown Brewery", "checkin_comment": "Nice DIPA"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-97.503899, 35.334301]}, "properties": {"beer_name": "Hoparazzi IPA", "venue_name": "Bricktown Brewery", "checkin_comment": "Good ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-92.268601, 34.756802]}, "properties": {"beer_name": "Nine Killer Imperial IPA", "venue_name": "Flyway Brewing Company", "checkin_comment": "Awesome DIPA"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-92.268601, 34.756802]}, "properties": {"beer_name": "Early Bird IPA", "venue_name": "Flyway Brewing Company", "checkin_comment": "Tasty ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-86.786301, 36.181301]}, "properties": {"beer_name": "Homestyle", "venue_name": "Jason Aldean's Kitchen + Rooftop Bar", "checkin_comment": "Nice ipa"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [-89.968803, 35.146801]}, "properties": {"beer_name": "Ananda", "venue_name": "Pinewood Social", "checkin_comment": "Nice ipa"}}]}
--------------------------------------------------------------------------------
/Untappd API/untappd_api.py:
--------------------------------------------------------------------------------
1 | import requests
2 | from geojson import Point, Feature, FeatureCollection, dump
3 | import config
4 |
5 | client_id = config.credentials.get("client_id")
6 | client_secret = config.credentials.get("client_secret")
7 |
8 | headers = {
9 |
10 | 'User-Agent': 'franchyze923_beer_viewer',
11 | 'From': 'franchyze923@gmail.com'
12 |
13 | }
14 |
15 |
16 | response = requests.get(f"https://api.untappd.com/v4/user/checkins/franchyze923?client_id={client_id}&client_secret={client_secret}&limit=50", headers=headers).json()
17 |
18 | checkins = response['response']['checkins']['items']
19 |
20 | features = []
21 |
22 | for checkin in checkins:
23 | #print(checkin)
24 | checkin_comment = checkin.get("checkin_comment")
25 | beer_name = checkin.get("beer").get("beer_name")
26 | venue_name = checkin.get("venue").get("venue_name")
27 | lat = checkin.get("brewery").get("location").get("lat")
28 | long = checkin.get("brewery").get("location").get("lng")
29 |
30 | print(f"Check in comment: {checkin_comment}")
31 | print(f"Beer name: {beer_name}")
32 | print(f"Venue name: {venue_name}")
33 | print(f"lat: {lat}")
34 | print(f"long: {long}")
35 | print("\n")
36 |
37 | features.append(Feature(geometry=Point((long, lat)), properties={"beer_name": beer_name, "venue_name": venue_name, "checkin_comment": checkin_comment} ))
38 |
39 | feature_collection = FeatureCollection(features)
40 |
41 | with open("/Users/fpolig01/Desktop/untappd_project/untappd_api.geojson", "w") as geojson_file:
42 | dump(feature_collection, geojson_file)
--------------------------------------------------------------------------------
/ffmpeg_video_converter.py:
--------------------------------------------------------------------------------
1 | import subprocess
2 | import os
3 | from pathlib import Path
4 | import time
5 | import logging
6 | import sys
7 |
8 | master_start_time = time.time()
9 |
10 | root_video_directory = r"E:\Media\Videos"
11 | root_output_directory = r"D:\Videos"
12 | ffmpeg_location = r"C:\Users\franp\Downloads\ffmpeg-2021-03-28-git-8b2bde0494-full_build\bin\ffmpeg.exe"
13 |
14 | # noinspection PyArgumentList
15 | logging.basicConfig(level=logging.INFO, format="%(asctime)s [%(levelname)s] %(message)s",
16 | handlers=[
17 | logging.FileHandler(r"D:\export_log.log"),
18 | logging.StreamHandler(sys.stdout)
19 | ]
20 | )
21 |
22 | for subdir, dirs, files in os.walk(root_video_directory):
23 | for file in files:
24 | if file.endswith(".avi"):
25 | avi_file = os.path.join(subdir, file)
26 | logging.info(f"Detected .avi file: {avi_file}")
27 | logging.info(f"Creating directory Structure: {os.path.dirname(avi_file).replace('E', 'D', 1)}")
28 |
29 | Path(os.path.dirname(avi_file).replace('E', 'D', 1)).mkdir(parents=True, exist_ok=True)
30 | output_file = os.path.join(avi_file.replace('E', 'D', 1).replace(".avi", ".mp4"))
31 | ## h.265 did not work with plex direct stream
32 | #ffmpeg_command = [ffmpeg_location, '-i', f'{avi_file}', "-c:v", "libx265", "-crf", "26", output_file]
33 | #ffmpeg_command = [ffmpeg_location, '-i', f'{avi_file}', output_file]
34 | #ffmpeg_command = [ffmpeg_location, '-i', f'{avi_file}', "-c:v", "libx264", "-crf", "20", output_file]
35 | handbrake_command = [r"C:\Users\franp\Downloads\HandBrakeCLI-1.3.3-win-x86_64\HandBrakeCLI.exe", '-i', f'{avi_file}',"-o", output_file, "-e", "x264", "-q", "20", "-B", "160"]
36 |
37 |
38 | logging.info(f"Converting: {avi_file} to .MP4 with x264. Output MP4: {output_file}")
39 | start_time = time.time()
40 |
41 | process = subprocess.Popen(handbrake_command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True)
42 | for line in process.stdout:
43 | print(line)
44 |
45 | logging.info("Done")
46 | logging.info('Program took {} seconds to complete..\n'.format(time.time() - start_time))
47 |
48 | logging.info("Done")
49 | logging.info('Program took {} seconds to complete.'.format(time.time() - master_start_time))
--------------------------------------------------------------------------------
/handbrake_converter.py:
--------------------------------------------------------------------------------
1 | import subprocess
2 | import os
3 | import time
4 | import logging
5 | import sys
6 | from shutil import copyfile
7 | from pathlib import Path
8 |
9 | master_start_time = time.time()
10 |
11 | handbrake_cli_exe = r"C:\Users\franp\Downloads\HandBrakeCLI-1.3.3-win-x86_64\HandBrakeCLI.exe"
12 | root_video_directory = Path(r"path to directory containing input videos")
13 | root_video_output = Path(r"path to output directory")
14 |
15 | # noinspection PyArgumentList
16 | logging.basicConfig(level=logging.INFO, format="%(asctime)s [%(levelname)s] %(message)s",
17 | handlers=[
18 | logging.FileHandler(r"D:\export_log.log"),
19 | logging.StreamHandler(sys.stdout)
20 | ]
21 | )
22 |
23 | for path, directories, files in os.walk(root_video_directory):
24 |
25 | new_folder = root_video_output.joinpath(*Path(path).parts[1:])
26 | logging.info(f"Creating folder for {new_folder}")
27 | Path(new_folder).mkdir(parents=True, exist_ok=True)
28 |
29 | for file in files:
30 | if file.endswith(".avi"):
31 | avi_file = os.path.join(path, file)
32 | logging.info(f"Detected .avi file: {avi_file}")
33 | output_file = str(root_video_output.joinpath(*Path(avi_file).parts[1:])).replace(".avi", ".mp4")
34 |
35 | if Path(output_file).exists():
36 | logging.info("deteced mp4 already converted")
37 | continue
38 | else:
39 |
40 | handbrake_command = [handbrake_cli_exe, '-i', f'{avi_file}',"-o", output_file, "-e", "x264", "-q", "20", "-B", "160"]
41 | logging.info(f"Converting: {avi_file} to .MP4 with x264. Output MP4: {output_file}")
42 | start_time = time.time()
43 |
44 | process = subprocess.Popen(handbrake_command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True)
45 | for line in process.stdout:
46 | print(line)
47 |
48 | logging.info("Done")
49 | logging.info('Program took {} seconds to complete..\n'.format(time.time() - start_time))
50 |
51 | else:
52 | non_avi_file = os.path.join(path, file)
53 | output_file = str(root_video_output.joinpath(*Path(non_avi_file).parts[1:]))
54 |
55 | logging.info(f"Detected Non .avi file. Copying {non_avi_file} to {output_file}")
56 | copyfile(non_avi_file, output_file)
57 |
58 | logging.info("Done")
59 | logging.info('Program took {} seconds to complete.'.format(time.time() - master_start_time))
--------------------------------------------------------------------------------
/handbrake_script_vid1.py:
--------------------------------------------------------------------------------
1 | import subprocess
2 |
3 | handbrake_command = ["D:\handbrake_cli\HandBrakeCLI.exe", "-i", r"D:\handbrake_cli\2020-09-13 09.12.17 gokart.avi", "-o", r"D:\handbrake_cli\2020-09-13 09.12.17 gokart_from_python.mp4","-e", "x264", "-q", "20", "-B", "160"]
4 |
5 | subprocess.run(handbrake_command, shell=True)
--------------------------------------------------------------------------------
/handbrake_script_vid2.py:
--------------------------------------------------------------------------------
1 | import subprocess
2 | import os
3 |
4 | input_directory = r"D:\handbrake_cli"
5 | directory_list = os.listdir(input_directory)
6 |
7 | for file in directory_list:
8 | full_path = os.path.join(input_directory, file)
9 | if full_path.endswith(".avi"):
10 | print("Converting {} to .mp4".format(full_path))
11 |
12 | handbrake_command = [r"D:\handbrake_cli\HandBrakeCLI.exe", "-i",f"{full_path}", "-o","{}".format(full_path.replace(".avi", ".mp4")), "-e","x264", "-q","20", "-B", "160"]
13 | #subprocess.run(handbrake_command, shell=True)
14 |
15 | process = subprocess.Popen(handbrake_command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True)
16 | for line in process.stdout:
17 | print(line)
18 |
19 | print("Finished converting")
20 |
21 | print("Done")
--------------------------------------------------------------------------------