├── Lab 1 ├── LAB_1_Part1.ipynb ├── LAB_1_Part2.ipynb ├── LAB_1_Part3.ipynb ├── Lab 1.docx └── sample_surface.txt ├── Lab 10 ├── Lab 10.docx ├── Lab_10_PT_1.ipynb └── Lab_10_PT_2.ipynb ├── Lab 11 ├── Lab 11.docx └── Lab_11.ipynb ├── Lab 12 ├── Lab 12.docx ├── Lab_12_PT_1.ipynb └── Lab_12_PT_2.ipynb ├── Lab 2 ├── Lab 2.docx └── Lab_2.ipynb ├── Lab 3 ├── 20220914_00.txt ├── Lab 3.docx └── Lab_3.ipynb ├── Lab 4 ├── Lab 4.docx ├── Lab_4_PT_1.ipynb └── Lab_4_PT_2.ipynb ├── Lab 5 ├── Lab 5.docx ├── Lab_5_PT_1.ipynb └── Lab_5_PT_2.ipynb ├── Lab 6 ├── Lab 6.docx ├── Lab_6_PT_1.ipynb └── Lab_6_PT_2.ipynb ├── Lab 7 ├── Lab 7.docx ├── Lab_7_PT_1.ipynb └── Lab_7_PT_2.ipynb ├── Lab 8 ├── Lab 8.docx ├── Lab_8_PT_1.ipynb ├── Lab_8_PT_2.ipynb └── Lab_8_PT_3.ipynb ├── Lab 9 ├── Lab 9.docx └── Lab_9.ipynb ├── Modules ├── iowa_metar.py ├── min_max.py └── sample_metar.txt ├── README.md └── Scripts ├── Download_GFS.ipynb └── Download_Sat.ipynb /Lab 1/LAB_1_Part1.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "700343c7-a16d-4683-83c4-c6b073dbb9ee", 6 | "metadata": { 7 | "tags": [] 8 | }, 9 | "source": [ 10 | "## Lab 1 Part I: METARs\n", 11 | "#### 9/7/2022\n", 12 | "\n", 13 | "\n", 14 | "The following tutorial is the first of three parts of the Python portion of Lab 1. In this part, we will focus on how to work with METAR data in python using the MetPy and Pandas modules. As with every future lab, I will include a link to the documentation of each module that we introduce for the first time.\n", 15 | "
\n", 16 | "### Module Documentation\n", 17 | "1. MetPy Metar Parsing Function: https://unidata.github.io/MetPy/latest/api/generated/metpy.io.parse_metar_file.html\n", 18 | "2. Pandas DataFrame: https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.html\n", 19 | "3. The datetime function from the datetime module: https://docs.python.org/3/library/datetime.html\n", 20 | "\n", 21 | "\n", 22 | "

\n", 23 | "\n", 24 | "If you have any questions about the code below, always feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 25 | "\n", 26 | "\n", 27 | "1. In most things we do in atmospheric science, we can save ourselves time by importing code (called modules) that someone else has written. In the section below, I load the Python modules we are going to need to complete this portion of the lab." 28 | ] 29 | }, 30 | { 31 | "cell_type": "code", 32 | "execution_count": null, 33 | "id": "0a23b168-628c-492a-9acb-67a4dcaffc64", 34 | "metadata": {}, 35 | "outputs": [], 36 | "source": [ 37 | "#from the data reading capabilities of metpy (metpy.io) import the metar reading capability (parse_metar_file)\n", 38 | "from metpy.io import parse_metar_file\n", 39 | "\n", 40 | "#import the data storage for the metar data. This package lays the data out in a table like format\n", 41 | "import pandas as pd\n", 42 | "\n", 43 | "#from the dates and time code(datetime), import the date and time reading capabilities (datetime).\n", 44 | "from datetime import datetime\n", 45 | "\n", 46 | "#from python's data import module (io) import the ability to read a string as a file. This allows us to avoid downloading files which speeds things up and keeps your files storage clean.\n", 47 | "from io import StringIO\n", 48 | "\n", 49 | "#import the module to download files from the internet\n", 50 | "import requests" 51 | ] 52 | }, 53 | { 54 | "cell_type": "markdown", 55 | "id": "0a28ab89-a468-4b0b-ba0e-37183058f488", 56 | "metadata": {}, 57 | "source": [ 58 | "

\n", 59 | "2. In the next part of the lab, we need to convert surface observation data between units since the data are not always in the desired units. Below is a sample function for how to convert wind from knots to mph.
" 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "id": "8dd40275-da7f-4e5a-89bc-c78ea0685a23", 66 | "metadata": {}, 67 | "outputs": [], 68 | "source": [ 69 | "#here the function is defined. The def command says to define the function of the name convert_knots_to_mph with the input variable of the name value\n", 70 | "def convert_knots_to_mph(value):\n", 71 | " \n", 72 | " #this line causes the function to return a value. Here I'm returning the input variable divided by 0.868976, the conversion factor between kt and MPH. We alternatively could multiply by 1.15 to get the same result.\n", 73 | " #note the indentation of this line - Python uses indentation to organize and execute its code.\n", 74 | " return value / 0.868976\n", 75 | " " 76 | ] 77 | }, 78 | { 79 | "cell_type": "markdown", 80 | "id": "83f30f58-9b85-4c41-8ce2-573bd9c8dfc7", 81 | "metadata": {}, 82 | "source": [ 83 | "

In the section below, create a function to convert a temperature value from °C to °F. Name the function convert_c_to_f.
" 84 | ] 85 | }, 86 | { 87 | "cell_type": "code", 88 | "execution_count": null, 89 | "id": "0e3ee8fd-dc70-494c-b4e8-8f13915dc832", 90 | "metadata": {}, 91 | "outputs": [], 92 | "source": [] 93 | }, 94 | { 95 | "cell_type": "markdown", 96 | "id": "81ab8807-1996-4964-9d5f-113e3915bb88", 97 | "metadata": {}, 98 | "source": [ 99 | "

\n", 100 | "\n", 101 | "3. Let's acquire some METAR data. First, we will need to download the METAR data from the Unidata THREDDS server. This code relies on the modules imported earlier in this notebook. When you run this code, it may take up to 10 seconds to run. Note: If you want to get your own surface data, you would go to https://thredds-test.unidata.ucar.edu/thredds/catalog/noaaport/text/metar/catalog.html.

\n", 102 | "\n" 103 | ] 104 | }, 105 | { 106 | "cell_type": "code", 107 | "execution_count": null, 108 | "id": "d600f675-92d2-45e9-baaa-259ff1902d23", 109 | "metadata": {}, 110 | "outputs": [], 111 | "source": [ 112 | "#Here I set the create the variable that holds the time in UTC for which we want the METAR data.\n", 113 | "#datetime(year, month, day, hour)\n", 114 | "file_time = datetime(2022,9,7,18)\n", 115 | "\n", 116 | "#Here I build the url to get the data. Most of the url stays the same but the metar file name changes based on the date and hour that it is valid for.\n", 117 | "#right before the quote I have the letter f, which means the values in the quotes after are what is called a formatted string.\n", 118 | "#The formatted string allows us to use the file time that we defined before to generate the string. Below I use this file time when I have\n", 119 | "#{file_time:%Y%m%d_%H}. This stands for {time variable:datetime format}. For the datetime format there are special codes that \n", 120 | "#tell the string which value to insert. To see what each code stands for, refer to the datetime module's documentation.\n", 121 | "url = f\"https://thredds-test.unidata.ucar.edu/thredds/fileServer/noaaport/text/metar/metar_{file_time:%Y%m%d_%H}00.txt\"\n", 122 | "\n", 123 | "#using the request module use the get function to retrieve the raw website data from the url we defined above\n", 124 | "web_data = requests.get(url)\n", 125 | "\n", 126 | "#here we take the web data from before and pull of the content (.content) from the web_data object. \n", 127 | "#Then we take the content and decode it to something that we can use rather than the website html (.decode())\n", 128 | "web_content = web_data.content.decode()\n", 129 | "\n", 130 | "#Here we take the decoded web content from above and make it a file object. This avoids downloading the data to your file system which\n", 131 | "#speeds things up, and it keeps your file space from becoming cluttered. Instead, this puts the METAR file directly to the RAM of the machine\n", 132 | "data_file = StringIO(web_content)\n", 133 | "\n", 134 | "\n", 135 | "#we now tell metpy to parse out the file we have downloaded (data_file). Also Metpy only can get the day of the month\n", 136 | "#from the METAR, so we need to specify the month (file_time.month) and year (file_time.year) from the file time that we set before\n", 137 | "#or else it will assume the current month and year. \n", 138 | "metar_data = parse_metar_file(data_file, month = file_time.month, year=file_time.year)\n", 139 | "\n", 140 | "#below you can see that the data is parse out and now is in a form that is similar to a table. This is called a data frame.\n", 141 | "#also in Jupyter you can display one variable by typing out the variable name like I did below. (Note: this does not work outside Jupyter)\n", 142 | "#if you need to display multiple variables in a cell, you will need to use the print statement instead\n", 143 | "metar_data" 144 | ] 145 | }, 146 | { 147 | "cell_type": "markdown", 148 | "id": "6d0ab42a-3567-4ea3-9871-e30eb1f35312", 149 | "metadata": {}, 150 | "source": [ 151 | "\n", 152 | "

\n", 153 | "3. We now have parsed our data parsed. The data is stored in something that is called a Pandas DataFrame, which you can visualize to be just like a table of data that you would see in a textbook or an Excel spreadsheet. There are column names and row names for the table that we can use to access various parts of the data. \n", 154 | "\n", 155 | "With the way that MetPy prases the METAR data, the row names are the station names and the column names are the observation quantity names. This structure is useful because sometimes when working with METAR data we need to get an observation for a single location. In the code below, I use Pandas' syntax to get O'Hare Airport's observations in our sample file. Multiple times may appear since O'Hare may make multiple observations during the requested hour.
\n" 156 | ] 157 | }, 158 | { 159 | "cell_type": "code", 160 | "execution_count": null, 161 | "id": "82250fb0-23bb-44c4-96bf-ae8052b7d176", 162 | "metadata": {}, 163 | "outputs": [], 164 | "source": [ 165 | "#set the site variable to a string of O'Hare's 4 letter identifier\n", 166 | "site = \"KORD\"\n", 167 | "\n", 168 | "#from the metar data frame (the metar_data variable) slice out the row (.loc[]) that has the index that is for the site we want (site) and save it to the variable station.\n", 169 | "station = metar_data.loc[site]\n", 170 | "\n", 171 | "#display the sliced data for O'Hare. The data may look different, but it is still setup the same as the cell above.\n", 172 | "station" 173 | ] 174 | }, 175 | { 176 | "cell_type": "markdown", 177 | "id": "ffd6a290-2d19-42b1-a4f1-f782ff096644", 178 | "metadata": {}, 179 | "source": [ 180 | "

\n", 181 | "4. We can also parse out specific variables by using the syntax below.
" 182 | ] 183 | }, 184 | { 185 | "cell_type": "code", 186 | "execution_count": null, 187 | "id": "9f5f1542-f3a6-4755-a0d5-39dcff13da44", 188 | "metadata": {}, 189 | "outputs": [], 190 | "source": [ 191 | "#from the data that only contains the metar for KORD (station) slice out the column named \"windspeed\" and save it to the variable station_wind. \n", 192 | "#For columns we can just do the brackets and we don't need a function like the .loc() function that we needed before for the row.\n", 193 | "station_wind = station[\"wind_speed\"]\n", 194 | "\n", 195 | "#display the variable that we saved the wind speed data from KORD to.\n", 196 | "station_wind" 197 | ] 198 | }, 199 | { 200 | "cell_type": "markdown", 201 | "id": "da005306-7b70-4769-943b-9d6d27fda726", 202 | "metadata": {}, 203 | "source": [ 204 | "

\n", 205 | "5. The parsed observation is in the standard METAR wind speed units of kt. We can convert these units by using the conversion function we created before:\n", 206 | "
" 207 | ] 208 | }, 209 | { 210 | "cell_type": "code", 211 | "execution_count": null, 212 | "id": "b8ad0fb3-559d-4287-9f41-5a4568c40b11", 213 | "metadata": {}, 214 | "outputs": [], 215 | "source": [ 216 | "#using the convert_knots_to_mph function that I defined before to convert the wind speed for KORD (station_wind) from knots to mph and save the output from the function to the variable station_wind_mph.\n", 217 | "station_wind_mph = convert_knots_to_mph(station_wind)\n", 218 | "#display the station wind speed that resulted from the function above\n", 219 | "station_wind_mph" 220 | ] 221 | }, 222 | { 223 | "cell_type": "markdown", 224 | "id": "e5dbcbf6-1c2a-4c40-83e1-087388da2578", 225 | "metadata": {}, 226 | "source": [ 227 | "

\n", 228 | "6. In the code section below, parse out the temperature (air_temperature), dewpoint (dew_point_temperature), pressure (air_pressure_at_sea_level), wind speed (wind_speed), wind direction (wind_direction), and cloud coverage (cloud_coverage) for Madison (KMSN). Display the output so you can use it to answer question 6 in the lab. Be sure to convert temperature and dewpoint to the appropriate units.\n", 229 | "

" 230 | ] 231 | }, 232 | { 233 | "cell_type": "code", 234 | "execution_count": null, 235 | "id": "2cd2486e-b83e-4732-b84c-c606807f3e33", 236 | "metadata": {}, 237 | "outputs": [], 238 | "source": [] 239 | }, 240 | { 241 | "cell_type": "markdown", 242 | "id": "4638d98c-8893-4f93-ae75-3e5b6379ee22", 243 | "metadata": {}, 244 | "source": [ 245 | "

\n", 246 | "\n", 247 | "You have now completed Part I of the Python portion of Lab 1. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 248 | ] 249 | } 250 | ], 251 | "metadata": { 252 | "kernelspec": { 253 | "display_name": "Python 3 (ipykernel)", 254 | "language": "python", 255 | "name": "python3" 256 | }, 257 | "language_info": { 258 | "codemirror_mode": { 259 | "name": "ipython", 260 | "version": 3 261 | }, 262 | "file_extension": ".py", 263 | "mimetype": "text/x-python", 264 | "name": "python", 265 | "nbconvert_exporter": "python", 266 | "pygments_lexer": "ipython3", 267 | "version": "3.7.12" 268 | } 269 | }, 270 | "nbformat": 4, 271 | "nbformat_minor": 5 272 | } 273 | -------------------------------------------------------------------------------- /Lab 1/LAB_1_Part2.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "700343c7-a16d-4683-83c4-c6b073dbb9ee", 6 | "metadata": { 7 | "tags": [] 8 | }, 9 | "source": [ 10 | "## Lab 1 Part II: Accessing Rawinsonde Data\n", 11 | "#### 9/7/2022\n", 12 | "\n", 13 | "\n", 14 | "The following tutorial is the second of three parts of the Python portion of Lab 1. In this part, we will focus on how to work with rawinsonde data in Python using the Siphon and Pandas modules.\n", 15 | "
\n", 16 | "\n", 17 | "### Module Documentation\n", 18 | "1. Pandas DataFrame: https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.html\n", 19 | "2. The datetime function from the datetime module: https://docs.python.org/3/library/datetime.html\n", 20 | "3. Wyoming Upper Air Data via Siphon: https://unidata.github.io/siphon/latest/api/simplewebservice.html#module-siphon.simplewebservice.wyoming\n", 21 | "\n", 22 | "\n", 23 | "

\n", 24 | "\n", 25 | "If you have any questions about the code below, always feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 26 | "\n", 27 | "\n", 28 | "1. Before we start downloading rawinsonde data, we first need to import the python modules that we need:\n", 29 | "
" 30 | ] 31 | }, 32 | { 33 | "cell_type": "code", 34 | "execution_count": null, 35 | "id": "190f19af-d025-4d20-ab09-a790a027678a", 36 | "metadata": {}, 37 | "outputs": [], 38 | "source": [ 39 | "#from python's date and time module (from datetime) import the ability to work with date and times (import datetime)\n", 40 | "from datetime import datetime\n", 41 | "\n", 42 | "#using the siphon module and its ability to remotely retrieve files (.simplewebservice) specifically from the University of Wyoming (.wyoming), \n", 43 | "#import the ability to download the University of Wyoming's upper-air data.\n", 44 | "from siphon.simplewebservice.wyoming import WyomingUpperAir" 45 | ] 46 | }, 47 | { 48 | "cell_type": "markdown", 49 | "id": "fbddbb06-6b98-493b-851e-b3b631638ccb", 50 | "metadata": {}, 51 | "source": [ 52 | "

\n", 53 | "2. Now with the packages we need, lets choose what rawinsonde observation we want. Below I set the date, time, and location of the rawinsonde observation that I want. For this assignment, we want Green Bay's (GRB) 1200 UTC upper-air observation from 9/7/2022." 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": null, 59 | "id": "bd2e7f28-4b72-47ec-8377-b69045246997", 60 | "metadata": {}, 61 | "outputs": [], 62 | "source": [ 63 | "#Here I set the create the variable that holds the time in UTC for which we want the rawinsonde data.\n", 64 | "#datetime(year, month, day, hour)\n", 65 | "sounding_date = datetime(2022,9,7,12)\n", 66 | "\n", 67 | "#We want to set a variable to the identifier for the desired rawinsonde station.\n", 68 | "station = \"GRB\"" 69 | ] 70 | }, 71 | { 72 | "cell_type": "markdown", 73 | "id": "fd5f8e94-c1d3-45a8-a334-182fb0e38c7c", 74 | "metadata": {}, 75 | "source": [ 76 | "

\n", 77 | "3. With our location and time set, we are ready to download the rawinsonde file using Siphon." 78 | ] 79 | }, 80 | { 81 | "cell_type": "code", 82 | "execution_count": null, 83 | "id": "ed496ffe-1d58-46b1-9b50-41b656699951", 84 | "metadata": {}, 85 | "outputs": [], 86 | "source": [ 87 | "#using the Wyoming upper air rawinsonde downloader we imported above (WyomingUpperAir), get the data (.request_data) for the desired location and time. \n", 88 | "#also with the .set_index(\"pressure\") I set the index of the data frame to be pressure so we can use data.loc[pressure] to get the data at a specified pressure.\n", 89 | "upper_air_data = WyomingUpperAir.request_data(sounding_date, station).set_index(\"pressure\")\n", 90 | "\n", 91 | "#display the rawinsonde observation. Once again the data is in a Pandas DataFrame.\n", 92 | "upper_air_data" 93 | ] 94 | }, 95 | { 96 | "cell_type": "markdown", 97 | "id": "5506f77f-f4e2-45a4-b5e4-7a5cb12712f0", 98 | "metadata": {}, 99 | "source": [ 100 | "

\n", 101 | "4. Next, let's determine the Pandas DataFrame column names. By looking at the column names, we are also looking at what observation variables are contained in the downloaded rawinsonde data:" 102 | ] 103 | }, 104 | { 105 | "cell_type": "code", 106 | "execution_count": null, 107 | "id": "fb9bcacf-f059-4321-afeb-ee2c3f0ce020", 108 | "metadata": {}, 109 | "outputs": [], 110 | "source": [ 111 | "#display the column names for the rawinsonde data dataframe.\n", 112 | "upper_air_data.columns" 113 | ] 114 | }, 115 | { 116 | "cell_type": "markdown", 117 | "id": "89498bbc-6527-4164-bd53-68ea7c1f7354", 118 | "metadata": {}, 119 | "source": [ 120 | "

\n", 121 | "5. Just like with the METAR data, we are able to parse out specific variables for specific heights using Pandas DataFrame parsing syntax:" 122 | ] 123 | }, 124 | { 125 | "cell_type": "code", 126 | "execution_count": null, 127 | "id": "22f5416d-e2dd-4812-941d-3132a4e08d59", 128 | "metadata": {}, 129 | "outputs": [], 130 | "source": [ 131 | "#display the geopotential height at 500 hPa for the rawinsonde observation we downloaded\n", 132 | "#we only need to specify a location of 500 because we previously requested the data to vary by pressure\n", 133 | "upper_air_data[\"height\"].loc[500]" 134 | ] 135 | }, 136 | { 137 | "cell_type": "markdown", 138 | "id": "d5cb02d0-3e71-4050-bcf3-ab9102d5351c", 139 | "metadata": {}, 140 | "source": [ 141 | "

\n", 142 | "6. In the code section below, parse out the 1200 UTC 9/7/22 Green Bay, WI (KGRB) 300 hPa temperature, dewpoint, wind speed, wind direction, and geopotential height. Display all of the variables so you can use them to answer question 8 in Lab 1." 143 | ] 144 | }, 145 | { 146 | "cell_type": "code", 147 | "execution_count": null, 148 | "id": "9adf54a3-2fe3-4aef-b630-5fd418dcac8c", 149 | "metadata": {}, 150 | "outputs": [], 151 | "source": [] 152 | }, 153 | { 154 | "cell_type": "markdown", 155 | "id": "4638d98c-8893-4f93-ae75-3e5b6379ee22", 156 | "metadata": {}, 157 | "source": [ 158 | "

\n", 159 | "\n", 160 | "You have now completed Part II of the Python portion of Lab 1. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 161 | ] 162 | } 163 | ], 164 | "metadata": { 165 | "kernelspec": { 166 | "display_name": "Python 3 (ipykernel)", 167 | "language": "python", 168 | "name": "python3" 169 | }, 170 | "language_info": { 171 | "codemirror_mode": { 172 | "name": "ipython", 173 | "version": 3 174 | }, 175 | "file_extension": ".py", 176 | "mimetype": "text/x-python", 177 | "name": "python", 178 | "nbconvert_exporter": "python", 179 | "pygments_lexer": "ipython3", 180 | "version": "3.7.12" 181 | } 182 | }, 183 | "nbformat": 4, 184 | "nbformat_minor": 5 185 | } 186 | -------------------------------------------------------------------------------- /Lab 1/Lab 1.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 1/Lab 1.docx -------------------------------------------------------------------------------- /Lab 10/Lab 10.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 10/Lab 10.docx -------------------------------------------------------------------------------- /Lab 10/Lab_10_PT_1.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "440968c6-adc0-4451-b073-fdebb9bfa43e", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 10 Part I: Plotting Kinematic Fields\n", 9 | "

\n", 10 | "\n", 11 | "In this lab, we create plots of the 300 hPa full (or total) wind and 500 hPa absolute vorticity to help us analyze the kinematic attributes of the horizontal wind field. On the Python end of things, we continue to work to create cleaner code while also working to become more independent in creating our own code.\n", 12 | "
\n", 13 | "\n", 14 | "### Module Documentation\n", 15 | "\n", 16 | "1. Xarray Dataset: https://docs.xarray.dev/en/stable/generated/xarray.Dataset.html\n", 17 | "2. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 18 | "3. Caropy crs: https://scitools.org.uk/cartopy/docs/latest/reference/crs.html\n", 19 | "4. Cartopy Feature: https://scitools.org.uk/cartopy/docs/latest/matplotlib/feature_interface.html\n", 20 | "5. Matplotlib Colors: https://matplotlib.org/stable/gallery/color/named_colors.html\n", 21 | "6. Matplotlib Contour: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.contour.html\n", 22 | "7. Matplotlib Barbs: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.barbs.html\n", 23 | "8. Matplotlib Color Maps: https://matplotlib.org/stable/tutorials/colors/colormaps.html\n", 24 | "9. Matplotlib Contourf: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.contourf.html\n", 25 | "10. Scipy Gaussian Filter: https://docs.scipy.org/doc/scipy/reference/generated/scipy.ndimage.gaussian_filter.html\n", 26 | "\n", 27 | "\n", 28 | "\n", 29 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 30 | "\n", 31 | "---\n", 32 | "\n", 33 | "
\n", 34 | "1. As usual, we start by importing the modules we need for our Python code." 35 | ] 36 | }, 37 | { 38 | "cell_type": "code", 39 | "execution_count": null, 40 | "id": "8be8eb37-15e7-48b8-9b3c-ac6aff02fa4e", 41 | "metadata": {}, 42 | "outputs": [], 43 | "source": [ 44 | "#from the dates and time code (datetime), import the date and time reading capabilities (datetime)\n", 45 | "from datetime import datetime\n", 46 | "\n", 47 | "#import the module numpy and save it to np\n", 48 | "import numpy as np\n", 49 | "\n", 50 | "#import the cartopy (cartopy) module's coordinate reference system (.crs) and save it to the variable crs\n", 51 | "import cartopy.crs as crs\n", 52 | "\n", 53 | "#import the cartopy (cartopy) module's ability to plot geographic data (.feature) and save it to the variable cfeature \n", 54 | "import cartopy.feature as cfeature\n", 55 | "\n", 56 | "#import the pyplot submodule from the matplotlib module\n", 57 | "import matplotlib.pyplot as plt\n", 58 | "\n", 59 | "#from the scipy module's ndimage submodule, import the function gaussian_filter\n", 60 | "from scipy.ndimage import gaussian_filter\n", 61 | "\n", 62 | "#import the module xarray and save it to xr\n", 63 | "import xarray as xr\n", 64 | "\n", 65 | "#from the metpy submodule units, import the units function\n", 66 | "from metpy.units import units" 67 | ] 68 | }, 69 | { 70 | "cell_type": "markdown", 71 | "id": "e6b9b4f2-3b2c-4da6-b505-497cfb3bf152", 72 | "metadata": {}, 73 | "source": [ 74 | "

\n", 75 | "2. As with previous labs, we start by creating a function to process our data. In the cell below, create a function that opens the dataset for the time passed to the function, selects data at the level we pass into the function, converts winds from m/s to kt, calculates the wind speed, and limits the extent of the data to 65°N-20°N and 145°W-45°W. Finally, have the function return the dataset we have created." 76 | ] 77 | }, 78 | { 79 | "cell_type": "code", 80 | "execution_count": null, 81 | "id": "af5f9294-b8bc-4d99-9fa8-5b77e6339a10", 82 | "metadata": {}, 83 | "outputs": [], 84 | "source": [ 85 | "\"\"\"\n", 86 | "Below, a function is defined to retrieve and process GFS analysis upper-air data. \n", 87 | "This function opens the GFS analysis data, retains only the data on the desired\n", 88 | "isobaric surface, converts the wind's units, calculates the ageostrophic wind and\n", 89 | "wind speed, and adds the calculated wind variables to the dataset.\n", 90 | "\n", 91 | "INPUT:\n", 92 | " level : INTEGER\n", 93 | " The level in hPa at which you want upper-air data.\n", 94 | " time : DATETIME\n", 95 | " The time at which you want upper-air data.\n", 96 | " \n", 97 | "OUTPUT:\n", 98 | " leveled_data : XARRAY DATASET\n", 99 | " The xarray containing your processed GFS analysis data.\n", 100 | "\"\"\"\n", 101 | "\n", 102 | "\n", 103 | "def process_upper_air_data(level, time):\n", 104 | " \"\"\"\n", 105 | " Specify the location of the upper-air data on the JupyterHub.\n", 106 | " \"\"\"\n", 107 | " lab_data_loc = \"/data/AtmSci360/Lab_10/\"\n", 108 | " \n", 109 | " " 110 | ] 111 | }, 112 | { 113 | "cell_type": "markdown", 114 | "id": "637f43e7-2d2f-4b14-97d1-4e79548f6928", 115 | "metadata": {}, 116 | "source": [ 117 | "

\n", 118 | "3. Using the data processing function you just created, get data at 500 hPa for October 12th, 2022 at 1200 UTC." 119 | ] 120 | }, 121 | { 122 | "cell_type": "code", 123 | "execution_count": null, 124 | "id": "198da684-d265-4250-8b1c-008cd95b8fdf", 125 | "metadata": {}, 126 | "outputs": [], 127 | "source": [] 128 | }, 129 | { 130 | "cell_type": "markdown", 131 | "id": "1bfe3055-68e4-4316-b9c6-763dad13217b", 132 | "metadata": {}, 133 | "source": [ 134 | "

\n", 135 | "4. We are ready to start creating our plots. There are some features that the plots to be created will have in common, so let's create a function to initialize our plots and eliminate redundant code. In the initialization function, we need to:\n", 136 | "\n", 144 | "\n", 145 | "At the function's end, we need to return the axes so that the resulting map can be passed into subsequent data plotting arguments." 146 | ] 147 | }, 148 | { 149 | "cell_type": "code", 150 | "execution_count": null, 151 | "id": "445af293-1eef-415b-a8cc-9295d7447d4a", 152 | "metadata": {}, 153 | "outputs": [], 154 | "source": [ 155 | "\"\"\"\n", 156 | "The function below is used to initialize a plot over the United States. This function creates the plot for a map,\n", 157 | "defines its projection, sets its extent, adds the desired geographic data, and plots isohypses (after smoothing the\n", 158 | "data using a Gaussian filter with an appropriate sigma value) in contours.\n", 159 | "\n", 160 | "INPUT:\n", 161 | " data : XARRAY DATASET\n", 162 | " The GFS data to use to plot isohypses.\n", 163 | "\n", 164 | "OUTPUT:\n", 165 | " ax : MATPLOTLIB AXES\n", 166 | " The initialized plot.\n", 167 | "\"\"\"\n", 168 | "\n", 169 | "def initialize_plot(data):\n", 170 | " " 171 | ] 172 | }, 173 | { 174 | "cell_type": "markdown", 175 | "id": "93adef74-298d-49e8-8169-319737c2e973", 176 | "metadata": {}, 177 | "source": [ 178 | "

\n", 179 | "5. We will first create a plot of absolute vorticity at 500 hPa. Follow along in the comments for areas you need to fill out." 180 | ] 181 | }, 182 | { 183 | "cell_type": "code", 184 | "execution_count": null, 185 | "id": "a248319d-0945-452f-a771-ead1b4718be0", 186 | "metadata": {}, 187 | "outputs": [], 188 | "source": [ 189 | "\"\"\"\n", 190 | "Below, a function is defined to plot absolute vorticity data. This function adds color-filled \n", 191 | "absoluted vorticity contours and wind barbs for the full wind.\n", 192 | "\n", 193 | "INPUT:\n", 194 | " time : DATETIME\n", 195 | " The date and time for which the plot is valid.\n", 196 | " level : INTEGER\n", 197 | " The level at which the plot is valid.\n", 198 | " data : XARRAY DATASET\n", 199 | " The GFS analysis data. \n", 200 | "\"\"\"\n", 201 | "\n", 202 | "def plot_vort(time, level, data):\n", 203 | " \"\"\"\n", 204 | " Initialize the plot using the function you created and save the axes to ax.\n", 205 | " \"\"\"\n", 206 | " \n", 207 | " \n", 208 | "\n", 209 | " \n", 210 | " \"\"\"\n", 211 | " Plot the 500 hPa absolute vorticity. Have the dataset display the variables \n", 212 | " contained within the dataset and find the variable for absolute vorticity. \n", 213 | " Next, apply a Gaussian filter with a sigma value of 2 to these data.\n", 214 | " \"\"\"\n", 215 | " \n", 216 | " \n", 217 | " \n", 218 | " \"\"\"\n", 219 | " Since absolute vorticity has very small values, multiply the values by 10**5 to get the values to something easier \n", 220 | " to comprehend and plot.\n", 221 | " \"\"\"\n", 222 | " \n", 223 | " \n", 224 | " \"\"\"\n", 225 | " Create color-filled contours for the absolute vorticity. Be sure to pick an appropriate color map and style your \n", 226 | " contours so they are easy to read.\n", 227 | " \"\"\"\n", 228 | " \n", 229 | " \n", 230 | " \"\"\"\n", 231 | " Next, we need to add a color bar to provide a reference to the shaded colors. In the lines below, create a\n", 232 | " color bar that is appropriately labeled. \n", 233 | " \n", 234 | " Hint: You can find the units if you print the dataset's variable. Also remember we multiplied vorticity by 10 ** 5. \n", 235 | " \"\"\"\n", 236 | " \n", 237 | " \n", 238 | " \n", 239 | " \"\"\"\n", 240 | " Next, we need to add wind barbs with an appropriate spacing.\n", 241 | " Add appropriate styling information to make the plot easier to understand.\n", 242 | " \"\"\"\n", 243 | " \n", 244 | " \n", 245 | " \n", 246 | " \"\"\"\n", 247 | " You can ignore this next section, which labels some points (A-C) that are referenced later in the lab.\n", 248 | " \"\"\" \n", 249 | " if str(data.time.values) == \"2022-10-12T12:00:00.000000000\":\n", 250 | " ax.text(-96,42,\"A\", size=12, weight=\"bold\", transform=crs.PlateCarree())\n", 251 | " ax.text(-96,53,\"B\", size=12, weight=\"bold\", transform=crs.PlateCarree())\n", 252 | " ax.text(-78,51,\"C\", size=12, weight=\"bold\", transform=crs.PlateCarree())\n", 253 | " \n", 254 | " \"\"\"\n", 255 | " Finally, add an appropriate title for the map that shows what is plotted and the time at which the map is valid.\n", 256 | " \n", 257 | " Hint: Remember that we multiplied the absolute vorticity by 10 ** 5, so make sure to include this information with its units.\n", 258 | " \"\"\"\n", 259 | "\n", 260 | " \n", 261 | " " 262 | ] 263 | }, 264 | { 265 | "cell_type": "markdown", 266 | "id": "7e3a9e6a-f3d3-4da7-9184-a62ba5caef93", 267 | "metadata": {}, 268 | "source": [ 269 | "

\n", 270 | "6. Call the function that you just created to plot 500 hPa absolute vorticity, using the data we retrieved earlier for October 12th, 2022 at 1200 UTC." 271 | ] 272 | }, 273 | { 274 | "cell_type": "code", 275 | "execution_count": null, 276 | "id": "38633660-67fb-4556-ae13-83efd1b7a682", 277 | "metadata": {}, 278 | "outputs": [], 279 | "source": [] 280 | }, 281 | { 282 | "cell_type": "markdown", 283 | "id": "1d2f73f6-1edd-4776-a4ea-39ac88e7d959", 284 | "metadata": {}, 285 | "source": [ 286 | "

\n", 287 | "7. We also need to create a map of the full wind at 300 hPa. Follow the instructions in the comments below for places where you need to finish the code." 288 | ] 289 | }, 290 | { 291 | "cell_type": "code", 292 | "execution_count": null, 293 | "id": "f5079fd0-5d7b-4880-b476-1ae231ac7dc7", 294 | "metadata": {}, 295 | "outputs": [], 296 | "source": [ 297 | "\"\"\"\n", 298 | "This function plots upper-air full wind data. This function creates line isopleths for the geopotential height, color-filled isopleths \n", 299 | "for the wind speed, and wind barbs for the full wind.\n", 300 | "\n", 301 | "INPUT:\n", 302 | " model_data : XARRAY DATASET\n", 303 | " The GFS analysis data.\n", 304 | " level : INTEGER\n", 305 | " The level at which the plot is valid.\n", 306 | " data : DATETIME\n", 307 | " The date and time at which the plot is valid.\n", 308 | "\"\"\"\n", 309 | "\n", 310 | "def plot_winds(time, level, data):\n", 311 | " \"\"\"\n", 312 | " Initialize a plot using the function you created.\n", 313 | " \"\"\"\n", 314 | "\n", 315 | " \n", 316 | " \"\"\"\n", 317 | " Next, we plot color-filled contours for wind speed. As with the geopotential height, we must first smooth the data.\n", 318 | " In the line below, use a Gaussian filter to smooth the wind speed data (wind_mag).\n", 319 | " \"\"\"\n", 320 | " \n", 321 | " \n", 322 | " \"\"\"\n", 323 | " Create color-filled contours for the wind speed. Be sure to pick an appropriate color map and style your \n", 324 | " contours so they are easy to read.\n", 325 | " \"\"\"\n", 326 | " \n", 327 | " \n", 328 | " \"\"\"\n", 329 | " Next, we need to add a color bar to provide a reference to the shaded colors. In the lines below, create a\n", 330 | " color bar that is appropriately labeled. \n", 331 | " \n", 332 | " Hint: You can find the units by printing out the dataset's variable. \n", 333 | " \"\"\"\n", 334 | " \n", 335 | " \n", 336 | " \n", 337 | " \"\"\"\n", 338 | " Next, we want to add wind barbs with an appropriate spacing.\n", 339 | " Add appropriate styling information to make the plot easier to understand.\n", 340 | " \"\"\"\n", 341 | " \n", 342 | " \n", 343 | " \n", 344 | " \"\"\"\n", 345 | " You can ignore this next section, which labels some points (A and B) that are referenced later in the lab.\n", 346 | " \"\"\" \n", 347 | " if str(data.time.values) == \"2022-10-12T12:00:00.000000000\":\n", 348 | " ax.text(-100,42.5,\"A\", size=12, weight=\"bold\", transform=crs.PlateCarree())\n", 349 | " ax.text(-86.5,44.5,\"B\", size=12, weight=\"bold\", transform=crs.PlateCarree())\n", 350 | " \n", 351 | " \"\"\"\n", 352 | " Finally, add an appropriate title for the map that shows what is plotted and the time at which the map is valid.\n", 353 | " \"\"\"\n", 354 | " \n", 355 | " \n" 356 | ] 357 | }, 358 | { 359 | "cell_type": "markdown", 360 | "id": "7b2d9e38-ee5b-4181-a344-e1be1ba603f2", 361 | "metadata": {}, 362 | "source": [ 363 | "

\n", 364 | "8. Since the filtered data earlier in the lab are at 500 hPa, we need to first re-run the data processing function to filter the data to 300 hPa on October 12th, 2022 at 1200 UTC. After doing so, run the full wind plotting function to plot the 300 hPa winds on October 12th, 2022 at 1200 UTC." 365 | ] 366 | }, 367 | { 368 | "cell_type": "code", 369 | "execution_count": null, 370 | "id": "215cb42f-b664-4843-aa1e-c423f8b13d98", 371 | "metadata": {}, 372 | "outputs": [], 373 | "source": [] 374 | }, 375 | { 376 | "cell_type": "markdown", 377 | "id": "c594bcfb-86f0-4651-8c98-5c4a354daffb", 378 | "metadata": {}, 379 | "source": [ 380 | "

\n", 381 | "9. To complete the analysis, we also need to plot 500 hPa absolute vorticity and 300 hPa winds for September 23rd, 2022 at 1800 UTC. Re-run the data processing and plotting functions to obtain these maps. Note: as you did above, you will need to re-run the data processing function between creating these plots." 382 | ] 383 | }, 384 | { 385 | "cell_type": "code", 386 | "execution_count": null, 387 | "id": "d7457295-f4ec-47f4-8fa5-cc78206ea26f", 388 | "metadata": {}, 389 | "outputs": [], 390 | "source": [] 391 | }, 392 | { 393 | "cell_type": "markdown", 394 | "id": "1eb7c2a9-0229-4206-a338-a3c595a07ea4", 395 | "metadata": {}, 396 | "source": [ 397 | "### You have now completed Part I of the Python portion of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 398 | ] 399 | } 400 | ], 401 | "metadata": { 402 | "kernelspec": { 403 | "display_name": "Python 3 (ipykernel)", 404 | "language": "python", 405 | "name": "python3" 406 | }, 407 | "language_info": { 408 | "codemirror_mode": { 409 | "name": "ipython", 410 | "version": 3 411 | }, 412 | "file_extension": ".py", 413 | "mimetype": "text/x-python", 414 | "name": "python", 415 | "nbconvert_exporter": "python", 416 | "pygments_lexer": "ipython3", 417 | "version": "3.7.12" 418 | } 419 | }, 420 | "nbformat": 4, 421 | "nbformat_minor": 5 422 | } 423 | -------------------------------------------------------------------------------- /Lab 10/Lab_10_PT_2.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "29c9fd6d-b76d-4f97-80d9-3d7d40990ae7", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 10 Part II: Plotting Satellite Data\n", 9 | "

\n", 10 | "In this part of the tutorial, we create visible satellite imagery to compare with our model-derived upper-air plots from Part I.\n", 11 | "
\n", 12 | "### Module Documentation\n", 13 | "1. The datetime function from the datetime module: https://docs.python.org/3/library/datetime.html\n", 14 | "2. Cartopy Feature: https://scitools.org.uk/cartopy/docs/latest/matplotlib/feature_interface.html\n", 15 | "3. Matplotlib: https://matplotlib.org/\n", 16 | "4. Matplotlib Normalize: https://matplotlib.org/stable/api/_as_gen/matplotlib.colors.Normalize.html\n", 17 | "\n", 18 | "\n", 19 | "\n", 20 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 21 | "\n", 22 | "---\n", 23 | "\n", 24 | "
\n", 25 | "1. As usual, we start by importing the modules we need for our Python code." 26 | ] 27 | }, 28 | { 29 | "cell_type": "code", 30 | "execution_count": null, 31 | "id": "82d63adc-bc01-4fe9-acd2-3f80cc8e5bb3", 32 | "metadata": {}, 33 | "outputs": [], 34 | "source": [ 35 | "#import everything from the metpy module\n", 36 | "import metpy\n", 37 | "\n", 38 | "#from the plotting portion of metpy, import the add_timestamp function\n", 39 | "from metpy.plots import add_timestamp\n", 40 | "\n", 41 | "#import the crs part of the cartopy module for geographic parts of the data such as map projections\n", 42 | "import cartopy.crs as ccrs\n", 43 | "\n", 44 | "#import the feature part of the cartopy module for things such as state lines\n", 45 | "import cartopy.feature as cfeature\n", 46 | "\n", 47 | "#import the pyplot part of the matplotlib module\n", 48 | "import matplotlib.pyplot as plt\n", 49 | "\n", 50 | "#import the datetime feature of the datetime module\n", 51 | "from datetime import datetime\n", 52 | "\n", 53 | "#from the matplotlib module, in the colors part, import the Normalize function. This is for adjusting the colors we use to plot the satellite data.\n", 54 | "from matplotlib.colors import Normalize\n", 55 | "\n", 56 | "#import the module xarray and save it to xr\n", 57 | "import xarray as xr\n", 58 | "\n", 59 | "#add this line so that the JupyterHub plots the maps within the Jupyter Notebook\n", 60 | "%matplotlib inline" 61 | ] 62 | }, 63 | { 64 | "cell_type": "markdown", 65 | "id": "448f6ee3-d34d-4f46-acdc-442b19a3daff", 66 | "metadata": {}, 67 | "source": [ 68 | "

\n", 69 | "2. Once again, the necessary satellite data have already been downloaded for you. The cell below opens a GOES-16 visible satellite file for September 23rd, 2022 at 1800 UTC. " 70 | ] 71 | }, 72 | { 73 | "cell_type": "code", 74 | "execution_count": null, 75 | "id": "f05e0462-63b4-4557-8dab-f2d4df2d2a02", 76 | "metadata": {}, 77 | "outputs": [], 78 | "source": [ 79 | "time = datetime(2022,9,23,18)\n", 80 | "\n", 81 | "#Specifying the data's location.\n", 82 | "lab_data_loc = \"/data/AtmSci360/Lab_10/\"\n", 83 | "\n", 84 | "#Open the satellite data with xarray. \n", 85 | "sat_data = xr.open_dataset(f\"{lab_data_loc}{time:%m%d%y_%H}_goes16.nc\", engine=\"netcdf4\")\n", 86 | "sat_data" 87 | ] 88 | }, 89 | { 90 | "cell_type": "markdown", 91 | "id": "e11e57ed-bc7d-43a8-9b53-051cdd0e93e1", 92 | "metadata": {}, 93 | "source": [ 94 | "

\n", 95 | "3. Since the data are in the same format as in Lab 4 Part II, use the code block below to parse out the satellite data, get the projection, set up the colormap, and plot the visible satellite data. Be sure to have an appropriate color map, number of labels, and geographical references for your satellite image. Finally, add an appropriate title." 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": null, 101 | "id": "a5dc3638-07d8-4607-8ed9-29ea0d8acc96", 102 | "metadata": {}, 103 | "outputs": [], 104 | "source": [] 105 | }, 106 | { 107 | "cell_type": "markdown", 108 | "id": "2ed22858-bbe2-4f0b-94e1-8a7d91c63715", 109 | "metadata": {}, 110 | "source": [ 111 | "### You have now completed the Python portion part of Lab 10. Be sure upload a fully rendered version of this Jupyter Notebook to your GitHub repository." 112 | ] 113 | } 114 | ], 115 | "metadata": { 116 | "kernelspec": { 117 | "display_name": "Python 3 (ipykernel)", 118 | "language": "python", 119 | "name": "python3" 120 | }, 121 | "language_info": { 122 | "codemirror_mode": { 123 | "name": "ipython", 124 | "version": 3 125 | }, 126 | "file_extension": ".py", 127 | "mimetype": "text/x-python", 128 | "name": "python", 129 | "nbconvert_exporter": "python", 130 | "pygments_lexer": "ipython3", 131 | "version": "3.7.12" 132 | } 133 | }, 134 | "nbformat": 4, 135 | "nbformat_minor": 5 136 | } 137 | -------------------------------------------------------------------------------- /Lab 11/Lab 11.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 11/Lab 11.docx -------------------------------------------------------------------------------- /Lab 11/Lab_11.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "3e3cdb32-bcaa-4356-8dd1-7e103d465b3e", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 11: Plotting Skew-T, ln-p diagrams\n", 9 | "

\n", 10 | "In this exercise, we are going to create Skew-T, ln-p diagrams from observed data at specific locations. We will once again use the MetPy functionality for plotting Skew-T, ln-p diagrams.\n", 11 | "
\n", 12 | "### Module Documentation\n", 13 | "\n", 14 | "1. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 15 | "2. Siphon Wyoming Upper Air: https://unidata.github.io/siphon/latest/api/simplewebservice.html#siphon.simplewebservice.wyoming.WyomingUpperAir.request_data\n", 16 | "3. MetPy Skew T: https://unidata.github.io/MetPy/latest/api/generated/metpy.plots.SkewT.html\n", 17 | "\n", 18 | "\n", 19 | "\n", 20 | "\n", 21 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 22 | "\n", 23 | "---\n", 24 | "\n", 25 | "
\n", 26 | "1. As usual, we start by importing the modules we need for our Python code." 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": null, 32 | "id": "80c2e492-0d91-4e68-afb6-ee0bbb86fe0c", 33 | "metadata": {}, 34 | "outputs": [], 35 | "source": [ 36 | "#from python's date and time module (from datetime), import the ability to work with date and times (import datetime)\n", 37 | "from datetime import datetime\n", 38 | "\n", 39 | "#using the module siphon and its ability to retrieve files from online (.simplewebservice), specifically for the University of Wyoming (.wyoming), \n", 40 | "#import the ability to download from the University of Wyoming's upper-air data archive\n", 41 | "from siphon.simplewebservice.wyoming import WyomingUpperAir\n", 42 | "\n", 43 | "#from metpy's plotting abilities, import the SkewT plotting class\n", 44 | "from metpy.plots import SkewT\n", 45 | "\n", 46 | "#import the plotting abilities of the module matplotlib (import matplotlib.pyplot) and save it to plt\n", 47 | "import matplotlib.pyplot as plt\n", 48 | "\n", 49 | "#from the metpy units feature (metpy.units), import the ability to assign and convert units (units)\n", 50 | "from metpy.units import units" 51 | ] 52 | }, 53 | { 54 | "cell_type": "markdown", 55 | "id": "88de5db7-6322-4e3f-a09b-3d007ac913ed", 56 | "metadata": {}, 57 | "source": [ 58 | "

\n", 59 | "2. Like we have done in past labs, we first need to create variables to collect our desired data. For a rawinsonde observation, we need two variables: the date and time for the observation (in datetime format) and the three-character identifier for the observation location (as a string). In the code block below, define the time variable as sounding_date and set it to 1200 UTC September 29, 2012 and define the site variable as station and set it to MPX (Chanhassen, MN)." 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "id": "af32ce46-361e-4611-ab2b-15e1746b3456", 66 | "metadata": {}, 67 | "outputs": [], 68 | "source": [] 69 | }, 70 | { 71 | "cell_type": "markdown", 72 | "id": "99f47963-cb24-4432-8ead-17f38ac1bf1c", 73 | "metadata": {}, 74 | "source": [ 75 | "

\n", 76 | "3. In the code section below, take the variables you created above and pass them into the WyomingUpperAir.request_data() function that has been started for you below. To see how to pass in the data, go to https://unidata.github.io/siphon/latest/api/simplewebservice.html#siphon.simplewebservice.wyoming.WyomingUpperAir.request_data." 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": null, 82 | "id": "3f4cb031-8786-4904-8e1d-548e61daacab", 83 | "metadata": {}, 84 | "outputs": [], 85 | "source": [ 86 | "upper_air_data = WyomingUpperAir.request_data()" 87 | ] 88 | }, 89 | { 90 | "cell_type": "markdown", 91 | "id": "57a47890-d1a1-456d-b8ca-c79f10942579", 92 | "metadata": {}, 93 | "source": [ 94 | "

\n", 95 | "4. We are now ready to plot. Like in previous labs, we will once again use a function for our plot since we will be plotting multiple Skew-T, ln-p diagrams. Watch out in the comments for areas you need to fill in." 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": null, 101 | "id": "bcc8a9b3-3c57-422a-a6d7-298d2c0003cc", 102 | "metadata": {}, 103 | "outputs": [], 104 | "source": [ 105 | "\"\"\"\n", 106 | "Below, a function is defined to plot a Skew-T, ln-p diagram from observation data. This plot contains\n", 107 | "temperature and dewpoint traces.\n", 108 | "\n", 109 | "\n", 110 | "INPUT:\n", 111 | " data : PANDAS DATAFRAME\n", 112 | " The observed rawinsonde data.\n", 113 | " station : STRING\n", 114 | " The rawinsonde station at which the data are valid.\n", 115 | " valid_time : DATETIME\n", 116 | " The date and time of the rawinsonde observation.\n", 117 | "\"\"\"\n", 118 | "\n", 119 | "\n", 120 | "def plot_skewT(data, station, valid_time):\n", 121 | " \"\"\"\n", 122 | " For the Skew-T, ln-p plots, MetPy requires all inputs into the plot to have units.\n", 123 | " Assign the appropriate units for the pressure, temperature, dew point, and wind variables.\n", 124 | " Note: we don't want to convert the units, we just need to append the data's units to the data.\n", 125 | " Hint: Assign the units using MetPy.\n", 126 | " \"\"\"\n", 127 | " \n", 128 | " \n", 129 | " \"\"\"\n", 130 | " Define a matplotlib figure that is 900px x 900px with a resolution of 300 dots per inch and save it to the variable fig.\n", 131 | " \"\"\"\n", 132 | "\n", 133 | " \n", 134 | " \n", 135 | " \"\"\"\n", 136 | " Here I initialize a Skew-T, ln-p diagram similar to how we defined our axes in previous labs.\n", 137 | " \n", 138 | " The first argument is the figure variable so MetPy knows what to use to create the Skew-T, ln-p axes.\n", 139 | " The second argument is how to rotate the temperature axis. The standard Skew-T, ln-p diagram uses a 45° skew.\n", 140 | " \"\"\"\n", 141 | " skew = SkewT(fig, rotation=45)\n", 142 | " \n", 143 | " \n", 144 | " \"\"\"\n", 145 | " Like in previous labs, we want to limit the plot's extent. However, in the case of Skew-T, ln-p diagrams,\n", 146 | " limiting the plot's extent is handled different from before. For the Skew-T, ln-p diagram, we must have\n", 147 | " separate statements for the x and y axes with the minimum value in the first argument and the maximum value in the second\n", 148 | " argument. In the two lines of code below, I set the extent of the plot to 1000-100 hPa and -30°C to 40°C.\n", 149 | " \"\"\"\n", 150 | " skew.ax.set_ylim(1000, 100)\n", 151 | " skew.ax.set_xlim(-30, 40)\n", 152 | " \n", 153 | " \n", 154 | " \"\"\"\n", 155 | " We are ready to now start drawing our temperature and dewpoint traces. In the lines\n", 156 | " below, use MetPy's SkewT plotting abilities to plot the temperature and dewpoint trace.\n", 157 | " \n", 158 | " Hint: Documentation for this is available at https://unidata.github.io/MetPy/latest/api/generated/metpy.plots.SkewT.html#metpy.plots.SkewT.plot.\n", 159 | " \"\"\"\n", 160 | " \n", 161 | " \n", 162 | " \"\"\"\n", 163 | " We need to thin the wind data since there are too many observations to legibly plot.\n", 164 | " In the code below, set up a slice for the wind barbs (like in mapping labs) and have it skip\n", 165 | " every 3 observations.\n", 166 | " \"\"\"\n", 167 | " wind_slice = \n", 168 | " \n", 169 | " \n", 170 | " \"\"\"\n", 171 | " In the line below, plot the wind barbs on the right side of the diagram, \n", 172 | " outside the main Skew-T section, using the slice we created to thin the wind barbs.\n", 173 | " \n", 174 | " Hint: Documentation for this is available at https://unidata.github.io/MetPy/latest/api/generated/metpy.plots.SkewT.html#metpy.plots.SkewT.plot_barbs.\n", 175 | " \"\"\"\n", 176 | " \n", 177 | " \n", 178 | " \n", 179 | " \"\"\"\n", 180 | " In order to see some of the thermodynamic properties of the sounding, we need to add\n", 181 | " dry adiabats, moist/pseudoadiabats, and mixing ratio lines. Using the documentation\n", 182 | " given below, add these lines to your diagram.\n", 183 | " \n", 184 | " Moist/Pseudoadiabats:\n", 185 | " https://unidata.github.io/MetPy/latest/api/generated/metpy.plots.SkewT.html#metpy.plots.SkewT.plot_moist_adiabats\n", 186 | " \n", 187 | " Dry Adiabats:\n", 188 | " https://unidata.github.io/MetPy/latest/api/generated/metpy.plots.SkewT.html#metpy.plots.SkewT.plot_dry_adiabats\n", 189 | " \n", 190 | " Mixing Ratio Lines:\n", 191 | " https://unidata.github.io/MetPy/latest/api/generated/metpy.plots.SkewT.html#metpy.plots.SkewT.plot_mixing_lines\n", 192 | " \n", 193 | " Note that the default styling for these lines makes the plot messy. \n", 194 | " Luckily, however, these augments use the standard styling arguments for\n", 195 | " lines in matplotlib. Thus, add styling information to the plotting commands\n", 196 | " that you add below to ensure a legible plot.\n", 197 | " \"\"\"\n", 198 | " \n", 199 | " \n", 200 | "\n", 201 | " \"\"\"\n", 202 | " Since we are not plotting geographic data, we need to provide labels for the x and y axes.\n", 203 | " The two lines provide appropriate labels for these axes.\n", 204 | " \"\"\"\n", 205 | " skew.ax.set_xlabel(\"Temperature ($\\degree$C)\")\n", 206 | " skew.ax.set_ylabel(\"Pressure (hPa)\")\n", 207 | " \n", 208 | " \n", 209 | " \"\"\"\n", 210 | " Finally, like all plots we have created this semester, this plot needs a title. Create an appropriate title \n", 211 | " for this plot using the station name and time that we used to get the data.\n", 212 | " \"\"\"\n", 213 | " \n", 214 | " \n", 215 | " " 216 | ] 217 | }, 218 | { 219 | "cell_type": "markdown", 220 | "id": "8234b848-c018-4067-8a55-f2404ee5db22", 221 | "metadata": {}, 222 | "source": [ 223 | "

\n", 224 | "5. Call the plotting function you just created." 225 | ] 226 | }, 227 | { 228 | "cell_type": "code", 229 | "execution_count": null, 230 | "id": "9a13e55c-bad0-41e3-8a93-d2886879dc8a", 231 | "metadata": {}, 232 | "outputs": [], 233 | "source": [] 234 | }, 235 | { 236 | "cell_type": "markdown", 237 | "id": "af39306b-f94a-4ed1-af52-ad4d0e37efe8", 238 | "metadata": {}, 239 | "source": [ 240 | "

\n", 241 | "6. We need a total of six Skew-T, ln-p plots for today's lab. We could manually type out the function for each sounding we want to view, but today we are going to use a loop to plot all of our Skew-T, ln-p plots. I've complete the code for this below, but follow along with the comments to see what I am doing since you will have to recreate this code in the next lab. \n", 242 | "\n", 243 | "For today, we want to generate soundings for the following times and locations:\n", 244 | "\n", 245 | "| Time | Sounding Site |\n", 246 | "|:----:|:----------:|\n", 247 | "| 1/13/2007 0000 UTC | SGF |\n", 248 | "| 1/17/2004 1200 UTC | ILN |\n", 249 | "| 10/5/2013 0000 UTC | RAP |\n", 250 | "| 4/24/2009 1200 UTC | MPX |\n", 251 | "| 5/5/2007 0000 UTC | OUN |\n", 252 | "| 8/7/2013 0000 UTC | MPX |\n" 253 | ] 254 | }, 255 | { 256 | "cell_type": "code", 257 | "execution_count": null, 258 | "id": "16e955e9-24c8-4798-8cf1-744fd99d0f20", 259 | "metadata": {}, 260 | "outputs": [], 261 | "source": [ 262 | "\"\"\"\n", 263 | "I first create a list (using standard Python syntax, []) containing the sounding sites. Because Python is zero-based, a list is referenced\n", 264 | "as [index_0, index_1, index_2, ...]. Each index in the list contains the sounding site as a string, just like how we have set up the sounding\n", 265 | "site in the past. Note that the order of the list is important. We will see why in the next list.\n", 266 | "\"\"\"\n", 267 | "\n", 268 | "sites = [\"SGF\", \"ILN\", \"RAP\", \"MPX\", \"OUN\", \"MPX\"]\n", 269 | "\n", 270 | "\"\"\"\n", 271 | "I next create a list of times. Each index is a datetime like we have used in the past. Note that each index of this\n", 272 | "list corresponds with the site contained in the same index in the sites list.\n", 273 | "\"\"\"\n", 274 | "\n", 275 | "times = [datetime(2007,1,13,0), datetime(2004,1,17,12), datetime(2013,10,5,0), datetime(2009,4,24,12), datetime(2007,5,5,0), datetime(2013,8,7,0)]\n", 276 | "\n", 277 | "\"\"\"\n", 278 | "Next, we use a for loop to iterate over each sounding.\n", 279 | "Unlike other programming languages that require you to iterate sequentially\n", 280 | "(e.g., 1, 2, 3, ..., N) over a set of elements, Python allows you to iterate\n", 281 | "directly over the elements. Here, we have two elements over which we wish to\n", 282 | "iterate: the sites and times. We combine these using Python's zip function and\n", 283 | "iterate over them concurrently.\n", 284 | "\n", 285 | "We then use the individual locations and times as input to the data requesting and\n", 286 | "skew-T, ln-p plotting functions.\n", 287 | "\"\"\"\n", 288 | "\n", 289 | "#get the sounding data and plot the Skew-T\n", 290 | "for site, time in zip(sites, times):\n", 291 | " #download the data from the University of Wyoming for the date and site that we want\n", 292 | " data = WyomingUpperAir.request_data(time, site)\n", 293 | " #plot the Skew-T using the data we just got from the University of Wyoming.\n", 294 | " plot_skewT(data, site, time)" 295 | ] 296 | }, 297 | { 298 | "cell_type": "markdown", 299 | "id": "3ef724af-8c2e-4f19-848b-9218dca03c25", 300 | "metadata": {}, 301 | "source": [ 302 | "### You have now completed the Python portion of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 303 | ] 304 | } 305 | ], 306 | "metadata": { 307 | "kernelspec": { 308 | "display_name": "Python 3 (ipykernel)", 309 | "language": "python", 310 | "name": "python3" 311 | }, 312 | "language_info": { 313 | "codemirror_mode": { 314 | "name": "ipython", 315 | "version": 3 316 | }, 317 | "file_extension": ".py", 318 | "mimetype": "text/x-python", 319 | "name": "python", 320 | "nbconvert_exporter": "python", 321 | "pygments_lexer": "ipython3", 322 | "version": "3.7.12" 323 | } 324 | }, 325 | "nbformat": 4, 326 | "nbformat_minor": 5 327 | } 328 | -------------------------------------------------------------------------------- /Lab 12/Lab 12.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 12/Lab 12.docx -------------------------------------------------------------------------------- /Lab 12/Lab_12_PT_1.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "3e3cdb32-bcaa-4356-8dd1-7e103d465b3e", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 12 Part I: Plotting Skew-T, ln-p diagrams\n", 9 | "

\n", 10 | "In this next to final exercise, we are going to create more Skew-T, ln-p diagrams at multiple specified locations using observations to aid in identifying common features. \n", 11 | "
\n", 12 | "### Module Documentation\n", 13 | "\n", 14 | "1. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 15 | "2. Siphon Wyoming Upper Air: https://unidata.github.io/siphon/latest/api/simplewebservice.html#siphon.simplewebservice.wyoming.WyomingUpperAir.request_data\n", 16 | "3. MetPy Skew T: https://unidata.github.io/MetPy/latest/api/generated/metpy.plots.SkewT.html\n", 17 | "\n", 18 | "\n", 19 | "\n", 20 | "\n", 21 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 22 | "\n", 23 | "---\n", 24 | "\n", 25 | "
\n", 26 | "1. As usual, we start by importing the modules we need for our Python code." 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": null, 32 | "id": "80c2e492-0d91-4e68-afb6-ee0bbb86fe0c", 33 | "metadata": {}, 34 | "outputs": [], 35 | "source": [ 36 | "#from python's date and time module (from datetime) import the ability to work with date and times (import datetime)\n", 37 | "from datetime import datetime\n", 38 | "\n", 39 | "#using the module siphon and its ability to retrieve files from online resources (.simplewebservice), specifically for the University of Wyoming (.wyoming), \n", 40 | "#import the ability to download from the University of Wyoming's upper-air data archive\n", 41 | "from siphon.simplewebservice.wyoming import WyomingUpperAir\n", 42 | "\n", 43 | "#from metpy's plotting abilities, import the SkewT plotting class\n", 44 | "from metpy.plots import SkewT\n", 45 | "\n", 46 | "#import the plotting abilities of the module matplotlib (import matplotlib.pyplot) and save it to plt\n", 47 | "import matplotlib.pyplot as plt\n", 48 | "\n", 49 | "#from metpy's units feature (metpy.units), import the ability to assign and convert units (units)\n", 50 | "from metpy.units import units" 51 | ] 52 | }, 53 | { 54 | "cell_type": "markdown", 55 | "id": "88de5db7-6322-4e3f-a09b-3d007ac913ed", 56 | "metadata": {}, 57 | "source": [ 58 | "

\n", 59 | "2. To make things easy today, copy your skew-T plotting function from Lab 11 into the cell below since we need the same plots as in Lab 11." 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "id": "af32ce46-361e-4611-ab2b-15e1746b3456", 66 | "metadata": {}, 67 | "outputs": [], 68 | "source": [] 69 | }, 70 | { 71 | "cell_type": "markdown", 72 | "id": "99f47963-cb24-4432-8ead-17f38ac1bf1c", 73 | "metadata": {}, 74 | "source": [ 75 | "

\n", 76 | "3. In the code section below, we need to make 5 Skew-T plots for the dates and stations below. In the code cell below, setup a loop similar to that in Lab 11 to generate the required soundings.\n", 77 | "\n", 78 | "| Time | Sounding Site |\n", 79 | "|:----:|:----------:|\n", 80 | "| 5/20/2013 1200 UTC | OUN |\n", 81 | "| 7/12/2016 0000 UTC | MPX |\n", 82 | "| 2/2/2011 0000 UTC | ILX |\n", 83 | "| 5/3/1999 1200 UTC | OUN |\n", 84 | "| 5/19/2013 1200 UTC | OUN|\n" 85 | ] 86 | }, 87 | { 88 | "cell_type": "code", 89 | "execution_count": null, 90 | "id": "3f4cb031-8786-4904-8e1d-548e61daacab", 91 | "metadata": {}, 92 | "outputs": [], 93 | "source": [] 94 | }, 95 | { 96 | "cell_type": "markdown", 97 | "id": "3ef724af-8c2e-4f19-848b-9218dca03c25", 98 | "metadata": {}, 99 | "source": [ 100 | "### You have now completed Part I of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 101 | ] 102 | } 103 | ], 104 | "metadata": { 105 | "kernelspec": { 106 | "display_name": "Python 3 (ipykernel)", 107 | "language": "python", 108 | "name": "python3" 109 | }, 110 | "language_info": { 111 | "codemirror_mode": { 112 | "name": "ipython", 113 | "version": 3 114 | }, 115 | "file_extension": ".py", 116 | "mimetype": "text/x-python", 117 | "name": "python", 118 | "nbconvert_exporter": "python", 119 | "pygments_lexer": "ipython3", 120 | "version": "3.7.12" 121 | } 122 | }, 123 | "nbformat": 4, 124 | "nbformat_minor": 5 125 | } 126 | -------------------------------------------------------------------------------- /Lab 12/Lab_12_PT_2.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "3e3cdb32-bcaa-4356-8dd1-7e103d465b3e", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 12 Part II: Plotting Vertical Equivalent Potential Temperature\n", 9 | "

\n", 10 | "In this final exercise, we are going to create a vertical profile of equivalent potential temperature to find our most unstable parcel.\n", 11 | "
\n", 12 | "\n", 13 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 14 | "\n", 15 | "---\n", 16 | "\n", 17 | "
\n", 18 | "1. As usual, we start by importing the modules we need for our Python code." 19 | ] 20 | }, 21 | { 22 | "cell_type": "code", 23 | "execution_count": null, 24 | "id": "80c2e492-0d91-4e68-afb6-ee0bbb86fe0c", 25 | "metadata": { 26 | "tags": [] 27 | }, 28 | "outputs": [], 29 | "source": [ 30 | "#from python's date and time module (from datetime) import the ability to work with date and times (import datetime)\n", 31 | "from datetime import datetime\n", 32 | "\n", 33 | "#using the module siphon and its ability to retrieve files from online (.simplewebservice) specifically for the University of Wyoming (.wyoming), \n", 34 | "#import the ability to download from the University of Wyoming's upper-air data archive\n", 35 | "from siphon.simplewebservice.wyoming import WyomingUpperAir\n", 36 | "\n", 37 | "#import the plotting abilities of the module matplotlib (import matplotlib.pyplot) and save it to plt\n", 38 | "import matplotlib.pyplot as plt\n", 39 | "\n", 40 | "#import the metpy calc module as calc\n", 41 | "import metpy.calc as calc\n", 42 | "\n", 43 | "#from metpy units module import the units function\n", 44 | "from metpy.units import units\n", 45 | "\n", 46 | "#from the metpy's units feature (metpy.units) import the ability to assign and convert units (units)\n", 47 | "from metpy.units import units\n", 48 | "\n", 49 | "#import the numpy module as np\n", 50 | "import numpy as np" 51 | ] 52 | }, 53 | { 54 | "cell_type": "markdown", 55 | "id": "99f47963-cb24-4432-8ead-17f38ac1bf1c", 56 | "metadata": {}, 57 | "source": [ 58 | "

\n", 59 | "2. In the code section below, I retrieve the sounding data for May 20th, 2013 at 1200 UTC and calculate equivalent potential temperature for you. The resulting equivalent potential temperature is saved back to the pandas data frame as \"theta-e\"." 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "id": "3f4cb031-8786-4904-8e1d-548e61daacab", 66 | "metadata": {}, 67 | "outputs": [], 68 | "source": [ 69 | "site = \"OUN\"\n", 70 | "time = datetime(2013,5,20,12)\n", 71 | "data = WyomingUpperAir.request_data(time, site)\n", 72 | "data[\"theta-e\"] = calc.equivalent_potential_temperature(np.array(data[\"pressure\"]) * units.hPa, np.array(data[\"temperature\"]) * units.degC, np.array(data[\"dewpoint\"]) * units.degC)\n", 73 | "data" 74 | ] 75 | }, 76 | { 77 | "cell_type": "markdown", 78 | "id": "57a47890-d1a1-456d-b8ca-c79f10942579", 79 | "metadata": {}, 80 | "source": [ 81 | "

\n", 82 | "3. Now that you have the data to work with, we are ready to create an equivalent potential temperature plot. An example of this kind of plot can be found in the link below. Using the example plot as a guide create a plot in matplotlib to show theta-e with height. Be sure to have an appropriate axis labels and a title. I've given you reference guides to various functions that you may need below. (hint: you need to start by setting up the figure like normal)\n", 83 | "\n", 84 | "#### Example Plot:\n", 85 | "https://panthers.sharepoint.com/:i:/s/InnovativeWeatherWebsite-Group/EflvdcvIegNNvoeF5dYDTx0BZk8E2dSTefTWxVT3tdZy9g\n", 86 | "\n", 87 | "\n", 88 | "#### References\n", 89 | "Plot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.plot.html
\n", 90 | "xlabel: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.xlabel.html
\n", 91 | "ylabel: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.ylabel.html
\n", 92 | "title: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.title.html
\n" 93 | ] 94 | }, 95 | { 96 | "cell_type": "code", 97 | "execution_count": null, 98 | "id": "bcc8a9b3-3c57-422a-a6d7-298d2c0003cc", 99 | "metadata": {}, 100 | "outputs": [], 101 | "source": [ 102 | "\n", 103 | "plt.ylim(1000,200)\n", 104 | "plt.xlim(275,400)\n" 105 | ] 106 | }, 107 | { 108 | "cell_type": "markdown", 109 | "id": "24da137c-8505-40a4-a2b2-29765a6769da", 110 | "metadata": {}, 111 | "source": [ 112 | "4. In the section below rerun your theta-e plot for ILX on Feburary 2nd, 2011." 113 | ] 114 | }, 115 | { 116 | "cell_type": "code", 117 | "execution_count": null, 118 | "id": "64e51e15-cfa7-4ec0-b939-59da57e71e7c", 119 | "metadata": {}, 120 | "outputs": [], 121 | "source": [] 122 | }, 123 | { 124 | "cell_type": "markdown", 125 | "id": "3ef724af-8c2e-4f19-848b-9218dca03c25", 126 | "metadata": {}, 127 | "source": [ 128 | "### You have now completed Python portion of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 129 | ] 130 | } 131 | ], 132 | "metadata": { 133 | "kernelspec": { 134 | "display_name": "Python 3 (ipykernel)", 135 | "language": "python", 136 | "name": "python3" 137 | }, 138 | "language_info": { 139 | "codemirror_mode": { 140 | "name": "ipython", 141 | "version": 3 142 | }, 143 | "file_extension": ".py", 144 | "mimetype": "text/x-python", 145 | "name": "python", 146 | "nbconvert_exporter": "python", 147 | "pygments_lexer": "ipython3", 148 | "version": "3.7.12" 149 | } 150 | }, 151 | "nbformat": 4, 152 | "nbformat_minor": 5 153 | } 154 | -------------------------------------------------------------------------------- /Lab 2/Lab 2.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 2/Lab 2.docx -------------------------------------------------------------------------------- /Lab 3/Lab 3.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 3/Lab 3.docx -------------------------------------------------------------------------------- /Lab 3/Lab_3.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "d0091d9e-2896-4ecf-95f3-143837bb976a", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 3: Plotting Surface Data and Comparing Surface to Upper-Air Data\n", 9 | "

\n", 10 | "In this week's tutorial, we are going to cover how to plot surface observations using matplotlib, cartopy, and MetPy. Once the tutorial is complete, you will have the maps you need to complete the rest of Lab 3.\n", 11 | "
\n", 12 | "### Module Documentation\n", 13 | "1. MetPy Metar Parsing Function: https://unidata.github.io/MetPy/latest/api/generated/metpy.io.parse_metar_file.html\n", 14 | "2. Datetime: https://docs.python.org/3/library/datetime.html\n", 15 | "3. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 16 | "4. Caropy crs: https://scitools.org.uk/cartopy/docs/latest/reference/crs.html\n", 17 | "5. Cartopy Feature: https://scitools.org.uk/cartopy/docs/latest/matplotlib/feature_interface.html\n", 18 | "6. Metpy Station Plot: https://unidata.github.io/MetPy/latest/examples/plots/Station_Plot.html\n", 19 | "\n", 20 | "\n", 21 | "\n", 22 | "\n", 23 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 24 | "\n", 25 | "---\n", 26 | "\n", 27 | "### Part I: Surface Plots\n", 28 | "
\n", 29 | "1. As usual, we start by importing the modules we need for our Python code." 30 | ] 31 | }, 32 | { 33 | "cell_type": "code", 34 | "execution_count": null, 35 | "id": "533aca43-8c0d-4847-b989-22caa2da3d08", 36 | "metadata": {}, 37 | "outputs": [], 38 | "source": [ 39 | "#from python's datetime module (from datetime) import the date and time capabilities (import datetime)\n", 40 | "from datetime import datetime\n", 41 | "#import the plotting abilities of the module matplotlib (import matplotlib.pyplot) and save it to the variable plt (as plt)\n", 42 | "import matplotlib.pyplot as plt\n", 43 | "#from metpy's plotting abilities (metpy.plots) import the abilities to create a station plot (StationPlot) and the sky cover symbols (sky_cover).\n", 44 | "from metpy.plots import StationPlot, sky_cover, current_weather\n", 45 | "#import the cartopy (cartopy) module's coordinate reference system (.crs) and save it to the variable crs\n", 46 | "import cartopy.crs as crs\n", 47 | "#import the cartopy (cartopy) module's ability to plot geographic data (.feature) and save it to the variable cfeature \n", 48 | "import cartopy.feature as cfeature\n", 49 | "#from the metpy's units feature (metpy.units) import the ability to assign and convert units (units)\n", 50 | "from metpy.units import units\n", 51 | "#from python's data import module (io) import the ability to read a string as a file. This allows us to avoid downloading files which speeds things up and keeps your files storage clean.\n", 52 | "from io import StringIO\n", 53 | "#import the module to download files from the internet \n", 54 | "import requests\n", 55 | "#from metpy's calculation abilities import the function to reduce the number of observations\n", 56 | "from metpy.calc import reduce_point_density\n", 57 | "#from metpy's ability to read files, from the metar functions import the parse_metar_file function\n", 58 | "from metpy.io.metar import parse_metar_file\n", 59 | "\n", 60 | "#add this line so that the JupyterHub plots the maps within the Jupyter Notebook\n", 61 | "%matplotlib inline" 62 | ] 63 | }, 64 | { 65 | "cell_type": "markdown", 66 | "id": "3879e0cf-32a1-4000-8e98-03de222bfce8", 67 | "metadata": {}, 68 | "source": [ 69 | "

\n", 70 | "2. Like we have done in previous labs, set the obs_time variable to a datetime specifically for September 14, 2022 at 1200 UTC" 71 | ] 72 | }, 73 | { 74 | "cell_type": "code", 75 | "execution_count": null, 76 | "id": "1686d814-1305-4591-9a84-8878645e1f2d", 77 | "metadata": {}, 78 | "outputs": [], 79 | "source": [ 80 | "obs_time =" 81 | ] 82 | }, 83 | { 84 | "cell_type": "markdown", 85 | "id": "c1dd15a3-f186-470b-bc58-1b5b132e0a31", 86 | "metadata": {}, 87 | "source": [ 88 | "

\n", 89 | "3. In the code section below, I download the METAR data for the date and time that we have specified - just like we did in Lab 1 Part I." 90 | ] 91 | }, 92 | { 93 | "cell_type": "code", 94 | "execution_count": null, 95 | "id": "da08e8f5-88d1-43b5-aebc-0ac4734ed600", 96 | "metadata": {}, 97 | "outputs": [], 98 | "source": [ 99 | "#the url address to the metar file.\n", 100 | "url = f\"https://thredds-test.unidata.ucar.edu/thredds/fileServer/noaaport/text/metar/metar_{obs_time:%Y%m%d_%H}00.txt\"\n", 101 | "\n", 102 | "#from the request module, use the get function to retrieve the raw website data from the url we defined above\n", 103 | "web_data = requests.get(url)\n", 104 | "\n", 105 | "#Here we take the web data from before and pull the content (.content) from the web_data object. \n", 106 | "#Then we take the content and decode it to something that we can use rather than the website html (.decode()).\n", 107 | "web_content = web_data.content.decode()\n", 108 | "\n", 109 | "#Here we take the decoded web content from above and make it a file object. This avoids downloading the data to your file system which\n", 110 | "#speeds things up, and it keeps your file space from becoming cluttered. Instead, this puts the METAR file directly to the RAM of the machine.\n", 111 | "data_file = StringIO(web_content)\n", 112 | "\n", 113 | "\n", 114 | "#We now tell MetPy to parse out the file we have downloaded (data_file). MetPy only can get the day of the month from the METAR,\n", 115 | "#so we need to specify the month (file_time.month) and year (file_time.year) from the file time that we set before\n", 116 | "#or else it will assume the current month and year. \n", 117 | "metar_data = parse_metar_file(data_file, month = obs_time.month, year=obs_time.year)\n", 118 | "\n", 119 | "#Display the resulting METAR pandas-formatted DataFrame\n", 120 | "metar_data" 121 | ] 122 | }, 123 | { 124 | "cell_type": "markdown", 125 | "id": "e209cbe6-65d0-461c-96e9-914b3250bb09", 126 | "metadata": {}, 127 | "source": [ 128 | "

\n", 129 | "4. As with any plot we make, we must first set the projection that we want to use for the map. In the code block below, setup a Lambert Conformal Conic Projection centered at 40°N and 95°W. Have the cone of the Lambert Conformal Conic projection intersect at 30°N and 50°N." 130 | ] 131 | }, 132 | { 133 | "cell_type": "code", 134 | "execution_count": null, 135 | "id": "0abe2170-db48-4fe4-b7bd-3496afb5e022", 136 | "metadata": {}, 137 | "outputs": [], 138 | "source": [ 139 | "proj = " 140 | ] 141 | }, 142 | { 143 | "cell_type": "markdown", 144 | "id": "4ec3d88b-d646-4de1-a053-1674df4568cb", 145 | "metadata": {}, 146 | "source": [ 147 | "

\n", 148 | "5. Now that we have specified the projection, let's convert the points from the latitude and longitude coordinate system to a coordinate system that works with the projection we just specified. When complete, the function will output x/y (or Cartesian) distances from the map's center point to each other point. These x/y (or Cartesian) positions will be used later." 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "execution_count": null, 154 | "id": "6bd93999-76a5-4a79-a6fc-f55375076dfc", 155 | "metadata": {}, 156 | "outputs": [], 157 | "source": [ 158 | "#projection_variable.transform_points(coordinate system data is in, longitude value, latitude value)\n", 159 | "point_locs = proj.transform_points(crs.PlateCarree(), metar_data['longitude'].values,metar_data['latitude'].values)\n", 160 | "point_locs" 161 | ] 162 | }, 163 | { 164 | "cell_type": "markdown", 165 | "id": "c91f3bf8-cf4b-4373-852b-9971bdbec243", 166 | "metadata": {}, 167 | "source": [ 168 | "

\n", 169 | "6. If we were to plot every station available in the data we downloaded, the data would be so dense that we would never be able read the plot. So, in the code section below, we are going to reduce the number observations using MetPy's reduce_point_density function. The reduce_point_density function allows us to specify to keep only those observations that are not within a specified radius of another kept observation. In the case below, I've specified this radius to be 150 km.

The variable metar_data (created in step 3 above) is a pandas DataFrame. The function reduce_point_density returns the indices from this DataFrame that we need to keep to reduce the data's density by the just-specified amount. The point_locs variable that we created in step 5 (with the x and y values) is the first of the inputs for the reduce_point_desnsity function and the second input is the spacing radius in meters." 170 | ] 171 | }, 172 | { 173 | "cell_type": "code", 174 | "execution_count": null, 175 | "id": "3e607efb-52a5-432e-bd7f-f9a99b17e18a", 176 | "metadata": {}, 177 | "outputs": [], 178 | "source": [ 179 | "data = metar_data[reduce_point_density(point_locs, 150000.)]" 180 | ] 181 | }, 182 | { 183 | "cell_type": "markdown", 184 | "id": "15d97907-c32e-4caa-91a2-43ea8f023ff9", 185 | "metadata": {}, 186 | "source": [ 187 | "

\n", 188 | "7. When using METAR data, the data are in a default set of units, which are not always the units in which we want our data. We created conversion functions to address this problem in Lab 1, but there are multiple ways to address the unit conversion issue. Today, we are going to see how to do unit conversions in MetPy. MetPy has the ability to attach units to a single value or to every value in an array. Attaching units with MetPy can be useful when using MetPy calculation functions (as we will be using throughout the semester) since MetPy calculation functions look at a value's units and start off by automatically converting them to the units that the calculation function needs. Adding units also allows us to use MetPy's unit conversion functionality, which is useful for when we aren't doing any calculations. The table below shows the unit codes for a few of the units for which MetPy can work.

\n", 189 | "\n", 190 | "| Unit | MetPy Code |\n", 191 | "|:----:|:----------:|\n", 192 | "| Degrees Fahrenheit | degF |\n", 193 | "| Degrees Celsius | degC |\n", 194 | "| Kelvin | kelvin |\n", 195 | "| Knots | kt |\n", 196 | "| Meters Per Second | mps |\n", 197 | "| Miles Per Hour | mph |\n", 198 | "
\n" 199 | ] 200 | }, 201 | { 202 | "cell_type": "code", 203 | "execution_count": null, 204 | "id": "b3ecb3f7-acb6-4d2b-96a1-2586bec67f01", 205 | "metadata": {}, 206 | "outputs": [], 207 | "source": [ 208 | "#assign units to temperature\n", 209 | "temps = data[\"air_temperature\"].values * units.degC\n", 210 | "\n", 211 | "#convert the temperature to the desired units and save it back to the dataframe\n", 212 | "data[\"air_temperature\"] = temps.to(units.degF)" 213 | ] 214 | }, 215 | { 216 | "cell_type": "markdown", 217 | "id": "fdc741a6-3b5b-4a82-bc7f-50d43c39982a", 218 | "metadata": {}, 219 | "source": [ 220 | "

\n", 221 | "8. In the cell, below convert the dew-point temperature (dew_point_temperature), u-wind component (eastward_wind), and the v-wind component (northward_wind) from the standard METAR units to the appropriate imperial units." 222 | ] 223 | }, 224 | { 225 | "cell_type": "code", 226 | "execution_count": null, 227 | "id": "47c2721c-9f7b-4d6a-b4ff-e93d693724d3", 228 | "metadata": {}, 229 | "outputs": [], 230 | "source": [] 231 | }, 232 | { 233 | "cell_type": "markdown", 234 | "id": "736dc68c-40a7-4677-b898-000d5b8f6e0f", 235 | "metadata": {}, 236 | "source": [ 237 | "

\n", 238 | "9. All of the data now having been wrangled (often 80+% of an atmospheric scientist's job!), we are ready to plot. In the section below, I have written out some of the plotting code for you. The rest is left for you to complete. Pay attention to the comments to know where you need to fill in with your own code." 239 | ] 240 | }, 241 | { 242 | "cell_type": "code", 243 | "execution_count": null, 244 | "id": "5fd220e0-ee3c-410f-b30c-810b9f49c57b", 245 | "metadata": {}, 246 | "outputs": [], 247 | "source": [ 248 | "\"\"\"\n", 249 | "Create a figure with a size of 2000x1200 with a dpi of 300.\n", 250 | "\"\"\"\n", 251 | "\n", 252 | "fig = \n", 253 | "\n", 254 | "\"\"\"\n", 255 | "Create the subplot with the projection we defined before.\n", 256 | "\"\"\"\n", 257 | "\n", 258 | "ax = \n", 259 | "\n", 260 | "\"\"\"\n", 261 | "Add the states, borders, and coastlines to your map. Be sure to have your borders setup with appropriate line colors and linewidths.\n", 262 | "\"\"\"\n", 263 | "\n", 264 | "# Set plot bounds (115-70°W, 24-53°N).\n", 265 | "ax.set_extent((-115, -70, 24, 53))\n", 266 | "\n", 267 | "\"\"\"\n", 268 | "Here we setup the station plot. This line defines the observation locations.\n", 269 | "\n", 270 | "StationPlot(plot variable, longitude of each station, latitude of each station, clip_on=whether to clip off values that fall outside the plot,\n", 271 | "transform=the data's coordinate system, fontsize=the size of the data values)\n", 272 | "\"\"\"\n", 273 | "stationplot = StationPlot(ax, data['longitude'].values, data['latitude'].values,\n", 274 | " clip_on=True, transform=crs.PlateCarree(), fontsize=7)\n", 275 | "\n", 276 | "\"\"\"\n", 277 | "Plot the temperature and dew-point temperature in the standard station plot locations. This can be accomplished the same way\n", 278 | "we plotted these variables in the upper-air plots in Lab 2.\n", 279 | "\"\"\"\n", 280 | "\n", 281 | "\"\"\"\n", 282 | "This line plots the sea-level pressure values. In surface plots, the sea-level pressure is in the upper right, and so we tell \n", 283 | "MetPy to plot the sea-level pressure to the northeast ('NE') of the center. We typically only want to plot the last three digits of \n", 284 | "the sea-level pressure value, and so we have to add a formatter to this argument to do so. The lambda command in the formatter is essentially like a function.\n", 285 | "The lambda can take an argument, which in this case is v, and it has an expression, which in this case is the format function. In the format function, we \n", 286 | "take the variable v and remove the decimal places (.0) while keeping the number as a float (f). We multiply by 10 so we don't have the decimal place. \n", 287 | "Once the format function is finished, we keep only the last three digits ([-3:]).\n", 288 | "\"\"\"\n", 289 | "stationplot.plot_parameter('NE', data['air_pressure_at_sea_level'].values,\n", 290 | " formatter=lambda v: format(10 * v, '.0f')[-3:])\n", 291 | "\n", 292 | "\"\"\"\n", 293 | "This line adds the cloud coverage symbol in the middle of our station plot. In our ingested data, the cloud coverage comes as a number. The object sky_cover that we imported\n", 294 | "at the start holds what symbol the number represents, and so the plot_symbol matches the number with the symbol and then plots the symbol.\n", 295 | "\n", 296 | "plot_symbol(location, symbol number code, object containing what symbol each number code matches)\n", 297 | "\n", 298 | "\"\"\"\n", 299 | "stationplot.plot_symbol('C', data['cloud_coverage'].values, sky_cover)\n", 300 | "\n", 301 | "\"\"\"\n", 302 | "This line plots the current weather symbol to the left of center (W). Once again, this comes in as a number that we must match to a symbol to plot. At the start, we \n", 303 | "imported the current_weather object which has the symbols that match the codes, allowing us to again use the plot_symbol function to plot the current weather.\n", 304 | "\n", 305 | "\"\"\"\n", 306 | "stationplot.plot_symbol('W', data['current_wx1_symbol'].values, current_weather)\n", 307 | "\n", 308 | "\"\"\"\n", 309 | "Plot the wind barbs for each station plot. The u-component wind variable name is eastward_wind and the v-component variable name is northward_wind.\n", 310 | "\"\"\"\n", 311 | "\n", 312 | "\n", 313 | "\n", 314 | "\"\"\"\n", 315 | "Create an appropriate title for your map. Be sure it includes the vertical level and data valid time.\n", 316 | "\"\"\"\n", 317 | "plot_title=\n", 318 | " \n", 319 | "\"\"\"\n", 320 | "Here, I take our title and tell maptlotlib to add it to the plot. I also tell matplotlib to plot the title so that it is in bold text (weight=\"bold\").\n", 321 | "\"\"\"\n", 322 | "plt.title(plot_title, weight=\"bold\")\n", 323 | "\n", 324 | "\n", 325 | "plt.show()" 326 | ] 327 | }, 328 | { 329 | "cell_type": "markdown", 330 | "id": "be6b079d-4b33-4bea-a6dd-cb0ed84f46ed", 331 | "metadata": {}, 332 | "source": [ 333 | "

\n", 334 | "### You have now completed the tutorial part of Lab 3. Be sure upload a fully rendered version of this Jupyter Notebook to your GitHub repository." 335 | ] 336 | } 337 | ], 338 | "metadata": { 339 | "kernelspec": { 340 | "display_name": "Python 3 (ipykernel)", 341 | "language": "python", 342 | "name": "python3" 343 | }, 344 | "language_info": { 345 | "codemirror_mode": { 346 | "name": "ipython", 347 | "version": 3 348 | }, 349 | "file_extension": ".py", 350 | "mimetype": "text/x-python", 351 | "name": "python", 352 | "nbconvert_exporter": "python", 353 | "pygments_lexer": "ipython3", 354 | "version": "3.7.12" 355 | } 356 | }, 357 | "nbformat": 4, 358 | "nbformat_minor": 5 359 | } 360 | -------------------------------------------------------------------------------- /Lab 4/Lab 4.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 4/Lab 4.docx -------------------------------------------------------------------------------- /Lab 4/Lab_4_PT_2.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "29c9fd6d-b76d-4f97-80d9-3d7d40990ae7", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 4 Part II: Plotting Satellite Data\n", 9 | "

\n", 10 | "In this part of the tutorial, we create visible satellite imagery to compare with our model-derived upper-air contour plot from Part I.\n", 11 | "
\n", 12 | "### Module Documentation\n", 13 | "1. The datetime function from the datetime module: https://docs.python.org/3/library/datetime.html\n", 14 | "2. Cartopy Feature: https://scitools.org.uk/cartopy/docs/latest/matplotlib/feature_interface.html\n", 15 | "3. Matplotlib: https://matplotlib.org/\n", 16 | "4. Matplotlib Normalize: https://matplotlib.org/stable/api/_as_gen/matplotlib.colors.Normalize.html\n", 17 | "\n", 18 | "\n", 19 | "\n", 20 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 21 | "\n", 22 | "---\n", 23 | "\n", 24 | "
\n", 25 | "1. As usual, we start by importing the modules we need for our Python code." 26 | ] 27 | }, 28 | { 29 | "cell_type": "code", 30 | "execution_count": null, 31 | "id": "82d63adc-bc01-4fe9-acd2-3f80cc8e5bb3", 32 | "metadata": {}, 33 | "outputs": [], 34 | "source": [ 35 | "#this time we want to import everything from the metpy module\n", 36 | "import metpy\n", 37 | "#from the plotting portion of metpy import the add_timestamp function\n", 38 | "from metpy.plots import add_timestamp\n", 39 | "\n", 40 | "#import the crs part of the cartopy module. This is for geographic parts of the data such as map projections.\n", 41 | "import cartopy.crs as ccrs\n", 42 | "\n", 43 | "#import the feature part of the cartopy module. This is for things such as state lines.\n", 44 | "import cartopy.feature as cfeature\n", 45 | "\n", 46 | "#import the pyplot part of the matplotlib module. This is matplotlib's main Python plotting module.\n", 47 | "import matplotlib.pyplot as plt\n", 48 | "\n", 49 | "#import the datetime feature of the datetime module. This is for working with dates and times.\n", 50 | "from datetime import datetime\n", 51 | "\n", 52 | "#from the matplotlib module, in the colors part, import the Normalize function. This is for adjusting the colors we use to plot the satellite data.\n", 53 | "from matplotlib.colors import Normalize\n", 54 | "\n", 55 | "#import the module xarray and save it to xr.\n", 56 | "import xarray as xr\n", 57 | "\n", 58 | "#add this line so that the JupyterHub plots the maps within the Jupyter Notebook\n", 59 | "%matplotlib inline" 60 | ] 61 | }, 62 | { 63 | "cell_type": "markdown", 64 | "id": "448f6ee3-d34d-4f46-acdc-442b19a3daff", 65 | "metadata": {}, 66 | "source": [ 67 | "

\n", 68 | "2. Once again, I have downloaded the needed data for you. This time, however, the data is in netCDF format, another common atmospheric data format. Whereas most model datasets are provided in GRIB format, most other atmospheric datasets are provided in netCDF format. One advantage of netCDF is that it is what is known as self-describing, meaning that everything you need (or that Python needs) to interpret the data is provided in the file - you don't need to look up something from elsewhere!\n", 69 | "\n", 70 | "We can use xarray to open the netCDF file, except we now need to use the netCDF parsing module netcdf4 for our parsing engine. I open the satellite data in xarray for you in the cell below. Notice that the data looks exactly like it did back in Lab 1. This is because the TDSCatalog function we used in Lab 1 put the data in xarray already for us. " 71 | ] 72 | }, 73 | { 74 | "cell_type": "code", 75 | "execution_count": null, 76 | "id": "f05e0462-63b4-4557-8dab-f2d4df2d2a02", 77 | "metadata": {}, 78 | "outputs": [], 79 | "source": [ 80 | "#Specifying the data's location.\n", 81 | "data_location = \"/srv/data/shared_notebooks/Synoptic1-AtmSci360/Data/Lab_4/\"\n", 82 | "\n", 83 | "#Open the satellite data with xarray. \n", 84 | "sat_data = xr.open_dataset(f\"{data_location}040721_18_sat.nc\", engine=\"netcdf4\")\n", 85 | "sat_data" 86 | ] 87 | }, 88 | { 89 | "cell_type": "markdown", 90 | "id": "e11e57ed-bc7d-43a8-9b53-051cdd0e93e1", 91 | "metadata": {}, 92 | "source": [ 93 | "

\n", 94 | "3. Since the data are in the same format as in Lab 1 Part III, use the code block below to parse out the satellite data, get the projection, set up the colormap, and plot the visible satellite data. Be sure to have appropriate the appropriate number of labels and geographical references in your satellite image. Finally, add your name to the plot using a left-justified title." 95 | ] 96 | }, 97 | { 98 | "cell_type": "code", 99 | "execution_count": null, 100 | "id": "a5dc3638-07d8-4607-8ed9-29ea0d8acc96", 101 | "metadata": {}, 102 | "outputs": [], 103 | "source": [] 104 | }, 105 | { 106 | "cell_type": "markdown", 107 | "id": "2ed22858-bbe2-4f0b-94e1-8a7d91c63715", 108 | "metadata": {}, 109 | "source": [ 110 | "### You have now completed the tutorial part of Lab 4. Be sure upload a fully rendered version of this Jupyter Notebook to your GitHub repository." 111 | ] 112 | } 113 | ], 114 | "metadata": { 115 | "kernelspec": { 116 | "display_name": "Python 3 (ipykernel)", 117 | "language": "python", 118 | "name": "python3" 119 | }, 120 | "language_info": { 121 | "codemirror_mode": { 122 | "name": "ipython", 123 | "version": 3 124 | }, 125 | "file_extension": ".py", 126 | "mimetype": "text/x-python", 127 | "name": "python", 128 | "nbconvert_exporter": "python", 129 | "pygments_lexer": "ipython3", 130 | "version": "3.7.12" 131 | } 132 | }, 133 | "nbformat": 4, 134 | "nbformat_minor": 5 135 | } 136 | -------------------------------------------------------------------------------- /Lab 5/Lab 5.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 5/Lab 5.docx -------------------------------------------------------------------------------- /Lab 5/Lab_5_PT_1.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "5deee5c6-bc09-4af4-bcc4-8fccc82e0986", 6 | "metadata": { 7 | "tags": [] 8 | }, 9 | "source": [ 10 | "## Lab 5 Part I: Plotting Contoured Surface Data\n", 11 | "

\n", 12 | "\n", 13 | "In this week's tutorial, we are going to learn how to grid METAR data so that we can have Python objectively isopleth the observations while also generating surface station plots. \n", 14 | "
\n", 15 | "\n", 16 | "### Module Documentation\n", 17 | "1. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 18 | "2. Caropy crs: https://scitools.org.uk/cartopy/docs/latest/reference/crs.html\n", 19 | "3. Cartopy Feature: https://scitools.org.uk/cartopy/docs/latest/matplotlib/feature_interface.html\n", 20 | "4. Matplotlib Colors: https://matplotlib.org/stable/gallery/color/named_colors.html\n", 21 | "5. Matplotlib Contour: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.contour.html\n", 22 | "6. Scipy Gaussian Filter: https://docs.scipy.org/doc/scipy/reference/generated/scipy.ndimage.gaussian_filter.html\n", 23 | "7. Scipy Griddata function: https://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.griddata.html\n", 24 | "\n", 25 | "\n", 26 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 27 | "\n", 28 | "---\n", 29 | "\n", 30 | "
\n", 31 | "1. As usual, we start by importing the modules we need for our Python code." 32 | ] 33 | }, 34 | { 35 | "cell_type": "code", 36 | "execution_count": null, 37 | "id": "9d0778ac-d8ad-43a7-bef6-cf6d5e8a9cb9", 38 | "metadata": {}, 39 | "outputs": [], 40 | "source": [ 41 | "#from the data reading capabilities of metpy (metpy.io) import the METAR reading capability (parse_metar_file)\n", 42 | "from metpy.io import parse_metar_file\n", 43 | "\n", 44 | "#from the dates and time code(datetime), import the date and time reading capabilities (datetime)\n", 45 | "from datetime import datetime\n", 46 | "\n", 47 | "#import the module to download files from the internet\n", 48 | "import requests\n", 49 | "\n", 50 | "#from the interpolate sub module of the scipy module, import the griddata function. This is what we use to create gridded data.\n", 51 | "from scipy.interpolate import griddata\n", 52 | "\n", 53 | "#from metpy's plotting abilities (metpy.plots) import the abilities to create a station plot (StationPlot) and the sky cover symbols (sky_cover)\n", 54 | "from metpy.plots import StationPlot, sky_cover\n", 55 | "\n", 56 | "#import the numpy module and save it to the variable np\n", 57 | "import numpy as np\n", 58 | "\n", 59 | "#import the cartopy (cartopy) module's coordinate reference system (.crs) and save it to the variable crs\n", 60 | "import cartopy.crs as crs\n", 61 | "\n", 62 | "#import the cartopy (cartopy) module's ability to plot geographic data (.feature) and save it to the variable cfeature \n", 63 | "import cartopy.feature as cfeature\n", 64 | "\n", 65 | "#import the pyplot submodule from the matplotlib module and save it to the variable plt\n", 66 | "import matplotlib.pyplot as plt\n", 67 | "\n", 68 | "#from the metpy calculations module, import the reduce_point_density function\n", 69 | "from metpy.calc import reduce_point_density\n", 70 | "\n", 71 | "#from the ndimage submodule within the scipy module, import the gaussian_filter function\n", 72 | "from scipy.ndimage import gaussian_filter\n", 73 | "\n", 74 | "#from metpy's units capabilites, import the units class\n", 75 | "from metpy.units import units\n", 76 | "\n", 77 | "#import the module sys\n", 78 | "import sys\n", 79 | "\n", 80 | "#this line declares that we would like to look for modules outside the default location. Here I specify the extra modules directory \n", 81 | "#within the shared_notebook space for synoptic I.\n", 82 | "sys.path.insert(0, '/srv/data/shared_notebooks/Synoptic1-AtmSci360/Extra_Modules/')\n", 83 | "\n", 84 | "#from the iowa_metar module (found in the special path above), import the download_alldata function\n", 85 | "from iowa_metar import download_alldata" 86 | ] 87 | }, 88 | { 89 | "cell_type": "markdown", 90 | "id": "f19d8308-9061-4692-985c-d8062e9439cf", 91 | "metadata": {}, 92 | "source": [ 93 | "

\n", 94 | "2. In the cell below define a datetime for August 30, 2021 at 1200 UTC." 95 | ] 96 | }, 97 | { 98 | "cell_type": "code", 99 | "execution_count": null, 100 | "id": "69f86a3d-d8eb-435b-9d61-a47b26ee25c9", 101 | "metadata": {}, 102 | "outputs": [], 103 | "source": [ 104 | "file_time = " 105 | ] 106 | }, 107 | { 108 | "cell_type": "markdown", 109 | "id": "08905507-59a3-4859-9410-00a8257d5bcc", 110 | "metadata": {}, 111 | "source": [ 112 | "

\n", 113 | "3. In any Python code we create, it is a best practice to avoid repeatedly coding the same thing multiple times. One way we can avoid making our code repetitive is by putting the code that we will need to call multiple times into a function. In this lab, we are going to create multiple surface plots for different times, so we should place the data processing and data plotting code in a function. I've started to create a data processing function for you in the code block below. Follow the instructions in the code and complete the data processing function." 114 | ] 115 | }, 116 | { 117 | "cell_type": "code", 118 | "execution_count": null, 119 | "id": "20eadfcb-37fc-4c2a-a70b-16b5d0fdbdf6", 120 | "metadata": {}, 121 | "outputs": [], 122 | "source": [ 123 | "\"\"\"\n", 124 | "Below I define a function to process METAR data. This function downloads, thins, grids, and does a unit conversion on the METAR data. This function requires the following variables:\n", 125 | "\n", 126 | "INPUT:\n", 127 | " time : DATETIME\n", 128 | " The time for which you would like surface data.\n", 129 | " \n", 130 | "OUTPUT:\n", 131 | " metar_data : PANDAS DATAFRAME\n", 132 | " Dataframe containing processed METAR data\n", 133 | " new_pressure : NUMPY ARRAY\n", 134 | " Gridded sea-level pressure data\n", 135 | " new_lat : NUMPY ARRAY\n", 136 | " Latitude values corresponding with the gridded sea-level pressure data\n", 137 | " new_lon : NUMPY ARRAY\n", 138 | " Longitude values corresponding with the gridded sea-level pressure data\n", 139 | " proj : CRS OBJECT\n", 140 | " The projection to use for your map\n", 141 | "\n", 142 | "\"\"\"\n", 143 | "\n", 144 | "\n", 145 | "def process_surface_data(time):\n", 146 | " \"\"\"\n", 147 | " Using the iowa_metar module's download_alldata function, we download the METAR observations for the time we have specified.\n", 148 | " The resulting output will be a StringIO object (like we worked with in Lab 3) that contains the raw METAR data. \n", 149 | " \"\"\"\n", 150 | " data_file = download_alldata(time)\n", 151 | " \n", 152 | " \n", 153 | " \"\"\"\n", 154 | " As in Lab 3, setup metpy's METAR parsing function to parse out the data we retrieved from Iowa State (data_file).\n", 155 | " \"\"\"\n", 156 | " metar_data = \n", 157 | " \n", 158 | " \n", 159 | " \"\"\"\n", 160 | " Again as in Lab 3, we need to drop any missing values that are in the latitude, longitude, and air_pressure_at_sea_level variables.\n", 161 | " For the gridding process to work these missing values must be removed.\n", 162 | " \"\"\"\n", 163 | " metar_data = metar_data.dropna(how='any', subset=['latitude', 'longitude', 'air_pressure_at_sea_level'])\n", 164 | " \n", 165 | " \"\"\"\n", 166 | " Setup a Lambert Conformal Conic Projection centered at 35°N and 95°W. Have the cone of the Lambert Conformal Conic projection intersect the Earth at 27.5°N and 42.5°N.\n", 167 | " \"\"\"\n", 168 | " proj = \n", 169 | " \n", 170 | " \"\"\"\n", 171 | " Now that we have specified the projection, let's convert the points from the latitude and longitude coordinate system to a coordinate system that works with the projection\n", 172 | " we just specified. \n", 173 | " \"\"\"\n", 174 | " point_locs = proj.transform_points(crs.PlateCarree(), metar_data['longitude'].values,metar_data['latitude'].values)\n", 175 | " \n", 176 | " \"\"\"\n", 177 | " Before we reduce the observation density like we did in Lab 3, we need to grid the data.\n", 178 | " \n", 179 | " The first step to gridding is to define the latitude and longitude grids for each point in our gridded data.\n", 180 | " In the code below, I define our grid to have longitudes ranging from 130°W to 60°W with a resolution of 0.1° (~9 km) and latitudes ranging from\n", 181 | " 22°N to 65°N at a resolution of 0.1° (11 km). Note that the np.arange function includes an extra increment to the end value since\n", 182 | " Python does not include the ending value when creating the list.\n", 183 | " \n", 184 | " np.arange(start, stop, increment)\n", 185 | " \n", 186 | " \"\"\"\n", 187 | " new_lon = np.arange(-130,-59.9,0.1)\n", 188 | " new_lat = np.arange(22,65.1,0.1)\n", 189 | " \n", 190 | " \n", 191 | " \"\"\"\n", 192 | " We now need to retrieve the latitude and longitude data from our METAR stations. \n", 193 | " From the METAR data, retrieve the latitude and longitude values and same them to separate variables.\n", 194 | " \"\"\"\n", 195 | " metar_latitude =\n", 196 | " metar_longitude = \n", 197 | " \n", 198 | " \n", 199 | " \"\"\"\n", 200 | " Since we want to grid the sea-level pressure, we need to pull the sea-level pressure values.\n", 201 | " \"\"\"\n", 202 | " metar_slp = metar_data['air_pressure_at_sea_level'].values\n", 203 | " \n", 204 | " \n", 205 | " \"\"\"\n", 206 | " We are now ready to grid our data using scipy's griddata function. \n", 207 | " In the function below the first argument ((metar_longitude, metar_latitude)) is the METAR station latitude and longitude values.\n", 208 | " The second argument (metar_slp) is the sea-level pressure for each METAR station. \n", 209 | " The third argument ((new_lon[None,:], new_lat[:,None])) is the grid's latitude and longitude values. The indexing done to the grid\n", 210 | " latitude and longitude values is to take the new longitude and latitude values and arrange them so they match their dimension. \n", 211 | " In this sense, new_lat[:,None] arranges the latitude values in a vertical list with the largest values at the top and new_lon[None, :]\n", 212 | " arranges the longitude values in a horizontal list with the largest values on the right side.\n", 213 | " The fourth argument (method='linear') is the method used to grid the data. Here linear is used as a simple method to interpolate the\n", 214 | " value at each grid. You can also use the \"nearest\" option or the \"cubic\" option.\n", 215 | " \"\"\"\n", 216 | " new_pressure = griddata((metar_longitude, metar_latitude), metar_slp, (new_lon[None,:], new_lat[:,None]), method='linear')\n", 217 | " \n", 218 | " \"\"\"\n", 219 | " As in Lab 4's Part I, we can apply a Gaussian filter to smooth the potentially noisy sea-level pressure data. Sea-level pressure observations\n", 220 | " tend to have a lot of small-scale variability, so that plotting the raw data without smoothing will create a plot with a lot of jagged lines.\n", 221 | " Raw data requires more smoothing than GFS model-derived analysis data, so in this case we set sigma to 8 to apply a longer/larger smoothing scale.\n", 222 | " \"\"\"\n", 223 | " new_pressure = gaussian_filter(new_pressure, 8)\n", 224 | " \n", 225 | " \"\"\"\n", 226 | " Now that the data are gridded, we can thin our point observations. In this case a spacing radius of \n", 227 | " 200 km is needed.\n", 228 | " \"\"\"\n", 229 | " metar_data = metar_data[reduce_point_density(point_locs, 200000.)]\n", 230 | " \n", 231 | " \n", 232 | " \"\"\"\n", 233 | " The METAR data are still in the standard METAR units. Convert the dew point, air temperature, and wind to the appropriate imperial units.\n", 234 | " \"\"\"\n", 235 | " \n", 236 | " \n", 237 | " \n", 238 | " \"\"\"\n", 239 | " Finally, we return the processed data.\n", 240 | " \"\"\"\n", 241 | " return metar_data, new_pressure, new_lat, new_lon, proj\n", 242 | " \n", 243 | " \n", 244 | " " 245 | ] 246 | }, 247 | { 248 | "cell_type": "markdown", 249 | "id": "6b36d4dd-a220-4cb9-bdca-28184bc81612", 250 | "metadata": {}, 251 | "source": [ 252 | "

\n", 253 | "4. Call the function we just created, using the datetime object you created in step 2." 254 | ] 255 | }, 256 | { 257 | "cell_type": "code", 258 | "execution_count": null, 259 | "id": "3fd2f7c7-136e-47c0-b7b1-af14254d7c38", 260 | "metadata": {}, 261 | "outputs": [], 262 | "source": [ 263 | "metar_data, new_pressure, new_lat, new_lon, proj =" 264 | ] 265 | }, 266 | { 267 | "cell_type": "markdown", 268 | "id": "7cec9b3b-37db-4ab3-b5f3-b7827efb6d51", 269 | "metadata": {}, 270 | "source": [ 271 | "

\n", 272 | "5. Since we are making multiple plots of the same map, lets also put our map code into a function. Fill out the function where you are asked to do so in the comments below." 273 | ] 274 | }, 275 | { 276 | "cell_type": "code", 277 | "execution_count": null, 278 | "id": "0a42d3b5-8cde-45b8-b795-dbdc83a04ce0", 279 | "metadata": {}, 280 | "outputs": [], 281 | "source": [ 282 | "\"\"\"\n", 283 | "Below I define a function to plot surface data in station-model format and sea-level pressure contours\n", 284 | "from the gridded data obtained above.\n", 285 | "\n", 286 | "INPUT:\n", 287 | "\n", 288 | " raw_data : PANDAS DATAFRAME\n", 289 | " The dataframe containing the METAR data\n", 290 | " date : DATETIME\n", 291 | " The time for which the plot is valid\n", 292 | " pressure_grid : NUMPY ARRAY\n", 293 | " The gridded pressure data\n", 294 | " lat_grid : NUMPY ARRAY\n", 295 | " Latitude values corresponding with the gridded sea-level pressure data\n", 296 | " lon_grid : NUMPY ARRAY\n", 297 | " Longitude values corresponding with the gridded sea-level pressure data\n", 298 | " proj : CRS OBJECT\n", 299 | " The projection you are going to use for your map\n", 300 | "\n", 301 | "\"\"\"\n", 302 | "\n", 303 | "def plot_surface(raw_data, date, pressure_grid, lat_grid, lon_grid, proj):\n", 304 | "\n", 305 | " \n", 306 | " \"\"\"\n", 307 | " Create a figure with a size of 1000px x 1500px and a resolution of 300 dots per inch,\n", 308 | " then set up an axes object (named ax) using the projection we previously defined for our map.\n", 309 | " \"\"\"\n", 310 | "\n", 311 | " \n", 312 | " \n", 313 | " \"\"\"\n", 314 | " Add the appropriate amount of geographic data. Be sure you follow \"good map\" suggestions with the geographic data styling.\n", 315 | " \"\"\"\n", 316 | "\n", 317 | " \n", 318 | " \n", 319 | " \n", 320 | " \"\"\"\n", 321 | " Limit the map to areas between 125°-65°W longitude and 23°-60°N latitude.\n", 322 | " \"\"\"\n", 323 | "\n", 324 | " \n", 325 | " \n", 326 | " \"\"\"\n", 327 | " Setup the station-model plots.\n", 328 | " \"\"\"\n", 329 | "\n", 330 | " \n", 331 | " \"\"\"\n", 332 | " Add temperature, dewpoint, wind, sea-level pressure, and sky coverage to the station plot in the appropriate locations with appropriate formatting.\n", 333 | " Make sure that the data displayed in your station plot is clear and easy to read.\n", 334 | " \"\"\"\n", 335 | "\n", 336 | "\n", 337 | " \"\"\"\n", 338 | " Next, add sea-level pressure contours using the gridded pressure data we created earlier. Add styling information to the partially completed\n", 339 | " contour statement below to make the contours easier to read.\n", 340 | " \"\"\"\n", 341 | " cont_p = plt.contour(lon_grid, lat_grid, pressure_grid, np.arange(0,2000,4), transform=crs.PlateCarree())\n", 342 | " \n", 343 | " \n", 344 | " \"\"\"\n", 345 | " Next, we need to label our contours. Add any necessary styling information to the partially completed clabel statement.\n", 346 | " \"\"\"\n", 347 | " ax.clabel(cont_p, cont_p.levels, fmt=lambda v: format(v, '.0f'))\n", 348 | " \n", 349 | " \n", 350 | " \"\"\"\n", 351 | " Let's add symbols denoting minima (lows) and maxima (highs) in the gridded sea-level pressure data.\n", 352 | " The function below takes our gridded data, find the minimum and maximum points, and then plots\n", 353 | " the appropriate symbol with the value.\n", 354 | " \n", 355 | " The first argument is our axes variable.\n", 356 | " The second and third arguments are our longitude and latitude variables, respectively.\n", 357 | " The fourth argument is our gridded sea-level pressure data.\n", 358 | " The fifth argument says if we are looking for lows (min) or highs (max).\n", 359 | " The sixth argument says how much of a spatial buffer to use when searching for highs and lows. The function will be more sensitive to smaller-\n", 360 | " scale highs and lows (thus leading to more highs and lows being plotted) when this value is smaller. The opposite is true when this value is larger.\n", 361 | " You will have to experiment with different values to find one that displays the appropriate amount of highs and lows (i.e., reflecting synoptic-scale\n", 362 | " features) rather than every high and low point in the data.\n", 363 | " The seventh argument is the size of the high or low symbols.\n", 364 | " The final argument is the data's coordinate system.\n", 365 | " \"\"\"\n", 366 | " plot_maxmin_points(ax, lon_grid, lat_grid, pressure_grid, \"min\", 100, textsize=20, transform=crs.PlateCarree())\n", 367 | " plot_maxmin_points(ax, lon_grid, lat_grid, pressure_grid, \"max\", 100, textsize=20, transform=crs.PlateCarree())\n", 368 | " \n", 369 | " \n", 370 | " \"\"\"\n", 371 | " Finally, add an appropriate title for the map that shows what is plotted and the time at which the map is valid.\n", 372 | " \"\"\"\n", 373 | " \n", 374 | " \n", 375 | " \n" 376 | ] 377 | }, 378 | { 379 | "cell_type": "markdown", 380 | "id": "2696522b-219c-48e8-8111-f7edc69a2bb1", 381 | "metadata": {}, 382 | "source": [ 383 | "

\n", 384 | "6. Call the plotting function." 385 | ] 386 | }, 387 | { 388 | "cell_type": "code", 389 | "execution_count": null, 390 | "id": "0a2e15e2-c8d1-4100-b3df-45cc125e12eb", 391 | "metadata": {}, 392 | "outputs": [], 393 | "source": [] 394 | }, 395 | { 396 | "cell_type": "markdown", 397 | "id": "0d933f05-cf38-4849-9cda-0b9a0bfac25e", 398 | "metadata": {}, 399 | "source": [ 400 | "

\n", 401 | "7. Reuse the functions you have created to create another plot for July 28th, 2022 at 1200 UTC. Don't forget to first create a new datetime object for this time!" 402 | ] 403 | }, 404 | { 405 | "cell_type": "raw", 406 | "id": "712cea35-97bb-4683-887c-10f1b5d19194", 407 | "metadata": {}, 408 | "source": [] 409 | }, 410 | { 411 | "cell_type": "markdown", 412 | "id": "aea8d2b0-e875-4a7e-be4c-c3e28c496309", 413 | "metadata": {}, 414 | "source": [ 415 | "### You have now completed Part I of the Python portion of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 416 | ] 417 | } 418 | ], 419 | "metadata": { 420 | "kernelspec": { 421 | "display_name": "Python 3 (ipykernel)", 422 | "language": "python", 423 | "name": "python3" 424 | }, 425 | "language_info": { 426 | "codemirror_mode": { 427 | "name": "ipython", 428 | "version": 3 429 | }, 430 | "file_extension": ".py", 431 | "mimetype": "text/x-python", 432 | "name": "python", 433 | "nbconvert_exporter": "python", 434 | "pygments_lexer": "ipython3", 435 | "version": "3.7.12" 436 | } 437 | }, 438 | "nbformat": 4, 439 | "nbformat_minor": 5 440 | } 441 | -------------------------------------------------------------------------------- /Lab 5/Lab_5_PT_2.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "5deee5c6-bc09-4af4-bcc4-8fccc82e0986", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 5 Part II: Plotting Contoured Upper-Air Data\n", 9 | "

\n", 10 | "\n", 11 | "In this part of the tutorial, we are going to plot station data (observations) at 500 hPa and GFS model-derived 500 hPa geopotential height isopleths.\n", 12 | "
\n", 13 | "### Module Documentation\n", 14 | "\n", 15 | "1. Siphon IAStateUpperAir: https://unidata.github.io/siphon/latest/api/simplewebservice.html#module-siphon.simplewebservice.iastate\n", 16 | "2. Xarray Dataset: https://docs.xarray.dev/en/stable/generated/xarray.Dataset.html\n", 17 | "3. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 18 | "4. Caropy crs: https://scitools.org.uk/cartopy/docs/latest/reference/crs.html\n", 19 | "5. Cartopy Feature: https://scitools.org.uk/cartopy/docs/latest/matplotlib/feature_interface.html\n", 20 | "6. Matplotlib Colors: https://matplotlib.org/stable/gallery/color/named_colors.html\n", 21 | "7. Matplotlib Contour: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.contour.html\n", 22 | "8. Matplotlib Barbs: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.barbs.html\n", 23 | "9. Scipy Gaussian Filter: https://docs.scipy.org/doc/scipy/reference/generated/scipy.ndimage.gaussian_filter.html\n", 24 | "\n", 25 | "\n", 26 | "\n", 27 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 28 | "\n", 29 | "---\n", 30 | "\n", 31 | "
\n", 32 | "1. As usual, we start by importing the modules we need for our Python code." 33 | ] 34 | }, 35 | { 36 | "cell_type": "code", 37 | "execution_count": null, 38 | "id": "9d0778ac-d8ad-43a7-bef6-cf6d5e8a9cb9", 39 | "metadata": {}, 40 | "outputs": [], 41 | "source": [ 42 | "#from the dates and time code(datetime), import the date and time reading capabilities (datetime)\n", 43 | "from datetime import datetime\n", 44 | "\n", 45 | "#from the metpy (metpy) module's ability to read and write files (.io), import the function to add latitude and longitude data for upper-air and surface station data (import add_station_lat_lon)\n", 46 | "from metpy.io import add_station_lat_lon\n", 47 | "\n", 48 | "#from metpy's plotting abilities (metpy.plots), import the abilities to create a station plot (StationPlot) and the sky cover symbols (sky_cover).\n", 49 | "from metpy.plots import StationPlot, sky_cover\n", 50 | "\n", 51 | "#import the module numpy and save it to np\n", 52 | "import numpy as np\n", 53 | "\n", 54 | "#import the cartopy (cartopy) module's coordinate reference system (.crs) and save it to the variable crs\n", 55 | "import cartopy.crs as crs\n", 56 | "\n", 57 | "#import the cartopy (cartopy) module's ability to plot geographic data (.feature) and save it to the variable cfeature \n", 58 | "import cartopy.feature as cfeature\n", 59 | "\n", 60 | "#import the pyplot submodule from the matplotlib module\n", 61 | "import matplotlib.pyplot as plt\n", 62 | "\n", 63 | "#from the scipy module's ndimage submodule, import the function gaussian_filter\n", 64 | "from scipy.ndimage import gaussian_filter\n", 65 | "\n", 66 | "#import the module xarray and save it to xr\n", 67 | "import xarray as xr\n", 68 | "\n", 69 | "#from the siphon module's (from siphon) web data downloader (.simplewebservice) specifically for Iowa State (.iastate) data, import the upper air downloader (import IAStateUpperAir)\n", 70 | "from siphon.simplewebservice.iastate import IAStateUpperAir\n", 71 | "\n", 72 | "#import the sys module to tap into the underlying Linux operating system\n", 73 | "import sys\n", 74 | "\n", 75 | "#One of our modules is not in the standard location, so we need to define the location of the Extra_Modules directory where there are some homemade modules.\n", 76 | "sys.path.insert(0, '/srv/data/shared_notebooks/Synoptic1-AtmSci360/Extra_Modules/')\n", 77 | "\n", 78 | "#from the module min_max (available through the module path defined above), import the function plot_maxmin_points\n", 79 | "from min_max import plot_maxmin_points" 80 | ] 81 | }, 82 | { 83 | "cell_type": "markdown", 84 | "id": "1804491f-e6bf-498b-ad49-b553d443f7e8", 85 | "metadata": {}, 86 | "source": [ 87 | "

\n", 88 | "2. We need to set up two variables, one for the time at which we want data and the other for the isobaric level at which we want data (in hPa). In the section below, define a datetime object for August 30th, 2021 1200 UTC and a variable containing the desired isobaric level (here, 500 hPa)." 89 | ] 90 | }, 91 | { 92 | "cell_type": "code", 93 | "execution_count": null, 94 | "id": "20eadfcb-37fc-4c2a-a70b-16b5d0fdbdf6", 95 | "metadata": {}, 96 | "outputs": [], 97 | "source": [ 98 | "map_time = \n", 99 | "level = " 100 | ] 101 | }, 102 | { 103 | "cell_type": "markdown", 104 | "id": "a26e97da-bc57-4c41-950a-e0f673826570", 105 | "metadata": {}, 106 | "source": [ 107 | "

\n", 108 | "3. We again process the data in a function since we will be creating multiple plots. Watch out in the comments for areas that you need to fill in or complete." 109 | ] 110 | }, 111 | { 112 | "cell_type": "code", 113 | "execution_count": null, 114 | "id": "30529a5e-5e13-4f55-aef0-f561b847f313", 115 | "metadata": {}, 116 | "outputs": [], 117 | "source": [ 118 | "\"\"\"\n", 119 | "Below I define a function to retrieve and process upper-air data. This function downloads rawinsonde data, opens the corresponding\n", 120 | "GFS analysis data, retains only the desired pressure level, and removes missing values from the rawinsonde data.\n", 121 | "\n", 122 | "\n", 123 | "INPUT:\n", 124 | " level : INTEGER\n", 125 | " The level in hPa at which you want upper-air data.\n", 126 | " time : DATETIME\n", 127 | " The time at which you would like upper-air data.\n", 128 | " \n", 129 | "OUTPUT:\n", 130 | " obs_data : PANDAS DATAFRAME\n", 131 | " Dataframe containing your processed rawinsonde data\n", 132 | " leveled_data : XARRAY DATASET\n", 133 | " The xarray containing your processed GFS analysis data\n", 134 | "\n", 135 | "\"\"\"\n", 136 | "\n", 137 | "def process_upper_air_data(level, time):\n", 138 | " \"\"\"\n", 139 | " Specify the location of the upper-air data on the JupyterHub.\n", 140 | " \"\"\"\n", 141 | " lab_data_loc = \"/data/AtmSci360/Lab_5/\"\n", 142 | " \n", 143 | " \"\"\"\n", 144 | " Open the GFS data using xarray. Since the data are once again GRIB-formatted data, we can use xarray the same way we did in Lab 4.\n", 145 | " \"\"\"\n", 146 | " model_data = xr.open_dataset(f\"{lab_data_loc}{time:%m%d%y_%H}_gfs.grib2\", engine='cfgrib', filter_by_keys={'typeOfLevel': 'isobaricInhPa'})\n", 147 | " \n", 148 | " \"\"\"\n", 149 | " We only want data at a single isobaric level. Limit the data in the xarray to only this specified level.\n", 150 | " \"\"\"\n", 151 | " leveled_data =\n", 152 | " \n", 153 | " \"\"\"\n", 154 | " We also need upper-air observations, so let's download rawinsonde data from Iowa State for the single specified level, as we did in Lab 2.\n", 155 | " \"\"\"\n", 156 | " obs_data = IAStateUpperAir.request_all_data(map_time, level)\n", 157 | " \n", 158 | " \"\"\"\n", 159 | " Like in Lab 2, the downloaded Iowa State rawinsonde data do not have latitude and longitude information for each station. \n", 160 | " The line below adds the latitude and longitude to each station using MetPy's add_stations_lat_lon function.\n", 161 | " \"\"\"\n", 162 | " obs_data = add_station_lat_lon(obs_data)\n", 163 | " \n", 164 | " \n", 165 | " \"\"\"\n", 166 | " The MetPy add_station_lat_lon does not have every station, however, and so some stations are missing their latitude and \n", 167 | " longitude values. The command below drops all stations that do not have a latitude and/or longitude value.\n", 168 | " \"\"\"\n", 169 | " obs_data = obs_data.dropna(how='any', subset=['latitude', 'longitude'])\n", 170 | " \n", 171 | " \n", 172 | " \"\"\"\n", 173 | " This line returns the observation data and model data after the function ends.\n", 174 | " \"\"\"\n", 175 | " return obs_data, leveled_data" 176 | ] 177 | }, 178 | { 179 | "cell_type": "markdown", 180 | "id": "f33389dc-2f95-4b85-b234-00737f2569aa", 181 | "metadata": {}, 182 | "source": [ 183 | "

\n", 184 | "4. In the cell below, call the function we created to process data using the level and time you specified earlier in this lab." 185 | ] 186 | }, 187 | { 188 | "cell_type": "code", 189 | "execution_count": null, 190 | "id": "3fd2f7c7-136e-47c0-b7b1-af14254d7c38", 191 | "metadata": {}, 192 | "outputs": [], 193 | "source": [] 194 | }, 195 | { 196 | "cell_type": "markdown", 197 | "id": "81c969bd-fe40-4a79-84c4-e20d3631181a", 198 | "metadata": {}, 199 | "source": [ 200 | "

\n", 201 | "5. Since we are making multiple plots, we also need to create a plot function. Again, watch out in the comments for areas that you need to fill in or complete." 202 | ] 203 | }, 204 | { 205 | "cell_type": "code", 206 | "execution_count": null, 207 | "id": "0a42d3b5-8cde-45b8-b795-dbdc83a04ce0", 208 | "metadata": {}, 209 | "outputs": [], 210 | "source": [ 211 | "\"\"\"\n", 212 | "Below I define a function to plot upper-air data. This function creates station-model plots for the observations and\n", 213 | "uses GFS data to draw geopotential height isopleths.\n", 214 | "\n", 215 | "\n", 216 | "INPUT:\n", 217 | " obs_data : PANDAS DATAFRAME\n", 218 | " The dataframe containg your rawinsonde data\n", 219 | " model_data : XARRAY DATASET\n", 220 | " The xarray dataset containing your GFS analysis.\n", 221 | " date : DATETIME\n", 222 | " The time at which your upper-air data are valid.\n", 223 | " level : INTEGER\n", 224 | " The level in hPa at which you want upper-air data.\n", 225 | " \n", 226 | "\"\"\"\n", 227 | "\n", 228 | "\n", 229 | "def plot_upper_air(obs_data, model_data, date, level):\n", 230 | " \n", 231 | " \"\"\"\n", 232 | " Setup a Lambert Conformal Conic Projection centered at 35°N and 95°W. Have the cone of the Lambert Conformal Conic projection intersect the Earth at 27.5°N and 42.5°N.\n", 233 | " \"\"\"\n", 234 | " proj = \n", 235 | " \n", 236 | " \n", 237 | " \n", 238 | " \"\"\"\n", 239 | " Create a figure with a size of 1000px x 1500px and a resolution of 300 dots per inch,\n", 240 | " then set up an axes object (named ax) using the projection we previously defined for our map.\n", 241 | " \"\"\"\n", 242 | "\n", 243 | " \n", 244 | " \n", 245 | " \"\"\"\n", 246 | " Add the appropriate amount of geographic data. Be sure you follow \"good map\" suggestions with the geographic data styling.\n", 247 | " \"\"\"\n", 248 | "\n", 249 | " \n", 250 | " \n", 251 | " \"\"\"\n", 252 | " Limit the map to between 125°-65°W longitude and 23°-60°N latitude.\n", 253 | " \"\"\"\n", 254 | "\n", 255 | " \n", 256 | " \"\"\"\n", 257 | " Setup the station plots for the upper-air observations.\n", 258 | " \"\"\"\n", 259 | "\n", 260 | " \n", 261 | " \"\"\"\n", 262 | " Add temperature, dewpoint, wind, and geopotential height (in decameters) in the appropriate locations with appropriate formatting. \n", 263 | " Make sure that your data displayed in your station plot are clear and easy to read.\n", 264 | " \"\"\"\n", 265 | "\n", 266 | "\n", 267 | " \n", 268 | "\n", 269 | " \n", 270 | " \"\"\"\n", 271 | " Like in Lab 2, we need to make the center sky cover symbol an open circle. MetPy's sky_cover function produces this symbol when the sky cover value\n", 272 | " is zero. Here, I create a list of zeros, one for each station, and then I pass it into the plot_symbol function to get\n", 273 | " the open circle center symbol.\n", 274 | " \"\"\"\n", 275 | " num_of_stations = len(obs_data[\"latitude\"].values)\n", 276 | "\n", 277 | " zeros = np.zeros(num_of_stations, dtype=int)\n", 278 | " \n", 279 | " stationplot.plot_symbol('C', zeros, sky_cover)\n", 280 | " \n", 281 | "\n", 282 | "\n", 283 | " \n", 284 | " \"\"\"\n", 285 | " Switching to the GFS analysis, first we want to smooth the geopotential height data to make them easier to analyze. \n", 286 | " Setup the Gaussian filter for the geopotential heights ('gh'). Choose an appropriate smoothing value for these data.\n", 287 | " \"\"\"\n", 288 | " smooth_heights = \n", 289 | " \n", 290 | " \n", 291 | " \"\"\"\n", 292 | " We can now isopleth the smoothed geopotential height data. In the statement below, add styling information to make the contours easier to read.\n", 293 | " \"\"\"\n", 294 | " cont_p = plt.contour(model_data[\"longitude\"].values, model_data[\"latitude\"].values, smooth_heights, np.arange(0,10000,60), transform=crs.PlateCarree())\n", 295 | " \n", 296 | " \n", 297 | " \"\"\"\n", 298 | " Every contour plot needs labels for each contour, so we add contour labels below. Add styling information to make the contour labels easier to read.\n", 299 | " \"\"\"\n", 300 | " ax.clabel(cont_p, cont_p.levels, fmt=lambda v: format(v, '.0f')\n", 301 | " \n", 302 | "\n", 303 | " \"\"\"\n", 304 | " Let's add symbols denoting minima (lows) and maxima (highs) in the gridded geopotential height data.\n", 305 | " The function below takes our gridded data, find the minimum and maximum points, and then plots\n", 306 | " the appropriate symbol with the value.\n", 307 | " \n", 308 | " The first argument is our axes variable.\n", 309 | " The second and third arguments are our longitude and latitude variables, respectively.\n", 310 | " The fourth argument is our gridded geopotential height data.\n", 311 | " The fifth argument says if we are looking for lows (min) or highs (max).\n", 312 | " The sixth argument says how much of a spatial buffer to use when searching for highs and lows. The function will be more sensitive to smaller-\n", 313 | " scale highs and lows (thus leading to more highs and lows being plotted) when this value is smaller. The opposite is true when this value is larger.\n", 314 | " You will have to experiment with different values to find one that displays the appropriate amount of highs and lows (i.e., reflecting synoptic-scale\n", 315 | " features) rather than every high and low point in the data.\n", 316 | " The seventh argument is the size of the high or low symbols.\n", 317 | " The final argument is the data's coordinate system.\n", 318 | " \"\"\" \n", 319 | " plot_maxmin_points(ax, model_data[\"longitude\"].values, model_data[\"latitude\"].values, smooth_heights, \"min\", 65, textsize=20, transform=crs.PlateCarree())\n", 320 | " plot_maxmin_points(ax, model_data[\"longitude\"].values, model_data[\"latitude\"].values, smooth_heights, \"max\", 65, textsize=20, transform=crs.PlateCarree())\n", 321 | " \n", 322 | " \n", 323 | " \"\"\"\n", 324 | " Finally, add an appropriate title for the map that shows what is plotted and the time at which the map is valid.\n", 325 | " \"\"\"\n", 326 | " \n", 327 | "\n" 328 | ] 329 | }, 330 | { 331 | "cell_type": "markdown", 332 | "id": "9e9b1172-48f8-49cc-b2a5-2ddfaea70f26", 333 | "metadata": {}, 334 | "source": [ 335 | "

\n", 336 | "6. Finally, re-run the two functions we created to create a plot valid at 1200 UTC on July 28th, 2022." 337 | ] 338 | }, 339 | { 340 | "cell_type": "code", 341 | "execution_count": null, 342 | "id": "0a2e15e2-c8d1-4100-b3df-45cc125e12eb", 343 | "metadata": {}, 344 | "outputs": [], 345 | "source": [] 346 | }, 347 | { 348 | "cell_type": "markdown", 349 | "id": "ae52ad7a-84e1-4540-b7dc-6c24150d2b21", 350 | "metadata": {}, 351 | "source": [ 352 | "### You have now completed the Python portion of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 353 | ] 354 | } 355 | ], 356 | "metadata": { 357 | "kernelspec": { 358 | "display_name": "Python 3 (ipykernel)", 359 | "language": "python", 360 | "name": "python3" 361 | }, 362 | "language_info": { 363 | "codemirror_mode": { 364 | "name": "ipython", 365 | "version": 3 366 | }, 367 | "file_extension": ".py", 368 | "mimetype": "text/x-python", 369 | "name": "python", 370 | "nbconvert_exporter": "python", 371 | "pygments_lexer": "ipython3", 372 | "version": "3.7.12" 373 | } 374 | }, 375 | "nbformat": 4, 376 | "nbformat_minor": 5 377 | } 378 | -------------------------------------------------------------------------------- /Lab 6/Lab 6.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 6/Lab 6.docx -------------------------------------------------------------------------------- /Lab 6/Lab_6_PT_1.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "8793bef3-73e1-4667-a64d-ef6f2d7b2d4f", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 6 Part I: Plotting Contoured Upper-Air Data at 500 hPa\n", 9 | "

\n", 10 | "\n", 11 | "In this tutorial, we are going to plot station data (observations) at 500 hPa and GFS model-derived 500 hPa geopotential height isopleths so that we can estimate the geostrophic wind speed at three specified locations. This plot will be similar to what we made in Part II of Lab 5.\n", 12 | "
\n", 13 | "### Module Documentation\n", 14 | "\n", 15 | "1. Siphon IAStateUpperAir: https://unidata.github.io/siphon/latest/api/simplewebservice.html#module-siphon.simplewebservice.iastate\n", 16 | "2. Xarray Dataset: https://docs.xarray.dev/en/stable/generated/xarray.Dataset.html\n", 17 | "3. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 18 | "4. Caropy crs: https://scitools.org.uk/cartopy/docs/latest/reference/crs.html\n", 19 | "5. Cartopy Feature: https://scitools.org.uk/cartopy/docs/latest/matplotlib/feature_interface.html\n", 20 | "6. Matplotlib Colors: https://matplotlib.org/stable/gallery/color/named_colors.html\n", 21 | "7. Matplotlib Contour: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.contour.html\n", 22 | "8. Matplotlib Barbs: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.barbs.html\n", 23 | "9. Scipy Gaussian Filter: https://docs.scipy.org/doc/scipy/reference/generated/scipy.ndimage.gaussian_filter.html\n", 24 | "\n", 25 | "\n", 26 | "\n", 27 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 28 | "\n", 29 | "---\n", 30 | "\n", 31 | "
\n", 32 | "1. As usual, we start by importing the modules we need for our Python code." 33 | ] 34 | }, 35 | { 36 | "cell_type": "code", 37 | "execution_count": null, 38 | "id": "9d427218-fd73-4914-866f-4e93c34b581a", 39 | "metadata": {}, 40 | "outputs": [], 41 | "source": [ 42 | "#from the dates and time code(datetime), import the date and time reading capabilities (datetime)\n", 43 | "from datetime import datetime\n", 44 | "\n", 45 | "#from the metpy (metpy) module's ability to read and write files (.io), import the function to add latitude and longitude data for upper-air and surface station data (import add_station_lat_lon)\n", 46 | "from metpy.io import add_station_lat_lon\n", 47 | "\n", 48 | "#from metpy's plotting abilities (metpy.plots), import the abilities to create a station plot (StationPlot) and the sky cover symbols (sky_cover)\n", 49 | "from metpy.plots import StationPlot, sky_cover\n", 50 | "\n", 51 | "#import the module numpy and save it to np\n", 52 | "import numpy as np\n", 53 | "\n", 54 | "#import the cartopy (cartopy) module's coordinate reference system (.crs) and save it to the variable crs\n", 55 | "import cartopy.crs as crs\n", 56 | "\n", 57 | "#import the cartopy (cartopy) module's ability to plot geographic data (.feature) and save it to the variable cfeature \n", 58 | "import cartopy.feature as cfeature\n", 59 | "\n", 60 | "#import the pyplot submodule from the matplotlib module\n", 61 | "import matplotlib.pyplot as plt\n", 62 | "\n", 63 | "#from the scipy module's ndimage submodule, import the function gaussian_filter\n", 64 | "from scipy.ndimage import gaussian_filter\n", 65 | "\n", 66 | "#import the module xarray and save it to xr\n", 67 | "import xarray as xr\n", 68 | "\n", 69 | "#from the siphon module's (from siphon) web data downloader (.simplewebservice) specifically for Iowa State (.iastate) data, import the upper air downloader (import IAStateUpperAir)\n", 70 | "from siphon.simplewebservice.iastate import IAStateUpperAir\n", 71 | "\n", 72 | "#from the metpy submodule units import the units function\n", 73 | "from metpy.units import units\n", 74 | "\n", 75 | "#import the calculation submodule form the metpy module and save it to the variable calc\n", 76 | "import metpy.calc as calc" 77 | ] 78 | }, 79 | { 80 | "cell_type": "markdown", 81 | "id": "804b5088-0de7-48cc-a73e-76aa0f45cddc", 82 | "metadata": {}, 83 | "source": [ 84 | "

\n", 85 | "2. For this part of the lab we are going to be looking at 1200 UTC March 12, 2022 at 500 hPa. Create variables for this time and for the desired isobaric level (in hPa)." 86 | ] 87 | }, 88 | { 89 | "cell_type": "code", 90 | "execution_count": null, 91 | "id": "955034a0-cbe5-41e4-bf70-46ca6538f915", 92 | "metadata": {}, 93 | "outputs": [], 94 | "source": [] 95 | }, 96 | { 97 | "cell_type": "markdown", 98 | "id": "e2d6a251-3b61-4eab-a808-fb592b57aa4a", 99 | "metadata": {}, 100 | "source": [ 101 | "

\n", 102 | "3. Since it is good habit, we are going to once again process our data in a function. In the function below we will download the upper-air observations and GFS data. The function below will also process the GFS data so that it is ready to plot. Watch out in the comments for areas that you need to fill in." 103 | ] 104 | }, 105 | { 106 | "cell_type": "code", 107 | "execution_count": null, 108 | "id": "f5374562-274e-4316-88ea-4007af1f8626", 109 | "metadata": {}, 110 | "outputs": [], 111 | "source": [ 112 | "\"\"\"\n", 113 | "Below I define a function to retrieve and process upper-air data. This function downloads rawinsonde data, opens the corresponding\n", 114 | "GFS analysis data, retains only the desired pressure level, and removes missing values from the rawinsonde data.\n", 115 | "\n", 116 | "\n", 117 | "INPUT:\n", 118 | " level : INTEGER\n", 119 | " The level in hPa at which you want upper-air data.\n", 120 | " time : DATETIME\n", 121 | " The time at which you would like upper-air data.\n", 122 | " \n", 123 | "OUTPUT:\n", 124 | " obs_data : PANDAS DATAFRAME\n", 125 | " Dataframe containing your processed rawinsonde data\n", 126 | " leveled_data : XARRAY DATASET\n", 127 | " The xarray containing your processed GFS analysis data\n", 128 | "\n", 129 | "\"\"\"\n", 130 | "\n", 131 | "def process_upper_air_data(level, time):\n", 132 | " \"\"\"\n", 133 | " Specify the location of the upper-air data on the JupyterHub.\n", 134 | " \"\"\"\n", 135 | " lab_data_loc = \"/data/AtmSci360/Lab_6/\"\n", 136 | " \n", 137 | " \"\"\"\n", 138 | " Open the GFS data using xarray. Since the data are once again GRIB-formatted data, we can use xarray the same way we did in Lab 4. Be sure to add filter_by_keys={'typeOfLevel': 'isobaricInhPa'}.\n", 139 | " \"\"\"\n", 140 | " model_data =\n", 141 | " \n", 142 | " \"\"\"\n", 143 | " We only want data at a single isobaric level. Limit the data in the xarray to only this specified level.\n", 144 | " \"\"\"\n", 145 | " leveled_data = \n", 146 | " \n", 147 | " \"\"\"\n", 148 | " We also need upper-air observations, so let's download rawinsonde data from Iowa State for the single specified level, as we did in Lab 2.\n", 149 | " \"\"\"\n", 150 | " obs_data =\n", 151 | " \n", 152 | " \"\"\"\n", 153 | " Add the latitude and longitude for each station.\n", 154 | " \"\"\"\n", 155 | " obs_data = \n", 156 | " \n", 157 | " \"\"\"\n", 158 | " Here I set the index value to the station name. This allows us to do obs_data[\"variable\"].loc[\"KGRB\"] to get Green Bay's data.\n", 159 | " \"\"\"\n", 160 | " obs_data = obs_data.set_index(\"station\")\n", 161 | " \n", 162 | " \"\"\"\n", 163 | " The MetPy add_station_lat_lon does not have every station, however, and so some stations are missing their latitude and \n", 164 | " longitude values. In the line below, add the command to remove stations with missing latitude and longitude values.\n", 165 | " \"\"\"\n", 166 | " obs_data = \n", 167 | " \n", 168 | " \"\"\"\n", 169 | " This line returns the observation data and model data after the function ends.\n", 170 | " \"\"\"\n", 171 | " return obs_data, leveled_data\n" 172 | ] 173 | }, 174 | { 175 | "cell_type": "markdown", 176 | "id": "54053936-3a5c-4456-a02b-1716658ce907", 177 | "metadata": {}, 178 | "source": [ 179 | "

\n", 180 | "4. In the cell below, run the function we created and save the output to two appropriately named variables." 181 | ] 182 | }, 183 | { 184 | "cell_type": "code", 185 | "execution_count": null, 186 | "id": "8b771c72-b053-4a38-9dc6-d7bd35510875", 187 | "metadata": {}, 188 | "outputs": [], 189 | "source": [] 190 | }, 191 | { 192 | "cell_type": "markdown", 193 | "id": "e1939206-6708-4a83-aaa6-5dcc97a354a5", 194 | "metadata": {}, 195 | "source": [ 196 | "

\n", 197 | "5. Now that the data are processed, we are ready to create our plot in the function below. Watch out in the comments for areas you need to fill in." 198 | ] 199 | }, 200 | { 201 | "cell_type": "code", 202 | "execution_count": null, 203 | "id": "36546d8c-7b4a-4c2e-9b4e-578e49f52722", 204 | "metadata": {}, 205 | "outputs": [], 206 | "source": [ 207 | "\"\"\"\n", 208 | "Below, I define a function to plot upper-air data. This function creates station-model plots for the observations and\n", 209 | "uses GFS data to draw geopotential height isopleths.\n", 210 | "\n", 211 | "\n", 212 | "INPUT:\n", 213 | " obs_data : PANDAS DATAFRAME\n", 214 | " The dataframe containing your rawinsonde data\n", 215 | " model_data : XARRAY DATASET\n", 216 | " The xarray dataset containing your GFS analysis.\n", 217 | " date : DATETIME\n", 218 | " The time at which your upper-air data are valid.\n", 219 | " level : INTEGER\n", 220 | " The level in hPa at which you want upper-air data.\n", 221 | " \n", 222 | "\"\"\"\n", 223 | "\n", 224 | "def plot_obs_upper_air(obs_data, model_data, date, level):\n", 225 | " \n", 226 | "\n", 227 | " \"\"\"\n", 228 | " Setup a Lambert Conic Conformal Projection centered at 35°N and 95°W. Have the cone of the Lambert Conic Conformal projection intersect the Earth at 27.5°N and 42.5°N.\n", 229 | " \"\"\"\n", 230 | " \n", 231 | " \n", 232 | " \"\"\"\n", 233 | " Create a figure with a size of 1150px x 800px and a resolution of 300 dots per inch,\n", 234 | " then set up an axes object (named ax) using the projection we previously defined for our map.\n", 235 | " \"\"\"\n", 236 | "\n", 237 | " \n", 238 | " \"\"\"\n", 239 | " Add an appropriate amount of geographic data. Be sure you follow \"good map\" suggestions with the geographic data styling.\n", 240 | " \"\"\"\n", 241 | "\n", 242 | " \n", 243 | " \"\"\"\n", 244 | " Limit the map to between 125°-65°W longitude and 23°-60°N latitude.\n", 245 | " \"\"\"\n", 246 | "\n", 247 | " \n", 248 | " \"\"\"\n", 249 | " Setup the station plots for the upper-air observations.\n", 250 | " \"\"\"\n", 251 | "\n", 252 | " \n", 253 | " \"\"\"\n", 254 | " Add temperature, dewpoint, wind, and geopotential height (in decameters) in the appropriate locations with appropriate formatting. \n", 255 | " Make sure that your data displayed in your station plot are clear and easy to read.\n", 256 | " \"\"\"\n", 257 | "\n", 258 | " \n", 259 | " \"\"\"\n", 260 | " Like in Lab 2, we need to make the center sky cover symbol an open circle. MetPy's sky_cover function produces this symbol when the sky cover value\n", 261 | " is zero. Here, I create a list of zeros, one for each station, and then I pass it into the plot_symbol function to get\n", 262 | " the open circle center symbol.\n", 263 | " \"\"\"\n", 264 | " num_of_stations = len(obs_data[\"latitude\"].values)\n", 265 | "\n", 266 | " zeros = np.zeros(num_of_stations, dtype=int)\n", 267 | " \n", 268 | " stationplot.plot_symbol('C', zeros, sky_cover)\n", 269 | "\n", 270 | " \"\"\"\n", 271 | " Switching to the GFS analysis, first we want to smooth the geopotential height data to make them easier to analyze. \n", 272 | " Setup the Gaussian filter for the geopotential heights ('gh'). Choose an appropriate smoothing value for these data.\n", 273 | " \"\"\"\n", 274 | " smooth_heights = \n", 275 | " \n", 276 | "\n", 277 | " \"\"\"\n", 278 | " We can now isopleth the smoothed geopotential height data. In the line below, add the command to isopleth the smoothed \n", 279 | " geopotential heights and add styling information to make the contours easier to read.\n", 280 | " \"\"\"\n", 281 | " cont_p = \n", 282 | " \n", 283 | " \n", 284 | " \"\"\"\n", 285 | " Every contour plot needs labels for each contour, so we add contour labels below. Add styling information to make the contour labels easier to read.\n", 286 | " \"\"\"\n", 287 | " ax.clabel(cont_p, cont_p.levels, fmt=lambda v: format(v, '.0f'))\n", 288 | " \n", 289 | " \n", 290 | " \n", 291 | " \"\"\"\n", 292 | " Here I add latitude and longitude gridlines for reference.\n", 293 | " Argument one determines if the labels (e.g. 45°N) appear within the plot.\n", 294 | " Argument two determines if the labels are decimal degrees or degrees minutes seconds. True is degreees mintues seconds\n", 295 | " and false is decimal degrees.\n", 296 | " Argument three determines if the longitude value labels are plotted on the line inside the plot or outside. True is\n", 297 | " inside the plot and false is outside the plot.\n", 298 | " Argument four determines if the latitude value labels are plotted on the line inside the plot or outside. True is\n", 299 | " inside the plot and false is outside the plot.\n", 300 | " The last three arguments are normal matplotlib styling arguments.\n", 301 | " \n", 302 | " \"\"\"\n", 303 | " ax.gridlines(draw_labels=True, dms=True, x_inline=False, y_inline=False, color=\"k\", linestyle=\"--\", alpha=0.5)\n", 304 | " \n", 305 | " \n", 306 | " \"\"\"\n", 307 | " You do not need to worry about the next three lines. These lines add an A, B, and C to specific locations that I reference in a question on the lab.\n", 308 | " \"\"\"\n", 309 | " ax.text(obs_data[\"longitude\"].loc[\"KSHV\"],obs_data[\"latitude\"].loc[\"KSHV\"]+0.5, \"A\", weight='bold', size=10, transform=crs.PlateCarree())\n", 310 | " ax.text(obs_data[\"longitude\"].loc[\"KDDC\"],obs_data[\"latitude\"].loc[\"KDDC\"]+0.5, \"B\", weight='bold', size=10, transform=crs.PlateCarree())\n", 311 | " ax.text(obs_data[\"longitude\"].loc[\"KOTX\"],obs_data[\"latitude\"].loc[\"KOTX\"]+0.5, \"C\", weight='bold', size=10, transform=crs.PlateCarree())\n", 312 | " \n", 313 | " \"\"\"\n", 314 | " Finally, add an appropriate title for the map that shows what is plotted and the time at which the map is valid.\n", 315 | " \"\"\"\n", 316 | " \n", 317 | " \n" 318 | ] 319 | }, 320 | { 321 | "cell_type": "markdown", 322 | "id": "578e58b1-a772-4183-8c64-3e88c427bfaa", 323 | "metadata": {}, 324 | "source": [ 325 | "

\n", 326 | "6. Finally, call your plotting function." 327 | ] 328 | }, 329 | { 330 | "cell_type": "code", 331 | "execution_count": null, 332 | "id": "fa946cbb-748e-4233-b123-462405477e11", 333 | "metadata": {}, 334 | "outputs": [], 335 | "source": [] 336 | }, 337 | { 338 | "cell_type": "markdown", 339 | "id": "b293271c-6827-479f-a230-26390a92f3ea", 340 | "metadata": {}, 341 | "source": [ 342 | "### You have now completed Part I of the Python portion of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 343 | ] 344 | } 345 | ], 346 | "metadata": { 347 | "kernelspec": { 348 | "display_name": "Python 3 (ipykernel)", 349 | "language": "python", 350 | "name": "python3" 351 | }, 352 | "language_info": { 353 | "codemirror_mode": { 354 | "name": "ipython", 355 | "version": 3 356 | }, 357 | "file_extension": ".py", 358 | "mimetype": "text/x-python", 359 | "name": "python", 360 | "nbconvert_exporter": "python", 361 | "pygments_lexer": "ipython3", 362 | "version": "3.7.12" 363 | } 364 | }, 365 | "nbformat": 4, 366 | "nbformat_minor": 5 367 | } 368 | -------------------------------------------------------------------------------- /Lab 7/Lab 7.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 7/Lab 7.docx -------------------------------------------------------------------------------- /Lab 7/Lab_7_PT_1.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "3e3cdb32-bcaa-4356-8dd1-7e103d465b3e", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 7 Part I: Plotting Skew-T, ln-p diagrams\n", 9 | "

\n", 10 | "In this tutorial, we are going to create Skew-T, ln-p diagrams at specific locations using observation data to aid in inferring horizontal temperature advection (from thermal-wind principles). Luckily, MetPy has functionality to make plotting Skew-T, ln-p diagrams easy rather than us having to build these diagrams (particularly their skewed axes) from scratch.\n", 11 | "
\n", 12 | "### Module Documentation\n", 13 | "\n", 14 | "1. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 15 | "2. Siphon Wyoming Upper Air: https://unidata.github.io/siphon/latest/api/simplewebservice.html#siphon.simplewebservice.wyoming.WyomingUpperAir.request_data\n", 16 | "3. MetPy Skew T: https://unidata.github.io/MetPy/latest/api/generated/metpy.plots.SkewT.html\n", 17 | "\n", 18 | "\n", 19 | "\n", 20 | "\n", 21 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 22 | "\n", 23 | "---\n", 24 | "\n", 25 | "
\n", 26 | "1. As usual, we start by importing the modules we need for our Python code." 27 | ] 28 | }, 29 | { 30 | "cell_type": "code", 31 | "execution_count": null, 32 | "id": "80c2e492-0d91-4e68-afb6-ee0bbb86fe0c", 33 | "metadata": {}, 34 | "outputs": [], 35 | "source": [ 36 | "#from python's date and time module (from datetime) import the ability to work with date and times (import datetime)\n", 37 | "from datetime import datetime\n", 38 | "\n", 39 | "#using the module siphon and its ability to retrieve files from online (.simplewebservice) specifically for the University of Wyoming (.wyoming), \n", 40 | "#import the ability to download from the University of Wyoming's upper-air data archive\n", 41 | "from siphon.simplewebservice.wyoming import WyomingUpperAir\n", 42 | "\n", 43 | "#from metpy's plotting abilities import the SkewT plotting class\n", 44 | "from metpy.plots import SkewT\n", 45 | "\n", 46 | "#import the plotting abilities of the module matplotlib (import matplotlib.pyplot) and save it to plt\n", 47 | "import matplotlib.pyplot as plt\n", 48 | "\n", 49 | "#from the metpy's units feature (metpy.units) import the ability to assign and convert units (units)\n", 50 | "from metpy.units import units" 51 | ] 52 | }, 53 | { 54 | "cell_type": "markdown", 55 | "id": "88de5db7-6322-4e3f-a09b-3d007ac913ed", 56 | "metadata": {}, 57 | "source": [ 58 | "

\n", 59 | "2. Like we have done in past labs, we first need to create variables to define our desired data. For rawinsonde observations, we need two variables: the time of the observations (in datetime format) and the second is the observation site (as a string). In the code block below, define the time variable as sounding_date and set it to 0000 UTC September 9, 2014 and define the site variable as station and set it to DVN." 60 | ] 61 | }, 62 | { 63 | "cell_type": "code", 64 | "execution_count": null, 65 | "id": "af32ce46-361e-4611-ab2b-15e1746b3456", 66 | "metadata": {}, 67 | "outputs": [], 68 | "source": [] 69 | }, 70 | { 71 | "cell_type": "markdown", 72 | "id": "99f47963-cb24-4432-8ead-17f38ac1bf1c", 73 | "metadata": {}, 74 | "source": [ 75 | "

\n", 76 | "3. In the code section below, we take the time and station variables you defined above and pass them into the siphon WyomingUpperAir.request_data function to retrieve the observation for the desired time and location." 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": null, 82 | "id": "3f4cb031-8786-4904-8e1d-548e61daacab", 83 | "metadata": {}, 84 | "outputs": [], 85 | "source": [ 86 | "upper_air_data = WyomingUpperAir.request_data(sounding_date, station)\n", 87 | "\n", 88 | "upper_air_data" 89 | ] 90 | }, 91 | { 92 | "cell_type": "markdown", 93 | "id": "57a47890-d1a1-456d-b8ca-c79f10942579", 94 | "metadata": {}, 95 | "source": [ 96 | "

\n", 97 | "4. We are now ready to plot. Like in previous labs, we will once again use a function for our plot since we will be plotting three different Skew-T, ln-p diagrams. Watch out in the comments for areas you need to fill in." 98 | ] 99 | }, 100 | { 101 | "cell_type": "code", 102 | "execution_count": null, 103 | "id": "bcc8a9b3-3c57-422a-a6d7-298d2c0003cc", 104 | "metadata": {}, 105 | "outputs": [], 106 | "source": [ 107 | "\"\"\"\n", 108 | "Below a function is defined to plot a Skew-T, ln-p diagram using observation data. This plot contains\n", 109 | "temperature and dewpoint traces.\n", 110 | "\n", 111 | "\n", 112 | "INPUT:\n", 113 | " data : PANDAS DATAFRAME\n", 114 | " The rawinsonde observation data\n", 115 | " station : STRING\n", 116 | " The rawinsonde station that the data is for.\n", 117 | " valid_time : DATETIME\n", 118 | " The time of the rawinsonde observation.\n", 119 | "\"\"\"\n", 120 | "\n", 121 | "\n", 122 | "def plot_skewT(data, station, valid_time):\n", 123 | " \"\"\"\n", 124 | " For the Skew-T, ln-p plots, MetPy requires all inputs into the plot to have units.\n", 125 | " For the variables below add units to the variable using MetPy's units capabilities.\n", 126 | " \"\"\"\n", 127 | " p = data['pressure'].values \n", 128 | " T = data['temperature'].values\n", 129 | " Td = data['dewpoint'].values\n", 130 | " u = data['u_wind'].values \n", 131 | " v = data['v_wind'].values\n", 132 | " \n", 133 | " \"\"\"\n", 134 | " Define a matplotlib figure that is 900px x 900px with a resolution of 300 dots per inch and save it to the variable fig.\n", 135 | " \"\"\"\n", 136 | "\n", 137 | " \n", 138 | " \n", 139 | " \"\"\"\n", 140 | " Here I initialize our Skew-T, ln-p diagram similarly to how we defined our axes in previous labs.\n", 141 | " \n", 142 | " The first argument is your figure variable so MetPy knows where to create the Skew-T, ln-p axes.\n", 143 | " The second argument is how to rotate your temperature axis. The standard Skew-T, ln-p diagram uses a 45° skew.\n", 144 | " \"\"\"\n", 145 | " skew = SkewT(fig, rotation=45)\n", 146 | " \n", 147 | " \n", 148 | " \"\"\"\n", 149 | " Like in previous labs, we want to limit the plot's extent. However, in the case of Skew-T, ln-p diagrams,\n", 150 | " limiting the plot's extent is handled different from before. For the Skew-T, ln-p diagram, we must have\n", 151 | " separate statements for the x and y axes with the minimum value in the first argument and the maximum value in the second\n", 152 | " argument. In the two lines of code below I set the extent of the plot for you.\n", 153 | " \"\"\"\n", 154 | " skew.ax.set_ylim(1000, 100)\n", 155 | " skew.ax.set_xlim(-40, 40)\n", 156 | " \n", 157 | " \n", 158 | " \"\"\"\n", 159 | " We are ready to now start drawing our temperature and dewpoint traces. In the two lines below I have \n", 160 | " blank plot statements for you to fill out: one for the temperature trace and the other for the \n", 161 | " dewpoint trace.\n", 162 | " \n", 163 | " The Skew-T, ln-p plotting function requires 3 arguments in this order:\n", 164 | " First is the height variable. In most cases (including this case), this is pressure.\n", 165 | " Second is the variable (e.g., temperature or dewpoint) you wish to plot.\n", 166 | " Third is the color of the line you are plotting. This uses the standard matplotlib colors.\n", 167 | " \n", 168 | " You can add more of the standard matplotlib styling arguments such as linewidth or linestyle by adding additional arguments.\n", 169 | " \"\"\"\n", 170 | " skew.plot()\n", 171 | " skew.plot()\n", 172 | " \n", 173 | " \n", 174 | " \"\"\"\n", 175 | " We need to thin out the wind data since there are too many observations to legibly plot.\n", 176 | " In the code below, set up a slice for the wind barbs (like in Labs 5 and 6) and have it skip\n", 177 | " every 3 observations.\n", 178 | " \"\"\"\n", 179 | " wind_slice = \n", 180 | " \n", 181 | " \n", 182 | " \"\"\"\n", 183 | " In the line below, I plot the thinned wind barbs. The wind barb arguments are set up similar to the plot arguments.\n", 184 | " \n", 185 | " The first argument is your vertical level, typically pressure.\n", 186 | " The second and third arguments are your u- and v-wind components, respectively.\n", 187 | " The fourth argument is the barbs' length. Larger values result in longer barbs.\n", 188 | " The final argument is the x location relative to the skew-T, ln-p diagram's axis (where 1 is the far right edge).\n", 189 | " Higher values will move your wind barbs to the right and smaller values will move your wind barbs to the left.\n", 190 | " \n", 191 | " You can also add more of the standard matplotlib wind barb styling arguments by adding additional arguments.\n", 192 | " \"\"\"\n", 193 | " skew.plot_barbs(p[wind_slice], u[wind_slice], v[wind_slice], length=5, xloc=1.05)\n", 194 | "\n", 195 | "\n", 196 | " \"\"\"\n", 197 | " Since we are not plotting geographic data, we need to provide labels for the x and y axes.\n", 198 | " In the two lines below I label these axes appropriately.\n", 199 | " \"\"\"\n", 200 | " skew.ax.set_xlabel(\"Temperature ($\\degree$C)\")\n", 201 | " skew.ax.set_ylabel(\"Pressure (hPa)\")\n", 202 | " \n", 203 | " \n", 204 | " \"\"\"\n", 205 | " Finally, like all plots we have created this semester, this plot needs a title. Create an appropriate title for this plot.\n", 206 | " \"\"\"\n", 207 | " \n", 208 | " \n", 209 | "\n" 210 | ] 211 | }, 212 | { 213 | "cell_type": "markdown", 214 | "id": "8234b848-c018-4067-8a55-f2404ee5db22", 215 | "metadata": {}, 216 | "source": [ 217 | "

\n", 218 | "5. Call the plotting function you just created." 219 | ] 220 | }, 221 | { 222 | "cell_type": "code", 223 | "execution_count": null, 224 | "id": "9a13e55c-bad0-41e3-8a93-d2886879dc8a", 225 | "metadata": {}, 226 | "outputs": [], 227 | "source": [] 228 | }, 229 | { 230 | "cell_type": "markdown", 231 | "id": "af39306b-f94a-4ed1-af52-ad4d0e37efe8", 232 | "metadata": {}, 233 | "source": [ 234 | "

\n", 235 | "6. Re-run your plotting function for site OAX at 0000 UTC on December 15, 2008." 236 | ] 237 | }, 238 | { 239 | "cell_type": "code", 240 | "execution_count": null, 241 | "id": "16e955e9-24c8-4798-8cf1-744fd99d0f20", 242 | "metadata": {}, 243 | "outputs": [], 244 | "source": [] 245 | }, 246 | { 247 | "cell_type": "markdown", 248 | "id": "9afea654-815d-4aae-9e1d-9fe13f171f58", 249 | "metadata": {}, 250 | "source": [ 251 | "

\n", 252 | "7. Re-run your plotting function for the site RAP at 0000 UTC on February 19, 2022." 253 | ] 254 | }, 255 | { 256 | "cell_type": "code", 257 | "execution_count": null, 258 | "id": "acfcb7ca-44d0-4e95-8d3a-86132d0f7e5a", 259 | "metadata": {}, 260 | "outputs": [], 261 | "source": [] 262 | }, 263 | { 264 | "cell_type": "markdown", 265 | "id": "3ef724af-8c2e-4f19-848b-9218dca03c25", 266 | "metadata": {}, 267 | "source": [ 268 | "### You have now completed Part I of the Python portion of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 269 | ] 270 | } 271 | ], 272 | "metadata": { 273 | "kernelspec": { 274 | "display_name": "Python 3 (ipykernel)", 275 | "language": "python", 276 | "name": "python3" 277 | }, 278 | "language_info": { 279 | "codemirror_mode": { 280 | "name": "ipython", 281 | "version": 3 282 | }, 283 | "file_extension": ".py", 284 | "mimetype": "text/x-python", 285 | "name": "python", 286 | "nbconvert_exporter": "python", 287 | "pygments_lexer": "ipython3", 288 | "version": "3.7.12" 289 | } 290 | }, 291 | "nbformat": 4, 292 | "nbformat_minor": 5 293 | } 294 | -------------------------------------------------------------------------------- /Lab 7/Lab_7_PT_2.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "82421a79-341c-4955-a7b6-3f27aeb1571d", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 7 Part II: Plotting Gridded Data to Evaluate Horizontal Temperature Advection\n", 9 | "

\n", 10 | "In this part of the tutorial, we are going to contour GFS-analyzed 850 hPa temperature and geopotential height data to help assess horizontal temperature advection. \n", 11 | "
\n", 12 | "### Module Documentation\n", 13 | "1. Xarray Dataset: https://docs.xarray.dev/en/stable/generated/xarray.Dataset.html\n", 14 | "2. Xarray with MetPy: https://unidata.github.io/MetPy/latest/tutorials/xarray_tutorial.html\n", 15 | "3. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 16 | "4. Caropy crs: https://scitools.org.uk/cartopy/docs/latest/reference/crs.html\n", 17 | "5. Cartopy Feature: https://scitools.org.uk/cartopy/docs/latest/matplotlib/feature_interface.html\n", 18 | "6. Matplotlib Colors: https://matplotlib.org/stable/gallery/color/named_colors.html\n", 19 | "7. Matplotlib Contour: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.contour.html\n", 20 | "8. Matplotlib Barbs: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.barbs.html\n", 21 | "9. Scipy Gaussian Filter: https://docs.scipy.org/doc/scipy/reference/generated/scipy.ndimage.gaussian_filter.html\n", 22 | "\n", 23 | "\n", 24 | "\n", 25 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 26 | "\n", 27 | "---\n", 28 | "\n", 29 | "
\n", 30 | "\n", 31 | "1. As usual, we start by importing the modules we need for our Python code." 32 | ] 33 | }, 34 | { 35 | "cell_type": "code", 36 | "execution_count": null, 37 | "id": "44435d91-c36c-4d43-a367-149d53585868", 38 | "metadata": {}, 39 | "outputs": [], 40 | "source": [ 41 | "#from the dates and time code(datetime), import the date and time reading capabilities (datetime)\n", 42 | "from datetime import datetime\n", 43 | "\n", 44 | "#import the module numpy and save it to np\n", 45 | "import numpy as np\n", 46 | "\n", 47 | "#import the cartopy (cartopy) module's coordinate reference system (.crs) and save it to the variable crs\n", 48 | "import cartopy.crs as crs\n", 49 | "\n", 50 | "#import the cartopy (cartopy) module's ability to plot geographic data (.feature) and save it to the variable cfeature \n", 51 | "import cartopy.feature as cfeature\n", 52 | "\n", 53 | "#import the pyplot submodule from the matplotlib module\n", 54 | "import matplotlib.pyplot as plt\n", 55 | "\n", 56 | "#from the scipy module's ndimage submodule, import the function gaussian_filter\n", 57 | "from scipy.ndimage import gaussian_filter\n", 58 | "\n", 59 | "#import the module xarray and save it to xr\n", 60 | "import xarray as xr\n", 61 | "\n", 62 | "#from the metpy submodule units import the units function\n", 63 | "from metpy.units import units\n", 64 | "\n", 65 | "#import the calculation submodule form the metpy module and save it to the variable calc\n", 66 | "import metpy.calc as calc\n", 67 | "\n", 68 | "#import the pandas module and save it to the variable pd\n", 69 | "import pandas as pd" 70 | ] 71 | }, 72 | { 73 | "cell_type": "markdown", 74 | "id": "3981123d-a2c9-4821-bae4-37f978f65322", 75 | "metadata": {}, 76 | "source": [ 77 | "

\n", 78 | "2. We need to define variables named time (as a datetime object) and level (as a number), representing the time and isobaric level of interest. Define these variables for 850 hPa at 0000 UTC on February 19, 2022." 79 | ] 80 | }, 81 | { 82 | "cell_type": "code", 83 | "execution_count": null, 84 | "id": "bd1e4a57-1696-4c09-b9df-25248ee65902", 85 | "metadata": {}, 86 | "outputs": [], 87 | "source": [] 88 | }, 89 | { 90 | "cell_type": "markdown", 91 | "id": "8004f5da-b93a-4c76-8079-1beed8375724", 92 | "metadata": {}, 93 | "source": [ 94 | "

\n", 95 | "3. Like in previous labs, we are going to follow good coding practices and put our data processing in a function. Watch out in the comments for areas you need to fill in." 96 | ] 97 | }, 98 | { 99 | "cell_type": "code", 100 | "execution_count": null, 101 | "id": "86a87d1b-39c4-48f5-84ae-84946ec91fb1", 102 | "metadata": {}, 103 | "outputs": [], 104 | "source": [ 105 | "\"\"\"\n", 106 | "Below I define a function to retrieve and process upper-air data. This function opens the GFS analysis data,\n", 107 | "retains only data from the desired pressure level, and converts temperature from Kelvin to Celsius.\n", 108 | "\n", 109 | "\n", 110 | "INPUT:\n", 111 | " level : INTEGER\n", 112 | " The level in hPa at which you want upper-air data.\n", 113 | " time : DATETIME\n", 114 | " The time at which you would like upper-air data.\n", 115 | " \n", 116 | "OUTPUT:\n", 117 | "\n", 118 | " leveled_data : XARRAY DATASET\n", 119 | " The xarray containing your processed GFS analysis data\n", 120 | "\n", 121 | "\"\"\"\n", 122 | "\n", 123 | "def process_upper_air_data(level, time):\n", 124 | " \"\"\"\n", 125 | " Specify the location of the upper-air data on the JupyterHub.\n", 126 | " \"\"\"\n", 127 | " lab_data_loc = \"/data/AtmSci360/Lab_7/\"\n", 128 | " \n", 129 | " \"\"\"\n", 130 | " Open the GFS data using xarray. Since the data are once again GRIB-formatted data, we can use xarray the same way we did in Lab 4.\n", 131 | " The data file name is still formatted %m%d%y_%H.grib2. Only open the variables with a vertical coordinate of isobaricInhPa.\n", 132 | " \"\"\"\n", 133 | " model_data = \n", 134 | " \n", 135 | " \"\"\"\n", 136 | " We only want data at a single isobaric level. Limit the data in the xarray to only the level we have specified.\n", 137 | " \"\"\"\n", 138 | " \n", 139 | " \n", 140 | " \"\"\"\n", 141 | " Our temperature (t) is in Kelvin. To make this easier for to use, convert temperature to Celsius and save it back to the xarray.\n", 142 | " \"\"\"\n", 143 | " \n", 144 | " \n", 145 | " \"\"\"\n", 146 | " Finally, have the function return the processed data.\n", 147 | " \"\"\"\n" 148 | ] 149 | }, 150 | { 151 | "cell_type": "markdown", 152 | "id": "2791cac3-7510-41ff-91c7-6c910c1c83df", 153 | "metadata": {}, 154 | "source": [ 155 | "

\n", 156 | "4. Call your data processing function to retrieve the data we need for this lab. Save the retrieved data to an appropriately named variable (e.g., model_data)." 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": null, 162 | "id": "7e7cee2b-7c1f-4ee9-92c0-138ddb95f57f", 163 | "metadata": {}, 164 | "outputs": [], 165 | "source": [] 166 | }, 167 | { 168 | "cell_type": "markdown", 169 | "id": "d2701760-71a0-4990-aeea-dcb89b0693f3", 170 | "metadata": {}, 171 | "source": [ 172 | "

\n", 173 | "5. Now that our data are processed, we are ready to start plotting. We once again put our plotting code into a function. Watch out in the comments below for areas you need to fill in." 174 | ] 175 | }, 176 | { 177 | "cell_type": "code", 178 | "execution_count": null, 179 | "id": "41059d16-695f-40a2-8b65-48a1eaa41aac", 180 | "metadata": {}, 181 | "outputs": [], 182 | "source": [ 183 | "\"\"\"\n", 184 | "Below a function is defined to plot upper-air geopotential heights (line contour) and temperature (line contour). \n", 185 | "\n", 186 | "\n", 187 | "INPUT:\n", 188 | " model_data : XARRAY DATASET\n", 189 | " The GFS analysis data\n", 190 | " level : INTEGER\n", 191 | " The level that the plot is valid for.\n", 192 | " data : DATETIME\n", 193 | " The date and time that the plot is valid for.\n", 194 | "\n", 195 | "\"\"\"\n", 196 | "\n", 197 | "\n", 198 | "def gfs_temp_upper_air_plot(model_data, level, date):\n", 199 | " \"\"\"\n", 200 | " Setup a Lambert Conic Conformal Projection centered at 35°N and 95°W. Have the cone of the Lambert Conic Conformal projection intersect the Earth at 27.5°N and 42.5°N.\n", 201 | " \"\"\"\n", 202 | " \n", 203 | " \n", 204 | " \"\"\"\n", 205 | " Create a figure with a size of 1150px x 800px and a resolution of 300 dots per inch,\n", 206 | " then set up an axes object (named ax) using the projection we previously defined for our map.\n", 207 | " \"\"\"\n", 208 | "\n", 209 | " \n", 210 | " \"\"\"\n", 211 | " Add an appropriate amount of geographic data. Be sure you follow \"good map\" suggestions with the geographic data styling.\n", 212 | " \"\"\"\n", 213 | " \n", 214 | " \n", 215 | " \"\"\"\n", 216 | " Limit the map to between 110°-75°W longitude and 30°-60°N latitude.\n", 217 | " \"\"\"\n", 218 | "\n", 219 | " \n", 220 | " \"\"\" \n", 221 | " Set up a Gaussian filter for the geopotential heights ('gh'). Choose an appropriate smoothing value for these data.\n", 222 | " \"\"\"\n", 223 | " \n", 224 | " \n", 225 | " \"\"\"\n", 226 | " Contour the smoothed geopotential heights. Be sure to style your contours so they are easy to read and make sure they have labels.\n", 227 | " \"\"\"\n", 228 | " \n", 229 | " \n", 230 | " \"\"\"\n", 231 | " Set up a Gaussian filter for the temperature ('t'). Choose an appropriate smoothing value for these data.\n", 232 | " \"\"\"\n", 233 | " \n", 234 | " \n", 235 | " \"\"\"\n", 236 | " Contour the smoothed temperatures. Be sure to style your contours so they are easy to read \n", 237 | " (including having them be distinct from the geopotential height contours) and make sure they have labels.\n", 238 | " \"\"\"\n", 239 | " \n", 240 | " \n", 241 | " \"\"\"\n", 242 | " Add an appropriate title to your plot.\n", 243 | " \"\"\"\n", 244 | " \n", 245 | "\n" 246 | ] 247 | }, 248 | { 249 | "cell_type": "markdown", 250 | "id": "4e9baddb-846a-4800-9c21-4a4488b42797", 251 | "metadata": {}, 252 | "source": [ 253 | "

\n", 254 | "6. Call the plotting function that you just created for the GFS model data, isobaric level, and date we previously retrieved." 255 | ] 256 | }, 257 | { 258 | "cell_type": "code", 259 | "execution_count": null, 260 | "id": "9b0f56f5-e725-4d6d-b23a-6d38aec5a42f", 261 | "metadata": {}, 262 | "outputs": [], 263 | "source": [] 264 | }, 265 | { 266 | "cell_type": "markdown", 267 | "id": "5fd8ea30-d95b-4b01-a8fa-9b7b18d0324b", 268 | "metadata": {}, 269 | "source": [ 270 | "### You have now completed the Python portion of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 271 | ] 272 | } 273 | ], 274 | "metadata": { 275 | "kernelspec": { 276 | "display_name": "Python 3 (ipykernel)", 277 | "language": "python", 278 | "name": "python3" 279 | }, 280 | "language_info": { 281 | "codemirror_mode": { 282 | "name": "ipython", 283 | "version": 3 284 | }, 285 | "file_extension": ".py", 286 | "mimetype": "text/x-python", 287 | "name": "python", 288 | "nbconvert_exporter": "python", 289 | "pygments_lexer": "ipython3", 290 | "version": "3.7.12" 291 | } 292 | }, 293 | "nbformat": 4, 294 | "nbformat_minor": 5 295 | } 296 | -------------------------------------------------------------------------------- /Lab 8/Lab 8.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 8/Lab 8.docx -------------------------------------------------------------------------------- /Lab 8/Lab_8_PT_2.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "ec3db39f-ab32-434f-861f-b9ebfc64acce", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 8 Part II: Plotting Contoured Upper-Air Data\n", 9 | "

\n", 10 | "\n", 11 | "In this part of the tutorial, we are going to plot GFS model-derived isohypses, isotherms, and wind barbs to aid in analyzing fronts at different levels.\n", 12 | "
\n", 13 | "### Module Documentation\n", 14 | "\n", 15 | "1. Xarray Dataset: https://docs.xarray.dev/en/stable/generated/xarray.Dataset.html\n", 16 | "2. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 17 | "3. Caropy crs: https://scitools.org.uk/cartopy/docs/latest/reference/crs.html\n", 18 | "4. Cartopy Feature: https://scitools.org.uk/cartopy/docs/latest/matplotlib/feature_interface.html\n", 19 | "5. Matplotlib Colors: https://matplotlib.org/stable/gallery/color/named_colors.html\n", 20 | "6. Matplotlib Contour: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.contour.html\n", 21 | "7. Matplotlib Barbs: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.barbs.html\n", 22 | "8. Scipy Gaussian Filter: https://docs.scipy.org/doc/scipy/reference/generated/scipy.ndimage.gaussian_filter.html\n", 23 | "9. Xarray with MetPy: https://unidata.github.io/MetPy/latest/tutorials/xarray_tutorial.html\n", 24 | "\n", 25 | "\n", 26 | "\n", 27 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 28 | "\n", 29 | "---\n", 30 | "\n", 31 | "
\n", 32 | "1. As usual, we start by importing the modules we need for our Python code." 33 | ] 34 | }, 35 | { 36 | "cell_type": "code", 37 | "execution_count": null, 38 | "id": "44435d91-c36c-4d43-a367-149d53585868", 39 | "metadata": {}, 40 | "outputs": [], 41 | "source": [ 42 | "#from the dates and time code(datetime), import the date and time reading capabilities (datetime)\n", 43 | "from datetime import datetime\n", 44 | "\n", 45 | "#import the module numpy and save it to np\n", 46 | "import numpy as np\n", 47 | "\n", 48 | "#import the cartopy (cartopy) module's coordinate reference system (.crs) and save it to the variable crs\n", 49 | "import cartopy.crs as crs\n", 50 | "\n", 51 | "#import the cartopy (cartopy) module's ability to plot geographic data (.feature) and save it to the variable cfeature \n", 52 | "import cartopy.feature as cfeature\n", 53 | "\n", 54 | "#import the pyplot submodule from the matplotlib module\n", 55 | "import matplotlib.pyplot as plt\n", 56 | "\n", 57 | "#from the scipy module's ndimage submodule, import the function gaussian_filter\n", 58 | "from scipy.ndimage import gaussian_filter\n", 59 | "\n", 60 | "#import the module xarray and save it to xr\n", 61 | "import xarray as xr\n", 62 | "\n", 63 | "#from the metpy submodule units import the units function\n", 64 | "from metpy.units import units\n", 65 | "\n", 66 | "#import the calculation submodule form the metpy module and save it to the variable calc\n", 67 | "import metpy.calc as calc\n", 68 | "\n", 69 | "#import the pandas module and save it to the variable pd\n", 70 | "import pandas as pd" 71 | ] 72 | }, 73 | { 74 | "cell_type": "markdown", 75 | "id": "921ffe88-9632-4993-9ac0-1eb1f05c73e8", 76 | "metadata": {}, 77 | "source": [ 78 | "

\n", 79 | "2. As with other labs, we will create a function to process our data. In the function below, I create a function for you that opens GFS analysis data and converts the units of the temperature data to degrees Celsius and wind data to knots. " 80 | ] 81 | }, 82 | { 83 | "cell_type": "code", 84 | "execution_count": null, 85 | "id": "86a87d1b-39c4-48f5-84ae-84946ec91fb1", 86 | "metadata": {}, 87 | "outputs": [], 88 | "source": [ 89 | "\"\"\"\n", 90 | "Below, I define a function to retrieve and process upper-air data. This function opens the GFS analysis data\n", 91 | "and converts the temperature units from Kelvin to Celsius and wind units from m/s to knots.\n", 92 | "\n", 93 | "\n", 94 | "INPUT:\n", 95 | " time : DATETIME\n", 96 | " The time at which you would like upper-air data.\n", 97 | " \n", 98 | "OUTPUT:\n", 99 | "\n", 100 | " model_data : XARRAY DATASET\n", 101 | " The xarray containing your processed GFS analysis data\n", 102 | "\n", 103 | "\"\"\"\n", 104 | "\n", 105 | "def process_upper_air_data(time):\n", 106 | " \"\"\"\n", 107 | " Specify the location of the upper-air data on the JupyterHub.\n", 108 | " \"\"\"\n", 109 | " lab_data_loc = \"/data/AtmSci360/Lab_8/\"\n", 110 | " \n", 111 | " \"\"\"\n", 112 | " Open the GFS data using xarray. Since the data are once again GRIB-formatted data, we can use xarray the same way we did in Lab 4.\n", 113 | " The data file name is still formatted %m%d%y_%H.grib2. Only open the variables with a vertical coordinate of isobaricInhPa.\n", 114 | " \"\"\"\n", 115 | " model_data = xr.open_dataset(f\"{lab_data_loc}{time:%m%d%y_%H}_gfs.grib2\", engine='cfgrib', filter_by_keys={'typeOfLevel': 'isobaricInhPa'})\n", 116 | " \n", 117 | " \n", 118 | " \"\"\"\n", 119 | " Our temperature (t) is in Kelvin and our wind variables (u and v) are in m/s. Here, we convert temperature to degrees Celsius\n", 120 | " and the wind variables to m/s, saving them back to the xarray as we do so.\n", 121 | " \"\"\"\n", 122 | " model_data['t'] = model_data[\"t\"].metpy.convert_units('degC')\n", 123 | " model_data['u'] = model_data[\"u\"].metpy.convert_units('kt')\n", 124 | " model_data['v'] = model_data[\"v\"].metpy.convert_units('kt')\n", 125 | " \n", 126 | "\n", 127 | " \"\"\"\n", 128 | " Finally, have the function return the processed data.\n", 129 | " \"\"\"\n", 130 | " return model_data\n" 131 | ] 132 | }, 133 | { 134 | "cell_type": "markdown", 135 | "id": "58416131-4963-4b89-8755-c01f8542acf3", 136 | "metadata": {}, 137 | "source": [ 138 | "

\n", 139 | "3. Call the data processing function to get data for January 5th, 2022 at 0000 UTC." 140 | ] 141 | }, 142 | { 143 | "cell_type": "code", 144 | "execution_count": null, 145 | "id": "e2e1e2de-7e44-4297-9bd7-5ec944e3c716", 146 | "metadata": {}, 147 | "outputs": [], 148 | "source": [] 149 | }, 150 | { 151 | "cell_type": "markdown", 152 | "id": "4d5f07fe-0006-4361-a619-6b2cfd9dccc4", 153 | "metadata": {}, 154 | "source": [ 155 | "

\n", 156 | "4. We are ready to create our plotting function. Follow the comments in the code block below to create a plot with isohypses, isotherms, and wind barbs at a user-specified level." 157 | ] 158 | }, 159 | { 160 | "cell_type": "code", 161 | "execution_count": null, 162 | "id": "41059d16-695f-40a2-8b65-48a1eaa41aac", 163 | "metadata": {}, 164 | "outputs": [], 165 | "source": [ 166 | "\"\"\"\n", 167 | "Below a function is defined to plot upper-air geopotential height (line contour), temperature (line contour), and wind (barbs).\n", 168 | "\n", 169 | "\n", 170 | "INPUT:\n", 171 | " model_data : XARRAY DATASET\n", 172 | " The GFS analysis data\n", 173 | " level : INTEGER\n", 174 | " The level that the plot is valid for.\n", 175 | " data : DATETIME\n", 176 | " The date and time that the plot is valid for.\n", 177 | "\n", 178 | "\"\"\"\n", 179 | "\n", 180 | "\n", 181 | "def gfs_temp_upper_air_plot(model_data, level, date):\n", 182 | " \n", 183 | " \"\"\"\n", 184 | " Since we are plotting multiple levels, we need to limit the data to the level we want to plot.\n", 185 | " When the data processing script was created it was designed to pull the data for all levels so we can\n", 186 | " create plots at multiple levels without having to download data each time.\n", 187 | " \n", 188 | " In the line below, like in data processing functions we have created in previous labs, parse the data\n", 189 | " so that only the level that the user specified in the GFS analysis data is retained.\n", 190 | " \"\"\"\n", 191 | " model_data = \n", 192 | " \n", 193 | " \n", 194 | " \"\"\"\n", 195 | " Setup a Lambert Conic Conformal Projection centered at 35°N and 95°W. Have the cone of the Lambert Conic Conformal projection intersect the Earth at 27.5°N and 42.5°N.\n", 196 | " \"\"\"\n", 197 | " proj = \n", 198 | " \n", 199 | " \n", 200 | " \"\"\"\n", 201 | " Create a figure with a size of 1150px x 800px and a resolution of 300 dots per inch,\n", 202 | " then set up an axes object (named ax) using the projection we previously defined for our map.\n", 203 | " \"\"\"\n", 204 | " \n", 205 | "\n", 206 | " \"\"\"\n", 207 | " Add an appropriate amount of geographic data. Be sure you follow \"good map\" suggestions with the geographic data styling.\n", 208 | " \"\"\"\n", 209 | "\n", 210 | " \n", 211 | " \"\"\"\n", 212 | " Limit the map to between 105°-75°W longitude and 32°-55°N latitude.\n", 213 | " \"\"\"\n", 214 | " \n", 215 | " \n", 216 | " \"\"\" \n", 217 | " Set up a Gaussian filter for the geopotential heights ('gh'). Choose an appropriate smoothing value for these data.\n", 218 | " \"\"\"\n", 219 | " smooth_heights = \n", 220 | "\n", 221 | " \"\"\"\n", 222 | " Contour the smoothed geopotential heights. Be sure to style your contours so they are easy to read and make sure they have labels.\n", 223 | " \"\"\"\n", 224 | " \n", 225 | " \"\"\"\n", 226 | " Set up a Gaussian filter for the temperature ('t'). Choose an appropriate smoothing value for these data.\n", 227 | " \"\"\"\n", 228 | " smooth_temp = \n", 229 | " \n", 230 | " \"\"\"\n", 231 | " Contour the smoothed temperatures. Be sure to style your contours so they are easy to read \n", 232 | " (including having them be distinct from the geopotential height contours) and make sure they have labels.\n", 233 | " \"\"\"\n", 234 | "\n", 235 | " \n", 236 | " \n", 237 | " \n", 238 | " \n", 239 | " \"\"\"\n", 240 | " Now we are ready to add our wind barbs. To start, create the x- and y-wind slices for the wind\n", 241 | " data, defined so that every 8 points in the x and y directions are skipped.\n", 242 | " \"\"\"\n", 243 | " wind_slice_x = \n", 244 | " wind_slice_y = \n", 245 | " \n", 246 | " \"\"\"\n", 247 | " Add styling information to the barb plotting argument below to make the wind barbs easier to read.\n", 248 | " \"\"\"\n", 249 | " ax.barbs(model_data[\"longitude\"][wind_slice_x].values, model_data[\"latitude\"][wind_slice_y].values,\n", 250 | " model_data[\"u\"][wind_slice_x, wind_slice_y].values,\n", 251 | " model_data[\"v\"][wind_slice_x, wind_slice_y].values,\n", 252 | " transform=crs.PlateCarree())\n", 253 | " \n", 254 | " \n", 255 | " \"\"\"\n", 256 | " Add an appropriate title to your plot.\n", 257 | " \"\"\" \n", 258 | " \n", 259 | " \n", 260 | " \n", 261 | " plt.show()\n" 262 | ] 263 | }, 264 | { 265 | "cell_type": "markdown", 266 | "id": "eae507aa-c126-4eba-8d01-22d893fd746b", 267 | "metadata": {}, 268 | "source": [ 269 | "

\n", 270 | "5. Run your plotting function for 850 hPa, 700 hPa, and 500 hPa with the data we downloaded earlier with our data processing script." 271 | ] 272 | }, 273 | { 274 | "cell_type": "code", 275 | "execution_count": null, 276 | "id": "d135a527-1513-4fef-ba7c-12e44731ef31", 277 | "metadata": {}, 278 | "outputs": [], 279 | "source": [] 280 | }, 281 | { 282 | "cell_type": "markdown", 283 | "id": "ddf3aad6-3ff9-40f5-bc82-77a6a09cc61e", 284 | "metadata": {}, 285 | "source": [ 286 | "

\n", 287 | "6. We need one extra timestep of data to complete the lab. Re-run your data processing script for January 5th, 2022 at 1200 UTC, then re-run your plotting script only at 500 hPa." 288 | ] 289 | }, 290 | { 291 | "cell_type": "code", 292 | "execution_count": null, 293 | "id": "8c497dec-d676-4e3b-92a6-3a6cb596010d", 294 | "metadata": {}, 295 | "outputs": [], 296 | "source": [] 297 | }, 298 | { 299 | "cell_type": "markdown", 300 | "id": "6888d7e1-ffd4-4269-976b-923177feaf12", 301 | "metadata": {}, 302 | "source": [ 303 | "### You have now completed Part II of the Python portion of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 304 | ] 305 | } 306 | ], 307 | "metadata": { 308 | "kernelspec": { 309 | "display_name": "Python 3 (ipykernel)", 310 | "language": "python", 311 | "name": "python3" 312 | }, 313 | "language_info": { 314 | "codemirror_mode": { 315 | "name": "ipython", 316 | "version": 3 317 | }, 318 | "file_extension": ".py", 319 | "mimetype": "text/x-python", 320 | "name": "python", 321 | "nbconvert_exporter": "python", 322 | "pygments_lexer": "ipython3", 323 | "version": "3.7.12" 324 | } 325 | }, 326 | "nbformat": 4, 327 | "nbformat_minor": 5 328 | } 329 | -------------------------------------------------------------------------------- /Lab 8/Lab_8_PT_3.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "id": "e0d7b2d5-f11b-4f10-bedb-27c91ecfca19", 6 | "metadata": {}, 7 | "source": [ 8 | "## Lab 8 Part III: Plotting Contoured Upper-Air Data\n", 9 | "

\n", 10 | "\n", 11 | "In this part of the tutorial, we plot GFS model-derived isohypses, color-filled isotachs, and wind barbs at 300 hPa. Most of the code in this notebook has been completed, but watch out for a few areas where you are asked to complete the code.\n", 12 | "
\n", 13 | "### Module Documentation\n", 14 | "\n", 15 | "1. Xarray Dataset: https://docs.xarray.dev/en/stable/generated/xarray.Dataset.html\n", 16 | "2. Matplotlib Pyplot: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.html\n", 17 | "3. Caropy crs: https://scitools.org.uk/cartopy/docs/latest/reference/crs.html\n", 18 | "4. Cartopy Feature: https://scitools.org.uk/cartopy/docs/latest/matplotlib/feature_interface.html\n", 19 | "5. Matplotlib Colors: https://matplotlib.org/stable/gallery/color/named_colors.html\n", 20 | "6. Matplotlib Contour: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.contour.html\n", 21 | "7. Matplotlib Barbs: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.barbs.html\n", 22 | "8. Scipy Gaussian Filter: https://docs.scipy.org/doc/scipy/reference/generated/scipy.ndimage.gaussian_filter.html\n", 23 | "9. Xarray with MetPy: https://unidata.github.io/MetPy/latest/tutorials/xarray_tutorial.html\n", 24 | "\n", 25 | "\n", 26 | "\n", 27 | "If you have any questions about the code below, feel free to reach out to me at mpvossen@uwm.edu. I am always willing to further explain the code.

\n", 28 | "\n", 29 | "---\n", 30 | "\n", 31 | "
\n", 32 | "1. As usual, we start by importing the modules we need for our Python code." 33 | ] 34 | }, 35 | { 36 | "cell_type": "code", 37 | "execution_count": null, 38 | "id": "44435d91-c36c-4d43-a367-149d53585868", 39 | "metadata": {}, 40 | "outputs": [], 41 | "source": [ 42 | "#from the dates and time code(datetime), import the date and time reading capabilities (datetime)\n", 43 | "from datetime import datetime\n", 44 | "\n", 45 | "#import the module numpy and save it to np\n", 46 | "import numpy as np\n", 47 | "\n", 48 | "#import the cartopy (cartopy) module's coordinate reference system (.crs) and save it to the variable crs\n", 49 | "import cartopy.crs as crs\n", 50 | "\n", 51 | "#import the cartopy (cartopy) module's ability to plot geographic data (.feature) and save it to the variable cfeature \n", 52 | "import cartopy.feature as cfeature\n", 53 | "\n", 54 | "#import the pyplot submodule from the matplotlib module\n", 55 | "import matplotlib.pyplot as plt\n", 56 | "\n", 57 | "#from the scipy module's ndimage submodule, import the function gaussian_filter\n", 58 | "from scipy.ndimage import gaussian_filter\n", 59 | "\n", 60 | "#import the module xarray and save it to xr\n", 61 | "import xarray as xr\n", 62 | "\n", 63 | "#from the metpy submodule units import the units function\n", 64 | "from metpy.units import units\n", 65 | "\n", 66 | "#import the calculation submodule form the metpy module and save it to the variable calc\n", 67 | "import metpy.calc as calc\n", 68 | "\n", 69 | "#import the pandas module and save it to the variable pd\n", 70 | "import pandas as pd" 71 | ] 72 | }, 73 | { 74 | "cell_type": "markdown", 75 | "id": "c6cd4a3e-04f0-4866-b71a-5aaeebb637fb", 76 | "metadata": {}, 77 | "source": [ 78 | "

\n", 79 | "2. The code block below establishes the data processing function. This data processing function opens GFS analysis data, converts the data's units, and computes the wind speed." 80 | ] 81 | }, 82 | { 83 | "cell_type": "code", 84 | "execution_count": null, 85 | "id": "86a87d1b-39c4-48f5-84ae-84946ec91fb1", 86 | "metadata": {}, 87 | "outputs": [], 88 | "source": [ 89 | "\"\"\"\n", 90 | "This function retrieves and processes upper-air data. This function opens GFS analysis data,\n", 91 | "retains only data from a desired pressure level, converts temperature and wind units, and\n", 92 | "computes wind speed.\n", 93 | "\n", 94 | "\n", 95 | "INPUT:\n", 96 | " level : INTEGER\n", 97 | " The level in hPa at which you want upper-air data.\n", 98 | " time : DATETIME\n", 99 | " The time at which you would like upper-air data.\n", 100 | " \n", 101 | "OUTPUT:\n", 102 | "\n", 103 | " model_data : XARRAY DATASET\n", 104 | " The xarray containing your processed GFS analysis data\n", 105 | "\n", 106 | "\"\"\"\n", 107 | "\n", 108 | "def process_upper_air_data(time, level):\n", 109 | " \"\"\"\n", 110 | " Specify the location of the upper-air data on the JupyterHub.\n", 111 | " \"\"\"\n", 112 | " lab_data_loc = \"/data/AtmSci360/Lab_8/\"\n", 113 | " \n", 114 | " \"\"\"\n", 115 | " Open the GFS data using xarray. Since the data are once again GRIB-formatted data, we can use xarray the same way we did in Lab 4.\n", 116 | " The data file name is still formatted %m%d%y_%H.grib2. Only open the variables with a vertical coordinate of isobaricInhPa.\n", 117 | " \"\"\"\n", 118 | " model_data = xr.open_dataset(f\"{lab_data_loc}{time:%m%d%y_%H}_gfs.grib2\", engine='cfgrib', filter_by_keys={'typeOfLevel': 'isobaricInhPa'})\n", 119 | " model_data = model_data.sel(isobaricInhPa = level)\n", 120 | " \n", 121 | " \"\"\"\n", 122 | " Our temperature (t) is in Kelvin and our winds are in m/s. To make this easier for to use, \n", 123 | " convert temperature to Celsius, the wind components to knots, and save each variable back to the xarray.\n", 124 | " \"\"\"\n", 125 | " model_data['t'] = model_data[\"t\"].metpy.convert_units('degC')\n", 126 | " model_data['u'] = model_data[\"u\"].metpy.convert_units('kt')\n", 127 | " model_data['v'] = model_data[\"v\"].metpy.convert_units('kt')\n", 128 | " \n", 129 | " \n", 130 | " \"\"\"\n", 131 | " We want color-filled isotachs, requiring us to compute the wind speed.\n", 132 | " \"\"\"\n", 133 | " wind_mag = ((model_data[\"u\"] ** 2) + (model_data[\"v\"] ** 2)) ** 0.5\n", 134 | " \n", 135 | " \"\"\"\n", 136 | " To keep everything in one place, we can add the computed wind speed variable to the xarray.\n", 137 | " \"\"\"\n", 138 | " model_data = model_data.assign(wind_mag=wind_mag)\n", 139 | " \n", 140 | " \n", 141 | " \"\"\"\n", 142 | " Finally, have the function return the processed data.\n", 143 | " \"\"\"\n", 144 | " return model_data\n", 145 | "\n", 146 | "\n", 147 | "\n", 148 | "model = process_upper_air_data(time,level)" 149 | ] 150 | }, 151 | { 152 | "cell_type": "markdown", 153 | "id": "c2c5f4bf-5b94-4307-a266-e877695eb6c5", 154 | "metadata": {}, 155 | "source": [ 156 | "

\n", 157 | "3. Run the data processing function to get data at 300 hPa on January 5th, 2022 at 0000 UTC." 158 | ] 159 | }, 160 | { 161 | "cell_type": "code", 162 | "execution_count": null, 163 | "id": "ec3488da-0a59-48c6-8cff-ec0f77d77588", 164 | "metadata": {}, 165 | "outputs": [], 166 | "source": [] 167 | }, 168 | { 169 | "cell_type": "markdown", 170 | "id": "ae342941-059e-4a44-b43f-eef28d0a8116", 171 | "metadata": {}, 172 | "source": [ 173 | "

\n", 174 | "4. Now that we have the data we need, we are ready to plot. In the function below, we create a plot of isohypses, color-filled isotachs, and wind barbs at 300 hPa. Watch out in the comments for areas you need to fill in." 175 | ] 176 | }, 177 | { 178 | "cell_type": "code", 179 | "execution_count": null, 180 | "id": "41059d16-695f-40a2-8b65-48a1eaa41aac", 181 | "metadata": {}, 182 | "outputs": [], 183 | "source": [ 184 | "\"\"\"\n", 185 | "Below, a function is defined to plot upper-air geopotential heights (line contour), wind magnitude (color filled contour), and wind (barbs) at 300 hPa. \n", 186 | "\n", 187 | "\n", 188 | "INPUT:\n", 189 | " model_data : XARRAY DATASET\n", 190 | " The GFS analysis data.\n", 191 | " level : INTEGER\n", 192 | " The level for which the plot is valid.\n", 193 | " data : DATETIME\n", 194 | " The date and time for which the plot is valid.\n", 195 | "\n", 196 | "\"\"\"\n", 197 | "\n", 198 | "\n", 199 | "def gfs_temp_upper_air_plot(model_data, date):\n", 200 | " \n", 201 | " \n", 202 | " \"\"\"\n", 203 | " Setup a Lambert Conic Conformal Projection centered at 35°N and 95°W. Have the cone of the Lambert Conic Conformal projection intersect the Earth at 27.5°N and 42.5°N.\n", 204 | " \"\"\"\n", 205 | " proj = crs.LambertConformal(central_longitude=-95, central_latitude=35, standard_parallels=[27.5,42.5])\n", 206 | " \n", 207 | " \"\"\"\n", 208 | " Create a figure with a size of 1150px x 800px and a resolution of 300 dots per inch,\n", 209 | " then set up an axes object (named ax) using the projection we previously defined for our map.\n", 210 | " \"\"\"\n", 211 | " fig = plt.figure(figsize=(11.5,8),dpi=300)\n", 212 | " ax = plt.subplot(projection = proj)\n", 213 | " \n", 214 | " \"\"\"\n", 215 | " Add geographic data to our map.\n", 216 | " \"\"\"\n", 217 | " ax.add_feature(cfeature.LAND.with_scale('50m'), edgecolor = \"black\", facecolor='none', linewidth=0.5, zorder=3)\n", 218 | " ax.add_feature(cfeature.BORDERS.with_scale('50m'), edgecolor = \"black\", facecolor='none', linewidth=0.5, zorder=3)\n", 219 | " ax.add_feature(cfeature.STATES.with_scale('50m'), edgecolor = \"grey\", linestyle=\":\", facecolor='none', linewidth=0.5, zorder=2)\n", 220 | " \n", 221 | " \"\"\"\n", 222 | " Limit the map to between 125°-65°W longitude and 25°-60°N latitude.\n", 223 | " \"\"\"\n", 224 | " ax.set_extent([-125,-65,25,60], crs=crs.PlateCarree())\n", 225 | " \n", 226 | " \"\"\" \n", 227 | " Set up a Gaussian filter for the geopotential heights ('gh').\n", 228 | " \"\"\"\n", 229 | " smooth_heights = gaussian_filter(model_data['gh'].values, 2)\n", 230 | " \n", 231 | " \n", 232 | " \"\"\"\n", 233 | " Contour and label the smoothed geopotential heights. \n", 234 | " \"\"\"\n", 235 | " cont_p = plt.contour(model_data[\"longitude\"].values, model_data[\"latitude\"].values, smooth_heights, np.arange(0,10000,60), colors='k', linewidths=0.85, transform=crs.PlateCarree())\n", 236 | " ax.clabel(cont_p, cont_p.levels, inline=True, fmt=lambda v: format(v, '.0f'), fontsize=6)\n", 237 | " \n", 238 | " \"\"\"\n", 239 | " Create a Gaussian filter to smooth the wind-speed data.\n", 240 | " \"\"\"\n", 241 | " smooth_wind = gaussian_filter(model_data['wind_mag'].values, 2)\n", 242 | " \n", 243 | " \"\"\"\n", 244 | " Use color-filled isopleths for the wind speed. \n", 245 | " \"\"\"\n", 246 | " cont_w = plt.contourf(model_data[\"longitude\"].values, model_data[\"latitude\"].values, smooth_wind, np.arange(50,210,25), cmap=\"twilight\", transform=crs.PlateCarree(), extend=\"max\", alpha=0.6)\n", 247 | " \n", 248 | " \"\"\"\n", 249 | " Add a color bar with a label to show that the color-filled contours represent wind speed in knots.\n", 250 | " \"\"\"\n", 251 | " cbar = plt.colorbar(cont_w)\n", 252 | " cbar.set_label(\"Wind Speed (kt)\", size=12, weight=\"bold\")\n", 253 | " \n", 254 | " \"\"\"\n", 255 | " Set up the wind slice to skip every 12 data points in each direction.\n", 256 | " \"\"\"\n", 257 | " wind_slice_x = slice(None, None, 12)\n", 258 | " wind_slice_y = slice(None, None, 12)\n", 259 | " \n", 260 | " \n", 261 | " \"\"\"\n", 262 | " Plot wind barbs using the wind slice we create before.\n", 263 | " \"\"\"\n", 264 | " ax.barbs(model_data[\"longitude\"][wind_slice_x].values, model_data[\"latitude\"][wind_slice_y].values,\n", 265 | " model_data[\"u\"][wind_slice_x, wind_slice_y].values,\n", 266 | " model_data[\"v\"][wind_slice_x, wind_slice_y].values,\n", 267 | " color='black',transform=crs.PlateCarree(),linewidth=0.6, length=4.25)\n", 268 | " \n", 269 | " \n", 270 | " \"\"\"\n", 271 | " Add an appropriate title to your plot.\n", 272 | " \"\"\" \n", 273 | " \n", 274 | " \n", 275 | " \n", 276 | " \n", 277 | " \n", 278 | " plt.show()\n" 279 | ] 280 | }, 281 | { 282 | "cell_type": "markdown", 283 | "id": "3ccd39cc-d3c9-40ef-a6f6-d9100d241c21", 284 | "metadata": {}, 285 | "source": [ 286 | "

\n", 287 | "5. Call the plotting function using the data we retrieved before." 288 | ] 289 | }, 290 | { 291 | "cell_type": "code", 292 | "execution_count": null, 293 | "id": "9b0f56f5-e725-4d6d-b23a-6d38aec5a42f", 294 | "metadata": {}, 295 | "outputs": [], 296 | "source": [] 297 | }, 298 | { 299 | "cell_type": "markdown", 300 | "id": "d5b144e6-820b-45b0-a6d2-cfcf7e105c42", 301 | "metadata": {}, 302 | "source": [ 303 | "### You have now completed the Python portion of the lab. Be sure to submit the fully rendered Jupyter Notebook on GitHub when you are finished." 304 | ] 305 | } 306 | ], 307 | "metadata": { 308 | "kernelspec": { 309 | "display_name": "Python 3 (ipykernel)", 310 | "language": "python", 311 | "name": "python3" 312 | }, 313 | "language_info": { 314 | "codemirror_mode": { 315 | "name": "ipython", 316 | "version": 3 317 | }, 318 | "file_extension": ".py", 319 | "mimetype": "text/x-python", 320 | "name": "python", 321 | "nbconvert_exporter": "python", 322 | "pygments_lexer": "ipython3", 323 | "version": "3.7.12" 324 | } 325 | }, 326 | "nbformat": 4, 327 | "nbformat_minor": 5 328 | } 329 | -------------------------------------------------------------------------------- /Lab 9/Lab 9.docx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/clarkevanswx/SynopticMet1/a6032312c07cb8a42daaea6bbb0f574d0d5c7b85/Lab 9/Lab 9.docx -------------------------------------------------------------------------------- /Modules/iowa_metar.py: -------------------------------------------------------------------------------- 1 | """ 2 | Script that scrapes data from the IEM ASOS download service 3 | 4 | Origanally created by Iowa State University 5 | 6 | Modified by: Michael Vossen 7 | University of Wisconsin Milwaukee 8 | 9 | """ 10 | from __future__ import print_function 11 | import json 12 | import time 13 | import datetime 14 | from io import StringIO 15 | import pandas as pd 16 | 17 | try: 18 | from urllib.request import urlopen 19 | except ImportError: 20 | from urllib2 import urlopen 21 | 22 | # Number of attempts to download data 23 | MAX_ATTEMPTS = 6 24 | # HTTPS here can be problematic for installs that don't have Lets Encrypt CA 25 | SERVICE = "http://mesonet.agron.iastate.edu/cgi-bin/request/asos.py?" 26 | 27 | 28 | def download_data(uri): 29 | """Fetch the data from the IEM 30 | The IEM download service has some protections in place to keep the number 31 | of inbound requests in check. This function implements an exponential 32 | backoff to keep individual downloads from erroring. 33 | Args: 34 | uri (string): URL to fetch 35 | Returns: 36 | string data 37 | """ 38 | attempt = 0 39 | while attempt < MAX_ATTEMPTS: 40 | try: 41 | data = urlopen(uri, timeout=300).read().decode("utf-8") 42 | if data is not None and not data.startswith("ERROR"): 43 | return data 44 | except Exception as exp: 45 | print("download_data(%s) failed with %s" % (uri, exp)) 46 | time.sleep(5) 47 | attempt += 1 48 | 49 | print("Exhausted attempts to download, returning empty data") 50 | return "" 51 | 52 | 53 | 54 | def download_alldata(ob_time): 55 | """ 56 | A method to download one hour of data for the entire United States 57 | """ 58 | # timestamps in UTC to request data for 59 | startts = ob_time 60 | 61 | interval = datetime.timedelta(hours=1) 62 | 63 | service = SERVICE + "data=metar&format=onlycomma&latlon=no&" 64 | 65 | now = startts 66 | 67 | thisurl = service 68 | thisurl += now.strftime("year1=%Y&month1=%m&day1=%d&hour1=%H&") 69 | thisurl += (now + interval).strftime("year2=%Y&month2=%m&day2=%d&hour2=%H&") 70 | thisurl += "&direct=no&report_type=3" 71 | 72 | 73 | print("Downloading: %s" % (now,)) 74 | data = StringIO(download_data(thisurl)) 75 | df = pd.read_csv(data) 76 | final_data = df["metar"] 77 | for count, value in enumerate(final_data): 78 | if value.find("METAR ") != -1: 79 | 80 | final_data.iloc[count] = value.replace("METAR ", "") 81 | data_file = StringIO() 82 | final_data.to_csv(data_file, sep="\n", index=False, header=False) 83 | return data_file 84 | 85 | 86 | 87 | def download_alldata_save(ob_time): 88 | """ 89 | A method to download one hour of data for the entire United States 90 | """ 91 | # timestamps in UTC to request data for 92 | startts = ob_time 93 | 94 | interval = datetime.timedelta(hours=1) 95 | 96 | service = SERVICE + "data=metar&format=onlycomma&latlon=no&" 97 | 98 | now = startts 99 | 100 | thisurl = service 101 | thisurl += now.strftime("year1=%Y&month1=%m&day1=%d&hour1=%H&") 102 | thisurl += (now + interval).strftime("year2=%Y&month2=%m&day2=%d&hour2=%H&") 103 | thisurl += "&direct=no&report_type=3" 104 | 105 | 106 | print("Downloading: %s" % (now,)) 107 | data = StringIO(download_data(thisurl)) 108 | df = pd.read_csv(data) 109 | final_data = df["metar"] 110 | for count, value in enumerate(final_data): 111 | if value.find("METAR ") != -1: 112 | 113 | final_data.iloc[count] = value.replace("METAR ", "") 114 | 115 | final_data.to_csv(f"{ob_time:%Y%m%d_%H}.txt", sep="\n", index=False, header=False) 116 | 117 | 118 | 119 | 120 | if __name__ == "__main__": 121 | date = datetime.datetime(2022,9,14,0) 122 | download_alldata_save(date) 123 | -------------------------------------------------------------------------------- /Modules/min_max.py: -------------------------------------------------------------------------------- 1 | from scipy.ndimage.filters import maximum_filter, minimum_filter 2 | import numpy as np 3 | 4 | 5 | def plot_maxmin_points(ax, lon, lat, data, extrema, nsize, textsize=8, 6 | plotValue=True, transform=None): 7 | """ 8 | This function will find and plot relative maximum and minimum for a 2D grid. The function 9 | can be used to plot an H for maximum values (e.g., High pressure) and an L for minimum 10 | values (e.g., low pressue). It is best to used filetered data to obtain a synoptic scale 11 | max/min value. The symbol text can be set to a string value and optionally the color of the 12 | symbol and any plotted value can be set with the parameter color. 13 | 14 | Parameters 15 | ---------- 16 | lon : 2D array 17 | Plotting longitude values 18 | lat : 2D array 19 | Plotting latitude values 20 | data : 2D array 21 | Data that you wish to plot the max/min symbol placement 22 | extrema : str 23 | Either a value of max for Maximum Values or min for Minimum Values 24 | nsize : int 25 | Size of the grid box to filter the max and min values to plot a reasonable number 26 | symbol : str 27 | Text to be placed at location of max/min value 28 | color : str 29 | Name of matplotlib colorname to plot the symbol (and numerical value, if plotted) 30 | plot_value : Boolean (True/False) 31 | Whether to plot the numeric value of max/min point 32 | 33 | Return 34 | ------ 35 | The max/min symbol will be plotted on the current axes within the bounding frame 36 | (e.g., clip_on=True) 37 | """ 38 | 39 | 40 | if (extrema == 'max'): 41 | data_ext = maximum_filter(data, nsize, mode='nearest') 42 | symbol = "H" 43 | color = "b" 44 | elif (extrema == 'min'): 45 | data_ext = minimum_filter(data, nsize, mode='nearest') 46 | symbol="L" 47 | color="r" 48 | else: 49 | raise ValueError('Value for hilo must be either max or min') 50 | 51 | if lon.ndim == 1: 52 | lon, lat = np.meshgrid(lon, lat) 53 | 54 | mxx, mxy = np.where(data_ext == data) 55 | 56 | for i in range(len(mxy)): 57 | ax.text(lon[mxx[i], mxy[i]], lat[mxx[i], mxy[i]], symbol, color=color, size=textsize, 58 | clip_on=True, horizontalalignment='center', verticalalignment='center', 59 | transform=transform) 60 | ax.text(lon[mxx[i], mxy[i]], lat[mxx[i], mxy[i]], 61 | '\n' + str(np.int(data[mxx[i], mxy[i]])), 62 | color=color, size=int(textsize/4), clip_on=True, fontweight='bold', 63 | horizontalalignment='center', verticalalignment='top', transform=transform) 64 | 65 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 |

First-Semester Synoptic Meteorology Labs

2 |

Release 1a, June 2023
Clark Evans and Michael Vossen

3 | 4 | This repository contains a set of twelve labs designed for a first-semester undergraduate course in synoptic meteorology, focusing on introductory concepts, atmospheric balances, kinematic properties, atmospheric stability, and fronts, jets, and cyclones. Each lab is currently provided in Microsoft Word format, and each is supported by one or more Jupyter Notebooks. The Jupyter Notebooks introduce students to accessing and plotting meteorological data (surface and upper-air observations, skew-T ln-p diagrams, and model data) using modern Python tools. 5 | 6 | A set of ten labs designed for a second-semester undergraduate course in synoptic meteorology, focusing on quasi-geostrophic theory, isentropic analysis principles, and isentropic potential vorticity, is also available on Github. Just as their scientific concepts build on this first-semester course's concepts, so too do their Python concepts. Thus, we recommend beginning the second-semester labs only after completing these labs. 7 | 8 | If you are hosting a JupyterHub, or if your students have Jupyter Notebook installed locally (e.g., as part of an Anaconda Python distribution), then nbgitpuller can be used to generate a link that downloads this repository into a working Notebook environment. 9 | 10 | The Jupyter Notebooks cover the following Python-related topics: 11 | 25 | 26 | Planned future additions/updates include: 27 | 32 | 33 | The primary packages used by these Jupyter Notebooks are cartopy, MetPy, siphon, and xarray. The JupyterHub on which these notebooks were first deployed runs Python 3.7.12, cartopy 0.20.0, MetPy 1.2.0, siphon 0.9, and xarray 0.20.2. It is likely that these notebooks will work with newer versions of each package with few, if any, changes. 34 | 35 | Comments, questions, etc.: evans36-at-uwm-dot-edu. 36 | -------------------------------------------------------------------------------- /Scripts/Download_GFS.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 9, 6 | "id": "d2487cc0-7663-46bb-99df-f41fb8b30739", 7 | "metadata": {}, 8 | "outputs": [], 9 | "source": [ 10 | "import s3fs\n", 11 | "import os\n", 12 | "from datetime import datetime\n", 13 | "\n", 14 | "time = datetime(2022,1,5,12)\n", 15 | "\n", 16 | "\n", 17 | "\n", 18 | "lab_number = \"10\"\n", 19 | "\n", 20 | "lab_location = f\"/data/AtmSci360/Lab_{lab_number}/\"\n", 21 | "\n", 22 | "os.makedirs(lab_location, exist_ok=\"True\")\n", 23 | "aws = s3fs.S3FileSystem(anon=True)\n", 24 | "product = \"pgrb2\"\n", 25 | "resolution = \"0p25\"\n", 26 | "aws_file = s3fs.S3File(aws, f'noaa-gfs-bdp-pds/gfs.{time:%Y%m%d}/{time:%H}/atmos/gfs.t{time:%H}z.{product}.{resolution}.f000')\n", 27 | "file_name = f\"{lab_location}{time:%m%d%y_%H}_gfs.grib2\"\n", 28 | "\n", 29 | "file_data = aws_file.readlines()\n", 30 | "file = open(file_name, 'wb')\n", 31 | "file.writelines(file_data)\n", 32 | "file.close()\n", 33 | "aws_file.close()" 34 | ] 35 | }, 36 | { 37 | "cell_type": "code", 38 | "execution_count": null, 39 | "id": "8e009377-d401-47e0-a2a5-eef2c39024b6", 40 | "metadata": {}, 41 | "outputs": [], 42 | "source": [] 43 | }, 44 | { 45 | "cell_type": "code", 46 | "execution_count": null, 47 | "id": "c9dc596e-c9cc-4559-b707-3b0f00548381", 48 | "metadata": {}, 49 | "outputs": [], 50 | "source": [] 51 | } 52 | ], 53 | "metadata": { 54 | "kernelspec": { 55 | "display_name": "Python 3 (ipykernel)", 56 | "language": "python", 57 | "name": "python3" 58 | }, 59 | "language_info": { 60 | "codemirror_mode": { 61 | "name": "ipython", 62 | "version": 3 63 | }, 64 | "file_extension": ".py", 65 | "mimetype": "text/x-python", 66 | "name": "python", 67 | "nbconvert_exporter": "python", 68 | "pygments_lexer": "ipython3", 69 | "version": "3.7.12" 70 | } 71 | }, 72 | "nbformat": 4, 73 | "nbformat_minor": 5 74 | } 75 | -------------------------------------------------------------------------------- /Scripts/Download_Sat.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 5, 6 | "id": "d2487cc0-7663-46bb-99df-f41fb8b30739", 7 | "metadata": {}, 8 | "outputs": [], 9 | "source": [ 10 | "import s3fs\n", 11 | "import os\n", 12 | "from datetime import datetime\n", 13 | "\n", 14 | "time = datetime(2022,9,23,18)\n", 15 | "\n", 16 | "channel = \"02\"\n", 17 | "\n", 18 | "lab_number = \"10\"\n", 19 | "\n", 20 | "lab_location = f\"/data/AtmSci360/Lab_{lab_number}/\"\n", 21 | "\n", 22 | "os.makedirs(lab_location, exist_ok=\"True\")\n", 23 | "aws = s3fs.S3FileSystem(anon=True)\n", 24 | "files = aws.ls(f'noaa-goes16/ABI-L1b-RadC/{time:%Y}/{time:%j}/{time:%H}/')\n", 25 | "\n", 26 | "channel_files = []\n", 27 | "\n", 28 | "for file in files:\n", 29 | " if file.find(f'M6C{channel}') != -1:\n", 30 | " channel_files.append(file)\n", 31 | "final_file = channel_files[0]\n", 32 | "aws_file = s3fs.S3File(aws, final_file)\n", 33 | "file_name = f\"{lab_location}{time:%m%d%y_%H}_goes16.nc\"\n", 34 | "\n", 35 | "file_data = aws_file.readlines()\n", 36 | "file = open(file_name, 'wb')\n", 37 | "file.writelines(file_data)\n", 38 | "file.close()\n", 39 | "aws_file.close()" 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "execution_count": null, 45 | "id": "8e009377-d401-47e0-a2a5-eef2c39024b6", 46 | "metadata": {}, 47 | "outputs": [], 48 | "source": [] 49 | }, 50 | { 51 | "cell_type": "code", 52 | "execution_count": null, 53 | "id": "c9dc596e-c9cc-4559-b707-3b0f00548381", 54 | "metadata": {}, 55 | "outputs": [], 56 | "source": [] 57 | } 58 | ], 59 | "metadata": { 60 | "kernelspec": { 61 | "display_name": "Python 3 (ipykernel)", 62 | "language": "python", 63 | "name": "python3" 64 | }, 65 | "language_info": { 66 | "codemirror_mode": { 67 | "name": "ipython", 68 | "version": 3 69 | }, 70 | "file_extension": ".py", 71 | "mimetype": "text/x-python", 72 | "name": "python", 73 | "nbconvert_exporter": "python", 74 | "pygments_lexer": "ipython3", 75 | "version": "3.7.12" 76 | } 77 | }, 78 | "nbformat": 4, 79 | "nbformat_minor": 5 80 | } 81 | --------------------------------------------------------------------------------