├── .gitignore
├── README.md
├── tutorial1
├── 1_3_basic_sine_curve_example.ipynb
├── 1_3_basic_wind_chill_example.ipynb
├── E-AI_Talks_Basics_01_Intro_Environment_First_AI_Example_v2.pdf
└── E-AI_Talks_Basics_01_Intro_Environment_First_AI_Example_v2.pptx
├── tutorial2
├── 2_1_gnn_example_1d.ipynb
├── 2_2_encoder_decoder_2d_example.ipynb
├── 2_3_data_assimilation_example.ipynb
├── E-AI_Talks_Basics_02_Dynamics_EnDecoder_Data_Assimilation_v3.pdf
└── E-AI_Talks_Basics_02_Dynamics_EnDecoder_Data_Assimilation_v3.pptx
├── tutorial3
├── 3_1_ollama_example_01.ipynb
├── 3_2_transformer_example.ipynb
├── 3_2_transformer_example_01.ipynb
├── 3_3_RAG_example_0.ipynb
├── E-AI_Talks_Basics_03_LLM_Transformer_RAG.pdf
└── E-AI_Talks_Basics_03_LLM_Transformer_RAG.pptx
├── tutorial4
├── 4-1#git_demo_store#hooks#post-receive
├── 4-2_provision.eccodes.sh
├── 4_1_1_Mlflow.ipynb
├── 4_1_2_mlflow_server_via_ngrok.ipynb
├── 4_1_3_MLFlow_Application.ipynb
├── E-AI_Talks_Basics_04_MLOps_final.pdf
└── E-AI_Talks_Basics_04_MLOps_final_static.pptx
├── tutorial5
├── 1_3_basic_wind_chill_example_with_logging.py
├── E-AI_Talks_Basics_05_MLflow_all.pdf
├── E-AI_Talks_Basics_05_MLflow_all.pptx
├── auth_config.ini
├── mlflow_setup.py
└── screen_mlflow.sh
└── tutorial6
├── .github
└── workflows
│ └── some-name.yml
├── .gitlab-ci.yml
├── .pre-commit-config.yaml
├── E-AI_Talks_Basics_06_CICD_final.pdf
├── E-AI_Talks_Basics_06_CICD_final.pptx
├── hello world.py
├── test_example.py
└── test_pytorch.py
/.gitignore:
--------------------------------------------------------------------------------
1 | .ipynb_checkpoints
2 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # tutorials
2 |
3 | EUMETNET E-AI is a programme "Artificial Intelligence and Machine Learning for Weather, Climate and Environmental Applications".
4 |
5 | We are a community of weather services in Europe, with many partners from academia, research institutes and industry.
6 |
7 | Collecting targeted tutorials for helping our scientists to learn about AI techniques and methods are being developed by many of our institutions. EUMETNET will work with tutorials and contribute to some of them, and make them accessible for our community.
8 |
9 | ## [Tutorial E-AI Basics 1: Intro, Environment, First Example](tutorial1/)
10 | - 1.1 Basic Ideas of AI Techniques
11 | - 1.2 Work Environment
12 | - 1.3 First Example for AI - hands-on
13 |
14 | ## [Tutorial E-AI Basics 2: Dynamics, Downscaling, Data Assimilation Examples](tutorial2/)
15 | - 2.1 Dynamic Prediction by a Graph NN
16 | - 2.2 Data Recovery/Denoising via Encoder-Decoder
17 | - 2.3 AI for Data Assimilation
18 |
19 | ## [Tutorial E-AI Basics 3: LLM Use, Transformer Example, RAG](tutorial3/)
20 | - 3.1 Intro to LLM Use and APIs
21 | - 3.2 Transformer for Language and Images
22 | - 3.3 LLM Retrieval Augmented Generation (RAG)
23 |
24 | ## [Tutorial E-AI Basics 4: "MLOps" - Machine Learning Operations](tutorial4/)
25 | - 4.1 Overview
26 | - 4.2 MLOps in relation to traditional Weather forecasting
27 | - 4.3 Road to MLOps
28 |
29 | ## [Tutorial E-AI Basics 5: MLflow - an open-source platform for managing the machine learning lifecycle](tutorial5/)
30 | - 5.1 Overview - User perspective
31 | - 5.2 Logging to MLflow as a ML software developer
32 | - 5.3 Running MLflow server as a user and as a service
33 |
34 | ## [Tutorial E-AI Basics 6: CI/CD - Continuous Integration and Continuous Deployment of ML codes](tutorial6/)
35 | - 6.1 Overview – What can CI/CD do for you?
36 | - 6.2 Basic tests with Pytest
37 | - 6.3 Setting up a runner
--------------------------------------------------------------------------------
/tutorial1/E-AI_Talks_Basics_01_Intro_Environment_First_AI_Example_v2.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial1/E-AI_Talks_Basics_01_Intro_Environment_First_AI_Example_v2.pdf
--------------------------------------------------------------------------------
/tutorial1/E-AI_Talks_Basics_01_Intro_Environment_First_AI_Example_v2.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial1/E-AI_Talks_Basics_01_Intro_Environment_First_AI_Example_v2.pptx
--------------------------------------------------------------------------------
/tutorial2/E-AI_Talks_Basics_02_Dynamics_EnDecoder_Data_Assimilation_v3.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial2/E-AI_Talks_Basics_02_Dynamics_EnDecoder_Data_Assimilation_v3.pdf
--------------------------------------------------------------------------------
/tutorial2/E-AI_Talks_Basics_02_Dynamics_EnDecoder_Data_Assimilation_v3.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial2/E-AI_Talks_Basics_02_Dynamics_EnDecoder_Data_Assimilation_v3.pptx
--------------------------------------------------------------------------------
/tutorial3/3_1_ollama_example_01.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "id": "c8334f49-aaf1-4468-8d8e-16573c277115",
7 | "metadata": {},
8 | "outputs": [],
9 | "source": [
10 | "import ollama"
11 | ]
12 | },
13 | {
14 | "cell_type": "code",
15 | "execution_count": 2,
16 | "id": "2d001f67-8599-434c-94ad-7bd364520c3f",
17 | "metadata": {},
18 | "outputs": [
19 | {
20 | "name": "stdout",
21 | "output_type": "stream",
22 | "text": [
23 | " Why was the equal sign so humble?\n",
24 | "\n",
25 | "Because it knew it wasn't less than or greater than anyone else! (I know, math jokes can be cheesy, but I hope this one made you smile!)\n"
26 | ]
27 | }
28 | ],
29 | "source": [
30 | "response = ollama.chat(model='mistral',messages=[{'role': 'user', 'content': \n",
31 | " 'tell me a joke involving mathematics'}])\n",
32 | "print(response['message']['content'])"
33 | ]
34 | },
35 | {
36 | "cell_type": "code",
37 | "execution_count": 3,
38 | "id": "410cbb56-2a57-457e-a583-bf7a5f642e85",
39 | "metadata": {},
40 | "outputs": [
41 | {
42 | "name": "stdout",
43 | "output_type": "stream",
44 | "text": [
45 | "\n",
46 | "Why did the meteorologist break up with his girlfriend?\n",
47 | "\n",
48 | "She was always clouding his judgment!\n"
49 | ]
50 | }
51 | ],
52 | "source": [
53 | "response = ollama.chat(model='llama2', \n",
54 | " messages=[{'role': 'user', 'content': 'tell me a joke involving meteorology'}])\n",
55 | "print(response['message']['content'])"
56 | ]
57 | },
58 | {
59 | "cell_type": "code",
60 | "execution_count": 4,
61 | "id": "9f1958dd-eacb-40b3-8d9d-aea8d8a9396b",
62 | "metadata": {},
63 | "outputs": [
64 | {
65 | "name": "stdout",
66 | "output_type": "stream",
67 | "text": [
68 | "\n",
69 | "Sure, here's another one:\n",
70 | "\n",
71 | "Why did the mathematician break up with his girlfriend?\n",
72 | "\n",
73 | "Because she couldn't solve their problems!\n"
74 | ]
75 | }
76 | ],
77 | "source": [
78 | "text2=\"Tell me another joke on mathematicians\"\n",
79 | "response = ollama.chat(model='llama2',messages=[{'role': 'user', 'content': text2}])\n",
80 | "print(response['message']['content'])"
81 | ]
82 | },
83 | {
84 | "cell_type": "code",
85 | "execution_count": 5,
86 | "id": "268c9347-6c60-4bb1-92cb-16d7e4e30f0e",
87 | "metadata": {},
88 | "outputs": [
89 | {
90 | "data": {
91 | "text/html": [
92 | "Response from ollama:"
93 | ],
94 | "text/plain": [
95 | ""
96 | ]
97 | },
98 | "metadata": {},
99 | "output_type": "display_data"
100 | },
101 | {
102 | "name": "stdout",
103 | "output_type": "stream",
104 | "text": [
105 | "{'model': 'llama2', 'created_at': '2024-09-21T11:16:32.247858642Z', 'message': {'role': 'assistant', 'content': \"\\nI'm just an AI, I don't have real-time access to current weather conditions. However, I can suggest some ways for you to find out the current weather in Paris:\\n\\n1. Check online weather websites: Websites such as AccuWeather, Weather.com, or the French meteorological service (Météo-France) provide up-to-date weather forecasts and conditions for cities around the world, including Paris.\\n2. Use a weather app: There are many weather apps available for smartphones and other devices that can provide you with real-time weather information in Paris. Some popular weather apps include Dark Sky, Weather Underground, and The Weather Channel.\\n3. Contact the local tourist office: The Paris Tourist Office (Office du Tourisme de Paris) or your hotel's front desk can provide you with information on the current weather conditions in Paris.\\n4. Watch local news: If you have access to a TV or computer, you can watch local news channels or check their website for weather updates in Paris.\\n\\nRemember, the weather in Paris can be unpredictable, so it's always a good idea to pack layers and be prepared for any conditions.\"}, 'done': True, 'total_duration': 52238710873, 'load_duration': 1566638, 'prompt_eval_count': 15, 'prompt_eval_duration': 2101520000, 'eval_count': 261, 'eval_duration': 50004740000}\n"
106 | ]
107 | },
108 | {
109 | "data": {
110 | "text/html": [
111 | "Ollama:"
112 | ],
113 | "text/plain": [
114 | ""
115 | ]
116 | },
117 | "metadata": {},
118 | "output_type": "display_data"
119 | },
120 | {
121 | "name": "stdout",
122 | "output_type": "stream",
123 | "text": [
124 | "\n",
125 | "I'm just an AI, I don't have real-time access to current weather conditions. However, I can suggest some ways for you to find out the current weather in Paris:\n",
126 | "\n",
127 | "1. Check online weather websites: Websites such as AccuWeather, Weather.com, or the French meteorological service (Météo-France) provide up-to-date weather forecasts and conditions for cities around the world, including Paris.\n",
128 | "2. Use a weather app: There are many weather apps available for smartphones and other devices that can provide you with real-time weather information in Paris. Some popular weather apps include Dark Sky, Weather Underground, and The Weather Channel.\n",
129 | "3. Contact the local tourist office: The Paris Tourist Office (Office du Tourisme de Paris) or your hotel's front desk can provide you with information on the current weather conditions in Paris.\n",
130 | "4. Watch local news: If you have access to a TV or computer, you can watch local news channels or check their website for weather updates in Paris.\n",
131 | "\n",
132 | "Remember, the weather in Paris can be unpredictable, so it's always a good idea to pack layers and be prepared for any conditions.\n"
133 | ]
134 | },
135 | {
136 | "data": {
137 | "text/html": [
138 | "Response from ollama:"
139 | ],
140 | "text/plain": [
141 | ""
142 | ]
143 | },
144 | "metadata": {},
145 | "output_type": "display_data"
146 | },
147 | {
148 | "name": "stdout",
149 | "output_type": "stream",
150 | "text": [
151 | "{'model': 'llama2', 'created_at': '2024-09-21T11:17:29.451397296Z', 'message': {'role': 'assistant', 'content': \"\\nIt's always a good idea to check the weather forecast before heading out, especially in Paris where the weather can be unpredictable. While it's impossible for me to provide you with the exact weather conditions at the moment, I can suggest some general tips on when to bring an umbrella in Paris:\\n\\n1. Spring and Autumn: These are the most likely seasons to experience rain in Paris, so it's a good idea to pack an umbrella during these months (March to May and September to November).\\n2. Summer: While it's less likely to rain in the summer months (June to August), it can still happen, especially in the late afternoon or evening. Bringing an umbrella during this time is a good precautionary measure.\\n3. Winter: Paris can experience occasional snow and rain during the winter months (December to February). While the chances of rain are lower than in other seasons, it's still possible, so it's best to bring an umbrella just in case.\\n\\nRemember, the weather in Paris can be unpredictable, so it's always better to be prepared with a lightweight and compact umbrella that you can easily carry with you.\"}, 'done': True, 'total_duration': 57166794119, 'load_duration': 2490003, 'prompt_eval_count': 27, 'prompt_eval_duration': 3836843000, 'eval_count': 267, 'eval_duration': 53195411000}\n"
152 | ]
153 | },
154 | {
155 | "data": {
156 | "text/html": [
157 | "Ollama:"
158 | ],
159 | "text/plain": [
160 | ""
161 | ]
162 | },
163 | "metadata": {},
164 | "output_type": "display_data"
165 | },
166 | {
167 | "name": "stdout",
168 | "output_type": "stream",
169 | "text": [
170 | "\n",
171 | "It's always a good idea to check the weather forecast before heading out, especially in Paris where the weather can be unpredictable. While it's impossible for me to provide you with the exact weather conditions at the moment, I can suggest some general tips on when to bring an umbrella in Paris:\n",
172 | "\n",
173 | "1. Spring and Autumn: These are the most likely seasons to experience rain in Paris, so it's a good idea to pack an umbrella during these months (March to May and September to November).\n",
174 | "2. Summer: While it's less likely to rain in the summer months (June to August), it can still happen, especially in the late afternoon or evening. Bringing an umbrella during this time is a good precautionary measure.\n",
175 | "3. Winter: Paris can experience occasional snow and rain during the winter months (December to February). While the chances of rain are lower than in other seasons, it's still possible, so it's best to bring an umbrella just in case.\n",
176 | "\n",
177 | "Remember, the weather in Paris can be unpredictable, so it's always better to be prepared with a lightweight and compact umbrella that you can easily carry with you.\n"
178 | ]
179 | }
180 | ],
181 | "source": [
182 | "import ollama\n",
183 | "from IPython.display import display, HTML\n",
184 | "\n",
185 | "# Initialize an empty list to keep track of the conversation\n",
186 | "conversation_history = []\n",
187 | "\n",
188 | "# Function to ask a question and get a response, maintaining context\n",
189 | "def ask_ollama(question, conversation_history):\n",
190 | " # Append the new question to the conversation history\n",
191 | " conversation_history.append({\"role\": \"user\", \"content\": question})\n",
192 | "\n",
193 | " # Send the entire conversation history to ollama\n",
194 | " response = ollama.chat(model='llama2', messages=conversation_history) # Pass the history directly\n",
195 | "\n",
196 | " # Print the response to understand its structure\n",
197 | " display(HTML(\"Response from ollama:\"))\n",
198 | " print(response)\n",
199 | "\n",
200 | " # Extract content from the response\n",
201 | " content = response['message']['content']\n",
202 | "\n",
203 | " # Append ollama's response to the conversation history\n",
204 | " conversation_history.append({\"role\": \"assistant\", \"content\": content})\n",
205 | "\n",
206 | " return content\n",
207 | "\n",
208 | "# Example usage\n",
209 | "question1 = \"What's the weather like today in Paris?\"\n",
210 | "response1 = ask_ollama(question1, conversation_history)\n",
211 | "display(HTML(\"Ollama:\"))\n",
212 | "print(response1)\n",
213 | "\n",
214 | "question2 = \"Should I bring an umbrella?\"\n",
215 | "response2 = ask_ollama(question2, conversation_history)\n",
216 | "display(HTML(\"Ollama:\"))\n",
217 | "print(response2)\n"
218 | ]
219 | },
220 | {
221 | "cell_type": "code",
222 | "execution_count": 6,
223 | "id": "078908d8-64ea-46e1-9d8e-95d6d7e37211",
224 | "metadata": {},
225 | "outputs": [
226 | {
227 | "name": "stdout",
228 | "output_type": "stream",
229 | "text": [
230 | "[{'role': 'user', 'content': \"What's the weather like today in Paris?\"}, {'role': 'assistant', 'content': \"\\nI'm just an AI, I don't have real-time access to current weather conditions. However, I can suggest some ways for you to find out the current weather in Paris:\\n\\n1. Check online weather websites: Websites such as AccuWeather, Weather.com, or the French meteorological service (Météo-France) provide up-to-date weather forecasts and conditions for cities around the world, including Paris.\\n2. Use a weather app: There are many weather apps available for smartphones and other devices that can provide you with real-time weather information in Paris. Some popular weather apps include Dark Sky, Weather Underground, and The Weather Channel.\\n3. Contact the local tourist office: The Paris Tourist Office (Office du Tourisme de Paris) or your hotel's front desk can provide you with information on the current weather conditions in Paris.\\n4. Watch local news: If you have access to a TV or computer, you can watch local news channels or check their website for weather updates in Paris.\\n\\nRemember, the weather in Paris can be unpredictable, so it's always a good idea to pack layers and be prepared for any conditions.\"}, {'role': 'user', 'content': 'Should I bring an umbrella?'}, {'role': 'assistant', 'content': \"\\nIt's always a good idea to check the weather forecast before heading out, especially in Paris where the weather can be unpredictable. While it's impossible for me to provide you with the exact weather conditions at the moment, I can suggest some general tips on when to bring an umbrella in Paris:\\n\\n1. Spring and Autumn: These are the most likely seasons to experience rain in Paris, so it's a good idea to pack an umbrella during these months (March to May and September to November).\\n2. Summer: While it's less likely to rain in the summer months (June to August), it can still happen, especially in the late afternoon or evening. Bringing an umbrella during this time is a good precautionary measure.\\n3. Winter: Paris can experience occasional snow and rain during the winter months (December to February). While the chances of rain are lower than in other seasons, it's still possible, so it's best to bring an umbrella just in case.\\n\\nRemember, the weather in Paris can be unpredictable, so it's always better to be prepared with a lightweight and compact umbrella that you can easily carry with you.\"}]\n"
231 | ]
232 | }
233 | ],
234 | "source": [
235 | "print(conversation_history)"
236 | ]
237 | },
238 | {
239 | "cell_type": "code",
240 | "execution_count": 7,
241 | "id": "bbb3eb4e-1eb2-47e8-83a2-988a83c8a1b0",
242 | "metadata": {},
243 | "outputs": [
244 | {
245 | "data": {
246 | "text/html": [
247 | "Response from ollama:"
248 | ],
249 | "text/plain": [
250 | ""
251 | ]
252 | },
253 | "metadata": {},
254 | "output_type": "display_data"
255 | },
256 | {
257 | "name": "stdout",
258 | "output_type": "stream",
259 | "text": [
260 | "{'model': 'llama2', 'created_at': '2024-09-21T11:15:39.997800641Z', 'message': {'role': 'assistant', 'content': \"\\nSure, here's another one:\\n\\nWhy did the mathematician break up with his girlfriend?\\n\\nBecause she couldn't solve their problems!\"}, 'done': True, 'total_duration': 9132486946, 'load_duration': 2658324, 'prompt_eval_count': 14, 'prompt_eval_duration': 2013336000, 'eval_count': 38, 'eval_duration': 6986761000}\n"
261 | ]
262 | }
263 | ],
264 | "source": [
265 | "from IPython.display import display, HTML\n",
266 | "display(HTML(\"Response from ollama:\"))\n",
267 | "print(response)\n"
268 | ]
269 | }
270 | ],
271 | "metadata": {
272 | "kernelspec": {
273 | "display_name": "Python 3 (ipykernel)",
274 | "language": "python",
275 | "name": "python3"
276 | },
277 | "language_info": {
278 | "codemirror_mode": {
279 | "name": "ipython",
280 | "version": 3
281 | },
282 | "file_extension": ".py",
283 | "mimetype": "text/x-python",
284 | "name": "python",
285 | "nbconvert_exporter": "python",
286 | "pygments_lexer": "ipython3",
287 | "version": "3.10.12"
288 | }
289 | },
290 | "nbformat": 4,
291 | "nbformat_minor": 5
292 | }
293 |
--------------------------------------------------------------------------------
/tutorial3/3_2_transformer_example.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 6,
6 | "metadata": {
7 | "executionInfo": {
8 | "elapsed": 2,
9 | "status": "ok",
10 | "timestamp": 1724522549414,
11 | "user": {
12 | "displayName": "Roland Potthast",
13 | "userId": "09141136587533247770"
14 | },
15 | "user_tz": -120
16 | },
17 | "id": "L3rPyneK69Zy"
18 | },
19 | "outputs": [],
20 | "source": [
21 | "import torch\n",
22 | "import torch.nn as nn\n",
23 | "import torch.optim as optim\n",
24 | "from torch.utils.data import Dataset, DataLoader\n",
25 | "import numpy as np\n",
26 | "import re\n"
27 | ]
28 | },
29 | {
30 | "cell_type": "code",
31 | "execution_count": 61,
32 | "metadata": {
33 | "colab": {
34 | "base_uri": "https://localhost:8080/"
35 | },
36 | "executionInfo": {
37 | "elapsed": 237,
38 | "status": "ok",
39 | "timestamp": 1724523434987,
40 | "user": {
41 | "displayName": "Roland Potthast",
42 | "userId": "09141136587533247770"
43 | },
44 | "user_tz": -120
45 | },
46 | "id": "3S3MIiE967Vz",
47 | "outputId": "b9fb14d4-1064-4d07-e60e-1add74b1a4fa"
48 | },
49 | "outputs": [],
50 | "source": [
51 | "# Example Dataset\n",
52 | "sentences = [\n",
53 | " \"The sky is clear, and the sun is shining brightly.\",\n",
54 | " \"Tomorrow's forecast predicts a chance of thunderstorms.\",\n",
55 | " \"The temperature is expected to drop below freezing tonight.\",\n",
56 | " \"The weather is perfect for a day at the beach.\",\n",
57 | " \"Strong winds are causing power outages across the region.\",\n",
58 | " \"A hurricane is approaching the coastline, and residents are advised to evacuate.\",\n",
59 | " \"There is a severe weather warning in effect until midnight.\",\n",
60 | " \"The sunset painted the sky with hues of orange and pink.\",\n",
61 | " \"The heatwave has broken temperature records this year.\",\n",
62 | " \"It's a cloudy day with a chance of light showers in the afternoon.\",\n",
63 | " \"The weather has been unpredictable lately, changing from sunny to rainy within hours.\",\n",
64 | " \"The spring blossoms are early this year due to mild weather.\",\n",
65 | " \"People are enjoying outdoor concerts as the nights get warmer.\",\n",
66 | " \"A warm breeze carried the scent of blooming flowers through the air.\",\n",
67 | " \"A heat advisory has been issued for the upcoming days.\",\n",
68 | " \"The local weather station reported record high temperatures today.\",\n",
69 | " \"A cool breeze is a welcome relief from the afternoon sun.\",\n",
70 | " \"Unexpected weather changes have become a common theme this year.\",\n",
71 | " \"The windchill factor makes it feel much colder outside.\",\n",
72 | "]\n",
73 | "\n",
74 | "# Build vocabulary mapping words to IDs\n",
75 | "def build_vocab(sentences):\n",
76 | " vocab = {\"\": 0, \"\": 1}\n",
77 | " index = 2\n",
78 | " for sentence in sentences:\n",
79 | " for word in sentence.lower().split():\n",
80 | " if word not in vocab:\n",
81 | " vocab[word] = index\n",
82 | " index += 1\n",
83 | " return vocab\n",
84 | "\n",
85 | "vocab = build_vocab(sentences)\n",
86 | "vocab_size = len(vocab)\n",
87 | "padding_idx = vocab[\"\"]"
88 | ]
89 | },
90 | {
91 | "cell_type": "code",
92 | "execution_count": null,
93 | "metadata": {},
94 | "outputs": [],
95 | "source": [
96 | "# Tokenization function\n",
97 | "def tokenize_sentence(sentence, vocab):\n",
98 | " return [vocab.get(word.lower(), vocab[\"\"]) for word in sentence.split()]\n",
99 | "\n",
100 | "# Padding function\n",
101 | "def pad_sequence(seq, max_len, pad_value=0):\n",
102 | " return seq + [pad_value] * (max_len - len(seq)) if len(seq) < max_len else seq[:max_len]\n",
103 | "\n",
104 | "# Dataset class\n",
105 | "class TextDataset(Dataset):\n",
106 | " def __init__(self, sentences, vocab, max_len):\n",
107 | " self.max_len = max_len\n",
108 | " self.vocab = vocab\n",
109 | " self.data = [tokenize_sentence(sentence, vocab) for sentence in sentences]\n",
110 | " \n",
111 | " def __len__(self):\n",
112 | " return len(self.data)\n",
113 | " \n",
114 | " def __getitem__(self, idx):\n",
115 | " seq = self.data[idx]\n",
116 | " x = seq[:-1] # Input sequence\n",
117 | " y = seq[1:] # Target sequence (shifted by one)\n",
118 | " x_padded = pad_sequence(x, self.max_len)\n",
119 | " y_padded = pad_sequence(y, self.max_len)\n",
120 | " return torch.tensor(x_padded, dtype=torch.long), torch.tensor(y_padded, dtype=torch.long)"
121 | ]
122 | },
123 | {
124 | "cell_type": "code",
125 | "execution_count": null,
126 | "metadata": {},
127 | "outputs": [],
128 | "source": [
129 | "# Transformer model components\n",
130 | "class PositionalEncoding(nn.Module):\n",
131 | " def __init__(self, d_model, max_len):\n",
132 | " super(PositionalEncoding, self).__init__()\n",
133 | " pe = torch.zeros(max_len, d_model)\n",
134 | " position = torch.arange(0, max_len).unsqueeze(1).float()\n",
135 | " div_term = torch.exp(torch.arange(0, d_model, 2).float() * (-math.log(10000.0) / d_model))\n",
136 | " pe[:, 0::2] = torch.sin(position * div_term) # Even indices\n",
137 | " pe[:, 1::2] = torch.cos(position * div_term) # Odd indices\n",
138 | " self.register_buffer('pe', pe.unsqueeze(0))\n",
139 | "\n",
140 | " def forward(self, x):\n",
141 | " x = x + self.pe[:, :x.size(1)].to(x.device)\n",
142 | " return x"
143 | ]
144 | },
145 | {
146 | "cell_type": "code",
147 | "execution_count": null,
148 | "metadata": {},
149 | "outputs": [],
150 | "source": [
151 | "class TransformerModel(nn.Module):\n",
152 | " def __init__(self, vocab_size, d_model, nhead, num_layers, dim_feedforward, max_len, padding_idx):\n",
153 | " super(TransformerModel, self).__init__()\n",
154 | " self.embedding = nn.Embedding(vocab_size, d_model, padding_idx=padding_idx)\n",
155 | " self.pos_encoder = PositionalEncoding(d_model, max_len)\n",
156 | " encoder_layer = nn.TransformerEncoderLayer(d_model, nhead, dim_feedforward)\n",
157 | " self.transformer_encoder = nn.TransformerEncoder(encoder_layer, num_layers)\n",
158 | " self.fc_out = nn.Linear(d_model, vocab_size)\n",
159 | " self.d_model = d_model\n",
160 | "\n",
161 | " def forward(self, src):\n",
162 | " src_mask = self.generate_square_subsequent_mask(src.size(1)).to(src.device)\n",
163 | " src_pad_mask = (src == padding_idx).to(src.device)\n",
164 | " src = self.embedding(src) * math.sqrt(self.d_model)\n",
165 | " src = self.pos_encoder(src)\n",
166 | " output = self.transformer_encoder(src.transpose(0, 1), mask=src_mask, src_key_padding_mask=src_pad_mask)\n",
167 | " output = self.fc_out(output)\n",
168 | " return output.transpose(0, 1)\n",
169 | "\n",
170 | " def generate_square_subsequent_mask(self, sz):\n",
171 | " mask = torch.triu(torch.ones(sz, sz) * float('-inf'), diagonal=1)\n",
172 | " return mask"
173 | ]
174 | },
175 | {
176 | "cell_type": "code",
177 | "execution_count": 85,
178 | "metadata": {},
179 | "outputs": [
180 | {
181 | "name": "stdout",
182 | "output_type": "stream",
183 | "text": [
184 | "Epoch [5/200], Loss: 4.2328\n",
185 | "Epoch [10/200], Loss: 3.4469\n",
186 | "Epoch [15/200], Loss: 2.7120\n",
187 | "Epoch [20/200], Loss: 2.0709\n",
188 | "Epoch [25/200], Loss: 1.5228\n",
189 | "Epoch [30/200], Loss: 1.1387\n",
190 | "Epoch [35/200], Loss: 0.9051\n",
191 | "Epoch [40/200], Loss: 0.6911\n",
192 | "Epoch [45/200], Loss: 0.5742\n",
193 | "Epoch [50/200], Loss: 0.4863\n",
194 | "Epoch [55/200], Loss: 0.4248\n",
195 | "Epoch [60/200], Loss: 0.3807\n",
196 | "Epoch [65/200], Loss: 0.3357\n",
197 | "Epoch [70/200], Loss: 0.2951\n",
198 | "Epoch [75/200], Loss: 0.2873\n",
199 | "Epoch [80/200], Loss: 0.2627\n",
200 | "Epoch [85/200], Loss: 0.2357\n",
201 | "Epoch [90/200], Loss: 0.2160\n",
202 | "Epoch [95/200], Loss: 0.2161\n",
203 | "Epoch [100/200], Loss: 0.1970\n",
204 | "Epoch [105/200], Loss: 0.2107\n",
205 | "Epoch [110/200], Loss: 0.1925\n",
206 | "Epoch [115/200], Loss: 0.1997\n",
207 | "Epoch [120/200], Loss: 0.1923\n",
208 | "Epoch [125/200], Loss: 0.1806\n",
209 | "Epoch [130/200], Loss: 0.1879\n",
210 | "Epoch [135/200], Loss: 0.2028\n",
211 | "Epoch [140/200], Loss: 0.1772\n",
212 | "Epoch [145/200], Loss: 0.1941\n",
213 | "Epoch [150/200], Loss: 0.1680\n",
214 | "Epoch [155/200], Loss: 0.1955\n",
215 | "Epoch [160/200], Loss: 0.1798\n",
216 | "Epoch [165/200], Loss: 0.1839\n",
217 | "Epoch [170/200], Loss: 0.1731\n",
218 | "Epoch [175/200], Loss: 0.1766\n",
219 | "Epoch [180/200], Loss: 0.1682\n",
220 | "Epoch [185/200], Loss: 0.1688\n",
221 | "Epoch [190/200], Loss: 0.1836\n",
222 | "Epoch [195/200], Loss: 0.1751\n",
223 | "Epoch [200/200], Loss: 0.1599\n"
224 | ]
225 | }
226 | ],
227 | "source": [
228 | "# Hyperparameters\n",
229 | "max_len = 15\n",
230 | "batch_size = 2\n",
231 | "d_model = 64\n",
232 | "nhead = 4\n",
233 | "num_layers = 2\n",
234 | "dim_feedforward = 128\n",
235 | "num_epochs = 200\n",
236 | "\n",
237 | "# Dataset and DataLoader\n",
238 | "dataset = TextDataset(sentences, vocab, max_len)\n",
239 | "dataloader = DataLoader(dataset, batch_size=batch_size, shuffle=True)\n",
240 | "\n",
241 | "# Initialize model, criterion, and optimizer\n",
242 | "model = TransformerModel(vocab_size, d_model, nhead, num_layers, dim_feedforward, max_len, padding_idx)\n",
243 | "criterion = nn.CrossEntropyLoss(ignore_index=padding_idx)\n",
244 | "optimizer = optim.Adam(model.parameters(), lr=0.0005)\n",
245 | "\n",
246 | "# Training loop\n",
247 | "for epoch in range(1, num_epochs + 1):\n",
248 | " model.train()\n",
249 | " total_loss = 0\n",
250 | " for x_batch, y_batch in dataloader:\n",
251 | " optimizer.zero_grad()\n",
252 | " output = model(x_batch)\n",
253 | " output = output.reshape(-1, vocab_size)\n",
254 | " y_batch = y_batch.view(-1)\n",
255 | " loss = criterion(output, y_batch)\n",
256 | " loss.backward()\n",
257 | " optimizer.step()\n",
258 | " total_loss += loss.item()\n",
259 | " avg_loss = total_loss / len(dataloader)\n",
260 | " if (epoch%5==0):\n",
261 | " print(f\"Epoch [{epoch}/{num_epochs}], Loss: {avg_loss:.4f}\")"
262 | ]
263 | },
264 | {
265 | "cell_type": "code",
266 | "execution_count": 90,
267 | "metadata": {},
268 | "outputs": [
269 | {
270 | "name": "stdout",
271 | "output_type": "stream",
272 | "text": [
273 | "\n",
274 | "Generated Text:\n",
275 | "The weather is perfect for a day at the beach.\n",
276 | "\n"
277 | ]
278 | }
279 | ],
280 | "source": [
281 | "# Text generation function\n",
282 | "def generate_text(model, vocab, start_text, max_len):\n",
283 | " model.eval()\n",
284 | " words = start_text.lower().split()\n",
285 | " input_ids = [vocab.get(word, vocab[\"\"]) for word in words]\n",
286 | " generated = words.copy()\n",
287 | " generated[0]=generated[0].capitalize()\n",
288 | " input_seq = torch.tensor([pad_sequence(input_ids, max_len)], dtype=torch.long)\n",
289 | " with torch.no_grad():\n",
290 | " for _ in range(max_len - len(input_ids)):\n",
291 | " output = model(input_seq)\n",
292 | " next_token_logits = output[0, len(generated) - 1, :]\n",
293 | " next_token_id = torch.argmax(next_token_logits).item()\n",
294 | " next_word = [word for word, idx in vocab.items() if idx == next_token_id][0]\n",
295 | " generated.append(next_word)\n",
296 | " input_seq[0, len(generated) - 1] = next_token_id\n",
297 | " if next_token_id == vocab[\"\"] or next_token_id == vocab[\"\"] or any([s in next_word for s in {'.', '!', '?'}]):\n",
298 | " break\n",
299 | " return ' '.join(generated)\n",
300 | "\n",
301 | "# Generate text\n",
302 | "start_text = \"The weather\"\n",
303 | "words=start_text.lower().split()\n",
304 | "generated_text = generate_text(model, vocab, start_text, max_len)\n",
305 | "print(\"\\nGenerated Text:\")\n",
306 | "print(generated_text+\"\\n\")"
307 | ]
308 | }
309 | ],
310 | "metadata": {
311 | "colab": {
312 | "authorship_tag": "ABX9TyOqzrZ2Ox1pYCh9SvUgAsLy",
313 | "provenance": []
314 | },
315 | "kernelspec": {
316 | "display_name": "Python 3 (ipykernel)",
317 | "language": "python",
318 | "name": "python3"
319 | },
320 | "language_info": {
321 | "codemirror_mode": {
322 | "name": "ipython",
323 | "version": 3
324 | },
325 | "file_extension": ".py",
326 | "mimetype": "text/x-python",
327 | "name": "python",
328 | "nbconvert_exporter": "python",
329 | "pygments_lexer": "ipython3",
330 | "version": "3.11.6"
331 | }
332 | },
333 | "nbformat": 4,
334 | "nbformat_minor": 4
335 | }
336 |
--------------------------------------------------------------------------------
/tutorial3/3_2_transformer_example_01.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {
6 | "id": "rELGSFIO9r-8"
7 | },
8 | "source": []
9 | },
10 | {
11 | "cell_type": "code",
12 | "execution_count": 1,
13 | "metadata": {
14 | "executionInfo": {
15 | "elapsed": 2,
16 | "status": "ok",
17 | "timestamp": 1724522549414,
18 | "user": {
19 | "displayName": "Roland Potthast",
20 | "userId": "09141136587533247770"
21 | },
22 | "user_tz": -120
23 | },
24 | "id": "L3rPyneK69Zy"
25 | },
26 | "outputs": [],
27 | "source": [
28 | "import torch\n",
29 | "import torch.nn as nn\n",
30 | "import torch.optim as optim\n",
31 | "from torch.utils.data import Dataset, DataLoader"
32 | ]
33 | },
34 | {
35 | "cell_type": "code",
36 | "execution_count": 2,
37 | "metadata": {
38 | "colab": {
39 | "base_uri": "https://localhost:8080/"
40 | },
41 | "executionInfo": {
42 | "elapsed": 237,
43 | "status": "ok",
44 | "timestamp": 1724523434987,
45 | "user": {
46 | "displayName": "Roland Potthast",
47 | "userId": "09141136587533247770"
48 | },
49 | "user_tz": -120
50 | },
51 | "id": "3S3MIiE967Vz",
52 | "outputId": "b9fb14d4-1064-4d07-e60e-1add74b1a4fa"
53 | },
54 | "outputs": [
55 | {
56 | "name": "stdout",
57 | "output_type": "stream",
58 | "text": [
59 | "vocab_size = 56\n"
60 | ]
61 | }
62 | ],
63 | "source": [
64 | "# Define the Vocabulary\n",
65 | "vocab = {\n",
66 | " 0: \"\",\n",
67 | " 1: \"I\", 2: \"am\", 3: \"you\", 4: \"is\", 5: \"we\", 6: \"are\", 7: \"a\", 8: \"an\", 9: \"the\",\n",
68 | " 10: \"simple\", 11: \"example\", 12: \"with\", 13: \"and\", 14: \"but\", 15: \"or\",\n",
69 | " 16: \"not\", 17: \"only\", 18: \"also\", 19: \"how\", 20: \"what\", 21: \"why\", 22: \"can\",\n",
70 | " 23: \"must\", 24: \"should\", 25: \"want\", 26: \"has\", 27: \"have\", 28: \"had\",\n",
71 | " 29: \"to\", 30: \"home\", 31: \"play\", 32: \"in\", 33: \"garden\", 34: \"weather\",\n",
72 | " 35: \"nice\", 36: \"drives\", 37: \"Berlin\", 38: \"reads\", 39: \"book\", 40: \"she\",\n",
73 | " 41: \"he\", 42: \"go\", 43: \"hungry\", 44: \"tired\", 45: \"happy\", 46: \"sad\",\n",
74 | " 47: \"it\", 48: \"good\", 49: \"this\", 50: \"bad\", 51: \"eat\", 52: \"drink\", 53: \"come\",\n",
75 | " 54: \"they\", 55: \"was\"\n",
76 | "}\n",
77 | "vocab_size = len(vocab) # or set it explicitly to the highest index in your vocab dictionary\n",
78 | "print(\"vocab_size = \", vocab_size)\n",
79 | "\n",
80 | "# Example Dataset\n",
81 | "sentences = [\n",
82 | " \"I am hungry\",\n",
83 | " \"you are tired\",\n",
84 | " \"we are happy\",\n",
85 | " \"they are sad\",\n",
86 | " \"it is simple\",\n",
87 | " \"the weather is nice\",\n",
88 | " \"this is bad\",\n",
89 | " \"this was good\",\n",
90 | " \"we want to eat\",\n",
91 | " \"they want to drink\",\n",
92 | " \"you can come\",\n",
93 | " \"we go home\",\n",
94 | " \"they play in the garden\",\n",
95 | " \"the weather is nice\",\n",
96 | " \"he drives to Berlin\",\n",
97 | " \"she reads a book\"\n",
98 | "]"
99 | ]
100 | },
101 | {
102 | "cell_type": "code",
103 | "execution_count": 3,
104 | "metadata": {
105 | "executionInfo": {
106 | "elapsed": 1,
107 | "status": "ok",
108 | "timestamp": 1724523435712,
109 | "user": {
110 | "displayName": "Roland Potthast",
111 | "userId": "09141136587533247770"
112 | },
113 | "user_tz": -120
114 | },
115 | "id": "kyNbO4gzcz2N"
116 | },
117 | "outputs": [],
118 | "source": [
119 | "import torch\n",
120 | "import torch.nn as nn\n",
121 | "import torch.nn.functional as F\n",
122 | "import math\n",
123 | "\n",
124 | "# Define the Positional Encoding\n",
125 | "class PositionalEncoding(nn.Module):\n",
126 | " def __init__(self, d_model, max_len=5000):\n",
127 | " super(PositionalEncoding, self).__init__()\n",
128 | " pe = torch.zeros(max_len, d_model)\n",
129 | " position = torch.arange(0, max_len, dtype=torch.float).unsqueeze(1)\n",
130 | " div_term = torch.exp(torch.arange(0, d_model, 2).float() * (-math.log(10000.0) / d_model))\n",
131 | " pe[:, 0::2] = torch.sin(position * div_term)\n",
132 | " pe[:, 1::2] = torch.cos(position * div_term)\n",
133 | " pe = pe.unsqueeze(0).transpose(0, 1)\n",
134 | " self.register_buffer('pe', pe)\n",
135 | "\n",
136 | " def forward(self, x):\n",
137 | " return x + self.pe[:x.size(0), :]\n",
138 | "\n",
139 | "# Define the Self-Attention layer\n",
140 | "class SelfAttention(nn.Module):\n",
141 | " def __init__(self, d_model, num_heads):\n",
142 | " super(SelfAttention, self).__init__()\n",
143 | " assert d_model % num_heads == 0, \"d_model must be divisible by num_heads\"\n",
144 | "\n",
145 | " self.d_k = d_model // num_heads\n",
146 | " self.num_heads = num_heads\n",
147 | "\n",
148 | " self.q_linear = nn.Linear(d_model, d_model)\n",
149 | " self.k_linear = nn.Linear(d_model, d_model)\n",
150 | " self.v_linear = nn.Linear(d_model, d_model)\n",
151 | " self.out_linear = nn.Linear(d_model, d_model)\n",
152 | "\n",
153 | " def forward(self, x):\n",
154 | " batch_size = x.size(0)\n",
155 | "\n",
156 | " # Linear transformation and splitting into heads\n",
157 | " q = self.q_linear(x).view(batch_size, -1, self.num_heads, self.d_k).transpose(1, 2)\n",
158 | " k = self.k_linear(x).view(batch_size, -1, self.num_heads, self.d_k).transpose(1, 2)\n",
159 | " v = self.v_linear(x).view(batch_size, -1, self.num_heads, self.d_k).transpose(1, 2)\n",
160 | "\n",
161 | " # Compute attention\n",
162 | " scores = torch.matmul(q, k.transpose(-2, -1)) / math.sqrt(self.d_k)\n",
163 | " attention = F.softmax(scores, dim=-1)\n",
164 | "\n",
165 | " # Apply attention to the values\n",
166 | " x = torch.matmul(attention, v).transpose(1, 2).contiguous().view(batch_size, -1, self.num_heads * self.d_k)\n",
167 | "\n",
168 | " # Linear transformation of the output\n",
169 | " return self.out_linear(x)\n",
170 | "\n",
171 | "# Define the Feedforward network\n",
172 | "class FeedForward(nn.Module):\n",
173 | " def __init__(self, d_model, d_ff=2048):\n",
174 | " super(FeedForward, self).__init__()\n",
175 | " self.linear1 = nn.Linear(d_model, d_ff)\n",
176 | " self.linear2 = nn.Linear(d_ff, d_model)\n",
177 | "\n",
178 | " def forward(self, x):\n",
179 | " return self.linear2(F.relu(self.linear1(x)))\n",
180 | "\n",
181 | "# Define the Transformer Block\n",
182 | "class TransformerBlock(nn.Module):\n",
183 | " def __init__(self, d_model, num_heads, d_ff):\n",
184 | " super(TransformerBlock, self).__init__()\n",
185 | " self.attention = SelfAttention(d_model, num_heads)\n",
186 | " self.norm1 = nn.LayerNorm(d_model)\n",
187 | " self.norm2 = nn.LayerNorm(d_model)\n",
188 | " self.ff = FeedForward(d_model, d_ff)\n",
189 | "\n",
190 | " def forward(self, x):\n",
191 | " # Self-Attention + Residual Connection + Normalization\n",
192 | " attention_out = self.attention(x)\n",
193 | " x = self.norm1(x + attention_out)\n",
194 | "\n",
195 | " # Feedforward + Residual Connection + Normalization\n",
196 | " ff_out = self.ff(x)\n",
197 | " x = self.norm2(x + ff_out)\n",
198 | "\n",
199 | " return x\n",
200 | "\n",
201 | "# Define the Transformer\n",
202 | "class SimpleTransformer(nn.Module):\n",
203 | " def __init__(self, d_model, num_heads, num_layers, vocab_size, max_len, d_ff=2048):\n",
204 | " super(SimpleTransformer, self).__init__()\n",
205 | " self.embedding = nn.Embedding(vocab_size, d_model)\n",
206 | " self.positional_encoding = PositionalEncoding(d_model, max_len)\n",
207 | " self.layers = nn.ModuleList([TransformerBlock(d_model, num_heads, d_ff) for _ in range(num_layers)])\n",
208 | " self.fc_out = nn.Linear(d_model, vocab_size)\n",
209 | "\n",
210 | " def forward(self, x):\n",
211 | " # Embedding + Positional Encoding\n",
212 | " x = self.embedding(x)\n",
213 | " x = self.positional_encoding(x)\n",
214 | "\n",
215 | " # Pass through the Transformer layers\n",
216 | " for layer in self.layers:\n",
217 | " x = layer(x)\n",
218 | "\n",
219 | " # Output layer\n",
220 | " return self.fc_out(x)\n"
221 | ]
222 | },
223 | {
224 | "cell_type": "code",
225 | "execution_count": 4,
226 | "metadata": {
227 | "executionInfo": {
228 | "elapsed": 1,
229 | "status": "ok",
230 | "timestamp": 1724523436074,
231 | "user": {
232 | "displayName": "Roland Potthast",
233 | "userId": "09141136587533247770"
234 | },
235 | "user_tz": -120
236 | },
237 | "id": "EkZ3iw0mULFn"
238 | },
239 | "outputs": [],
240 | "source": [
241 | "import torch\n",
242 | "import torch.nn as nn\n",
243 | "import torch.optim as optim\n",
244 | "from torch.utils.data import Dataset, DataLoader\n",
245 | "\n",
246 | "# Function for tokenization\n",
247 | "def tokenize_sentence(sentence, vocab):\n",
248 | " return [key for word in sentence.split() for key, value in vocab.items() if value == word]\n",
249 | "\n",
250 | "# Adjusted padding function\n",
251 | "def pad_sequence(seq, max_len, pad_value=0):\n",
252 | " if len(seq) < max_len:\n",
253 | " return seq + [pad_value] * (max_len - len(seq))\n",
254 | " else:\n",
255 | " return seq[:max_len]\n",
256 | "\n",
257 | "class SimpleDataset(Dataset):\n",
258 | " def __init__(self, sentences, vocab, max_len):\n",
259 | " self.sentences = sentences\n",
260 | " self.vocab = vocab\n",
261 | " self.max_len = max_len\n",
262 | " self.data = [tokenize_sentence(sentence, vocab) for sentence in sentences]\n",
263 | "\n",
264 | " def __len__(self):\n",
265 | " return len(self.data)\n",
266 | "\n",
267 | " def __getitem__(self, idx):\n",
268 | " # Get the tokenized and padded sentence\n",
269 | " sequence = self.data[idx]\n",
270 | "\n",
271 | " # Prepare x (all tokens except the last one)\n",
272 | " x = sequence[:-1]\n",
273 | "\n",
274 | " # Prepare y (all tokens except the first one)\n",
275 | " y = sequence\n",
276 | "\n",
277 | " # Ensure both x and y are padded to the same length\n",
278 | " x_padded = pad_sequence(x, self.max_len)\n",
279 | " y_padded = pad_sequence(y, self.max_len)\n",
280 | "\n",
281 | " return torch.tensor(x_padded), torch.tensor(y_padded)\n",
282 | "\n",
283 | "# Dataset and DataLoader Setup\n",
284 | "max_len = 6 # Maximum sequence length\n",
285 | "dataset = SimpleDataset(sentences, vocab, max_len)\n",
286 | "dataloader = DataLoader(dataset, batch_size=6, shuffle=True)\n",
287 | "\n",
288 | "# Model, loss function, and optimizer\n",
289 | "vocab_size = len(vocab) # Adjust to the size of the vocabulary\n",
290 | "d_model = 32 # Smaller model dimension\n",
291 | "num_heads = 2 # Fewer heads in multi-head attention\n",
292 | "num_layers = 2 # Number of Transformer layers\n",
293 | "model = SimpleTransformer(d_model, num_heads, num_layers, vocab_size, max_len)\n",
294 | "\n",
295 | "# Initialize weights\n",
296 | "def initialize_weights(m):\n",
297 | " if isinstance(m, nn.Linear):\n",
298 | " nn.init.xavier_uniform_(m.weight)\n",
299 | " if m.bias is not None:\n",
300 | " nn.init.zeros_(m.bias)\n",
301 | "\n",
302 | "model.apply(initialize_weights)\n",
303 | "\n",
304 | "criterion = nn.CrossEntropyLoss(ignore_index=0) # Ignore padding index\n",
305 | "optimizer = optim.Adam(model.parameters(), lr=0.001) # Reduced learning rate\n"
306 | ]
307 | },
308 | {
309 | "cell_type": "code",
310 | "execution_count": 5,
311 | "metadata": {
312 | "colab": {
313 | "base_uri": "https://localhost:8080/"
314 | },
315 | "executionInfo": {
316 | "elapsed": 4261,
317 | "status": "ok",
318 | "timestamp": 1724523441177,
319 | "user": {
320 | "displayName": "Roland Potthast",
321 | "userId": "09141136587533247770"
322 | },
323 | "user_tz": -120
324 | },
325 | "id": "LZTj5uo34NB_",
326 | "outputId": "a4d5cee5-1273-4c03-e687-17243e884e5e"
327 | },
328 | "outputs": [
329 | {
330 | "name": "stdout",
331 | "output_type": "stream",
332 | "text": [
333 | "Epoch 1/101, Loss: 4.26396385828654\n",
334 | "Epoch 101/101, Loss: 0.025235851605733235\n"
335 | ]
336 | }
337 | ],
338 | "source": [
339 | "# Training loop\n",
340 | "num_epochs = 101 # Fewer epochs\n",
341 | "# Initialize a list to store the loss values\n",
342 | "loss_history = []\n",
343 | "\n",
344 | "n = 0 # Initialize counter\n",
345 | "for epoch in range(num_epochs):\n",
346 | " model.train()\n",
347 | " epoch_loss = 0\n",
348 | "\n",
349 | " for x, y in dataloader:\n",
350 | " optimizer.zero_grad()\n",
351 | " output = model(x)\n",
352 | " loss = criterion(output.view(-1, vocab_size), y.view(-1))\n",
353 | "\n",
354 | " if torch.isnan(loss):\n",
355 | " # NaN detected, stopping training.\n",
356 | " break\n",
357 | "\n",
358 | " loss.backward()\n",
359 | " torch.nn.utils.clip_grad_norm_(model.parameters(), max_norm=1.0)\n",
360 | " optimizer.step()\n",
361 | " loss_history.append(loss.item())\n",
362 | "\n",
363 | " epoch_loss += loss.item()\n",
364 | "\n",
365 | " if n % 100 == 0:\n",
366 | " print(f\"Epoch {epoch+1}/{num_epochs}, Loss: {epoch_loss/len(dataloader)}\")\n",
367 | " n += 1\n"
368 | ]
369 | },
370 | {
371 | "cell_type": "code",
372 | "execution_count": 6,
373 | "metadata": {
374 | "colab": {
375 | "base_uri": "https://localhost:8080/",
376 | "height": 490
377 | },
378 | "executionInfo": {
379 | "elapsed": 671,
380 | "status": "ok",
381 | "timestamp": 1724523443800,
382 | "user": {
383 | "displayName": "Roland Potthast",
384 | "userId": "09141136587533247770"
385 | },
386 | "user_tz": -120
387 | },
388 | "id": "BWmX5Q7-XMWn",
389 | "outputId": "eca74eda-b7ed-4471-c535-c084e4a9cf9a"
390 | },
391 | "outputs": [
392 | {
393 | "name": "stdout",
394 | "output_type": "stream",
395 | "text": [
396 | "number of steps with loss recorded: 303\n"
397 | ]
398 | },
399 | {
400 | "data": {
401 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAioAAAHHCAYAAACRAnNyAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAABMuUlEQVR4nO3dd3hUddo+8PtMzSSTzKT3ECD0EjpGiiBIWQuIBcuriLu6Ku4rP9su66rorqK49r421NcCuCKrqyhFQBBClx5aIIH0OpM29fv7I5khQxIYksmcSXJ/rivXMmfOTJ45G8jt8y1HEkIIEBEREQUghdwFEBEREbWEQYWIiIgCFoMKERERBSwGFSIiIgpYDCpEREQUsBhUiIiIKGAxqBAREVHAYlAhIiKigMWgQkRERAGLQYWI6AIWLlwISZLkLoOoS2JQIQpwS5YsgSRJ2LFjh9yleGXz5s249tprERsbC61Wi9TUVPzxj39ETk6O3KV5SE1NhSRJF/xasmSJ3KUSdWkS7/VDFNiWLFmCuXPnYvv27RgxYoTc5ZzX66+/jgceeAA9evTAHXfcgfj4eBw6dAjvv/8+AOD777/HpZdeKnOV9b755htUVVW5H3///ff44osv8PLLLyMqKsp9/NJLL0VKSgrsdjuCgoLkKJWoS1PJXQARdQ6bN2/G/PnzMXbsWKxatQrBwcHu5+69916MGTMG119/PQ4cOIDw8HC/1VVdXY2QkJAmx2fOnOnxuKCgAF988QVmzpyJ1NTUJuerVPznkkgOHPoh6iR2796N6dOnIywsDHq9HpMmTcLWrVs9zrHZbHjqqafQq1cvBAUFITIyEmPHjsXq1avd5xQUFGDu3LlISkqCVqtFfHw8ZsyYgZMnT573+//973+HJEn4+OOPPUIKAPTs2ROLFy9Gfn4+3n33XQDAP//5T0iShFOnTjV5rwULFkCj0aC8vNx9LDMzE9OmTYPBYEBwcDAuu+wybN682eN1rrkkBw8exC233ILw8HCMHTvWq+t3Ps3NUZEkCffffz+WL1+O/v37Q6fTISMjA/v27QMAvPvuu0hLS0NQUBAmTJjQ7PXz5jMRdXUMKkSdwIEDBzBu3Dj89ttvePTRR/H4448jOzsbEyZMQGZmpvu8hQsX4qmnnsLEiRPxxhtv4LHHHkNKSgp27drlPue6667DihUrMHfuXLz11lv43//9X5jN5vPOMampqcHatWsxbtw4dO/evdlzZs+eDa1Wi++++w4AcOONN0KSJCxbtqzJucuWLcOUKVPcnZd169Zh/PjxMJlMePLJJ/Hss8+ioqICl19+ObZt29bk9TfccANqamrw7LPP4q677vLuIrbCL7/8goceeghz5szBwoULcejQIVx11VV488038dprr+G+++7DI488gi1btuDOO+/0eO3FfiaiLksQUUD76KOPBACxffv2Fs+ZOXOm0Gg04vjx4+5jeXl5IjQ0VIwfP959LD09XVx55ZUtvk95ebkAIF544YWLqnHPnj0CgHjggQfOe97gwYNFRESE+3FGRoYYPny4xznbtm0TAMQnn3wihBDC6XSKXr16ialTpwqn0+k+r6amRnTv3l1cccUV7mNPPvmkACBuvvnmi6pfCCFeeOEFAUBkZ2c3ec71vo0BEFqt1uP8d999VwAQcXFxwmQyuY8vWLDA470v5jMRdXXsqBB1cA6HAz/99BNmzpyJHj16uI/Hx8fjlltuwaZNm2AymQAARqMRBw4cwNGjR5t9L51OB41Gg/Xr13sMu1yI2WwGAISGhp73vNDQUHctQH2XZefOnTh+/Lj72NKlS6HVajFjxgwAwJ49e3D06FHccsstKC0tRUlJCUpKSlBdXY1JkyZh48aNcDqdHt/nnnvu8br2tpg0aZLHfJbRo0cDqO9KNb4WruMnTpwA0LrPRNRVMagQdXDFxcWoqalBnz59mjzXr18/OJ1O5ObmAgCefvppVFRUoHfv3hg0aBAeeeQR7N27132+VqvF888/jx9++AGxsbEYP348Fi9ejIKCgvPW4Pql7AosLTGbzR6/wG+44QYoFAosXboUACCEwPLly91zbQC4Q9WcOXMQHR3t8fX+++/DYrGgsrLS4/u0NPzkaykpKR6PDQYDACA5ObnZ467w15rPRNRVcRo7URcyfvx4HD9+HCtXrsRPP/2E999/Hy+//DLeeecd/OEPfwAAzJ8/H1dffTW++eYb/Pjjj3j88cexaNEirFu3DkOHDm32fdPS0qBSqTxCz7ksFguysrI8llgnJCRg3LhxWLZsGf76179i69atyMnJwfPPP+8+x9VZeOGFFzBkyJBm31uv13s81ul0Xl2PtlIqlRd1XDTsBtGaz0TUVTGoEHVw0dHRCA4ORlZWVpPnDh8+DIVC4fFf+BEREZg7dy7mzp2LqqoqjB8/HgsXLnQHFaB+lc5DDz2Ehx56CEePHsWQIUPw4osv4v/+7/+arSEkJAQTJ07EunXrcOrUKXTr1q3JOcuWLYPFYsFVV13lcXz27Nm47777kJWVhaVLlyI4OBhXX321Ry0AEBYWhsmTJ1/cxQlQnfEzEbUXDv0QdXBKpRJTpkzBypUrPZbAFhYW4vPPP8fYsWPdwyilpaUer9Xr9UhLS4PFYgFQv3qnrq7O45yePXsiNDTUfU5L/va3v0EIgTvuuAO1tbUez2VnZ+PRRx9FfHw8/vjHP3o8d91110GpVOKLL77A8uXLcdVVV3nsezJ8+HD07NkT//znPz02aHMpLi4+b12BqDN+JqL2wo4KUQfx4YcfYtWqVU2OP/DAA/jHP/6B1atXY+zYsbjvvvugUqnw7rvvwmKxYPHixe5z+/fvjwkTJmD48OGIiIjAjh078NVXX+H+++8HABw5cgSTJk3CjTfeiP79+0OlUmHFihUoLCzETTfddN76xo8fj3/+85948MEHMXjwYPfOtIcPH8Z7770Hp9OJ77//vslmbzExMZg4cSJeeuklmM1mzJ492+N5hUKB999/H9OnT8eAAQMwd+5cJCYm4syZM/j5558RFhaGb7/9trWXVRad8TMRtRuZVx0R0QW4lie39JWbmyuEEGLXrl1i6tSpQq/Xi+DgYDFx4kTx66+/erzXP/7xDzFq1ChhNBqFTqcTffv2Fc8884ywWq1CCCFKSkrEvHnzRN++fUVISIgwGAxi9OjRYtmyZV7Xu3HjRjFjxgwRFRUl1Gq1SElJEXfddZc4efJki6957733BAARGhoqamtrmz1n9+7dYtasWSIyMlJotVrRrVs3ceONN4q1a9e6z3EtIy4uLva6XpfWLE+eN2+ex7Hs7Oxml3f//PPPAoBYvnz5RX8moq6O9/ohIiKigMU5KkRERBSwGFSIiIgoYDGoEBERUcBiUCEiIqKAxaBCREREAYtBhYiIiAJWh97wzel0Ii8vD6GhoZAkSe5yiIiIyAtCCJjNZiQkJEChOH/PpEMHlby8vCZ3KSUiIqKOITc3F0lJSec9p0MHFdft4nNzc933MiEiIqLAZjKZkJyc7P49fj4dOqi4hnvCwsIYVIiIiDoYb6ZtcDItERERBSwGFSIiIgpYDCpEREQUsBhUiIiIKGAxqBAREVHAYlAhIiKigMWgQkRERAGLQYWIiIgCFoMKERERBSwGFSIiIgpYDCpEREQUsBhUiIiIKGAxqHih1uqQuwQiIqIuiUHlAg7mmZD+1E9YvOqw3KUQERF1OQwqF7D3dAWsDid251TIXQoREVGXw6ByAVUWOwDAYufwDxERkb8xqFxAtaU+oFgdTpkrISIi6noYVC6gxtrQUbExqBAREfkbg8oFuIZ+2FEhIiLyPwaVC6h2BRU7gwoREZG/MahcQFXDHBULgwoREZHfMaichxCCHRUiIiIZqeQuIBD957c8PPb1PozuEXF2Mi2XJxMREfkdOyrN0CgVMFvsKK+xuSfT2hwCTqeQuTIiIqKuhUGlGcZgNQCgvMbq3kcF4MofIiIif2NQaUZ4sAYAUFFjc89RATihloiIyN84R6UZro5KRY3V4zgn1BIREfkXg0ozDLr6oHLulBROqCUiIvIvDv00I0ithE6tbHKcHRUiIiL/YlBpQXjD8E9jnKNCRETkXwwqLTA0TKhtjB0VIiIi/2JQaYFR17SjwuXJRERE/sWg0oLwkGaGfmwMKkRERP7EoNICg66ZoR8HV/0QERH5E4NKC5qdTMuOChERkV8xqLTA2ExQ4RwVIiIi/2JQaYGxmVU/XJ5MRETkXwwqLWhu1Q+DChERkX8xqLQgPIT7qBAREcmNQaUFzXdUuOqHiIjInxhUWtB4jopSIQFgR4WIiMjfGFRaYGjUUQlvCC0MKkRERP7FoNICjUoBvVYFAIho2KWWk2mJiIj8K2CCynPPPQdJkjB//ny5S3FzbaMfGaIFwI4KERGRvwVEUNm+fTveffddDB48WO5SPPxpYi9ck56A0T0iAHAyLRERkb/JHlSqqqpw66234r333kN4eLjc5Xi4cWQyXrt5qHsIiB0VIiIi/5I9qMybNw9XXnklJk+efMFzLRYLTCaTx5c/aFT1l4lzVIiIiPxLJec3//LLL7Fr1y5s377dq/MXLVqEp556qp2rakrbEFTYUSEiIvIv2Toqubm5eOCBB/DZZ58hKCjIq9csWLAAlZWV7q/c3Nx2rrKeq6PCmxISERH5l2wdlZ07d6KoqAjDhg1zH3M4HNi4cSPeeOMNWCwWKJVKj9dotVpotVp/lwpNQx0WG4MKERGRP8kWVCZNmoR9+/Z5HJs7dy769u2LP//5z01CipxcQz8WdlSIiIj8SragEhoaioEDB3ocCwkJQWRkZJPjcnNPprVxeTIREZE/yb7qpyPQco4KERGRLGRd9XOu9evXy11CszRc9UNERCQLdlS8oFU1TKZlUCEiIvIrBhUvsKNCREQkDwYVL7hX/fBeP0RERH7FoOKFxjvTCiFkroaIiKjrYFDxgmvoxykAu5NBhYiIyF8YVLzgmkwLcJ4KERGRPzGoeMHVUQEYVIiIiPyJQcULSoUElUICwCXKRERE/sSg4iUuUSYiIvI/BhUvabhEmYiIyO8YVLx0di8VdlSIiIj8hUHFS2e30WdHhYiIyF8YVLwUrKkPKtUWBhUiIiJ/YVDxkl5bf6Ppaotd5kqIiIi6DgYVL4U0BJUqBhUiIiK/YVDxEjsqRERE/seg4qUQbcMcFSvnqBAREfkLg4qXOPRDRETkfwwqXuLQDxERkf8xqHiJHRUiIiL/Y1Dxkiuo1HAfFSIiIr9hUPGS3j2Zlh0VIiIif2FQ8VKIhkM/RERE/sag4qUQTqYlIiLyOwYVL50NKpyjQkRE5C8MKl5yzVHh0A8REZH/MKh4qfHQjxBC5mqIiIi6BgYVL7mCit0pkFtWi72nK+QtiIiIqAtgUPGSa9UPANz83lbMeHMzcstqZKyIiIio82NQ8ZJSIUGnrp+ncqaiFkIAJ0urZa6KiIioc2NQuQiu4R+XsmqrTJUQERF1DQwqF8G18selosYmUyVERERdA4PKRWBHhYiIyL8YVC7CuUGlooZBhYiIqD0xqFwE/bkdFQ79EBERtSsGlYvAjgoREZF/MahchHMn03KOChERUftiULkIjTd9A4ByBhUiIqJ2xaByEc4d+innHBUiIqJ2xaByEVyTaV3/W2tz4B/fHcTsd7fAYnfIWRoREVGnxKByEVwdld6xeqgUEgDg/U3ZyMwuw9YTZXKWRkRE1CkxqFyES3pEIClch5lDE2EM1ng8F6JRtvAqIiIiai0GlYvQI1qPTX++HLdnpCIiRO3xnMMpZKqKiIio82JQaaXwczoqNgeDChERka8xqLRSk6DidMpUCRERUefFoNJK4SHnBBU7gwoREZGvMai0Uniw5xwVDv0QERH5HoNKK0Wc01Gxc+iHiIjI5xhUWqlPXKjHYyuHfoiIiHyOQaWVxvWKxtqHLsNlvaMBcOiHiIioPTCotEHPaL17O30O/RAREfkeg0obqZT1W+lz6IeIiMj3GFTaSK2sv4R27kxLRETkcwwqbaRu6KhwHxUiIiLfY1BpI1dHxeZgUCEiIvI1BpU2cgcVDv0QERH5HINKG6k49ENERNRuGFTaSMOhHyIionbDoNJGHPohIiJqPwwqbcShHyIiovbDoNJGGu6jQkRE1G4YVNpIpWjYmZZzVIiIiHyOQaWN1KqGOSoc+iEiIvI5BpU24hb6RERE7YdBpY3cW+hz6IeIiMjnZA0qb7/9NgYPHoywsDCEhYUhIyMDP/zwg5wlXTRXR4V3TyYiIvI9WYNKUlISnnvuOezcuRM7duzA5ZdfjhkzZuDAgQNylnVROPRDRETUflRyfvOrr77a4/EzzzyDt99+G1u3bsWAAQNkquricOiHiIio/cgaVBpzOBxYvnw5qqurkZGR0ew5FosFFovF/dhkMvmrvBZx6IeIiKj9yD6Zdt++fdDr9dBqtbjnnnuwYsUK9O/fv9lzFy1aBIPB4P5KTk72c7VNqRQc+iEiImovsgeVPn36YM+ePcjMzMS9996LOXPm4ODBg82eu2DBAlRWVrq/cnNz/VxtUxoVh36IiIjai+xDPxqNBmlpaQCA4cOHY/v27Xj11Vfx7rvvNjlXq9VCq9X6u8Tzck+mdbCjQkRE5Guyd1TO5XQ6PeahBDrX0A+30CciIvI9WTsqCxYswPTp05GSkgKz2YzPP/8c69evx48//ihnWReFQz9ERETtR9agUlRUhNtvvx35+fkwGAwYPHgwfvzxR1xxxRVylnVROPRDRETUfmQNKh988IGc394nVEoO/RAREbWXgJuj0tFwwzciIqL2w6DSRpqGjooQgIN7qRAREfkUg0obuYZ+AHZViIiIfI1BpY1cQz8AgwoREZGvMai0kVrRuKPCoR8iIiJfYlBpI4VCglJxdkJtTmkNXl97FNkl1TJXRkRE1PHJvoV+Z6BWSnA4BV5bexSfZeYAAI4XV+GVm4bKXBkREVHHxo6KD7iGf1whBQDKamxylUNERNRpMKj4gFrV9DJa7Q4ZKiEiIupcGFR8oPHKH5c6G1cAERERtRWDig+oFE0vY52NHRUiIqK2YlDxAU0zQz8WOzsqREREbcWg4gMqRdOhn1orOypERERtxaDiA+pG2+hrG7ordZxMS0RE1GYMKj7QeNVPTJgWAOeoEBER+QKDig+oGw39xIQGAahf9SMEt9QnIiJqCwYVH2g89BOt17r/zAm1REREbcOg4gPNDf0AHP4hIiJqKwYVH2g89BMerHHfpJCbvhEREbUNg4oPNB76CQ1SQadWAmBHhYiIqK0YVHxA1WgL/TCdGkFqLlEmIiLyBQYVH9A06qiEBamgVbk6Khz6ISIiagsGFR9QewSVsx0V7k5LRETUNgwqPtB46Cc0SI0g1xwVDv0QERG1CYOKD3h0VHQqd1CxcDItERFRmzCo+IBC8uyonF31wzkqREREbcGg4gONh3hCg1RnV/2wo0JERNQmDCo+UNdo0qxaqYC2oaNSy6BCRETUJgwqPnBuIAni8mQiIiKfYFDxgSZBhUM/REREPsGg4gPn7pei4/JkIiIin2BQ8QGnEB6Pzy5P5tAPERFRWzCo+MDCawbAGKzGwqv7AwB3piUiIvIRldwFdAYDEgzY/fgVkBr2U+HOtERERL7BjoqPSI02fdO6N3xjUCEiImoLBpV2wJ1piYiIfINBpR1weTIREZFvMKi0g7MbvjlQWmWBOGdVEBEREXmHQaUduCbT/na6EsP/sQYvrzkqc0VEREQdE4NKO3AN/bhsPV4qUyVEREQdW6uCSm5uLk6fPu1+vG3bNsyfPx//+te/fFZYR+bqqLicLK2WqRIiIqKOrVVB5ZZbbsHPP/8MACgoKMAVV1yBbdu24bHHHsPTTz/t0wI7onODSpHZgmqLXaZqiIiIOq5WBZX9+/dj1KhRAIBly5Zh4MCB+PXXX/HZZ59hyZIlvqyvQzp36AdgV4WIiKg1WhVUbDYbtFotAGDNmjW45pprAAB9+/ZFfn6+76rroM7tqADAyZIaGSohIiLq2FoVVAYMGIB33nkHv/zyC1avXo1p06YBAPLy8hAZGenTAjuiZoMKOypEREQXrVVB5fnnn8e7776LCRMm4Oabb0Z6ejoA4D//+Y97SKgrC1I1vazZJQwqREREF6tVNyWcMGECSkpKYDKZEB4e7j5+9913Izg42GfFdVQq5dmgkhSuw+nyWpxkUCEiIrporeqo1NbWwmKxuEPKqVOn8MorryArKwsxMTE+LbCjG9crCgBwspRzVIiIiC5Wq4LKjBkz8MknnwAAKioqMHr0aLz44ouYOXMm3n77bZ8W2FFNGxCHRKMOD0zqDQAoqbLAXGcDAFRb7Phky0kUmurkLJGIiCjgtSqo7Nq1C+PGjQMAfPXVV4iNjcWpU6fwySef4LXXXvNpgR3V2/8zDBsfnYg4QxCi9PUrpI4X1w//PPmfA3hi5QHcuWS7nCUSEREFvFYFlZqaGoSGhgIAfvrpJ8yaNQsKhQKXXHIJTp065dMCOypJkqBUSACA3rF6AMCRQjMA4Kud9bv6HsgzyVMcERFRB9GqoJKWloZvvvkGubm5+PHHHzFlyhQAQFFREcLCwnxaYGfQO7Y+1B0pMMtcCRERUcfSqqDyxBNP4OGHH0ZqaipGjRqFjIwMAPXdlaFDh/q0wM6gT1xDUCmqkrkSIiKijqVVy5Ovv/56jB07Fvn5+e49VABg0qRJuPbaa31WXGfRUkclVNuqy09ERNRltPo3ZVxcHOLi4tx3UU5KSuJmby1wzVEpMNWhqNFKn/AQjVwlERERdQitGvpxOp14+umnYTAY0K1bN3Tr1g1GoxF///vf4XQ6fV1jhxcapEaiUQcAWH+k2H28uZsXEhER0Vmt6qg89thj+OCDD/Dcc89hzJgxAIBNmzZh4cKFqKurwzPPPOPTIjuD3rF6nKmoxYass0HFYmeoIyIiOp9WBZWPP/4Y77//vvuuyQAwePBgJCYm4r777mNQaUbv2FD8nFWMtYcL3ccsNgYVIiKi82nV2ENZWRn69u3b5Hjfvn1RVlbW5qI6o6EpRgBAXaNwUmd3yFQNERFRx9CqoJKeno433nijyfE33ngDgwcPbnNRndH43tHQnnNXZXZUiIiIzq9VQz+LFy/GlVdeiTVr1rj3UNmyZQtyc3Px/fff+7TAziJYo8JlvaPx08GzQz91dgeEEJAkScbKiIiIAlerOiqXXXYZjhw5gmuvvRYVFRWoqKjArFmzcODAAXz66ae+rrHTmDogzuOxEIDNIWSqhoiIKPBJQgif/ab87bffMGzYMDgc/pl7YTKZYDAYUFlZ2SG27q+osWLI06s9ju1dOAVhQWqZKiIiIvK/i/n9zY08/MgYrMGrNw3BgulnJyJzngoREVHLGFT8bMaQRPzxsp7uibUWrvwhIiJqkaxBZdGiRRg5ciRCQ0MRExODmTNnIisrS86S/CZIrQTguVyZiIiIPF3Uqp9Zs2ad9/mKioqL+uYbNmzAvHnzMHLkSNjtdvz1r3/FlClTcPDgQYSEhFzUe3U07KgQERFd2EUFFYPBcMHnb7/9dq/fb9WqVR6PlyxZgpiYGOzcuRPjx4+/mNI6HHZUiIiILuyigspHH33UXnUAACorKwEAERERzT5vsVhgsVjcj00mU7vW057YUSEiIrqwgJlM63Q6MX/+fIwZMwYDBw5s9pxFixbBYDC4v5KTk/1cpe+4Oipc9UNERNSygAkq8+bNw/79+/Hll1+2eM6CBQtQWVnp/srNzfVjhb7FjgoREdGFtWoLfV+7//778d1332Hjxo1ISkpq8TytVgutVuvHytoP56gQERFdmKxBRQiBP/3pT1ixYgXWr1+P7t27y1mOX7GjQkREdGGyBpV58+bh888/x8qVKxEaGoqCggIA9auHdDqdnKW1O63aFVTYUSEiImqJrHNU3n77bVRWVmLChAmIj493fy1dulTOsvwiSOUa+mFHhYiIqCWyD/10Ve6OCueoEBERtShgVv10NVpXR6Vhjsq8z3bh2rc2w+5gcCEiInIJiFU/XVHjjkqx2YL/7ssHAJwsrUFajF7O0oiIiAIGOyoyCWrUUTmQV+k+zjkrREREZzGoyKRxR2X/mbNBxVRnk6skIiKigMOgIpOzc1Sc2H/m7D2LzHV2uUoiIiIKOJyjIpMgd0fFgQN5Z4OKqZYdFSIiIhd2VGTi6qgUmupwpqLWfZwdFSIiorMYVGTi6qj8drrS4ziDChER0VkMKjJxdVTOZeZkWiIiIjcGFZm4OiouGmX9Y676ISIiOotBRSbndlQGJRkAcOiHiIioMQYVmZzbURnMoEJERNQEg4pMzu2opCcZAXCOChERUWMMKjLRqs5e+ii9FglGHQDAxI4KERGRG4OKTILUZzsq3aOCERpUv/ceOypERERnMajIpHFHJTnibFBhR4WIiOgsBhWZNO6oJBp1CNOpAQBWu5N3UCYiImrAoCKTxh2VBKMOeo0KklT/mCt/iIiI6jGoyEShkNx/To0MgUIhQa/hPBUiIqLGePdkGT10RW+cqajFJT0iAABhOjXMFjs7KkRERA0YVGT0p0m9PB6fXfnDoEJERARw6CegnF35w6EfIiIigEEloIQG1a/82X6yDDtPlclcDRERkfwYVAJIWENH5aPNJ3Hd21vw4aZsmSsiIiKSF4NKAHF1VFye/u4gvt+XL1M1RERE8mNQCSDVlrOTaCf1jQEArDlUKFc5REREsmNQCSBXpcdDkuqXLU8fFA8AKDZbZK6KiIhIPlyeHEAu7xuL/QunIkSrwsYjxQAYVIiIqGtjRyXAhGjrs2N0qBYAgwoREXVtDCoBKqYhqJRWW2FzOGWuhoiISB4MKgEqPFgDZcP9gEqrrDJXQ0REJA8GlQClUEiI0msAANkl1Vi+IxcVNQwsRETUtXAybQCLCQ1CocmCP3y8HdVWB64cHI83bxkmd1lERER+w45KAHNNqK22OgAA/93Lzd+IiKhrYVAJYK4JtS6hWjbAiIioa2FQCWDR5wSVeGOQTJUQERHJg0ElgJ0bVJxCpkKIiIhkwqASwM4d+qmosclUCRERkTwYVALYuR0VU60NQrCtQkREXQeDSgCLCfWck2J1OFFn4y61RETUdTCoBLBzOyoAUFnL4R8iIuo6GFQCWJBaiT9e1gO3jE5BeLAaAIMKERF1LdyYI8AtmN4PALDleCnKa2wMKkRE1KWwo9JBhOnqOyq83w8REXUlDCodhEHHoR8iIup6GFQ6CAYVIiLqihhUOgiDrn46kYlBhYiIuhAGlQ7C1VEpNFnw752nUWO1y1wRERFR+2NQ6SBcQWXpjlw8tPw3LF6VJXNFRERE7Y9BpYNwBRWXVfsL4ORdComIqJNjUOkgDDqNx+MCUx32namUqRoiIiL/YFDpIM7tqADATwcLZKiEiIjIfxhUOojGQSU0qH4F0E8HCuUqh4iIyC8YVDoIQ/DZoHL3uB6QJOBoURWKzRYZqyIiImpfDCodROOOypheUUgODwYAHCuqkqskIiKidseg0kGEaJT43aA4XN43BulJRqTF6AEAx4oZVIiIqPPi3ZM7CEmS8Natw92P02L0WHe4CMfZUSEiok6MHZUOKi26vqNypNCMOR9uw/+8nwmr3SlzVURERL7FjkoH1bNh6GfriVK49n37LPMU5o7pLmNVREREvsWOSgfl6qg03pz21bVHeXdlIiLqVBhUOihDsBpReq37cbBGiYoaG97/5YSMVREREfkWg0oHlhYTAqA+pDx77SAAwCdbTvHOykRE1GkwqHRgvWNDAQAT+8bg6vQEpEYGo7LWhmXbc2WujIiIyDcYVDqw34/tjllDE/HIlD5QKiT8flwPAMCHm09CCN5ZmYiIOj4GlQ6sW2QIXpo9BKlR9UNA1w5NBADklNXAbOHwDxERdXyyBpWNGzfi6quvRkJCAiRJwjfffCNnOR2eXquCVlX/f2llDVf/EBFRxydrUKmurkZ6ejrefPNNOcvoVIwNNy/kMmUiIuoMZN3wbfr06Zg+fbqcJXQ6Bp0ahSYLKthRISKiTqBD7UxrsVhgsVjcj00mk4zVBCajTgOAHRUiIuocOtRk2kWLFsFgMLi/kpOT5S4p4ITpOPRDRESdR4cKKgsWLEBlZaX7KzeX+4WcyzVHpaLWKnMlREREbdehhn60Wi20Wu2FT+zCDOyoEBFRJ9KhOip0YUZXUKmxwVRnQzX3UyEiog5M1o5KVVUVjh075n6cnZ2NPXv2ICIiAikpKTJW1nEZGoZ+is0WTHpxA4LUCmx4eCIUCknmyoiIiC6erEFlx44dmDhxovvxgw8+CACYM2cOlixZIlNVHZtr6GfvmUoUm+tXSBWa6xAXFgSHU0ClPNtEq6y1uc8nIiIKRLIGlQkTJvCeND7mCh6ukAIAx4qqcMt7mQgNUmHFfWOgVEj4cFM2nv7uIP5123BMGRAnV7lERETnxTkqnYwxWNPk2NpDRcguqcbe05XYfrIMALAnt8Ljf4mIiAIRg0on09xQztrDhe4//3dvPgDAVFe/Kqi8hsuYiYgocDGodDLGZoJKblmt+88/7M+Hwylgali+XF7NZcxERBS4GFQ6mbALTI4tqbIiM7sUprr6ZcvsqBARUSBjUOlklAoJoUHNz5FONOoAAJknymDm0A8REXUADCqdUEtLjsemRQGoDyemWldHhUM/REQUuBhUOiHX/X70WpU7tIQFqdA7LhQAUGSyoNbmAABU1Fi5RJyIiAIWg0on5AonSeE6dIsMBgD0ig1FREj98VNlNe5zbQ6BKm6zT0REAYpBpRMy6ur3Ukk06pAc0RBUYvTuPVZySqs9zq/g8A8REQUoBpVOKKxRR+XyPjHQqBSY1C8W4Q1Bpdrq8Di/rJoTaomIKDDJuoU+tY/fDYrD9pNluCo9ASNTIzBjSAJUSgVySmuaPZ8rf4iIKFAxqHRC43pFY82Dl7kfu25EaAxpfjUQh36IiChQceinCwnVqqBSSE2Oc+iHiIgCFYNKFyJJUrM3LayoscJcZ8O0Vzbirk92yFAZERFR8xhUupjw4KbDP+U1Nvx752kcLjBj9cFC5JY1P5eFiIjI3xhUupjwkLMdFb22fopSWbUVn2w55T6+6ViJ3+siIiJqDoNKF9O4o5LSsMfKf/fl40TJ2b1VGFSIiChQMKh0MeGN5qi4dq11SU8yAAB+PVYCp5Pb6hMRkfwYVLqYxkM/KY2CijFYjbf+Zzj0WhXKa2w4mG+SozwiIiIPDCpdTOOhn55Revefn5s1CIlGHS7pEQEA+OUoh3+IiEh+DCpdTOOhn77xoXhkah88PWMApg2MBwCMTYsCAGw6VixLfURERI1xZ9oupnFQCQtSY97ENI/nx/aKBgBsP1mOOpsDQWqlX+sjIiJqjB2VLia80Tb6rpsXNtYzOgRxYUGw2p3YfrLMn6URERE1waDSxTTuqIQGNW2oSZKEsb3qh3/WZxUjv7IWlbU2CMFVQERE5H8c+uliEsN1SIkIRnSoFmpl8zl1bFoUvtp5Gh9sysYHm7IBAJP7xeL9OSP8WSoRERGDSlejVSmx9qHLoJSa3pzQZXzvaIQFqWCqs0OpkOBwCqw5VL+1fnJEcIuvIyIi8jUO/XRBaqUCimbuouwSEaLBL3++HFsXTMKxZ6ZjdPf6JctrDhX6q0QiIiIADCrUAoNOjThDECRJwhX9YwEwqBARkf8xqNAFTepXH1QyT5ShstYmczVERNSVMKjQBXWPCkFajB52p8CP+wvcxytqrHDwnkBERNSOOJmWvDJrWCIWr8rCy2uOoNpqx782nkB+ZR3G9YrCp78fLXd5RETUSbGjQl65c0x3JBp1yK+sw1PfHkR+ZR0AYPOxEtTZHDJXR0REnRWDCnklSK3Egt/1dT9+6IreCA9WwymAI4VmvPnzMaxtmGx7sqQa5dVWuUolIqJOhEM/5LUrB8XDPlsgOlSLMWlR2Hy8BFtPlGHJ5pP4evcZGIPV+Oa+MZjy8kYMSjLg3/deKnfJRETUwbGjQl6TJAkzhyZiTMMdlvvGhQEAvtubDwCoqLHhi205sDqc2HmqHJU1XCFERERtw6BCrdYnLhQAYHU43ceW7sh1/3nvmQoAgNMpuKyZiIhahUGFWs0VVBqraNRF2ZNTAQB4fOV+DPv7aizbntvkfCIiovNhUKFW6x3bNKg09tvpClRZ7Pj3rtNwOAX+/PVerNxzxk/VERFRZ8CgQq2m16qQHKEDAPRt1F1RNtxHaE9uJX7cX4A6mxNKhQQhgMdW7EcZVwQREZGXGFSoTUZ0q79h4e/HdkeQuv7HaWKfGKgUEkqqLHh343EAwLyJaRiQEIYqix1v/nxMtnqJiKhjYVChNnn8qv74YM4IXD88CQMTDACAS3pEoF98/YqgI4VVAIBrhybi0Wn1+7B8uuUUzlTUerzPzlPleHn1EVRZ7H6snoiIAh2DCrVJRIgGk/rFQpIk/Hl6X8wekYzZI5Mxb2JPJBrrh4XG9YpC96gQjO8VhRHdwmF1OLHm4Nk7MTucAv/7xW68uvYobvsgkyuEiIjIjRu+kc+MTI3AyNT6oaBpA+MxbWA8SqosCAtSA6jfh2VMWhR2nCrHb7kV7tetPVTo7rDszqnAX7/ehzdvHeb3+omIKPCwo0LtKkqvhUZ19sdsSLIRALDndIX72KdbTwEAMnpEAgA2HSuBEJ53ZbY5nE2OERFR58egQn6V3hBUThRXo7LWhpV7zuCXoyWQJODvMwdCqZBQWWtDgakOy7bn4mihGUcLzRi88Ccs/M8BeYsnIiK/49AP+VVEiAYpEcHIKavB3I+2YVfDpnDXDk1EWowe3aNCcKyoCm+sO4bPMnOQFqPH5H6xqLU5sPK3PDx59QAoGpY/ExFR58eOCvmdq6viCil/ujwNi68bDODsfixf7TwNADhWVIUvt+cAqN/19kRJNYQQHAYiIuoiGFTI79KTDO4/3zo6BQ9N6QOVsv5H0RVULPaz9w9qvC3/Z5mnMODJH/HUtwf9VC0REcmJQYX8bmyvKEgS0D0qBI9d2c/juT4Nd2RuyUebT6LG6sCnW08h75y9WIiIqPNhUCG/6xsXhh8eGIdv5o1BsEZ1znNnt+JPi9G7d7sd1T3C4zyHU+DjLSfbvVYiIpIXJ9OSLPq20DlJNOoQolGi2urA5X1jEKpVYdWBAvx9xkBMfWUjACAsSAVTnR2fbc1BkckCnUYJCYCpzo6UCB3iDToUmS24anD8BW+cSEREgY1BhQKKQiFhZPcIrM8qxqS+MRjdIxJ/mtQLQH2H5VhRFR6/qj/e3nAcJ4qrsWJ3y3dj/s+eM1j2xwx8vOUkJvWLxbCUcH99DCIi8hFJdODlEyaTCQaDAZWVlQgLO//cBuo4SqosyCmraRIs9p6uwG+5Fbh1dDfkVdZiw5Fi1FgcqLE64BQCeq0KRwrNKKu2YnduBcqqrQgPVqO8xoYgtQLv3jYCgxINUEoScstrsCe3AhP6RCMpPFimT0pE1DVdzO9vBhXqlJZuz8Gf/73vgucZg9V465ZhuDQtyn3M5nDixZ+OIDO7FFF6Lf50eRoGJxnbsVoioq7lYn5/czItdUrXD0/GkGQjNCoFPrpjJK4cHA+p0T5xIRolksJ1qKix4fYPt+GnAwUAgGqLHX/4eAfe2XAcu3MqsPpgIe5csh1F5jqZPgkRUdfGjgp1WnW2+mGhiBCN+5jTKeAQAkpJgtXhxEPLf8N/9+ZDrZTw4BV98MP+fOw9XYkgtQKP/a4fPsvMweECM8b1isLHc0dBoZBQVm3FJ1tOoqrODo1KAX2QCjcMT0ZOWQ0WfL0XC6b3w8S+MR61CCHwwo9ZkCTg4Sl9IEncXZeIuq6L+f3NybTUaQWplQhSKz2OKRQSFKgPCUEKJV6dPQQQwH/35eP5VYcBAOHBanx4x0gMTQlHRs9IXPX6JvxytATf7s3DFf1jMefDbdh3ptLjfffmVsJUZ8ORwir87Zv9WPfwZcguqcajX+2FSiFh3sQ0vLX+OADgmvRE9InjaiQiIm+wo0Jdns3hxLIdufhq52nYHE68dtNQ9IjWu59/Y91R/POnI0gwBCE1KgS/Hi9FRIgGNwxPcm8+J0lA479Jk/vF4pejxe4ddoPUCtTZ6v/8v5N6Yf+ZSpRVWzF3TCquSU/wSYclu6QaIVolYkKD2vxeRETtiZNpiXyozubA5f9cj7zK+nkqGpUCn/9hNEak1m9CN/ejbfg5qxgAEBqkgrnO7n5t//gwHMw3ebyfRqWAtdEtAh6Z2gcT+8TgrfXHMCjRgPG9o9E7NhRKhYSjhWb8dLAQ16QnwOpwYtPREswckghDsNrjPTccKcadS7ZDq1LgpRuHYNrAuHa5FkREvsCgQuRjaw4W4v4vdiGjRyT+Mr2fx9DNluOluPm9rQCAt24dhnc2HEeRyYKHpvTGdcOScOfH27E+qxiX9ozElhOl7s7LiG7h2HGqHHqtClF6DU6W1rjfM0SjRL/4MOzJrYDdKaBVKeBwCtidAmkxenx0x0gkR9Qvqz5WVIVr39rsEZBemT0EM4cmenyGihor/vjpTvRPCMMTV/XHmkNFiAsLwqBG914C6nf9dQoBtZJz7YmofTCoELUDIUSzQzRCCDyx8gCqLHa8cP1gKCQJCsXZ88qqrfhyew5mj0jGPf+3E9tPliM6VIsNj0zA9W9vcXdcovRa9IsPxa5T5ai2OtyvTwrX4XR5/X2NgjVK1FgdkKT6bs39E9Pw/KrDOFlag5Gp4egVG4rPM3OgUyvx8Z2jUFplwYebs9EzWg+r3YmvGzbImz0iGUt35EKSgPsm9MSfLu+FILUSG48U48Fle1BSZUX3qBC8fvNQDEw0QAiBHafKkRoZguhQrcfnr7M5oFUpOEGYiLzGoEIUoL79LQ/zl+7BC9cPxqxhSVh7qBC//3gHgPpuzO8GxcPhFDhaZMbe3EokheuQ0TMS648UI0SjQlK4Dvd/vgu7cio83jfRqMPK+8cgPFiD2z/MxOZjpRdVV6JRhyEpRvy4vwB259l/EuLCgvDy7CFY8ms2fjxQiCi9Fp/cOQp1dgdSI0NwMM+Euz/dgdiwIPxhXHcMTQ5H71g9qix2fLk9Fxk9IpGebERWgRnmOhusDifyKuowNMWIno3mAV2I1e6E1eGEXuu7+f+uf/oYsIj8j0GFKIA5ncLdcRFC4OU1R6FR1q8M8vaXZpG5Di+vPoovtuUgWKPEv++9FP3i6/8OlFRZcO//7cSBPBOCNSpcOSgOX+86A7PFjmvSE7DhSDEqa20YmmLEH8b2wN+/O4gC09l9Yq5JT8AjU/vgjo+24XhxdYs1uEJDlcXucdy1HLys2gqNSoFxaVFYe7jI4xxJAq4cFI8Fv+uHIlMdjhVVQadR4uNfT6LYbMFLs4dgxa4z2J9Xif7xYVi1vwAWuxPvzxmBS3pEerxXtcWOLcdLEaHXoG9caJMbXQLAoXwTPtlyEjePSsHgJCNqrPX75VRZ7Hh/zghOQCbyMwYVoi5i56lyhAerPVYpNedEcRU2Hy/FDcOTsD6rGF9uz8HT1wxESmQwaq0OfLPnDKotdqTF6DG+VzQUCgnZJdX4/ZLtqLE60DMmBPdeloZnvj+EQ/km6NRK1Nrqh6dGpUbg8n4xWH2wEEcKzDA3BJdQrcr9Z4UEJEcEQyFJMAarsbuhI6RUSHA4vf8nKESjxJxLU1FaZcXW7FLo1EqcLq91hyWNUoERqeGI0msRZwjCxD4xOFpkxqLvD6PW5oBBp8ayP2bgzZ+P4T+/5QEABiSEoVeMHjqNEjeMSMaW4/XdqBtGJEEI4LfcChSa6jB1YBy2HC/Fy6uP4NK0KFw5KB69YvUeIWdPbgWC1IoWb7rpsu90JdYcKsQto1MQG3b+kGS1O6FRcb4QdS4MKkTULupsDuSU1aBHVAiW7TiNg/mVeHhKHxiD67sodocTW0+Uocpiw4Q+MXh+1WHsOFmOv13ZD6MbdUIO5pmw8NsD2JZdBo1KgWEpRpRX25DRMxIH803Yll2G0CAV7p+YhtPltRiSbMSK3Wew6VhJs3UlhetgsTtRbLa0WHvjcAUAKoWEEK0KlbU2rz57lF6L8hprk2A1Ji0So7tHYlt2mbu+0d0jcLq8FmqlhEFJRpRVWxAbGoQJfWOw/0wlPtiUDYdTIEqvwYQ+MThTXosBCWGIN+qglICYsCAcK6rC+qwi7MmtQFqMHi/dOAQDEw04WmjG4QIzhncLx4E8E7IKTBiRGoGkcB3Kqq04XlyFwUlGdIsIxte76q+Zuc6GJ64egLyKWmw9UYqJfWMwNNkIAMgqNMPprB/+C9OpsP5IMQoq63DD8CSoGiZUO50CR4uqUGO1Iz3J6DEHqzG7w4kiswUOp0BSuM7rDmFJlQWRIRoOw3UhHS6ovPnmm3jhhRdQUFCA9PR0vP766xg1atQFX8egQtRxCSGw81Q5ekTrPXYPrrM58O1vebikR6R7ZRMA1FodWLo9BydKqqFWKjCuVxQkSUKIRonh3epvYHmipBqZJ8pQY7Vj7+lK/Hq8BCkRwZgyIA6zhibipn9txYmSaujUSiy8pj/6xoXhxdVH0C8+FEcLq7DucBGGpRjhFPXdEYUE9GiYiJxTVr8qa9qAOGjVCvyWW4Gcsho0zi0qhQSHEPDmX9WIEA3Kqq1eXy9JAmJDgzyG6VqiVkroEaVHVqHZfUyvVXkM0yUadQgPUWP/mbPL58OCVDA1rB6bNTQRfxjXA//5LQ9f7TyNkqr6EBgXFgS70wlAwq2jU5AUrkNJlRU5ZTX46UABShs+09AUI+ZNSENeZS3MdXacLq9FdkkVLukRiWvSE+AUAuY6O9775QS+31eACX2iMXtEMr7fXwCH04mIEA3SovVwivpl/0NTjNCqlDhRUo0jBWYkRwQjPdmAIJUSn2w5hRqbHZd0j8SGI8XQqBSYc2kqEo06OJwCp8troFUp4RQCFrsTiUYdNCoF8itrseZgISRJQkbPSCQadU02iaysseFIkRlqpQJpMfom86Qqa204WmhG/4QwKBUSKmtt5x1KFA0/Hy2Fva6iQwWVpUuX4vbbb8c777yD0aNH45VXXsHy5cuRlZWFmJiY876WQYWILobN4US1xY6wIHWzvyhc84eEECiussCgU0OrUsJUZ8Oi7w9BrVTg8av6u5duny6vwRfbclBstiDeoMP1DZsA/nK0GH3iQmFzOJFVUIUovQZ7T1diV059MJs+MA4T+8Rgya8nUVlrQ2pkMPbnVcJcZ4fN4UR+ZR1iQrWY0CcGQ1OMeHXNUfywv/5+VAoJSIvR40hhFcKD1RiZGoGdp8pRbbU3LHXX4nBBfUAJ1aowd2x3/Hy4yL2b8ujuEdh3phI1DSvLNCoFQrUqd8AI0ShRZ3c26Rzp1EqoFJJ7OK8lqobrar+IIb22aGn4UKmQkByuQ2m11WPpPlAf5DRKhcfqOpdgjRIRIRoYg9UorbIiv9IzGCYYgqBQSKizOTEk2Yht2aUw1dmhVSkgBGB1OJEWo0fP6BBY7E6U19hQXl3fiUtPNuBwgRmnSmuQ0SMSaTF62J1OVNbaUVFjRZ3NgWCNCiFaJSRJQkWNFUqFArVWOw7mmdAvPgxXDY5HTlktJAlQKSWUmK0oqbLA7nQiJSIEYUEqqJQSVAoF1EoJp8trsSunHEOTw3FbRjcUmevw6ZZTyK+sw2V9ohGt16K02orT5bXoGxeK4d3CYaq1YcORYpjq7MjoEYnxvaN8fpf5DhVURo8ejZEjR+KNN94AADidTiQnJ+NPf/oT/vKXv5z3tQwqRNRVlFZZcLK0GonGYMQZgmCqsyFIpWwyf0UIgZV78pCZXYb7JvREckQwqi12fLApG4OSDJjYJwa1VgfWZxWhpMqC3w2KR6ReC1OdDdnF1UiNCsGGI8V4aNkeaJQKjOoegZtGpWBinxg4hUBmdhmMOjVOldVg+Y5cSJKEqBANokK1yOgRiXG9olBktuDv3x3EoXwT0mL0iAzRIkKvQbwhCP/eeRpZhWboGm5x0TNaj5tGJWPxqiyUVFlw86gUpEQEo8BUhxPFVVArFSgyWbA/rxJOIRAdqsWAeANOllbjaFEVHE6BfvFhSI0Mxs5T5RjdIxIlZgu2nDi78k3TsA+RhPpf7q5doiWpfj8jhSRhd26Fx0aMjSUadbA6Wh5aDNEomw09ncUlPSLw5d0ZPn3PDhNUrFYrgoOD8dVXX2HmzJnu43PmzEFFRQVWrlx53tczqBARtQ9zna2+i+Knjf9sjvouzrlDL+dTa3WgwFSHbhHBTTpkeRW1yCmrQYhGhf4JYWj89JmKWljtTsSGBSGkYShHCAGzxY6yKitKq62orLUiNEiNPnGhCAuq3wm6osaKY0VV7ltmZGaXISlch6sGJyC7pD5UGYM1+OVo/co6tVKB8GANwoPVsDqc2JNbgbiwIAxIMOCXo8WoqLFBIQGGYA2MOjV0GiVqrQ7UWO2wOQTCQ9SwOwQUkoResXr8d18+DuWb0StGD5VSgs0uEBWqQZReC6UkIaesBrU2B6x2J+xOJ+wOgRCtCoOTDFi+4zT2nq6AIViNiQ2dul+PlcLqcCIsSI04QxAys8uQU1oNnUaFYSlGROm12HysBFMGxOLu8T3b/n9yIx0mqOTl5SExMRG//vorMjLOprVHH30UGzZsQGZmpsf5FosFFsvZRGsymZCcnMygQkRE1IFcTFDpUGveFi1aBIPB4P5KTk6WuyQiIiJqR7IGlaioKCiVShQWFnocLywsRFxc05uqLViwAJWVle6v3Nxcf5VKREREMpA1qGg0GgwfPhxr1651H3M6nVi7dq3HUJCLVqtFWFiYxxcRERF1Xr67cUYrPfjgg5gzZw5GjBiBUaNG4ZVXXkF1dTXmzp0rd2lEREQkM9mDyuzZs1FcXIwnnngCBQUFGDJkCFatWoXY2Fi5SyMiIiKZyb6PSltweTIREVHH02lX/RAREVHXwqBCREREAYtBhYiIiAIWgwoREREFLAYVIiIiClgMKkRERBSwGFSIiIgoYDGoEBERUcCSfWfatnDtVWcymWSuhIiIiLzl+r3tzZ6zHTqomM1mAEBycrLMlRAREdHFMpvNMBgM5z2nQ2+h73Q6kZeXh9DQUEiS5NP3NplMSE5ORm5uLrfnPw9eJ+/xWnmP18o7vE7e47Xynj+ulRACZrMZCQkJUCjOPwulQ3dUFAoFkpKS2vV7hIWF8YfaC7xO3uO18h6vlXd4nbzHa+W99r5WF+qkuHAyLREREQUsBhUiIiIKWAwqLdBqtXjyySeh1WrlLiWg8Tp5j9fKe7xW3uF18h6vlfcC7Vp16Mm0RERE1Lmxo0JEREQBi0GFiIiIAhaDChEREQUsBhUiIiIKWAwqzXjzzTeRmpqKoKAgjB49Gtu2bZO7JNktXLgQkiR5fPXt29f9fF1dHebNm4fIyEjo9Xpcd911KCwslLFi/9i4cSOuvvpqJCQkQJIkfPPNNx7PCyHwxBNPID4+HjqdDpMnT8bRo0c9zikrK8Ott96KsLAwGI1G/P73v0dVVZUfP4V/XOha3XHHHU1+xqZNm+ZxTle4VosWLcLIkSMRGhqKmJgYzJw5E1lZWR7nePP3LScnB1deeSWCg4MRExODRx55BHa73Z8fpd15c60mTJjQ5Ofqnnvu8TinK1yrt99+G4MHD3Zv4paRkYEffvjB/Xwg/0wxqJxj6dKlePDBB/Hkk09i165dSE9Px9SpU1FUVCR3abIbMGAA8vPz3V+bNm1yP/f//t//w7fffovly5djw4YNyMvLw6xZs2Ss1j+qq6uRnp6ON998s9nnFy9ejNdeew3vvPMOMjMzERISgqlTp6Kurs59zq233ooDBw5g9erV+O6777Bx40bcfffd/voIfnOhawUA06ZN8/gZ++KLLzye7wrXasOGDZg3bx62bt2K1atXw2azYcqUKaiurnafc6G/bw6HA1deeSWsVit+/fVXfPzxx1iyZAmeeOIJOT5Su/HmWgHAXXfd5fFztXjxYvdzXeVaJSUl4bnnnsPOnTuxY8cOXH755ZgxYwYOHDgAIMB/pgR5GDVqlJg3b577scPhEAkJCWLRokUyViW/J598UqSnpzf7XEVFhVCr1WL58uXuY4cOHRIAxJYtW/xUofwAiBUrVrgfO51OERcXJ1544QX3sYqKCqHVasUXX3whhBDi4MGDAoDYvn27+5wffvhBSJIkzpw547fa/e3cayWEEHPmzBEzZsxo8TVd9VoVFRUJAGLDhg1CCO/+vn3//fdCoVCIgoIC9zlvv/22CAsLExaLxb8fwI/OvVZCCHHZZZeJBx54oMXXdNVrJYQQ4eHh4v333w/4nyl2VBqxWq3YuXMnJk+e7D6mUCgwefJkbNmyRcbKAsPRo0eRkJCAHj164NZbb0VOTg4AYOfOnbDZbB7XrW/fvkhJSenS1y07OxsFBQUe18VgMGD06NHu67JlyxYYjUaMGDHCfc7kyZOhUCiQmZnp95rltn79esTExKBPnz649957UVpa6n6uq16ryspKAEBERAQA7/6+bdmyBYMGDUJsbKz7nKlTp8JkMrn/C7ozOvdauXz22WeIiorCwIEDsWDBAtTU1Lif64rXyuFw4Msvv0R1dTUyMjIC/meqQ9+U0NdKSkrgcDg8/o8AgNjYWBw+fFimqgLD6NGjsWTJEvTp0wf5+fl46qmnMG7cOOzfvx8FBQXQaDQwGo0er4mNjUVBQYE8BQcA12dv7ufJ9VxBQQFiYmI8nlepVIiIiOhy127atGmYNWsWunfvjuPHj+Ovf/0rpk+fji1btkCpVHbJa+V0OjF//nyMGTMGAwcOBACv/r4VFBQ0+3Pneq4zau5aAcAtt9yCbt26ISEhAXv37sWf//xnZGVl4euvvwbQta7Vvn37kJGRgbq6Ouj1eqxYsQL9+/fHnj17AvpnikGFvDJ9+nT3nwcPHozRo0ejW7duWLZsGXQ6nYyVUWdx0003uf88aNAgDB48GD179sT69esxadIkGSuTz7x587B//36P+WDUvJauVeM5TIMGDUJ8fDwmTZqE48ePo2fPnv4uU1Z9+vTBnj17UFlZia+++gpz5szBhg0b5C7rgjj000hUVBSUSmWTmc6FhYWIi4uTqarAZDQa0bt3bxw7dgxxcXGwWq2oqKjwOKerXzfXZz/fz1NcXFyTidp2ux1lZWVd+toBQI8ePRAVFYVjx44B6HrX6v7778d3332Hn3/+GUlJSe7j3vx9i4uLa/bnzvVcZ9PStWrO6NGjAcDj56qrXCuNRoO0tDQMHz4cixYtQnp6Ol599dWA/5liUGlEo9Fg+PDhWLt2rfuY0+nE2rVrkZGRIWNlgaeqqgrHjx9HfHw8hg8fDrVa7XHdsrKykJOT06WvW/fu3REXF+dxXUwmEzIzM93XJSMjAxUVFdi5c6f7nHXr1sHpdLr/Qe2qTp8+jdLSUsTHxwPoOtdKCIH7778fK1aswLp169C9e3eP5735+5aRkYF9+/Z5BLvVq1cjLCwM/fv3988H8YMLXavm7NmzBwA8fq66wrVqjtPphMViCfyfqXadqtsBffnll0Kr1YolS5aIgwcPirvvvlsYjUaPmc5d0UMPPSTWr18vsrOzxebNm8XkyZNFVFSUKCoqEkIIcc8994iUlBSxbt06sWPHDpGRkSEyMjJkrrr9mc1msXv3brF7924BQLz00kti9+7d4tSpU0IIIZ577jlhNBrFypUrxd69e8WMGTNE9+7dRW1trfs9pk2bJoYOHSoyMzPFpk2bRK9evcTNN98s10dqN+e7VmazWTz88MNiy5YtIjs7W6xZs0YMGzZM9OrVS9TV1bnfoytcq3vvvVcYDAaxfv16kZ+f7/6qqalxn3Ohv292u10MHDhQTJkyRezZs0esWrVKREdHiwULFsjxkdrNha7VsWPHxNNPPy127NghsrOzxcqVK0WPHj3E+PHj3e/RVa7VX/7yF7FhwwaRnZ0t9u7dK/7yl78ISZLETz/9JIQI7J8pBpVmvP766yIlJUVoNBoxatQosXXrVrlLkt3s2bNFfHy80Gg0IjExUcyePVscO3bM/Xxtba247777RHh4uAgODhbXXnutyM/Pl7Fi//j5558FgCZfc+bMEULUL1F+/PHHRWxsrNBqtWLSpEkiKyvL4z1KS0vFzTffLPR6vQgLCxNz584VZrNZhk/Tvs53rWpqasSUKVNEdHS0UKvVolu3buKuu+5q8h8IXeFaNXeNAIiPPvrIfY43f99Onjwppk+fLnQ6nYiKihIPPfSQsNlsfv407etC1yonJ0eMHz9eRERECK1WK9LS0sQjjzwiKisrPd6nK1yrO++8U3Tr1k1oNBoRHR0tJk2a5A4pQgT2z5QkhBDt27MhIiIiah3OUSEiIqKAxaBCREREAYtBhYiIiAIWgwoREREFLAYVIiIiClgMKkRERBSwGFSIiIgoYDGoEFGHkpqaildeeUXuMojITxhUiKhFd9xxB2bOnAkAmDBhAubPn++3771kyZImt50HgO3bt3vcEZeIOjeV3AUQUdditVqh0Wha/fro6GgfVkNEgY4dFSK6oDvuuAMbNmzAq6++CkmSIEkSTp48CQDYv38/pk+fDr1ej9jYWNx2220oKSlxv3bChAm4//77MX/+fERFRWHq1KkAgJdeegmDBg1CSEgIkpOTcd9996GqqgoAsH79esydOxeVlZXu77dw4UIATYd+cnJyMGPGDOj1eoSFheHGG2/0uB39woULMWTIEHz66adITU2FwWDATTfdBLPZ7D7nq6++wqBBg6DT6RAZGYnJkyejurq6na4mEV0MBhUiuqBXX30VGRkZuOuuu5Cfn4/8/HwkJyejoqICl19+OYYOHYodO3Zg1apVKCwsxI033ujx+o8//hgajQabN2/GO++8AwBQKBR47bXXcODAAXz88cdYt24dHn30UQDApZdeildeeQVhYWHu7/fwww83qcvpdGLGjBkoKyvDhg0bsHr1apw4cQKzZ8/2OO/48eP45ptv8N133+G7777Dhg0b8NxzzwEA8vPzcfPNN+POO+/EoUOHsH79esyaNQu8DRpRYODQDxFdkMFggEajQXBwMOLi4tzH33jjDQwdOhTPPvus+9iHH36I5ORkHDlyBL179wYA9OrVC4sXL/Z4z8bzXVJTU/GPf/wD99xzD9566y1oNBoYDAZIkuTx/c61du1a7Nu3D9nZ2UhOTgYAfPLJJxgwYAC2b9+OkSNHAqgPNEuWLEFoaCgA4LbbbsPatWvxzDPPID8/H3a7HbNmzUK3bt0AAIMGDWrD1SIiX2JHhYha7bfffsPPP/8MvV7v/urbty+A+i6Gy/Dhw5u8ds2aNZg0aRISExMRGhqK2267DaWlpaipqfH6+x86dAjJycnukAIA/fv3h9FoxKFDh9zHUlNT3SEFAOLj41FUVAQASE9Px6RJkzBo0CDccMMNeO+991BeXu79RSCidsWgQkStVlVVhauvvhp79uzx+Dp69CjGjx/vPi8kJMTjdSdPnsRVV12FwYMH49///jd27tyJN998E0D9ZFtfU6vVHo8lSYLT6QQAKJVKrF69Gj/88AP69++P119/HX369EF2drbP6yCii8egQkRe0Wg0cDgcHseGDRuGAwcOIDU1FWlpaR5f54aTxnbu3Amn04kXX3wRl1xyCXr37o28vLwLfr9z9evXD7m5ucjNzXUfO3jwICoqKtC/f3+vP5skSRgzZgyeeuop7N69GxqNBitWrPD69UTUfhhUiMgrqampyMzMxMmTJ1FSUgKn04l58+ahrKwMN998M7Zv347jx4/jxx9/xNy5c88bMtLS0mCz2fD666/jxIkT+PTTT92TbBt/v6qqKqxduxYlJSXNDglNnjwZgwYNwq233opdu3Zh27ZtuP3223HZZZdhxIgRXn2uzMxMPPvss9ixYwdycnLw9ddfo7i4GP369bu4C0RE7YJBhYi88vDDD0OpVKJ///6Ijo5GTk4OEhISsHnzZjgcDkyZMgWDBg3C/PnzYTQaoVC0/M9Leno6XnrpJTz//PMYOHAgPvvsMyxatMjjnEsvvRT33HMPZs+ejejo6CaTcYH6TsjKlSsRHh6O8ePHY/LkyejRoweWLl3q9ecKCwvDxo0b8bvf/Q69e/fG3/72N7z44ouYPn269xeHiNqNJLgGj4iIiAIUOypEREQUsBhUiIiIKGAxqBAREVHAYlAhIiKigMWgQkRERAGLQYWIiIgCFoMKERERBSwGFSIiIgpYDCpEREQUsBhUiIiIKGAxqBAREVHAYlAhIiKigPX/AT9S1SbDfaVXAAAAAElFTkSuQmCC",
402 | "text/plain": [
403 | ""
404 | ]
405 | },
406 | "metadata": {},
407 | "output_type": "display_data"
408 | }
409 | ],
410 | "source": [
411 | "import matplotlib.pyplot as plt\n",
412 | "import numpy as np\n",
413 | "\n",
414 | "# Print the shape of the loss history array\n",
415 | "print(\"number of steps with loss recorded:\", np.shape(loss_history)[0])\n",
416 | "\n",
417 | "# Plot the loss history to visualize how the loss changes over time\n",
418 | "plt.plot(loss_history)\n",
419 | "plt.xlabel('Iterations') # X-axis label indicating the number of iterations (batches)\n",
420 | "plt.ylabel('Loss') # Y-axis label indicating the loss value\n",
421 | "plt.title('Loss Over Time') # Title of the plot\n",
422 | "plt.show() # Display the plot\n"
423 | ]
424 | },
425 | {
426 | "cell_type": "code",
427 | "execution_count": 7,
428 | "metadata": {
429 | "colab": {
430 | "base_uri": "https://localhost:8080/",
431 | "height": 1000
432 | },
433 | "executionInfo": {
434 | "elapsed": 580,
435 | "status": "ok",
436 | "timestamp": 1724524491669,
437 | "user": {
438 | "displayName": "Roland Potthast",
439 | "userId": "09141136587533247770"
440 | },
441 | "user_tz": -120
442 | },
443 | "id": "3ggr248-unsS",
444 | "outputId": "0aeaf569-b9d9-44cc-f636-880dbd2a85f4"
445 | },
446 | "outputs": [
447 | {
448 | "name": "stdout",
449 | "output_type": "stream",
450 | "text": [
451 | "3\n",
452 | "test_sentence: \t \t \t \t \t I am hungry\n",
453 | "test input : tensor([[1, 2, 0, 0, 0, 0]]) : I am \n",
454 | "test_output : tensor([[ 1, 2, 43, 43, 43, 43]]) : I am hungry\n"
455 | ]
456 | },
457 | {
458 | "data": {
459 | "text/html": [
460 | "Result: True"
461 | ],
462 | "text/plain": [
463 | ""
464 | ]
465 | },
466 | "metadata": {},
467 | "output_type": "display_data"
468 | },
469 | {
470 | "name": "stdout",
471 | "output_type": "stream",
472 | "text": [
473 | "3\n",
474 | "test_sentence: \t \t \t \t \t you are tired\n",
475 | "test input : tensor([[3, 6, 0, 0, 0, 0]]) : you are \n",
476 | "test_output : tensor([[ 3, 6, 44, 44, 44, 44]]) : you are tired\n"
477 | ]
478 | },
479 | {
480 | "data": {
481 | "text/html": [
482 | "Result: True"
483 | ],
484 | "text/plain": [
485 | ""
486 | ]
487 | },
488 | "metadata": {},
489 | "output_type": "display_data"
490 | },
491 | {
492 | "name": "stdout",
493 | "output_type": "stream",
494 | "text": [
495 | "3\n",
496 | "test_sentence: \t \t \t \t \t we are happy\n",
497 | "test input : tensor([[5, 6, 0, 0, 0, 0]]) : we are \n",
498 | "test_output : tensor([[ 5, 6, 45, 45, 45, 45]]) : we are happy\n"
499 | ]
500 | },
501 | {
502 | "data": {
503 | "text/html": [
504 | "Result: True"
505 | ],
506 | "text/plain": [
507 | ""
508 | ]
509 | },
510 | "metadata": {},
511 | "output_type": "display_data"
512 | },
513 | {
514 | "name": "stdout",
515 | "output_type": "stream",
516 | "text": [
517 | "3\n",
518 | "test_sentence: \t \t \t \t \t they are sad\n",
519 | "test input : tensor([[54, 6, 0, 0, 0, 0]]) : they are \n",
520 | "test_output : tensor([[54, 6, 46, 46, 46, 46]]) : they are sad\n"
521 | ]
522 | },
523 | {
524 | "data": {
525 | "text/html": [
526 | "Result: True"
527 | ],
528 | "text/plain": [
529 | ""
530 | ]
531 | },
532 | "metadata": {},
533 | "output_type": "display_data"
534 | },
535 | {
536 | "name": "stdout",
537 | "output_type": "stream",
538 | "text": [
539 | "3\n",
540 | "test_sentence: \t \t \t \t \t it is simple\n",
541 | "test input : tensor([[47, 4, 0, 0, 0, 0]]) : it is \n",
542 | "test_output : tensor([[47, 4, 10, 10, 10, 10]]) : it is simple\n"
543 | ]
544 | },
545 | {
546 | "data": {
547 | "text/html": [
548 | "Result: True"
549 | ],
550 | "text/plain": [
551 | ""
552 | ]
553 | },
554 | "metadata": {},
555 | "output_type": "display_data"
556 | },
557 | {
558 | "name": "stdout",
559 | "output_type": "stream",
560 | "text": [
561 | "4\n",
562 | "test_sentence: \t \t \t \t \t the weather is nice\n",
563 | "test input : tensor([[ 9, 34, 4, 0, 0, 0]]) : the weather is \n",
564 | "test_output : tensor([[ 9, 34, 4, 35, 35, 35]]) : the weather is nice\n"
565 | ]
566 | },
567 | {
568 | "data": {
569 | "text/html": [
570 | "Result: True"
571 | ],
572 | "text/plain": [
573 | ""
574 | ]
575 | },
576 | "metadata": {},
577 | "output_type": "display_data"
578 | },
579 | {
580 | "name": "stdout",
581 | "output_type": "stream",
582 | "text": [
583 | "3\n",
584 | "test_sentence: \t \t \t \t \t this is bad\n",
585 | "test input : tensor([[49, 4, 0, 0, 0, 0]]) : this is \n",
586 | "test_output : tensor([[49, 4, 50, 50, 50, 50]]) : this is bad\n"
587 | ]
588 | },
589 | {
590 | "data": {
591 | "text/html": [
592 | "Result: True"
593 | ],
594 | "text/plain": [
595 | ""
596 | ]
597 | },
598 | "metadata": {},
599 | "output_type": "display_data"
600 | },
601 | {
602 | "name": "stdout",
603 | "output_type": "stream",
604 | "text": [
605 | "3\n",
606 | "test_sentence: \t \t \t \t \t this was good\n",
607 | "test input : tensor([[49, 55, 0, 0, 0, 0]]) : this was \n",
608 | "test_output : tensor([[49, 55, 48, 48, 48, 48]]) : this was good\n"
609 | ]
610 | },
611 | {
612 | "data": {
613 | "text/html": [
614 | "Result: True"
615 | ],
616 | "text/plain": [
617 | ""
618 | ]
619 | },
620 | "metadata": {},
621 | "output_type": "display_data"
622 | },
623 | {
624 | "name": "stdout",
625 | "output_type": "stream",
626 | "text": [
627 | "4\n",
628 | "test_sentence: \t \t \t \t \t we want to eat\n",
629 | "test input : tensor([[ 5, 25, 29, 0, 0, 0]]) : we want to \n",
630 | "test_output : tensor([[ 5, 25, 29, 51, 51, 51]]) : we want to eat\n"
631 | ]
632 | },
633 | {
634 | "data": {
635 | "text/html": [
636 | "Result: True"
637 | ],
638 | "text/plain": [
639 | ""
640 | ]
641 | },
642 | "metadata": {},
643 | "output_type": "display_data"
644 | },
645 | {
646 | "name": "stdout",
647 | "output_type": "stream",
648 | "text": [
649 | "4\n",
650 | "test_sentence: \t \t \t \t \t they want to drink\n",
651 | "test input : tensor([[54, 25, 29, 0, 0, 0]]) : they want to \n",
652 | "test_output : tensor([[54, 25, 29, 52, 52, 52]]) : they want to drink\n"
653 | ]
654 | },
655 | {
656 | "data": {
657 | "text/html": [
658 | "Result: True"
659 | ],
660 | "text/plain": [
661 | ""
662 | ]
663 | },
664 | "metadata": {},
665 | "output_type": "display_data"
666 | },
667 | {
668 | "name": "stdout",
669 | "output_type": "stream",
670 | "text": [
671 | "3\n",
672 | "test_sentence: \t \t \t \t \t you can come\n",
673 | "test input : tensor([[ 3, 22, 0, 0, 0, 0]]) : you can \n",
674 | "test_output : tensor([[ 3, 22, 53, 53, 53, 53]]) : you can come\n"
675 | ]
676 | },
677 | {
678 | "data": {
679 | "text/html": [
680 | "Result: True"
681 | ],
682 | "text/plain": [
683 | ""
684 | ]
685 | },
686 | "metadata": {},
687 | "output_type": "display_data"
688 | },
689 | {
690 | "name": "stdout",
691 | "output_type": "stream",
692 | "text": [
693 | "3\n",
694 | "test_sentence: \t \t \t \t \t we go home\n",
695 | "test input : tensor([[ 5, 42, 0, 0, 0, 0]]) : we go \n",
696 | "test_output : tensor([[ 5, 42, 30, 30, 30, 30]]) : we go home\n"
697 | ]
698 | },
699 | {
700 | "data": {
701 | "text/html": [
702 | "Result: True"
703 | ],
704 | "text/plain": [
705 | ""
706 | ]
707 | },
708 | "metadata": {},
709 | "output_type": "display_data"
710 | },
711 | {
712 | "name": "stdout",
713 | "output_type": "stream",
714 | "text": [
715 | "5\n",
716 | "test_sentence: \t \t \t \t \t they play in the garden\n",
717 | "test input : tensor([[54, 31, 32, 9, 0, 0]]) : they play in the \n",
718 | "test_output : tensor([[54, 31, 32, 9, 33, 33]]) : they play in the garden\n"
719 | ]
720 | },
721 | {
722 | "data": {
723 | "text/html": [
724 | "Result: True"
725 | ],
726 | "text/plain": [
727 | ""
728 | ]
729 | },
730 | "metadata": {},
731 | "output_type": "display_data"
732 | },
733 | {
734 | "name": "stdout",
735 | "output_type": "stream",
736 | "text": [
737 | "4\n",
738 | "test_sentence: \t \t \t \t \t the weather is nice\n",
739 | "test input : tensor([[ 9, 34, 4, 0, 0, 0]]) : the weather is \n",
740 | "test_output : tensor([[ 9, 34, 4, 35, 35, 35]]) : the weather is nice\n"
741 | ]
742 | },
743 | {
744 | "data": {
745 | "text/html": [
746 | "Result: True"
747 | ],
748 | "text/plain": [
749 | ""
750 | ]
751 | },
752 | "metadata": {},
753 | "output_type": "display_data"
754 | },
755 | {
756 | "name": "stdout",
757 | "output_type": "stream",
758 | "text": [
759 | "4\n",
760 | "test_sentence: \t \t \t \t \t he drives to Berlin\n",
761 | "test input : tensor([[41, 36, 29, 0, 0, 0]]) : he drives to \n",
762 | "test_output : tensor([[41, 36, 29, 37, 37, 37]]) : he drives to Berlin\n"
763 | ]
764 | },
765 | {
766 | "data": {
767 | "text/html": [
768 | "Result: True"
769 | ],
770 | "text/plain": [
771 | ""
772 | ]
773 | },
774 | "metadata": {},
775 | "output_type": "display_data"
776 | },
777 | {
778 | "name": "stdout",
779 | "output_type": "stream",
780 | "text": [
781 | "4\n",
782 | "test_sentence: \t \t \t \t \t she reads a book\n",
783 | "test input : tensor([[40, 38, 7, 0, 0, 0]]) : she reads a \n",
784 | "test_output : tensor([[40, 38, 7, 39, 39, 39]]) : she reads a book\n"
785 | ]
786 | },
787 | {
788 | "data": {
789 | "text/html": [
790 | "Result: True"
791 | ],
792 | "text/plain": [
793 | ""
794 | ]
795 | },
796 | "metadata": {},
797 | "output_type": "display_data"
798 | }
799 | ],
800 | "source": [
801 | "# ----------------------------------------------------------------------------\n",
802 | "# Testen des Modells\n",
803 | "# ----------------------------------------------------------------------------\n",
804 | "from IPython.display import HTML, display\n",
805 | "\n",
806 | "# Function to display colored text\n",
807 | "def color_text(text, color):\n",
808 | " display(HTML(f\"{text}\"))\n",
809 | "\n",
810 | "model.eval()\n",
811 | "for words in sentences:\n",
812 | " test_sentence = words\n",
813 | " test_tokens = tokenize_sentence(test_sentence, vocab)\n",
814 | " mylen = len(test_tokens)\n",
815 | " print(mylen)\n",
816 | " test_input = torch.tensor(pad_sequence(test_tokens[:-1], max_len))\n",
817 | " test_input = test_input.unsqueeze(0) # Add batch dimension\n",
818 | " output = model(test_input)\n",
819 | " predicted_ids = torch.argmax(output[:mylen], dim=-1)\n",
820 | " #print(\"predicted_ids: \", predicted_ids[:,:mylen])\n",
821 | " predicted_ids2 = predicted_ids[:,:mylen]\n",
822 | " decoded_input = [vocab[id.item()] for id in test_input.squeeze()]\n",
823 | " decoded_output = [vocab[id.item()] for id in predicted_ids2.squeeze()]\n",
824 | " print(\"test_sentence: \\t \\t \\t \\t \\t\", test_sentence)\n",
825 | " print(\"test input :\", test_input, \": \", \" \".join(decoded_input))\n",
826 | " print(\"test_output :\", predicted_ids, \":\", \" \".join(decoded_output))\n",
827 | " success = (\" \".join(decoded_output) == test_sentence)\n",
828 | " result = \"Result: \" + str(success)\n",
829 | " color_text(result,\"green\")"
830 | ]
831 | },
832 | {
833 | "cell_type": "code",
834 | "execution_count": 8,
835 | "metadata": {
836 | "colab": {
837 | "base_uri": "https://localhost:8080/"
838 | },
839 | "executionInfo": {
840 | "elapsed": 249,
841 | "status": "ok",
842 | "timestamp": 1724523453596,
843 | "user": {
844 | "displayName": "Roland Potthast",
845 | "userId": "09141136587533247770"
846 | },
847 | "user_tz": -120
848 | },
849 | "id": "lSqebIN62wQS",
850 | "outputId": "08249938-2836-4714-a22e-2c89fc04898c"
851 | },
852 | "outputs": [
853 | {
854 | "name": "stdout",
855 | "output_type": "stream",
856 | "text": [
857 | "['I am hungry', 'you are tired', 'we are happy', 'they are sad', 'it is simple', 'the weather is nice', 'this is bad', 'this was good', 'we want to eat', 'they want to drink', 'you can come', 'we go home', 'they play in the garden', 'the weather is nice', 'he drives to Berlin', 'she reads a book']\n",
858 | "(tensor([1, 2, 0, 0, 0, 0]), tensor([ 1, 2, 43, 0, 0, 0]))\n",
859 | "(tensor([3, 6, 0, 0, 0, 0]), tensor([ 3, 6, 44, 0, 0, 0]))\n",
860 | "(tensor([5, 6, 0, 0, 0, 0]), tensor([ 5, 6, 45, 0, 0, 0]))\n",
861 | "(tensor([54, 6, 0, 0, 0, 0]), tensor([54, 6, 46, 0, 0, 0]))\n",
862 | "(tensor([47, 4, 0, 0, 0, 0]), tensor([47, 4, 10, 0, 0, 0]))\n",
863 | "(tensor([ 9, 34, 4, 0, 0, 0]), tensor([ 9, 34, 4, 35, 0, 0]))\n",
864 | "(tensor([49, 4, 0, 0, 0, 0]), tensor([49, 4, 50, 0, 0, 0]))\n",
865 | "(tensor([49, 55, 0, 0, 0, 0]), tensor([49, 55, 48, 0, 0, 0]))\n",
866 | "(tensor([ 5, 25, 29, 0, 0, 0]), tensor([ 5, 25, 29, 51, 0, 0]))\n",
867 | "(tensor([54, 25, 29, 0, 0, 0]), tensor([54, 25, 29, 52, 0, 0]))\n",
868 | "(tensor([ 3, 22, 0, 0, 0, 0]), tensor([ 3, 22, 53, 0, 0, 0]))\n",
869 | "(tensor([ 5, 42, 0, 0, 0, 0]), tensor([ 5, 42, 30, 0, 0, 0]))\n",
870 | "(tensor([54, 31, 32, 9, 0, 0]), tensor([54, 31, 32, 9, 33, 0]))\n",
871 | "(tensor([ 9, 34, 4, 0, 0, 0]), tensor([ 9, 34, 4, 35, 0, 0]))\n",
872 | "(tensor([41, 36, 29, 0, 0, 0]), tensor([41, 36, 29, 37, 0, 0]))\n",
873 | "(tensor([40, 38, 7, 0, 0, 0]), tensor([40, 38, 7, 39, 0, 0]))\n",
874 | "-------------------------------------------\n",
875 | "x= tensor([1, 2, 0, 0, 0, 0])\n",
876 | "y= tensor([ 1, 2, 43, 0, 0, 0])\n",
877 | "Data entry 0:\n",
878 | "x = I am \n",
879 | "y = I am hungry \n",
880 | " I am hungry\n",
881 | "\n",
882 | "x= tensor([3, 6, 0, 0, 0, 0])\n",
883 | "y= tensor([ 3, 6, 44, 0, 0, 0])\n",
884 | "Data entry 1:\n",
885 | "x = you are \n",
886 | "y = you are tired \n",
887 | " you are tired\n",
888 | "\n",
889 | "x= tensor([5, 6, 0, 0, 0, 0])\n",
890 | "y= tensor([ 5, 6, 45, 0, 0, 0])\n",
891 | "Data entry 2:\n",
892 | "x = we are \n",
893 | "y = we are happy \n",
894 | " we are happy\n",
895 | "\n",
896 | "x= tensor([54, 6, 0, 0, 0, 0])\n",
897 | "y= tensor([54, 6, 46, 0, 0, 0])\n",
898 | "Data entry 3:\n",
899 | "x = they are \n",
900 | "y = they are sad \n",
901 | " they are sad\n",
902 | "\n",
903 | "x= tensor([47, 4, 0, 0, 0, 0])\n",
904 | "y= tensor([47, 4, 10, 0, 0, 0])\n",
905 | "Data entry 4:\n",
906 | "x = it is \n",
907 | "y = it is simple \n",
908 | " it is simple\n",
909 | "\n",
910 | "x= tensor([ 9, 34, 4, 0, 0, 0])\n",
911 | "y= tensor([ 9, 34, 4, 35, 0, 0])\n",
912 | "Data entry 5:\n",
913 | "x = the weather is \n",
914 | "y = the weather is nice \n",
915 | " the weather is nice\n",
916 | "\n",
917 | "x= tensor([49, 4, 0, 0, 0, 0])\n",
918 | "y= tensor([49, 4, 50, 0, 0, 0])\n",
919 | "Data entry 6:\n",
920 | "x = this is \n",
921 | "y = this is bad \n",
922 | " this is bad\n",
923 | "\n",
924 | "x= tensor([49, 55, 0, 0, 0, 0])\n",
925 | "y= tensor([49, 55, 48, 0, 0, 0])\n",
926 | "Data entry 7:\n",
927 | "x = this was \n",
928 | "y = this was good \n",
929 | " this was good\n",
930 | "\n",
931 | "x= tensor([ 5, 25, 29, 0, 0, 0])\n",
932 | "y= tensor([ 5, 25, 29, 51, 0, 0])\n",
933 | "Data entry 8:\n",
934 | "x = we want to \n",
935 | "y = we want to eat \n",
936 | " we want to eat\n",
937 | "\n",
938 | "x= tensor([54, 25, 29, 0, 0, 0])\n",
939 | "y= tensor([54, 25, 29, 52, 0, 0])\n",
940 | "Data entry 9:\n",
941 | "x = they want to \n",
942 | "y = they want to drink \n",
943 | " they want to drink\n",
944 | "\n",
945 | "x= tensor([ 3, 22, 0, 0, 0, 0])\n",
946 | "y= tensor([ 3, 22, 53, 0, 0, 0])\n",
947 | "Data entry 10:\n",
948 | "x = you can \n",
949 | "y = you can come \n",
950 | " you can come\n",
951 | "\n",
952 | "x= tensor([ 5, 42, 0, 0, 0, 0])\n",
953 | "y= tensor([ 5, 42, 30, 0, 0, 0])\n",
954 | "Data entry 11:\n",
955 | "x = we go \n",
956 | "y = we go home \n",
957 | " we go home\n",
958 | "\n",
959 | "x= tensor([54, 31, 32, 9, 0, 0])\n",
960 | "y= tensor([54, 31, 32, 9, 33, 0])\n",
961 | "Data entry 12:\n",
962 | "x = they play in the \n",
963 | "y = they play in the garden \n",
964 | " they play in the garden\n",
965 | "\n",
966 | "x= tensor([ 9, 34, 4, 0, 0, 0])\n",
967 | "y= tensor([ 9, 34, 4, 35, 0, 0])\n",
968 | "Data entry 13:\n",
969 | "x = the weather is \n",
970 | "y = the weather is nice \n",
971 | " the weather is nice\n",
972 | "\n",
973 | "x= tensor([41, 36, 29, 0, 0, 0])\n",
974 | "y= tensor([41, 36, 29, 37, 0, 0])\n",
975 | "Data entry 14:\n",
976 | "x = he drives to \n",
977 | "y = he drives to Berlin \n",
978 | " he drives to Berlin\n",
979 | "\n",
980 | "x= tensor([40, 38, 7, 0, 0, 0])\n",
981 | "y= tensor([40, 38, 7, 39, 0, 0])\n",
982 | "Data entry 15:\n",
983 | "x = she reads a \n",
984 | "y = she reads a book \n",
985 | " she reads a book\n",
986 | "\n"
987 | ]
988 | }
989 | ],
990 | "source": [
991 | "# Create the dataset\n",
992 | "dataset = SimpleDataset(sentences, vocab, max_len)\n",
993 | "print(sentences)\n",
994 | "for entry in dataset:\n",
995 | " print(entry)\n",
996 | "print(\"-------------------------------------------\")\n",
997 | "# Iterate over the dataset and print the data entries\n",
998 | "for i in range(len(dataset)):\n",
999 | " x, y = dataset[i]\n",
1000 | " print(\"x=\",x)\n",
1001 | " print(\"y=\",y)\n",
1002 | "\n",
1003 | " # Decode x\n",
1004 | " decoded_x = [vocab[token.item()] for token in x]\n",
1005 | " print(f\"Data entry {i}:\")\n",
1006 | " print(f\"x = {' '.join(decoded_x)}\")\n",
1007 | "\n",
1008 | " # Decode y\n",
1009 | " decoded_y = [vocab[token.item()] for token in y]\n",
1010 | " print(f\"y = {' '.join(decoded_y)}\")\n",
1011 | " print(\" \", sentences[i])\n",
1012 | " print()"
1013 | ]
1014 | },
1015 | {
1016 | "cell_type": "code",
1017 | "execution_count": 9,
1018 | "metadata": {
1019 | "colab": {
1020 | "base_uri": "https://localhost:8080/"
1021 | },
1022 | "executionInfo": {
1023 | "elapsed": 234,
1024 | "status": "ok",
1025 | "timestamp": 1724523456945,
1026 | "user": {
1027 | "displayName": "Roland Potthast",
1028 | "userId": "09141136587533247770"
1029 | },
1030 | "user_tz": -120
1031 | },
1032 | "id": "LuEjfplMWLQ_",
1033 | "outputId": "919a8c74-411a-4a0a-b78d-4746f8117ae3"
1034 | },
1035 | "outputs": [
1036 | {
1037 | "name": "stdout",
1038 | "output_type": "stream",
1039 | "text": [
1040 | "0 ) x=\n",
1041 | "\t it is \n",
1042 | "\t they play in the \n",
1043 | "\t they want to \n",
1044 | "\t the weather is \n",
1045 | "\t I am \n",
1046 | "\t the weather is \n",
1047 | " y=\n",
1048 | "\t it is simple \n",
1049 | "\t they play in the garden \n",
1050 | "\t they want to drink \n",
1051 | "\t the weather is nice \n",
1052 | "\t I am hungry \n",
1053 | "\t the weather is nice \n",
1054 | "1 ) x=\n",
1055 | "\t he drives to \n",
1056 | "\t this is \n",
1057 | "\t you are \n",
1058 | "\t we want to \n",
1059 | "\t we are \n",
1060 | "\t we go \n",
1061 | " y=\n",
1062 | "\t he drives to Berlin \n",
1063 | "\t this is bad \n",
1064 | "\t you are tired \n",
1065 | "\t we want to eat \n",
1066 | "\t we are happy \n",
1067 | "\t we go home \n",
1068 | "2 ) x=\n",
1069 | "\t this was \n",
1070 | "\t you can \n",
1071 | "\t they are \n",
1072 | "\t she reads a \n",
1073 | " y=\n",
1074 | "\t this was good \n",
1075 | "\t you can come \n",
1076 | "\t they are sad \n",
1077 | "\t she reads a book \n"
1078 | ]
1079 | }
1080 | ],
1081 | "source": [
1082 | "n = 0\n",
1083 | "for x, y in dataloader:\n",
1084 | " print(n, \") x=\")\n",
1085 | " for seq in x: # Iterate over each sequence in the batch\n",
1086 | " decoded_x = [vocab[token.item()] for token in seq.squeeze()] # Decode the sequence\n",
1087 | " print(\"\\t\", \" \".join(decoded_x)) # Join decoded words into a single string\n",
1088 | "\n",
1089 | " print(\" y=\")\n",
1090 | " for seq in y: # Iterate over each target sequence in the batch\n",
1091 | " decoded_y = [vocab[token.item()] for token in seq.squeeze()] # Decode the sequence\n",
1092 | " print(\"\\t\", \" \".join(decoded_y)) # Join decoded words into a single string\n",
1093 | "\n",
1094 | " n += 1"
1095 | ]
1096 | },
1097 | {
1098 | "cell_type": "code",
1099 | "execution_count": 10,
1100 | "metadata": {
1101 | "colab": {
1102 | "base_uri": "https://localhost:8080/"
1103 | },
1104 | "executionInfo": {
1105 | "elapsed": 252,
1106 | "status": "ok",
1107 | "timestamp": 1724523459979,
1108 | "user": {
1109 | "displayName": "Roland Potthast",
1110 | "userId": "09141136587533247770"
1111 | },
1112 | "user_tz": -120
1113 | },
1114 | "id": "FFxf9X0Yz2tA",
1115 | "outputId": "3e69c262-6e3e-44d9-8730-41d5c1119ab9"
1116 | },
1117 | "outputs": [
1118 | {
1119 | "name": "stdout",
1120 | "output_type": "stream",
1121 | "text": [
1122 | "Vocabulary:\n",
1123 | "1: I 2: am 3: you 4: is 5: we 6: are 7: a 8: an 9: the 10: simple 11: example 12: with 13: and 14: but 15: or 16: not 17: only 18: also 19: how 20: what \n",
1124 | "21: why 22: can 23: must 24: should 25: want 26: has 27: have 28: had 29: to 30: home 31: play 32: in 33: garden 34: weather 35: nice 36: drives 37: Berlin 38: reads 39: book 40: she \n",
1125 | "41: he 42: go 43: hungry 44: tired 45: happy 46: sad 47: it 48: good 49: this 50: bad 51: eat 52: drink 53: come 54: they 55: was \n",
1126 | "\n",
1127 | "Testing tokenization, padding, and decoding:\n",
1128 | "Original Sentence: I am hungry\n",
1129 | "Tokenized: [1, 2, 43]\n",
1130 | "Padded: [1, 2, 43, 0, 0, 0]\n",
1131 | "Decoded: I am hungry \n",
1132 | "---\n",
1133 | "Original Sentence: you are tired\n",
1134 | "Tokenized: [3, 6, 44]\n",
1135 | "Padded: [3, 6, 44, 0, 0, 0]\n",
1136 | "Decoded: you are tired \n",
1137 | "---\n",
1138 | "Original Sentence: we are happy\n",
1139 | "Tokenized: [5, 6, 45]\n",
1140 | "Padded: [5, 6, 45, 0, 0, 0]\n",
1141 | "Decoded: we are happy \n",
1142 | "---\n",
1143 | "Original Sentence: they are sad\n",
1144 | "Tokenized: [54, 6, 46]\n",
1145 | "Padded: [54, 6, 46, 0, 0, 0]\n",
1146 | "Decoded: they are sad \n",
1147 | "---\n",
1148 | "Original Sentence: it is simple\n",
1149 | "Tokenized: [47, 4, 10]\n",
1150 | "Padded: [47, 4, 10, 0, 0, 0]\n",
1151 | "Decoded: it is simple \n",
1152 | "---\n",
1153 | "Original Sentence: the weather is nice\n",
1154 | "Tokenized: [9, 34, 4, 35]\n",
1155 | "Padded: [9, 34, 4, 35, 0, 0]\n",
1156 | "Decoded: the weather is nice \n",
1157 | "---\n",
1158 | "Original Sentence: this is bad\n",
1159 | "Tokenized: [49, 4, 50]\n",
1160 | "Padded: [49, 4, 50, 0, 0, 0]\n",
1161 | "Decoded: this is bad \n",
1162 | "---\n",
1163 | "Original Sentence: this was good\n",
1164 | "Tokenized: [49, 55, 48]\n",
1165 | "Padded: [49, 55, 48, 0, 0, 0]\n",
1166 | "Decoded: this was good \n",
1167 | "---\n",
1168 | "Original Sentence: we want to eat\n",
1169 | "Tokenized: [5, 25, 29, 51]\n",
1170 | "Padded: [5, 25, 29, 51, 0, 0]\n",
1171 | "Decoded: we want to eat \n",
1172 | "---\n",
1173 | "Original Sentence: they want to drink\n",
1174 | "Tokenized: [54, 25, 29, 52]\n",
1175 | "Padded: [54, 25, 29, 52, 0, 0]\n",
1176 | "Decoded: they want to drink \n",
1177 | "---\n",
1178 | "Original Sentence: you can come\n",
1179 | "Tokenized: [3, 22, 53]\n",
1180 | "Padded: [3, 22, 53, 0, 0, 0]\n",
1181 | "Decoded: you can come \n",
1182 | "---\n",
1183 | "Original Sentence: we go home\n",
1184 | "Tokenized: [5, 42, 30]\n",
1185 | "Padded: [5, 42, 30, 0, 0, 0]\n",
1186 | "Decoded: we go home \n",
1187 | "---\n",
1188 | "Original Sentence: they play in the garden\n",
1189 | "Tokenized: [54, 31, 32, 9, 33]\n",
1190 | "Padded: [54, 31, 32, 9, 33, 0]\n",
1191 | "Decoded: they play in the garden \n",
1192 | "---\n",
1193 | "Original Sentence: the weather is nice\n",
1194 | "Tokenized: [9, 34, 4, 35]\n",
1195 | "Padded: [9, 34, 4, 35, 0, 0]\n",
1196 | "Decoded: the weather is nice \n",
1197 | "---\n",
1198 | "Original Sentence: he drives to Berlin\n",
1199 | "Tokenized: [41, 36, 29, 37]\n",
1200 | "Padded: [41, 36, 29, 37, 0, 0]\n",
1201 | "Decoded: he drives to Berlin \n",
1202 | "---\n",
1203 | "Original Sentence: she reads a book\n",
1204 | "Tokenized: [40, 38, 7, 39]\n",
1205 | "Padded: [40, 38, 7, 39, 0, 0]\n",
1206 | "Decoded: she reads a book \n",
1207 | "---\n",
1208 | "Final Data:\n",
1209 | "Sequence 1 \t: [1, 2, 43, 0, 0, 0]\n",
1210 | "\tDecoded : I am hungry \n",
1211 | "\tOriginal: I am hungry\n",
1212 | "Sequence 2 \t: [3, 6, 44, 0, 0, 0]\n",
1213 | "\tDecoded : you are tired \n",
1214 | "\tOriginal: you are tired\n",
1215 | "Sequence 3 \t: [5, 6, 45, 0, 0, 0]\n",
1216 | "\tDecoded : we are happy \n",
1217 | "\tOriginal: we are happy\n",
1218 | "Sequence 4 \t: [54, 6, 46, 0, 0, 0]\n",
1219 | "\tDecoded : they are sad \n",
1220 | "\tOriginal: they are sad\n",
1221 | "Sequence 5 \t: [47, 4, 10, 0, 0, 0]\n",
1222 | "\tDecoded : it is simple \n",
1223 | "\tOriginal: it is simple\n",
1224 | "Sequence 6 \t: [9, 34, 4, 35, 0, 0]\n",
1225 | "\tDecoded : the weather is nice \n",
1226 | "\tOriginal: the weather is nice\n",
1227 | "Sequence 7 \t: [49, 4, 50, 0, 0, 0]\n",
1228 | "\tDecoded : this is bad \n",
1229 | "\tOriginal: this is bad\n",
1230 | "Sequence 8 \t: [49, 55, 48, 0, 0, 0]\n",
1231 | "\tDecoded : this was good \n",
1232 | "\tOriginal: this was good\n",
1233 | "Sequence 9 \t: [5, 25, 29, 51, 0, 0]\n",
1234 | "\tDecoded : we want to eat \n",
1235 | "\tOriginal: we want to eat\n",
1236 | "Sequence 10 \t: [54, 25, 29, 52, 0, 0]\n",
1237 | "\tDecoded : they want to drink \n",
1238 | "\tOriginal: they want to drink\n",
1239 | "Sequence 11 \t: [3, 22, 53, 0, 0, 0]\n",
1240 | "\tDecoded : you can come \n",
1241 | "\tOriginal: you can come\n",
1242 | "Sequence 12 \t: [5, 42, 30, 0, 0, 0]\n",
1243 | "\tDecoded : we go home \n",
1244 | "\tOriginal: we go home\n",
1245 | "Sequence 13 \t: [54, 31, 32, 9, 33, 0]\n",
1246 | "\tDecoded : they play in the garden \n",
1247 | "\tOriginal: they play in the garden\n",
1248 | "Sequence 14 \t: [9, 34, 4, 35, 0, 0]\n",
1249 | "\tDecoded : the weather is nice \n",
1250 | "\tOriginal: the weather is nice\n",
1251 | "Sequence 15 \t: [41, 36, 29, 37, 0, 0]\n",
1252 | "\tDecoded : he drives to Berlin \n",
1253 | "\tOriginal: he drives to Berlin\n",
1254 | "Sequence 16 \t: [40, 38, 7, 39, 0, 0]\n",
1255 | "\tDecoded : she reads a book \n",
1256 | "\tOriginal: she reads a book\n"
1257 | ]
1258 | }
1259 | ],
1260 | "source": [
1261 | "# ------------------------------------------------------------------------------\n",
1262 | "# Testing tokenization, padding, and decoding\n",
1263 | "# ------------------------------------------------------------------------------\n",
1264 | "\n",
1265 | "# Print vocabulary with indices\n",
1266 | "print(\"Vocabulary:\")\n",
1267 | "for jj in range(1, len(vocab)): # Assuming vocab starts from 1\n",
1268 | " print(f\"{jj}: {vocab[jj]}\", end=\" \")\n",
1269 | " if jj % 20 == 0:\n",
1270 | " print()\n",
1271 | "print(\"\\n\")\n",
1272 | "\n",
1273 | "# Tokenize and pad all sentences\n",
1274 | "max_len = 6\n",
1275 | "mydata = [pad_sequence(tokenize_sentence(sentence, vocab), max_len=max_len) for sentence in sentences]\n",
1276 | "\n",
1277 | "# Test tokenization, padding, and decoding for each sentence\n",
1278 | "print(\"Testing tokenization, padding, and decoding:\")\n",
1279 | "for sentence in sentences:\n",
1280 | " print(f\"Original Sentence: {sentence}\")\n",
1281 | "\n",
1282 | " # Tokenization\n",
1283 | " tokenized = tokenize_sentence(sentence, vocab)\n",
1284 | " print(f\"Tokenized: {tokenized}\")\n",
1285 | "\n",
1286 | " # Padding\n",
1287 | " padded = pad_sequence(tokenized, max_len=max_len)\n",
1288 | " print(f\"Padded: {padded}\")\n",
1289 | "\n",
1290 | " # Decoding\n",
1291 | " decoded = [vocab[id] for id in padded]\n",
1292 | " print(f\"Decoded: {' '.join(decoded)}\")\n",
1293 | " print(\"---\")\n",
1294 | "\n",
1295 | "# Iterate over tokenized and padded sequences\n",
1296 | "print(\"Final Data:\")\n",
1297 | "for i, seq in enumerate(mydata):\n",
1298 | " seq_word = [vocab[jj] for jj in seq] # Decode the sequence\n",
1299 | " print(f\"Sequence {i+1} \\t: {seq}\")\n",
1300 | " print(f\"\\tDecoded : {' '.join(seq_word)}\")\n",
1301 | " print(f\"\\tOriginal: {sentences[i]}\")\n"
1302 | ]
1303 | },
1304 | {
1305 | "cell_type": "code",
1306 | "execution_count": 11,
1307 | "metadata": {
1308 | "executionInfo": {
1309 | "elapsed": 1,
1310 | "status": "aborted",
1311 | "timestamp": 1724522516517,
1312 | "user": {
1313 | "displayName": "Roland Potthast",
1314 | "userId": "09141136587533247770"
1315 | },
1316 | "user_tz": -120
1317 | },
1318 | "id": "F2G0eUC24-GM"
1319 | },
1320 | "outputs": [
1321 | {
1322 | "data": {
1323 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAA7YAAAK9CAYAAAAQSvDvAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjkuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8hTgPZAAAACXBIWXMAAA9hAAAPYQGoP6dpAACHAElEQVR4nOzde3zO9f/H8ee1sc1pm2FmDttMYRgia8mhjDkkSgdRNsdvRWKd6FuO3yz1DSWHTqJvRDmVComIEnIoCjlGmGMzG4Zdn98ftuvnspld27Xr8pnH/Xb73Lg+1/t6v1/X9bk+47XX+/P+WAzDMAQAAAAAgEl5uDsAAAAAAAAKgsQWAAAAAGBqJLYAAAAAAFMjsQUAAAAAmBqJLQAAAADA1EhsAQAAAACmRmILAAAAADA1ElsAAAAAgKmR2AIAAAAATI3EFsgji8WiESNG5KltaGio4uPjCzWevBoxYoQsFou7wyg08fHxCg0NtdvnyLFyl/3798tisWj69Ok3XBzu+s64+7t68OBB+fj46Mcff3RbDCtXrpTFYtHKlSuv2/ZG+Q5ladmypVq2bOnuMArsRjonHOXu78TFixdVtWpVTZ482S3jA7i5kdjClKZPny6LxWLbfHx8dOutt2rAgAE6evSoS2L46aefNGLECCUnJ7tkvMIWHx9v95le/fkid+PGjZPFYtF33313zTbvv/++LBaLvvzySxdGdmM5e/asRowYkafEzdVGjRqlqKgoNW3a1Lbv6vPC19dX9evX15tvvqn09HSXxDVr1ixNmDDBJWM5U1aSda3ttddec3eIprJy5Uo98MADCgoKkpeXlwIDA9WxY0fNnz/f3aHZFC9eXAkJCXr11Vd1/vx5d4cD4CZTzN0BAAUxatQohYWF6fz581qzZo2mTJmib775Rtu2bVPJkiWdOta5c+dUrNj/nzI//fSTRo4cqfj4ePn7+9u13blzpzw8zPd7I29vb33wwQfZ9nt6erohmvy7+li5QteuXfX8889r1qxZiomJybHNrFmzVK5cObVr107FihXTuXPnVLx4cZfGmRcvv/yyhgwZUih9nz17ViNHjpSkbNW9whz3eo4fP64ZM2ZoxowZ2Z678rxITk7WvHnz9Nxzz2nDhg2aPXu2U+No3ry5zp07Jy8vL9u+WbNmadu2bRo0aJBd25CQkBv2O3SlRx99VO3bt8+2v2HDhm6IJn/c+d2UpOHDh2vUqFG65ZZb9K9//UshISE6efKkvvnmG3Xp0kUzZ85Ut27d3BbflXr27KkhQ4Zo1qxZ6tWrl7vDAXATIbGFqbVr106NGzeWJPXp00flypXTuHHj9MUXX+jRRx916liOVC29vb2dOrarFCtWTI899pi7wygwd1SYg4ODdffdd2v+/PmaMmVKtu/AoUOH9MMPP6hfv362RORGrYQXK1bM5b8YcOe4kvTJJ5+oWLFi6tixY7bnrj4vnnrqKUVFRWnOnDkaN26cgoODnRaHh4dHnr8XZplNcdttt5n+54o7v5tz587VqFGj9OCDD2rWrFl2v8h4/vnntXTpUl28eNEtseXE399fbdq00fTp00lsAbiU+UpKQC7uueceSdK+ffskSZcuXdLo0aMVHh4ub29vhYaG6qWXXso2hfCXX35RbGysypcvrxIlSigsLCzbP8hXXrc5YsQIPf/885KksLAw29S6/fv3S8r5Gtu9e/fqoYceUkBAgEqWLKk77rhDX3/9tV2brOvrPvvsM7366quqUqWKfHx81KpVK+3evduu7erVq/XQQw+pWrVq8vb2VtWqVTV48GCdO3cu359fXmRNA//xxx+VkJCgChUqqFSpUrr//vt1/PjxbO0XL16sFi1aqEyZMvL19dXtt9+uWbNm2bX5/PPP1ahRI5UoUULly5fXY489pkOHDmXra+HChapbt658fHxUt25dLViwIMcYr77GNuv6uN27d9sq7H5+furZs6fOnj1r99pz585p4MCBKl++vMqUKaP77rtPhw4dytN1u4899phOnz6d7bhK0uzZs2W1WtW9e3dJOV8Ll5SUpJ49e6pKlSry9vZWpUqV1KlTJ9v3Kqf3luXq79ypU6f03HPPqV69eipdurR8fX3Vrl07/frrr7m+hys/ryy5TVPPiuXChQsaNmyYGjVqJD8/P5UqVUrNmjXT999/b+tn//79qlChgiRp5MiR2frI6TrGvJ7DoaGhuvfee7VmzRo1adJEPj4+ql69uj7++OPrvl/p8ncrKipKpUuXvm5bDw8PW7U569gcO3ZMvXv3VsWKFeXj46P69evnWP2dPXu2GjVqZDsf6tWrp7feesv2/NXX2LZs2VJff/21/vrrL9vnlXVN+bWup1yxYoWaNWumUqVKyd/fX506ddL27dvt2jhyTnz00Ue65557FBgYKG9vb0VERGjKlCnX/Zwc4cjxS05O1uDBgxUaGipvb29VqVJFPXr00IkTJ2xt8no8kpOTFR8fLz8/P/n7+ysuLi7Hy0ty+m5aLBYNGDDA9nPJ29tbderU0ZIlS7K9fuXKlWrcuLF8fHwUHh6ud999N8/X7b7yyisKCAjQtGnTcqzOx8bG6t57773m63/77TfFx8erevXq8vHxUVBQkHr16qWTJ0/atTtz5owGDRpk+1wDAwPVunVrbdq0ydZm165d6tKli4KCguTj46MqVaqoa9euOn36tF1frVu31po1a3Tq1Knrvj8AcBYqtihS9uzZI0kqV66cpMtV3BkzZujBBx/Us88+q3Xr1ikxMVHbt2+3JUXHjh1TmzZtVKFCBQ0ZMkT+/v7av39/rtctPfDAA/rzzz/16aefavz48Spfvrwk2f7TfrWjR4/qzjvv1NmzZzVw4ECVK1dOM2bM0H333ae5c+fq/vvvt2v/2muvycPDQ88995xOnz6t119/Xd27d9e6detsbT7//HOdPXtWTz75pMqVK6f169dr4sSJ+vvvv/X555/n+zO88j+HWby8vOTr62u37+mnn1bZsmU1fPhw7d+/XxMmTNCAAQM0Z84cW5us39jXqVNHQ4cOlb+/vzZv3qwlS5bYps1Nnz5dPXv21O23367ExEQdPXpUb731ln788Udt3rzZNs3722+/VZcuXRQREaHExESdPHnSlgTm1cMPP6ywsDAlJiZq06ZN+uCDDxQYGKixY8fa2sTHx+uzzz7T448/rjvuuEOrVq1Shw4d8tT/Aw88oCeffFKzZs3SAw88YPfcrFmzFBISYnf95tW6dOmi33//XU8//bRCQ0N17NgxLVu2TAcOHMi2QNb17N27VwsXLtRDDz2ksLAwHT16VO+++65atGihP/74w6Eq47/+9a9s06uXLFmimTNnKjAwUJKUkpKiDz74QI8++qj69u2rM2fO6MMPP1RsbKzWr1+vBg0aqEKFCpoyZYqefPJJ3X///bbPKDIy8ppj5+UczrJ79249+OCD6t27t+Li4jRt2jTFx8erUaNGqlOnzjXHuHjxojZs2KAnn3wyz5/JlT9rzp07p5YtW2r37t0aMGCAwsLC9Pnnnys+Pl7Jycl65plnJEnLli3To48+qlatWtm+c9u3b9ePP/5oa3O1f//73zp9+rT+/vtvjR8/XpJyTb6/++47tWvXTtWrV9eIESN07tw5TZw4UU2bNtWmTZuyfY/yck5MmTJFderU0X333adixYpp0aJFeuqpp2S1WtW/f//rflZnz57N8eeKv7+/XRU0L8cvNTVVzZo10/bt29WrVy/ddtttOnHihL788kv9/fffKl++fJ6Ph2EY6tSpk9asWaMnnnhCtWvX1oIFCxQXF3fd95RlzZo1mj9/vp566imVKVNGb7/9trp06aIDBw7Y/h3avHmz2rZtq0qVKmnkyJHKyMjQqFGjrvnvxZV27dqlHTt2qFevXipTpkye47rSsmXLtHfvXvXs2VNBQUH6/fff9d577+n333/Xzz//bEuun3jiCc2dO1cDBgxQRESETp48qTVr1mj79u267bbbdOHCBcXGxio9PV1PP/20goKCdOjQIX311VdKTk6Wn5+fbcxGjRrJMAz99NNPuSbdAOBUBmBCH330kSHJ+O6774zjx48bBw8eNGbPnm2UK1fOKFGihPH3338bW7ZsMSQZffr0sXvtc889Z0gyVqxYYRiGYSxYsMCQZGzYsCHXMSUZw4cPtz1+4403DEnGvn37srUNCQkx4uLibI8HDRpkSDJWr15t23fmzBkjLCzMCA0NNTIyMgzDMIzvv//ekGTUrl3bSE9Pt7V96623DEnG1q1bbfvOnj2bbdzExETDYrEYf/31l23f8OHDjbyc6nFxcYakHLfY2Fhbu6zPPiYmxrBarbb9gwcPNjw9PY3k5GTDMAwjOTnZKFOmjBEVFWWcO3fObqys1124cMEIDAw06tata9fmq6++MiQZw4YNs+1r0KCBUalSJVv/hmEY3377rSHJCAkJsev/6mOV9Rn06tXLrt39999vlCtXzvZ448aNhiRj0KBBdu3i4+Oz9XktDz30kOHj42OcPn3atm/Hjh2GJGPo0KG2ffv27TMkGR999JFhGIbxzz//GJKMN954I9f+rxXH1d+58+fP275XV47p7e1tjBo16ppxGMb1vzO7du0y/Pz8jNatWxuXLl0yDMMwLl26ZPedzXpPFStWtPvcjx8/fs33cPW4eT2Hs96/JOOHH36w7Tt27Jjh7e1tPPvss9d8L4ZhGLt37zYkGRMnTsz2XFxcnFGqVCnj+PHjxvHjx43du3cbY8aMMSwWixEZGWkYhmFMmDDBkGR88sknttdduHDBiI6ONkqXLm2kpKQYhmEYzzzzjOHr62v7zHKS9TPg+++/t+3r0KFDtu+4YeR87Bo0aGAEBgYaJ0+etO379ddfDQ8PD6NHjx62fXk9Jwwj5581sbGxRvXq1e32tWjRwmjRokW2+K61rV271tY2r8dv2LBhhiRj/vz52WLK+rmS1+OxcOFCQ5Lx+uuv29pdunTJaNasWZ7OCUmGl5eXsXv3btu+X3/9Ndt3qWPHjkbJkiWNQ4cO2fbt2rXLKFas2HV/Nn/xxReGJGP8+PG5tsuS03cip+P36aefZvu8/fz8jP79+1+z782bNxuSjM8///y6cRw+fNiQZIwdOzZPcQOAMzAVGaYWExOjChUqqGrVquratatKly6tBQsWqHLlyvrmm28kSQkJCXavefbZZyXJNl00qyL41VdfFdp1St98842aNGmiu+66y7avdOnS6tevn/bv368//vjDrn3Pnj3tFo9p1qyZpMtVuCwlSpSw/T0tLU0nTpzQnXfeKcMwtHnz5nzF6ePjo2XLlmXbclq9tF+/fnbT6Jo1a6aMjAz99ddfki5XCc6cOaMhQ4Zkuw4w63W//PKLjh07pqeeesquTYcOHVSrVi3bMTpy5Ii2bNmiuLg4u6pA69atFRERkef398QTT9g9btasmU6ePKmUlBRJsk0hfOqpp+zaPf3003ke47HHHtP58+ftKv5ZU6+zpiHnpESJEvLy8tLKlSv1zz//5Hm8a/H29rYtYJaRkaGTJ0+qdOnSqlmzpt3UQkelpaXp/vvvV9myZfXpp5/aFhbz9PS0fWetVqtOnTqlS5cuqXHjxvkeL6/ncJaIiAjbuSJdnkFRs2ZNu/MmJ1lTMsuWLZvj82lpaapQoYIqVKigGjVq6KWXXlJ0dLStYvzNN98oKCjI7rr+4sWLa+DAgUpNTdWqVaskXf5Zk5aWpmXLll33vedH1nkSHx+vgIAA2/7IyEi1bt3a9nle6XrnhGT/s+b06dM6ceKEWrRoob1792abgpqTfv365fhz5epzNy/Hb968eapfv362WS7S//9cyevx+Oabb1SsWDG7Sr2np6dD53tMTIzCw8NtjyMjI+Xr62uLOSMjQ9999506d+5sN0uiRo0aateu3XX7zzoO+a3WSvbH7/z58zpx4oTuuOMOSbI7N/39/bVu3TodPnw4x36yfvYuXbo023T1q2WdSzlV6gGgsDAVGaY2adIk3XrrrSpWrJgqVqyomjVr2v4z/9dff8nDw0M1atSwe01QUJD8/f1tCViLFi3UpUsXjRw5UuPHj1fLli3VuXNndevWzWmLQP3111+KiorKtr927dq25+vWrWvbX61aNbt2Wf9JuDLhOXDggIYNG6Yvv/wyWyKUl/9s5sTT0/OaK/pe7XoxZk3VvPJ9XS3rGNSsWTPbc7Vq1dKaNWvs2t1yyy3Z2jmSqOUWs6+vr+07ExYWZtfu6u9Qbtq1a6eAgADNmjXLds3rp59+qvr16+c6Hdbb21tjx47Vs88+q4oVK+qOO+7Qvffeqx49eigoKCjP42exWq166623NHnyZO3bt08ZGRm257KmSOZH3759tWfPHv3000/Z+pkxY4befPNN7dixw+6XRFd/nnmV13M4y9XHV7p8jPP6iwLDMHLc7+Pjo0WLFkm6fJzCwsLspsD/9ddfuuWWW7KthH7l+S1d/oXJZ599pnbt2qly5cpq06aNHn74YbVt2zZP8V1PbudT7dq1tXTpUqWlpalUqVK2/dc7JyTpxx9/1PDhw7V27dpsCc3p06ftftmUk1tuuSVPP1fycvz27NmjLl265NpPXo/HX3/9pUqVKmWb2p3T55ffmI8dO6Zz587l+DMkLz9Xso7BmTNn8hzT1U6dOqWRI0dq9uzZOnbsmN1zV/5b8frrrysuLk5Vq1ZVo0aN1L59e/Xo0UPVq1eXdPk8TkhI0Lhx4zRz5kw1a9ZM9913nx577LFs34Gsc8kM9/4FUHRQsYWpNWnSRDExMWrZsqVq166d4y12rvcPq8Vi0dy5c7V27VoNGDBAhw4dUq9evdSoUSOlpqYWVui5utbtdbL+s5CRkaHWrVvr66+/1osvvqiFCxdq2bJltkVkrFar22O8Ebki5uLFi+vhhx/WihUrdPToUW3YsEG7du3KtVqbZdCgQfrzzz+VmJgoHx8fvfLKK6pdu3aeKvBXJq6SNGbMGCUkJKh58+b65JNPtHTpUi1btkx16tTJ9/fjrbfe0qeffqr3339fDRo0sHvuk08+UXx8vMLDw/Xhhx9qyZIlWrZsme65554Cfx/z+p/j/B7frAT9Wglw1i98YmJi1KxZM4eu675SYGCgtmzZoi+//FL33Xefvv/+e7Vr186hazqd7Xqf2Z49e9SqVSudOHFC48aN09dff61ly5Zp8ODBkpz7s4afKdnVqlVLkrR169Z89/Hwww/r/fff1xNPPKH58+fr22+/tc1OufL4Pfzww9q7d68mTpyo4OBgvfHGG6pTp44WL15sa/Pmm2/qt99+00svvWRbaK9OnTr6+++/7cbMOpey1p8AAFcgsUWRFRISIqvVql27dtntP3r0qJKTkxUSEmK3/4477tCrr76qX375RTNnztTvv/+e6z0qHflNdEhIiHbu3Jlt/44dO2zPO2Lr1q36888/9eabb+rFF19Up06dFBMT49TbjhRU1vS8bdu2XbNN1vvO6bPZuXOn7fmsP68+ltd6bX5lfWeyVtXOcvWK1NfTvXt3ZWRkaM6cOZo1a5YsFkuebz8VHh6uZ599Vt9++622bdumCxcu6M0337Q9X7Zs2Wyrtl64cEFHjhyx2zd37lzdfffd+vDDD9W1a1e1adNGMTExOa74mherV6/Wc889p0GDBuWYpM+dO1fVq1fX/Pnz9fjjjys2NlYxMTE6f/68XTtHzxtHzuH8qlatmkqUKJHtuOdVSEiIdu3alS3Jy+n89vLyUseOHTV58mTt2bNH//rXv/Txxx/n+h3L62eW2/m0Y8cOlS9f3q5amxeLFi1Senq6vvzyS/3rX/9S+/btFRMTYze91ZXCw8Nz/Zki5f14hISE6MiRI9l+genMnymBgYHy8fHJ8fjm5efKrbfeqpo1a+qLL77I1y9a//nnHy1fvlxDhgzRyJEjdf/996t169a2KuzVKlWqpKeeekoLFy7Uvn37VK5cOb366qt2berVq6eXX35ZP/zwg1avXq1Dhw5p6tSpdm2yzqWsKjkAuAKJLYqs9u3bS5ImTJhgt3/cuHGSZFvp9p9//sn22/WsatTVtxS5UtZ/EPOSKLRv317r16/X2rVrbfvS0tL03nvvKTQ01KHrRKX/rxJcGbdhGHa3DXG3Nm3aqEyZMkpMTMyW3GTF3bhxYwUGBmrq1Kl2n/XixYu1fft22zGqVKmSGjRooBkzZthNnVu2bFm265MLIjY2VpI0efJku/0TJ050qJ+mTZsqNDRUn3zyiebMmaMWLVpct8p39uzZbJ9TeHi4ypQpY/fZhIeH64cffrBr995772Wr2Hp6emb7Xn/++ec53kbpeo4cOaKHH35Yd911l954440c2+T0nVy3bp3dd16SSpYsKSnv5410/XO4oIoXL67GjRvrl19+ydfr27dvr6SkJLsVwS9duqSJEyeqdOnSatGihSRlu72Kh4eHbUXo6/2sycvlBVeeJ1d+vtu2bdO3335r+zwdkdNxPX36tD766COH+3KGLl266Ndff83xVl9ZMeb1eLRv316XLl2yu3VRRkaGw+d7brKq/QsXLrS7dnX37t12ldDcjBw5UidPnlSfPn106dKlbM9/++23+uqrr645vpS9gnz1OZWRkZHtOxYYGKjg4GDbdzMlJSXb+PXq1ZOHh0e27+/GjRtlsVgUHR19/TcIAE7CNbYosurXr6+4uDi99957Sk5OVosWLbR+/XrNmDFDnTt31t133y3p8nWBkydP1v3336/w8HCdOXNG77//vnx9fXP9j2CjRo0kXb4dR9euXVW8eHF17Ngxx4rIkCFD9Omnn6pdu3YaOHCgAgICNGPGDO3bt0/z5s3LcQp1bmrVqqXw8HA999xzOnTokHx9fTVv3rwCLzp06dIlffLJJzk+d//99ztU7fH19dX48ePVp08f3X777erWrZvKli2rX3/9VWfPntWMGTNUvHhxjR07Vj179lSLFi306KOP2m73ExoaapvuKEmJiYnq0KGD7rrrLvXq1UunTp3SxIkTVadOHadNGW/UqJG6dOmiCRMm6OTJk7bb/fz555+S8l45s1gs6tatm8aMGSNJGjVq1HVf8+eff6pVq1Z6+OGHFRERoWLFimnBggU6evSounbtamvXp08fPfHEE+rSpYtat26tX3/9VUuXLs025e/ee+/VqFGj1LNnT915553aunWrZs6cec1KTW4GDhyo48eP64UXXsg2iyEyMlKRkZG69957NX/+fN1///3q0KGD9u3bp6lTpyoiIsLu+JQoUUIRERGaM2eObr31VgUEBKhu3bo5Xoud13PYGTp16qR///vfSklJyXZrq+vp16+f3n33XcXHx2vjxo0KDQ3V3Llz9eOPP2rChAm2hX/69OmjU6dO6Z577lGVKlX0119/aeLEiWrQoEGula1GjRppzpw5SkhI0O23367SpUurY8eOObZ944031K5dO0VHR6t379622/34+fld9z7MOWnTpo2tyvyvf/1Lqampev/99xUYGJhtlsC1bNq0KcefK+Hh4Q4nPs8//7zmzp2rhx56yHbJyKlTp/Tll19q6tSpql+/fp6PR8eOHdW0aVMNGTJE+/fvV0REhObPn5/vNQquZcSIEfr222/VtGlTPfnkk8rIyNA777yjunXrasuWLdd9/SOPPKKtW7fq1Vdf1ebNm/Xoo48qJCREJ0+e1JIlS7R8+fJs9wbP4uvrq+bNm+v111/XxYsXVblyZX377bfZZiecOXNGVapU0YMPPqj69eurdOnS+u6777RhwwbbjJEVK1ZowIABeuihh3Trrbfq0qVL+t///idPT89s1z0vW7ZMTZs2LdD1/ADgMBevwgw4RdYtZ653i56LFy8aI0eONMLCwozixYsbVatWNYYOHWqcP3/e1mbTpk3Go48+alSrVs3w9vY2AgMDjXvvvdf45Zdf7PpSDrcoGT16tFG5cmXDw8PD7tY/V996xTAMY8+ePcaDDz5o+Pv7Gz4+PkaTJk2Mr776yq5N1q0+rr6dQk63cPjjjz+MmJgYo3Tp0kb58uWNvn372m414citW7LkdrufK9/btT77nG5TYhiG8eWXXxp33nmnUaJECcPX19do0qSJ8emnn9q1mTNnjtGwYUPD29vbCAgIMLp37278/fff2WKcN2+eUbt2bcPb29uIiIgw5s+fb8TFxeX5dj/Hjx+3a5f1Xq68ZVNaWprRv39/IyAgwChdurTRuXNnY+fOnYYk47XXXrvu55jl999/NyQZ3t7exj///JPt+auP6YkTJ4z+/fsbtWrVMkqVKmX4+fkZUVFRxmeffWb3uoyMDOPFF180ypcvb5QsWdKIjY01du/enePtfp599lmjUqVKRokSJYymTZsaa9euveYtWXL7zrRo0eKa34usz9lqtRpjxowxQkJCDG9vb6Nhw4bGV199lePx+emnn4xGjRoZXl5edn3k9F3NyzlsGJfPuQ4dOmT7nK9+v9dy9OhRo1ixYsb//vc/u/1Zt/vJy+t79uxplC9f3vDy8jLq1atn95kahmHMnTvXaNOmjREYGGh4eXkZ1apVM/71r38ZR44csbXJ6TxKTU01unXrZvj7+9vd3iqnY2cYhvHdd98ZTZs2tZ1zHTt2NP744w+7No6cE19++aURGRlp+Pj4GKGhocbYsWONadOmZWvn6O1+rvy+OnL8Tp48aQwYMMCoXLmy4eXlZVSpUsWIi4szTpw4YWuTl+OR1dfjjz9u+Pr6Gn5+fsbjjz9uu61NXm73k9PtcXL6+b98+XKjYcOGhpeXlxEeHm588MEHxrPPPmv4+Phke/21LF++3OjUqZMRGBhoFCtWzKhQoYLRsWNH44svvrC1yek78ffffxv333+/4e/vb/j5+RkPPfSQ7XY8Wedeenq68fzzzxv169c3ypQpY5QqVcqoX7++MXnyZFs/e/fuNXr16mWEh4cbPj4+RkBAgHH33Xcb3333nV2cycnJhpeXl/HBBx/k+b0BgDNYDOMGXpUBAG4AW7ZsUcOGDfXJJ5/kaREomFPv3r31559/avXq1e4OBTeBzp076/fff89x7QAzmzBhgl5//XXt2bPHbddiA7g5cY0tAFzh3Llz2fZNmDBBHh4eat68uRsigqsMHz5cGzZs0I8//ujuUFDEXP1zZdeuXfrmm2/UsmVL9wRUSC5evKhx48bp5ZdfJqkF4HJUbAHgCiNHjtTGjRt19913q1ixYlq8eLEWL15su24PABxVqVIlxcfHq3r16vrrr780ZcoUpaena/PmzTnenxsA4DgSWwC4wrJlyzRy5Ej98ccfSk1NVbVq1fT444/r3//+t4oVY709AI7r2bOnvv/+eyUlJcnb21vR0dEaM2aMbrvtNneHBgBFhtunIh86dEiPPfaYypUrpxIlSqhevXr5vuUCABRU69attWbNGp06dUoXLlzQ7t27NXz4cJJaAPn20Ucfaf/+/Tp//rxOnz6tJUuWkNQCKJAffvhBHTt2VHBwsCwWixYuXHjd16xcuVK33XabvL29VaNGDU2fPj1bm0mTJik0NFQ+Pj6KiorS+vXrnR98IXFrYvvPP/+oadOmKl68uBYvXqw//vhDb775psqWLevOsAAAAADghpWWlqb69etr0qRJeWq/b98+dejQQXfffbe2bNmiQYMGqU+fPlq6dKmtTdat5YYPH65Nmzapfv36io2N1bFjxwrrbTiVW6ciDxkyRD/++CMrUAIAAABAPlgsFi1YsECdO3e+ZpsXX3xRX3/9tbZt22bb17VrVyUnJ2vJkiWSpKioKN1+++165513JElWq1VVq1bV008/rSFDhhTqe3AGt86t+/LLLxUbG6uHHnpIq1atUuXKlfXUU0+pb9++ObZPT09Xenq67bHVatWpU6dUrlw5WSwWV4UNAAAAIAeGYejMmTMKDg6Wh4fbr3p02Pnz53XhwgWXj2sYRrZ8xtvbW97e3k7pf+3atYqJibHbFxsbq0GDBkmSLly4oI0bN2ro0KG25z08PBQTE6O1a9c6JYbC5tbEdu/evZoyZYoSEhL00ksvacOGDRo4cKC8vLwUFxeXrX1iYqJGjhzphkgBAAAA5NXBgwdVpUoVd4fhkPPnzysspLSSjmW4fOzSpUsrNTXVbt/w4cM1YsQIp/SflJSkihUr2u2rWLGiUlJSdO7cOf3zzz/KyMjIsc2OHTucEkNhc2tia7Va1bhxY40ZM0aS1LBhQ23btk1Tp07NMbEdOnSoEhISbI9Pnz6tatWqqZlnJxWzFHd6fEn/C3d6n1fyKX6x0Pou2/3vQut73tbCW9yrwbzehda3JP23/f8Kre9fz1UrtL63p1YqtL6HVlpSaH1L0uO/xRda3982/KTQ+r5zVZ9C63vL3R8XWt+S1GBxz0Lr+7d20wut78gvCzHu+z4qtL4lqf6CXoXW96/3Tyu0vs0atyQ1KMTYtxRi7A3mF96/c1se+LDQ+pbMG7tZ45bMGXtKqlUht+1XmTJlCqX/wnThwgUlHcvQXxtD5VvGddXmlDNWhTTar4MHD8rX19e231nV2puFWxPbSpUqKSIiwm5f7dq1NW/evBzbX6scX8xSvFASW8+Shftl8ixeeCdMMYtXofXtW8az0Pr28PEptL4lqWQhxu7j6fzvYJbiRuEdz9KF/IO7MM+jwvxHx6NE4X0XC/sfy8KNvRDPf5PGLZk3drPGLRXuvxdm/XeOzzxnZo1bMnfsZr5MsHQZi0qXcV38Vl0ey9fX1y6xdaagoCAdPXrUbt/Ro0fl6+urEiVKyNPTU56enjm2CQoKKpSYnM2tE9+bNm2qnTt32u37888/FRIS4qaIAAAAAKBoiY6O1vLly+32LVu2TNHR0ZIkLy8vNWrUyK6N1WrV8uXLbW1udG5NbAcPHqyff/5ZY8aM0e7duzVr1iy999576t+/vzvDAgAAAIAbVmpqqrZs2aItW7ZIunw7ny1btujAgQOSLl/C2aNHD1v7J554Qnv37tULL7ygHTt2aPLkyfrss880ePBgW5uEhAS9//77mjFjhrZv364nn3xSaWlp6tmz8C4Vcia3TkW+/fbbtWDBAg0dOlSjRo1SWFiYJkyYoO7du7szLAAAAAA3qQzDqgwX3hA1w7A6/JpffvlFd999t+1x1jpEcXFxmj59uo4cOWJLciUpLCxMX3/9tQYPHqy33npLVapU0QcffKDY2Fhbm0ceeUTHjx/XsGHDlJSUpAYNGmjJkiXZFpS6Ubk1sZWke++9V/fee6+7wwAAAAAAU2jZsqUM49rZ9/Tp03N8zebNm3Ptd8CAARowYEBBw3MLtye2AAAAAHCjsMqQVa4r2bpyrKLMfHdNBgAAAADgClRsAQAAACCTVVY5ftVrwcZDwVGxBQAAAACYGoktAAAAAMDUmIoMAAAAAJkyDEMZuaw4XBjjoeCo2AIAAAAATI2KLQAAAABk4nY/5kTFFgAAAABgaiS2AAAAAABTYyoyAAAAAGSyylAGU5FNh4otAAAAAMDUqNgCAAAAQCYWjzInKrYAAAAAAFOjYgsAAAAAmTIMQxmG66qorhyrKKNiCwAAAAAwNRJbAAAAAICpMRUZAAAAADJZMzdXjoeCo2ILAAAAADA1KrYAAAAAkClDhjJceAseV45VlFGxBQAAAACYGoktAAAAAMDUmIoMAAAAAJkyjMubK8dDwVGxBQAAAACYGhVbAAAAAMjE7X7MiYotAAAAAMDUqNgCAAAAQCarLMqQxaXjoeCo2AIAAAAATI3EFgAAAABgakxFBgAAAIBMVuPy5srxUHBUbAEAAAAApkbFFgAAAAAyZbh48ShXjlWUUbEFAAAAAJgaiS0AAAAAwNSYigwAAAAAmZiKbE5UbAEAAAAApkbFFgAAAAAyWQ2LrIbrqqiuHKsoo2ILAAAAADA1KrYAAAAAkIlrbM2Jii0AAAAAwNRIbAEAAAAApsZUZAAAAADIlCEPZbiw/pfhspGKNiq2AAAAAABTo2ILAAAAAJkMF9/ux+B2P05BxRYAAAAAYGoktgAAAAAAU2MqMgAAAABk4j625kTFFgAAAABgalRsAQAAACBThuGhDMOFt/sxXDZUkUbFFgAAAABgalRsAQAAACCTVRZZXVj/s4qSrTNQsQUAAAAAmBqJLQAAAADA1JiKDAAAAACZuN2POVGxBQAAAACYGhVbAAAAAMjk+tv9sHiUM1CxBQAAAACYGoktAAAAAMDUmIoMAAAAAJku38fWdQs6uXKsooyKLQAAAADA1KjYAgAAAEAmqzyU4cL6n1UsHuUMVGwBAAAAAKZGYgsAAAAAMDWmIgMAAABAJu5ja05UbAEAAAAApkbFFgAAAAAyWeUhK4tHmQ4VWwAAAAAwmUmTJik0NFQ+Pj6KiorS+vXrr9m2ZcuWslgs2bYOHTrY2sTHx2d7vm3btq54K05BxRYAAAAAMmUYFmUYFpeO56g5c+YoISFBU6dOVVRUlCZMmKDY2Fjt3LlTgYGB2drPnz9fFy5csD0+efKk6tevr4ceesiuXdu2bfXRRx/ZHnt7ezscm7tQsQUAAAAAExk3bpz69u2rnj17KiIiQlOnTlXJkiU1bdq0HNsHBAQoKCjIti1btkwlS5bMlth6e3vbtStbtqwr3o5TkNgCAAAAgJulpKTYbenp6Tm2u3DhgjZu3KiYmBjbPg8PD8XExGjt2rV5GuvDDz9U165dVapUKbv9K1euVGBgoGrWrKknn3xSJ0+ezP8bcjESWwAAAADIlCEPl2+SVLVqVfn5+dm2xMTEHOM7ceKEMjIyVLFiRbv9FStWVFJS0nXf3/r167Vt2zb16dPHbn/btm318ccfa/ny5Ro7dqxWrVqldu3aKSMjI5+fpGtxjS0AAAAAuNnBgwfl6+tre1xY17d++OGHqlevnpo0aWK3v2vXrra/16tXT5GRkQoPD9fKlSvVqlWrQonFmajYAgAAAEAmq+Hh8k2SfH197bZrJbbly5eXp6enjh49arf/6NGjCgoKyvW9paWlafbs2erdu/d1P4fq1aurfPny2r17dx4/OfcisQUAAAAAk/Dy8lKjRo20fPly2z6r1arly5crOjo619d+/vnnSk9P12OPPXbdcf7++2+dPHlSlSpVKnDMrkBiCwAAAAAmkpCQoPfff18zZszQ9u3b9eSTTyotLU09e/aUJPXo0UNDhw7N9roPP/xQnTt3Vrly5ez2p6am6vnnn9fPP/+s/fv3a/ny5erUqZNq1Kih2NhYl7ynguIaWwAAAADIdOWCTq4Zz3D4NY888oiOHz+uYcOGKSkpSQ0aNNCSJUtsC0odOHBAHh7272Hnzp1as2aNvv3222z9eXp66rffftOMGTOUnJys4OBgtWnTRqNHjzbNvWxJbAEAAADAZAYMGKABAwbk+NzKlSuz7atZs6YMI+ckukSJElq6dKkzw3M5ElsAAAAAyGSVlGFYXDoeCo5rbAEAAAAApkbFFgAAAAAyWeUhqwvrf64cqyjjUwQAAAAAmBqJLQAAAADA1JiKDAAAAACZMgwPZRguvN2PC8cqyvgUAQAAAACmRsUWAAAAADJZZZFVrrzdj+vGKsqo2AIAAAAATI3EFgAAAABgakxFBgAAAIBMLB5lTnyKAAAAAABTo2ILAAAAAJky5KEMF9b/XDlWUcanCAAAAAAwNbcmtiNGjJDFYrHbatWq5c6QAAAAANzErIbF5RsKzu1TkevUqaPvvvvO9rhYMbeHBAAAAAAwEbdnkcWKFVNQUJC7wwAAAAAAmJTbE9tdu3YpODhYPj4+io6OVmJioqpVq5Zj2/T0dKWnp9sep6SkuCpMAAAAADcBq4sXj7Ky7JFTuDWxjYqK0vTp01WzZk0dOXJEI0eOVLNmzbRt2zaVKVMmW/vExESNHDky237P6tXk6ent9PiCxhZ3ep9XGvzJvELre1yTboXWd6p1VaH17XvLP4XWtyT9cb5yofVd2+dQofX93dGahdZ3mUL+WZp+ofB+zBS3eBZa38ow8fUuVpPGzjVGLmcx3B0BAADO4dbEtl27dra/R0ZGKioqSiEhIfrss8/Uu3fvbO2HDh2qhIQE2+OUlBRVrVrVJbECAAAAKPqshoeshgsrti4cqyhz+1TkK/n7++vWW2/V7t27c3ze29tb3t7Or8wCAAAAAMzrhvr1QGpqqvbs2aNKlSq5OxQAAAAAgEm4tWL73HPPqWPHjgoJCdHhw4c1fPhweXp66tFHH3VnWAAAAABuUhmyKEOuW/fBlWMVZW5NbP/++289+uijOnnypCpUqKC77rpLP//8sypUqODOsAAAAAAAJuLWxHb27NnuHB4AAAAA7LB4lDnxKQIAAAAATO2GWhUZAAAAANwpQ6697jXDZSMVbVRsAQAAAACmRmILAAAAADA1piIDAAAAQCYWjzInPkUAAAAAgKlRsQUAAACATBmGhzJcWEV15VhFGZ8iAAAAAMDUSGwBAAAAAKbGVGQAAAAAyGTIIqsL72NruHCsooyKLQAAAADA1KjYAgAAAEAmFo8yJz5FAAAAAICpUbEFAAAAgExWwyKr4brrXl05VlFGxRYAAAAAYGoktgAAAAAAU2MqMgAAAABkypCHMlxY/3PlWEUZnyIAAAAAwNSo2AIAAABAJhaPMicqtgAAAAAAUyOxBQAAAACYGlORAQAAACCTVR6yurD+58qxijI+RQAAAACAqVGxBQAAAIBMGYZFGS5c0MmVYxVlVGwBAAAAAKZGxRYAAAAAMnG7H3OiYgsAAAAAMDUSWwAAAACAqTEVGQAAAAAyGYaHrIbr6n+GC8cqyvgUAQAAAACmRsUWAAAAADJlyKIMufB2Py4cqyijYgsAAAAAMDUSWwAAAACAqTEVGQAAAAAyWQ3X3lvWarhsqCKNii0AAAAAwNSo2AIAAABAJquLb/fjyrGKMj5FAAAAAICpkdgCAAAAAEyNqcgAAAAAkMkqi6wuvLesK8cqyqjYAgAAAABMjYotAAAAAGTKMCzKcOHtflw5VlFGxRYAAAAAYGoktgAAAACQKet2P67c8mPSpEkKDQ2Vj4+PoqKitH79+mu2nT59uiwWi93m4+Nj18YwDA0bNkyVKlVSiRIlFBMTo127duUrNncgsQUAAAAAE5kzZ44SEhI0fPhwbdq0SfXr11dsbKyOHTt2zdf4+vrqyJEjtu2vv/6ye/7111/X22+/ralTp2rdunUqVaqUYmNjdf78+cJ+O05BYgsAAAAAbpaSkmK3paenX7PtuHHj1LdvX/Xs2VMRERGaOnWqSpYsqWnTpl3zNRaLRUFBQbatYsWKtucMw9CECRP08ssvq1OnToqMjNTHH3+sw4cPa+HChc58m4WGxBYAAAAAMlllkdVw4ZZ5u5+qVavKz8/PtiUmJuYY34ULF7Rx40bFxMTY9nl4eCgmJkZr16695vtKTU1VSEiIqlatqk6dOun333+3Pbdv3z4lJSXZ9enn56eoqKhc+7yRsCoyAAAAALjZwYMH5evra3vs7e2dY7sTJ04oIyPDruIqSRUrVtSOHTtyfE3NmjU1bdo0RUZG6vTp0/rvf/+rO++8U7///ruqVKmipKQkWx9X95n13I2OxBYAAAAAMhn6/yqqq8aTLl8De2Vi60zR0dGKjo62Pb7zzjtVu3Ztvfvuuxo9enShjOlqTEUGAAAAAJMoX768PD09dfToUbv9R48eVVBQUJ76KF68uBo2bKjdu3dLku11BenT3UhsAQAAAMAkvLy81KhRIy1fvty2z2q1avny5XZV2dxkZGRo69atqlSpkiQpLCxMQUFBdn2mpKRo3bp1ee7T3ZiKDAAAAACZshZ1cuV4jkpISFBcXJwaN26sJk2aaMKECUpLS1PPnj0lST169FDlypVtC1CNGjVKd9xxh2rUqKHk5GS98cYb+uuvv9SnTx9Jl1dMHjRokP7zn//olltuUVhYmF555RUFBwerc+fOTnuvhYnEFgAAAABM5JFHHtHx48c1bNgwJSUlqUGDBlqyZIlt8acDBw7Iw+P/J+f+888/6tu3r5KSklS2bFk1atRIP/30kyIiImxtXnjhBaWlpalfv35KTk7WXXfdpSVLlsjHx8fl7y8/SGwBAAAAIJPV8JDVcN0Vm/kda8CAARowYECOz61cudLu8fjx4zV+/Phc+7NYLBo1apRGjRqVr3jcjWtsAQAAAACmRsUWAAAAADKZ4RpbZEfFFgAAAABgaiS2AAAAAABTYyoyAAAAAGSyyiKrXDgV2YVjFWVUbAEAAAAApkbFFgAAAAAysXiUOVGxBQAAAACYGoktAAAAAMDUmIoMAAAAAJmYimxOVGwBAAAAAKZGxRYAAAAAMlGxNScqtgAAAAAAU6NiCwAAAACZqNiaExVbAAAAAICpkdgCAAAAAEyNqcgAAAAAkMmQZJXrpgcbLhupaKNiCwAAAAAwNSq2AAAAAJCJxaPMiYotAAAAAMDUSGwBAAAAAKbGVGQAAAAAyMRUZHOiYgsAAAAAMDUqtgAAAACQiYqtOVGxBQAAAACYGhVbAAAAAMhExdacqNgCAAAAAEyNxBYAAAAAYGpMRQYAAACATIZhkeHC6cGuHKsoo2ILAAAAADA1KrYAAAAAkMkqi6xy4eJRLhyrKKNiCwAAAAAwNRJbAAAAAICpMRUZAAAAADJxH1tzomILAAAAADA1KrYAAAAAkInb/ZgTFVsAAAAAgKlRsQUAAACATFxja05UbAEAAAAApnbDJLavvfaaLBaLBg0a5O5QAAAAAAAmckNMRd6wYYPeffddRUZGujsUAAAAADcxFo8yJ7dXbFNTU9W9e3e9//77Klu2rLvDAQAAAACYjNsT2/79+6tDhw6KiYm5btv09HSlpKTYbQAAAADgLEbm4lGu2qjYOodbpyLPnj1bmzZt0oYNG/LUPjExUSNHjsy2f2eCrzxK+Dg7PN0S/4vT+7xS2xIXCq3vwTElCq3vzRec/1lnaVNlR6H1LUlbz1QptL5jK/1eaH2fOluy0PoubSncHwMXzxde/8UtnoXWt3Gp8H7v51HYv1O0Fm73hcViuDsCAABgVm6r2B48eFDPPPOMZs6cKR+fvCVKQ4cO1enTp23bwYMHCzlKAAAAAMCNzm0V240bN+rYsWO67bbbbPsyMjL0ww8/6J133lF6ero8Pe2rMd7e3vL29nZ1qAAAAABuEoYkw4WziJiw5BxuS2xbtWqlrVu32u3r2bOnatWqpRdffDFbUgsAAAAAQE7cltiWKVNGdevWtdtXqlQplStXLtt+AAAAAHAFqyyyyHULOlldOFZR5vZVkQEAAAAAKAi3rop8tZUrV7o7BAAAAAA3McPFt+Dhdj/OQcUWAAAAAGBqJLYAAAAAAFO7oaYiAwAAAIA7WQ2LLC6cHmxlKrJTULEFAAAAAJgaFVsAAAAAyGQYlzdXjoeCo2ILAAAAADA1ElsAAAAAgKkxFRkAAAAAMnEfW3OiYgsAAAAAMDUqtgAAAACQiYqtOVGxBQAAAACYGoktAAAAAMDUmIoMAAAAAJmshkUWF04PtjIV2Smo2AIAAAAATI2KLQAAAABkMozLmyvHQ8FRsQUAAAAAmBqJLQAAAABkulyxtbhwy1+ckyZNUmhoqHx8fBQVFaX169dfs+3777+vZs2aqWzZsipbtqxiYmKytY+Pj5fFYrHb2rZtm7/g3IDEFgAAAABMZM6cOUpISNDw4cO1adMm1a9fX7GxsTp27FiO7VeuXKlHH31U33//vdauXauqVauqTZs2OnTokF27tm3b6siRI7bt008/dcXbcQoSWwAAAAAwkXHjxqlv377q2bOnIiIiNHXqVJUsWVLTpk3Lsf3MmTP11FNPqUGDBqpVq5Y++OADWa1WLV++3K6dt7e3goKCbFvZsmVd8XacgsQWAAAAADK5dhry5U2SUlJS7Lb09PQc47tw4YI2btyomJgY2z4PDw/FxMRo7dq1eXqPZ8+e1cWLFxUQEGC3f+XKlQoMDFTNmjX15JNP6uTJk/n8FF2PxBYAAAAA3Kxq1ary8/OzbYmJiTm2O3HihDIyMlSxYkW7/RUrVlRSUlKexnrxxRcVHBxslxy3bdtWH3/8sZYvX66xY8dq1apVateunTIyMvL/plyI2/0AAAAAQCYjc3PleJJ08OBB+fr62vZ7e3sXynivvfaaZs+erZUrV8rHx8e2v2vXrra/16tXT5GRkQoPD9fKlSvVqlWrQonFmajYAgAAAICb+fr62m3XSmzLly8vT09PHT161G7/0aNHFRQUlOsY//3vf/Xaa6/p22+/VWRkZK5tq1evrvLly2v37t2OvRE3IbEFAAAAAJPw8vJSo0aN7BZ+yloIKjo6+pqve/311zV69GgtWbJEjRs3vu44f//9t06ePKlKlSo5Je7CxlRkAAAAAMh05YJOrhrPUQkJCYqLi1Pjxo3VpEkTTZgwQWlpaerZs6ckqUePHqpcubLtOt2xY8dq2LBhmjVrlkJDQ23X4pYuXVqlS5dWamqqRo4cqS5duigoKEh79uzRCy+8oBo1aig2NtZ5b7YQkdgCAAAAgIk88sgjOn78uIYNG6akpCQ1aNBAS5YssS0odeDAAXl4/P/k3ClTpujChQt68MEH7foZPny4RowYIU9PT/3222+aMWOGkpOTFRwcrDZt2mj06NGFdq2vs+Ursf3f//6nqVOnat++fVq7dq1CQkI0YcIEhYWFqVOnTs6OEQAAAABcw12rRzlowIABGjBgQI7PrVy50u7x/v37c+2rRIkSWrp0af4CuUE4fI3tlClTlJCQoPbt2ys5Odm2/LO/v78mTJjg7PgAAAAAAMiVw4ntxIkT9f777+vf//63PD09bfsbN26srVu3OjU4AAAAAHCpzGtsXbXJhdfzFmUOJ7b79u1Tw4YNs+339vZWWlqaU4ICAAAAACCvHE5sw8LCtGXLlmz7lyxZotq1azsjJgAAAAAA8szhxaMSEhLUv39/nT9/XoZhaP369fr000+VmJioDz74oDBiBAAAAACXMIzLmyvHQ8E5nNj26dNHJUqU0Msvv6yzZ8+qW7duCg4O1ltvvaWuXbsWRowAAAAAAFxTvm730717d3Xv3l1nz55VamqqAgMDnR0XAAAAALicbVEnF46HgnM4sd23b58uXbqkW265RSVLllTJkiUlSbt27VLx4sUVGhrq7BgBAAAAALgmhxePio+P108//ZRt/7p16xQfH++MmAAAAAAAyDOHE9vNmzeradOm2fbfcccdOa6WDAAAAACmkXVvWVduKDCHE1uLxaIzZ85k23/69GllZGQ4JSgAAAAAAPLK4cS2efPmSkxMtEtiMzIylJiYqLvuusupwQEAAACAK2Xd7seVGwrO4cWjxo4dq+bNm6tmzZpq1qyZJGn16tVKSUnRihUrnB4gAAAAAAC5cbhiGxERod9++00PP/ywjh07pjNnzqhHjx7asWOH6tatWxgxAgAAAIBrGG7YUGD5uo9tcHCwxowZ4+xYAAAAAABwWL4S2+TkZK1fv17Hjh2T1Wq1e65Hjx5OCQwAAAAAgLxwOLFdtGiRunfvrtTUVPn6+spi+f/lqS0WC4ktAAAAANMyDIsMF96Cx5VjFWUOX2P77LPPqlevXkpNTVVycrL++ecf23bq1KnCiBEAAAAAgGtyOLE9dOiQBg4cqJIlSxZGPAAAAADgXiwcVehWr16txx57TNHR0Tp06JAk6X//+5/WrFmTr/4cTmxjY2P1yy+/5GswAAAAAMDNbd68eYqNjVWJEiW0efNmpaenS5JOnz6d70WKHb7GtkOHDnr++ef1xx9/qF69eipevLjd8/fdd1++AgEAAAAAFH3/+c9/NHXqVPXo0UOzZ8+27W/atKn+85//5KtPhxPbvn37SpJGjRqV7TmLxaKMjIx8BQIAAAAA7sbiUYVv586dat68ebb9fn5+Sk5OzlefDk9Ftlqt19xIagEAAAAAuQkKCtLu3buz7V+zZo2qV6+erz4dTmyvdP78+YK8HAAAAABuLK5cOOomXUCqb9++euaZZ7Ru3TpZLBYdPnxYM2fO1HPPPacnn3wyX306PBU5IyNDY8aM0dSpU3X06FH9+eefql69ul555RWFhoaqd+/e+QoEAAAAAFD0DRkyRFarVa1atdLZs2fVvHlzeXt767nnntPTTz+drz4drti++uqrmj59ul5//XV5eXnZ9tetW1cffPBBvoIAAAAAgBuDxQ3bzcVisejf//63Tp06pW3btunnn3/W8ePHNXr06Hz36XBi+/HHH+u9995T9+7d5enpadtfv3597dixI9+BAAAAAABuHl5eXoqIiFCTJk1UunTpAvXl8FTkQ4cOqUaNGtn2W61WXbx4sUDBAAAAAACKtrvvvlsWy7Ur1StWrHC4T4cT24iICK1evVohISF2++fOnauGDRs6HAAAAAAA3DBcvaDTTbh4VIMGDeweX7x4UVu2bNG2bdsUFxeXrz4dTmyHDRumuLg4HTp0SFarVfPnz9fOnTv18ccf66uvvspXEAAAAACAm8P48eNz3D9ixAilpqbmq0+Hr7Ht1KmTFi1apO+++06lSpXSsGHDtH37di1atEitW7fOVxAAAAAAcEPgdj9u89hjj2natGn5eq3DFVtJatasmZYtW5avAQEAAAAAuNratWvl4+OTr9fmK7EFAAAAACA/HnjgAbvHhmHoyJEj+uWXX/TKK6/kq0+HE1sPD49cV7DKyMjIVyAAAAAA4HaG5fLmyvFuMn5+fnaPPTw8VLNmTY0aNUpt2rTJV58OJ7YLFiywe3zx4kVt3rxZM2bM0MiRI/MVBAAAAADg5vDRRx85vU+HE9tOnTpl2/fggw+qTp06mjNnjnr37u2UwAAAAADA1Qzj8ubK8VBwTrvG9o477lC/fv2c1R0AAAAAoIgoW7Zsrpe0XunUqVMO9++UxPbcuXN6++23VblyZWd0BwAAAADu4epb8NwkFdsJEyYUav8OJ7ZXZ9qGYejMmTMqWbKkPvnkE6cGBwAAAAAwv7i4uELt3+HEdvz48XaJrYeHhypUqKCoqCiVLVvWqcEBAAAAAIqu8+fP68KFC3b7fH19He7H4cQ2Pj7e4UEAAAAAwBS43U+hS0tL04svvqjPPvtMJ0+ezPZ8fm4h63Bi+9tvv+W5bWRkpKPdAwAAAACKsBdeeEHff/+9pkyZoscff1yTJk3SoUOH9O677+q1117LV58OJ7YNGjS47mpWhmHIYrHkK9MGAAAAAHexGJc3V453s1m0aJE+/vhjtWzZUj179lSzZs1Uo0YNhYSEaObMmerevbvDfXo4+oL58+crLCxMkydP1ubNm7V582ZNnjxZ4eHhmjdvnvbu3at9+/Zp7969DgcDAAAAACjaTp06perVq0u6fD1t1u197rrrLv3www/56tPhiu2YMWP09ttvq3379rZ9kZGRqlq1ql555RVt3LgxX4EAAAAAAIq+6tWra9++fapWrZpq1aqlzz77TE2aNNGiRYvk7++frz4drthu3bpVYWFh2faHhYXpjz/+yFcQAAAAAHBDMNyw3WR69uypX3/9VZI0ZMgQTZo0ST4+Pho8eLCef/75fPXpcMW2du3aSkxM1AcffCAvLy9J0oULF5SYmKjatWvnKwgAAAAAQNH23HPPqU+fPho8eLBtX0xMjHbs2KGNGzeqRo0a+V6A2OHEdurUqerYsaOqVKliG/S3336TxWLRokWL8hUEAAAAANwQuN1Pofniiy80fvx4RUVFqU+fPnrkkUdUqlQphYSEKCQkpEB9OzwVuUmTJtq7d6/+85//KDIyUpGRkXr11Ve1d+9eNWnSpEDBAAAAAACKpl27dun777/XrbfeqmeeeUZBQUHq1auXfvrppwL37XDFVpJKlSqlfv36FXhwAAAAALihuPq615vsGtvmzZurefPmmjRpkubMmaOPPvpId911l2rWrKnevXvr8ccfV8WKFR3u1+GKrST973//01133aXg4GD99ddfkqTx48friy++yE93AAAAAICbSKlSpdSrVy+tXr1af/75px544AElJiaqWrVq+erP4cR2ypQpSkhIULt27fTPP/8oIyNDklS2bFlNmDAhX0EAAAAAAG4+aWlpWr16tVatWqV//vnHdn9bRzmc2E6cOFHvv/++/v3vf6tYsf+fydy4cWNt3bo1X0EAAAAAwA2B2/24xJo1a9SrVy9VqlRJAwcO1K233qrVq1dr+/bt+erP4Wts9+3bp4YNG2bb7+3trbS0tHwFAQAAAAAo2o4cOaIZM2Zo+vTp+vPPP3XHHXdo3Lhx6tq1q0qXLl2gvh1ObMPCwrRly5ZsyzEvWbKE+9gCAAAAMDcWjyo0VatWVbly5fT444+rd+/eTs0fHU5sExIS1L9/f50/f16GYWj9+vX69NNPlZiYqA8++MBpgQEAAAAAio7PPvtM9913n90lrc7icI99+vRRiRIl9PLLL+vs2bPq1q2bgoOD9dZbb6lr165ODxAAAAAAYH4PPPBAofWdr1S5e/fu6t69u86ePavU1FQFBgY6Oy4AAAAAcD3Dcnlz5XgosHzdxzZLyZIltX37di1evFj//POPs2ICAAAAACDP8lyxHTt2rFJTUzV69GhJkmEYateunb799ltJUmBgoJYvX646deoUTqQAAAAAUMgsxuXNleOh4PJcsZ0zZ47q1q1rezx37lz98MMPWr16tU6cOKHGjRtr5MiRhRIkAAAAAADXkueK7b59+xQZGWl7/M033+jBBx9U06ZNJUkvv/yyHnroIedHCAAAAAAoMu6//35ZLNmvLbZYLPLx8VGNGjXUrVs31axZM8995rlie+nSJXl7e9ser127VnfeeaftcXBwsE6cOJHngQEAAADghmO4YbvJ+Pn5acWKFdq0aZMsFossFos2b96sFStW6NKlS5ozZ47q16+vH3/8Mc995jmxDQ8P1w8//CBJOnDggP788081b97c9vzff/+tcuXKOfB2AAAAAAA3m6CgIHXr1k179+7VvHnzNG/ePO3Zs0ePPfaYwsPDtX37dsXFxenFF1/Mc595norcv39/DRgwQKtXr9bPP/+s6OhoRURE2J5fsWKFGjZs6Ng7AgAAAADcVD788EP9+OOP8vD4/zqrh4eHnn76ad15550aM2aMBgwYoGbNmuW5zzxXbPv27au3335bp06dUvPmzTVv3jy75w8fPqxevXrleWBJmjJliiIjI+Xr6ytfX19FR0dr8eLFDvUBAAAAADCPS5cuaceOHdn279ixQxkZGZIkHx+fHK/DvRaH7mPbq1cvLViwQFOmTFFQUJDdc5MnT9b999/vSHeqUqWKXnvtNW3cuFG//PKL7rnnHnXq1Em///67Q/0AAAAAgDNY9P+3/HHJls84J02apNDQUPn4+CgqKkrr16/Ptf3nn3+uWrVqycfHR/Xq1dM333xj97xhGBo2bJgqVaqkEiVKKCYmRrt27cpndLl7/PHH1bt3b40fP15r1qzRmjVrNH78ePXu3Vs9evSQJK1atcqhW8k6lNg6W8eOHdW+fXvdcsstuvXWW/Xqq6+qdOnS+vnnn90ZFgAAAADcsObMmaOEhAQNHz5cmzZtUv369RUbG6tjx47l2P6nn37So48+qt69e2vz5s3q3LmzOnfurG3bttnavP7663r77bc1depUrVu3TqVKlVJsbKzOnz/v9PjHjx+vQYMG6fXXX1fz5s3VvHlzvf766xo8eLDGjRsnSWrTpo1mz56d5z7dmtheKSMjQ7Nnz1ZaWpqio6NzbJOenq6UlBS7DQAAAABuJuPGjVPfvn3Vs2dPRUREaOrUqSpZsqSmTZuWY/u33npLbdu21fPPP6/atWtr9OjRuu222/TOO+9IulytnTBhgl5++WV16tRJkZGR+vjjj3X48GEtXLjQ6fF7enrq3//+t44cOaLk5GQlJyfryJEjeumll+Tp6SlJqlatmqpUqZLnPvO8eFRh2bp1q6Kjo3X+/HmVLl1aCxYssFuU6kqJiYkaOXJktv1LWkxWmTLOz9Ef7P680/u80oFLawqt7xot9hVa34tTIq/fKJ86+P1aaH1L0pA/Hyi0voOqFN5a7ampJQqtb29L4f4YsF7wLLS+PQrxd3OWS/mdGOR+lgyTxm7i2x1YTBy7aRkm/Z4DuPEZFtf+jMkc6+qinbe3t93tVrNcuHBBGzdu1NChQ237PDw8FBMTo7Vr1+Y4xNq1a5WQkGC3LzY21pa07tu3T0lJSYqJibE97+fnp6ioKK1du1Zdu3bN11vLC19fX6f04/aKbc2aNbVlyxatW7dOTz75pOLi4vTHH3/k2Hbo0KE6ffq0bTt48KCLowUAAAAA56tatar8/PxsW2JiYo7tTpw4oYyMDFWsWNFuf8WKFZWUlJTja5KSknJtn/WnI30WxNGjR/X4448rODhYxYoVk6enp92WH26v2Hp5ealGjRqSpEaNGmnDhg1666239O6772Zre63fWgAAAACAUxhy7SyizLEOHjxoV70synlPfHy8Dhw4oFdeeUWVKlVyaPXja3E4sU1LS9Nrr72m5cuX69ixY7JarXbP7927t0ABWa1WpaenF6gPAAAAADCTrFugXk/58uXl6empo0eP2u0/evRotjvXZAkKCsq1fdafR48eVaVKlezaNGjQwJG3kSdr1qzR6tWrndq3w4ltnz59tGrVKj3++OMFzq6HDh2qdu3aqVq1ajpz5oxmzZqllStXaunSpfnuEwAAAACKKi8vLzVq1EjLly9X586dJV0uDi5fvlwDBgzI8TXR0dFavny5Bg0aZNu3bNky26K9YWFhCgoK0vLly23JZkpKiu1yUWerWrWqDMO5ZXGHE9vFixfr66+/VtOmTQs8+LFjx9SjRw8dOXJEfn5+ioyM1NKlS9W6desC9w0AAAAADnPTVGRHJCQkKC4uTo0bN1aTJk00YcIEpaWlqWfPnpKkHj16qHLlyrbrdJ955hm1aNFCb775pjp06KDZs2frl19+0XvvvSdJslgsGjRokP7zn//olltuUVhYmF555RUFBwfbkmdnmjBhgoYMGaJ3331XoaGhTunT4cS2bNmyCggIcMrgH374oVP6AQAAAICbxSOPPKLjx49r2LBhSkpKUoMGDbRkyRLb4k8HDhyQh8f/rxN85513atasWXr55Zf10ksv6ZZbbtHChQtVt25dW5sXXnhBaWlp6tevn5KTk3XXXXdpyZIl8vHxKZT4z549q/DwcJUsWVLFixe3e/7UqVMO9+lwYjt69GgNGzZMM2bMUMmSJR0eEAAAAABuVBbDtbdxy+9YAwYMuObU45UrV2bb99BDD+mhhx66dhwWi0aNGqVRo0blLyAHTJgwwel9OpzYvvnmm9qzZ48qVqyo0NDQbNn1pk2bnBYcAAAAAKBoiYuLc3qfDie2hTHHGgAAAABuCCa4xtaMUlJSbKs+p6Sk5No2L6tDX83hxHb48OEODwIAAAAAuHmVLVtWR44cUWBgoPz9/XO8u45hGLJYLMrIyHC4f4cTWwAAAAAAHLFixQrbIsTff/+90/vPU2IbEBCgP//8U+XLl1fZsmVzvXdtflawAgAAAIAbAlORC0WLFi1y/Luz5CmxHT9+vMqUKSOpcFawAgAAAAAUXb/99lue20ZGRjrcf54S2ytXrSqMFawAAAAA4EZgltv9mE2DBg1ksVhs19HmJj/X2HpcvwkAAAAAAPm3b98+7d27V/v27dO8efMUFhamyZMna/Pmzdq8ebMmT56s8PBwzZs3L1/9s3gUAAAAAKBQhYSE2P7+0EMP6e2331b79u1t+yIjI1W1alW98sor+brFLIktAAAAAGQxLJc3V453k9m6davCwsKy7Q8LC9Mff/yRrz6ZigwAAAAAcJnatWsrMTFRFy5csO27cOGCEhMTVbt27Xz1ScUWAAAAALJwu59CN3XqVHXs2FFVqlSxrYD822+/yWKxaNGiRfnq0+HENi0tTa+99pqWL1+uY8eOyWq12j2/d+/efAUCAAAAACj6mjRpor1792rmzJnasWOHJOmRRx5Rt27dVKpUqXz16XBi26dPH61atUqPP/64KlWqdN2lmgEAAADALLjdj2uUKlVK/fr1c1p/Die2ixcv1tdff62mTZs6LQgAAAAAwM1jz549mjBhgrZv3y5JqlOnjgYOHKjw8PB89efw4lFly5ZVQEBAvgYDAAAAANzcli5dqoiICK1fv16RkZGKjIzUzz//rDp16mjZsmX56tPhiu3o0aM1bNgwzZgxQyVLlszXoAAAAABwQ2LxqEI3ZMgQDR48WK+99lq2/S+++KJat27tcJ8OJ7Zvvvmm9uzZo4oVKyo0NFTFixe3e37Tpk0OBwEAAAAAuDls375dn332Wbb9vXr10oQJE/LVp8OJbefOnfM1EAAAAADc8Fy8eNTNWLGtUKGCtmzZoltuucVu/5YtWxQYGJivPh1ObIcPH56vgQAAAAAA6Nu3r/r166e9e/fqzjvvlCT9+OOPGjt2rBISEvLVp8OJbZaNGzfarWDVsGHD/HYFAAAAALhJvPLKKypTpozefPNNDR06VJIUHBysESNGaODAgfnq0+HE9tixY+ratatWrlwpf39/SVJycrLuvvtuzZ49WxUqVMhXIAAAAADgdiweVegsFosGDx6swYMH68yZM5KkMmXKFKhPh2/38/TTT+vMmTP6/fffderUKZ06dUrbtm1TSkpKvrNrAAAAAMDNYd++fdq1a5ekywltVlK7a9cu7d+/P199OpzYLlmyRJMnT1bt2rVt+yIiIjRp0iQtXrw4X0EAAAAAwA3BcMN2k4mPj9dPP/2Ubf+6desUHx+frz4dTmytVmu2W/xIUvHixWW1WvMVBAAAAADg5rB582Y1bdo02/477rhDW7ZsyVefDie299xzj5555hkdPnzYtu/QoUMaPHiwWrVqla8gAAAAAOBGYDFcv91sLBaL7draK50+fVoZGRn56tPhxPadd95RSkqKQkNDFR4ervDwcIWFhSklJUUTJ07MVxAAAAAAgJtD8+bNlZiYaJfEZmRkKDExUXfddVe++nR4VeSqVatq06ZN+u6777Rjxw5JUu3atRUTE5OvAAAAAAAAN4+xY8eqefPmqlmzppo1ayZJWr16tVJSUrRixYp89Zmv+9haLBa1bt1arVu3ztegAAAAAICbU0REhH777Te98847+vXXX1WiRAn16NFDAwYMUEBAQL76zFNi+/bbb6tfv37y8fHR22+/nWtbbvkDAAAAAMhNcHCwxowZ47T+8pTYjh8/Xt27d5ePj4/Gjx9/zXYWi4XEFgAAAIB5ufoWPDfh4lGSlJycrPXr1+vYsWPZ7q7To0cPh/vLU2K7b9++HP8OAAAAAIAjFi1apO7duys1NVW+vr6yWCy25ywWS74SW4dXRR41apTOnj2bbf+5c+c0atQohwMAAAAAANw8nn32WfXq1UupqalKTk7WP//8Y9tOnTqVrz4dTmxHjhyp1NTUbPvPnj2rkSNH5isIAAAAALgRcB/bwnfo0CENHDhQJUuWdFqfDie2hmHYlYqz/Prrr/lewQoAAAAAcHOIjY3VL7/84tQ+83y7n7Jly8pischisejWW2+1S24zMjKUmpqqJ554wqnBAQAAAIDL3YRVVFfq0KGDnn/+ef3xxx+qV6+eihcvbvf8fffd53CfeU5sJ0yYIMMw1KtXL40cOVJ+fn6257y8vBQaGqro6GiHAwAAAAAA3Dz69u0rSTmu0WSxWJSRkeFwn3lObOPi4iRJYWFhuvPOO7Nl1QAAAABgetzup9BdfXsfZ8hTYpuSkiJfX19JUsOGDXXu3DmdO3cux7ZZ7QAAAAAAcIU8LR5VtmxZHTt2TJLk7++vsmXLZtuy9gMAAAAAcLX27dvr9OnTtsevvfaakpOTbY9PnjypiIiIfPWdp4rtihUrbCsef//99/kaCAAAAABudK6+Bc/NdLufpUuXKj093fZ4zJgxevjhh+Xv7y9JunTpknbu3JmvvvOU2LZo0SLHvwMAAAAAkBeGYeT6uCAcvo/tkiVLtGbNGtvjSZMmqUGDBurWrZv++ecfpwUGAAAAAC5nuGFDgTmc2D7//PNKSUmRJG3dulUJCQlq37699u3bp4SEBKcHCAAAAAAwP4vFIovFkm2fM+T5dj9Z9u3bZ7ugd968eerYsaPGjBmjTZs2qX379k4JCgAAAABQtBiGofj4eHl7e0uSzp8/ryeeeEKlSpWSJLvrbx3lcGLr5eWls2fPSpK+++479ejRQ5IUEBBgq+QCAAAAgBmxeFThiYuLs3v82GOPZWuTlV86yuHE9q677lJCQoKaNm2q9evXa86cOZKkP//8U1WqVMlXEAAAAACAou2jjz4qtL4dvsb2nXfeUbFixTR37lxNmTJFlStXliQtXrxYbdu2dXqAAAAAAOAyLB5lSg5XbKtVq6avvvoq2/7x48c7JSAAAAAAABzhcGIrSRkZGVq4cKG2b98uSapTp47uu+8+eXp6OjU4AAAAAHApV1dRqdg6hcOJ7e7du9W+fXsdOnRINWvWlCQlJiaqatWq+vrrrxUeHu70IAEAAAAAuBaHr7EdOHCgwsPDdfDgQW3atEmbNm3SgQMHFBYWpoEDBxZGjAAAAAAAXJPDFdtVq1bp559/VkBAgG1fuXLl9Nprr6lp06ZODQ4AAAAAXInb/ZiTwxVbb29vnTlzJtv+1NRUeXl5OSUoAAAAAADyyuHE9t5771W/fv20bt06GYYhwzD0888/64knntB9991XGDECAAAAgGtwux9TcjixffvttxUeHq7o6Gj5+PjIx8dHTZs2VY0aNfTWW28VRowAAAAAAFyTw9fY+vv764svvtDu3bttt/upXbu2atSo4fTgAAAAAAC4njwntlarVW+88Ya+/PJLXbhwQa1atdLw4cNVokSJwowPAAAAAFyH+9iaUp6nIr/66qt66aWXVLp0aVWuXFlvvfWW+vfvX5ixAQAAAABwXXlObD/++GNNnjxZS5cu1cKFC7Vo0SLNnDlTVqu1MOMDAAAAAJfJut2PKzcUXJ4T2wMHDqh9+/a2xzExMbJYLDp8+HChBAYAAAAAQF7kObG9dOmSfHx87PYVL15cFy9edHpQAAAAAADkVZ4XjzIMQ/Hx8fL29rbtO3/+vJ544gmVKlXKtm/+/PnOjRAAAAAAXIXFo0wpz4ltXFxctn2PPfaYU4MBAAAAAMBReU5sP/roo8KMAwAAAADcztULOrF4lHPk+RpbAAAAAABuRHmu2AIAAABAkcc1tqZExRYAAAAAiqhTp06pe/fu8vX1lb+/v3r37q3U1NRc2z/99NOqWbOmSpQooWrVqmngwIE6ffq0XTuLxZJtmz17dmG/nWuiYgsAAAAARVT37t115MgRLVu2TBcvXlTPnj3Vr18/zZo1K8f2hw8f1uHDh/Xf//5XERER+uuvv/TEE0/o8OHDmjt3rl3bjz76SG3btrU99vf3L8y3kisSWwAAAADIUoSmIm/fvl1LlizRhg0b1LhxY0nSxIkT1b59e/33v/9VcHBwttfUrVtX8+bNsz0ODw/Xq6++qscee0yXLl1SsWL/n0L6+/srKCio8N6AA5iKDAAAAABulpKSYrelp6cXuM+1a9fK39/fltRKUkxMjDw8PLRu3bo893P69Gn5+vraJbWS1L9/f5UvX15NmjTRtGnTZBjuu2CYxBYAAAAAMlncsElS1apV5efnZ9sSExML/F6SkpIUGBhot69YsWIKCAhQUlJSnvo4ceKERo8erX79+tntHzVqlD777DMtW7ZMXbp00VNPPaWJEycWOOb8YioyAAAAALjZwYMH5evra3vs7e19zbZDhgzR2LFjc+1v+/btBY4pJSVFHTp0UEREhEaMGGH33CuvvGL7e8OGDZWWlqY33nhDAwcOLPC4+UFiCwAAAABu5uvra5fY5ubZZ59VfHx8rm2qV6+uoKAgHTt2zG7/pUuXdOrUqeteG3vmzBm1bdtWZcqU0YIFC1S8ePFc20dFRWn06NFKT0/PNSkvLCS2AAAAAJDFBItHVahQQRUqVLhuu+joaCUnJ2vjxo1q1KiRJGnFihWyWq2Kioq65utSUlIUGxsrb29vffnll/Lx8bnuWFu2bFHZsmXdktRKJLYAAAAAUCTVrl1bbdu2Vd++fTV16lRdvHhRAwYMUNeuXW0rIh86dEitWrXSxx9/rCZNmiglJUVt2rTR2bNn9cknn9gWs5IuJ9Senp5atGiRjh49qjvuuEM+Pj5atmyZxowZo+eee85t75XEFgAAAAAyWYzLmyvHK0wzZ87UgAED1KpVK3l4eKhLly56++23bc9fvHhRO3fu1NmzZyVJmzZtsq2YXKNGDbu+9u3bp9DQUBUvXlyTJk3S4MGDZRiGatSooXHjxqlv376F+2ZyQWILAAAAAEVUQECAZs2adc3nQ0ND7W7T07Jly+vetqdt27Zq27at02J0BhJbAAAAAMhigmtskR33sQUAAAAAmBqJLQAAAADA1JiKDAAAAABXYnqw6VCxBQAAAACYGhVbAAAAAMhU1G73c7OgYgsAAAAAMDUSWwAAAACAqTEVGQAAAACycB9bU6JiCwAAAAAwNSq2AAAAAJCJxaPMiYotAAAAAMDU3JrYJiYm6vbbb1eZMmUUGBiozp07a+fOne4MCQAAAMDNzHDDhgJza2K7atUq9e/fXz///LOWLVumixcvqk2bNkpLS3NnWAAAAAAAE3HrNbZLliyxezx9+nQFBgZq48aNat68uZuiAgAAAACYyQ21eNTp06clSQEBATk+n56ervT0dNvjlJQUl8QFAAAA4ObA4lHmdMMktlarVYMGDVLTpk1Vt27dHNskJiZq5MiR2fYfzfBWWobzZ1U3HLTZ6X1eaUxSm0Lre3TowkLru8+2HoXW95AG6wqtb0k6esK30Pr2tXgXWt+X0grvVC1uKdwfA5Z0c65RZ7lkcXcI+WfSfyD5h90NDBN/zwEAuMIN8z/O/v37a9u2bZo9e/Y12wwdOlSnT5+2bQcPHnRhhAAAAACKPBaPMqUbomI7YMAAffXVV/rhhx9UpUqVa7bz9vaWt3fhVcUAAAAAAObj1sTWMAw9/fTTWrBggVauXKmwsDB3hgMAAAAAMCG3Jrb9+/fXrFmz9MUXX6hMmTJKSkqSJPn5+alEiRLuDA0AAADAzcjV04OZiuwUbr3GdsqUKTp9+rRatmypSpUq2bY5c+a4MywAAAAAgIm4fSoyAAAAANwouN2POd0wqyIDAAAAAJAfN8SqyAAAAABwQ+AaW1OiYgsAAAAAMDUSWwAAAACAqTEVGQAAAAAyWQxDFhcucuvKsYoyKrYAAAAAAFOjYgsAAAAAWVg8ypSo2AIAAAAATI3EFgAAAABgakxFBgAAAIBMFuPy5srxUHBUbAEAAAAApkbFFgAAAACysHiUKVGxBQAAAACYGhVbAAAAAMjENbbmRMUWAAAAAGBqJLYAAAAAAFNjKjIAAAAAZGHxKFOiYgsAAAAAMDUqtgAAAACQicWjzImKLQAAAADA1EhsAQAAAACmxlRkAAAAAMjC4lGmRMUWAAAAAGBqVGwBAAAA4Aos6GQ+VGwBAAAAAKZGxRYAAAAAshjG5c2V46HAqNgCAAAAAEyNxBYAAAAAYGpMRQYAAACATBbDtYtHsVCVc1CxBQAAAACYGhVbAAAAAMhiZG6uHA8FRsUWAAAAAGBqJLYAAAAAAFNjKjIAAAAAZLJYL2+uHA8FR8UWAAAAAGBqVGwBAAAAIAuLR5kSFVsAAAAAgKmR2AIAAAAATI2pyAAAAACQyWJc3lw5HgqOii0AAAAAwNSo2AIAAABAFsO4vLlyPBQYFVsAAAAAgKlRsQUAAACATFxja05UbAEAAACgiDp16pS6d+8uX19f+fv7q3fv3kpNTc31NS1btpTFYrHbnnjiCbs2Bw4cUIcOHVSyZEkFBgbq+eef16VLlwrzreSKii0AAAAAFFHdu3fXkSNHtGzZMl28eFE9e/ZUv379NGvWrFxf17dvX40aNcr2uGTJkra/Z2RkqEOHDgoKCtJPP/2kI0eOqEePHipevLjGjBlTaO8lNyS2AAAAAJDFyNxcOV4h2b59u5YsWaINGzaocePGkqSJEyeqffv2+u9//6vg4OBrvrZkyZIKCgrK8blvv/1Wf/zxh7777jtVrFhRDRo00OjRo/Xiiy9qxIgR8vLyKpT3kxumIgMAAACAm6WkpNht6enpBe5z7dq18vf3tyW1khQTEyMPDw+tW7cu19fOnDlT5cuXV926dTV06FCdPXvWrt969eqpYsWKtn2xsbFKSUnR77//XuC484OKLQAAAABkctfiUVWrVrXbP3z4cI0YMaJAfSclJSkwMNBuX7FixRQQEKCkpKRrvq5bt24KCQlRcHCwfvvtN7344ovauXOn5s+fb+v3yqRWku1xbv0WJhJbAAAAAHCzgwcPytfX1/bY29v7mm2HDBmisWPH5trf9u3b8x1Lv379bH+vV6+eKlWqpFatWmnPnj0KDw/Pd7+FicQWAAAAANzM19fXLrHNzbPPPqv4+Phc21SvXl1BQUE6duyY3f5Lly7p1KlT17x+NidRUVGSpN27dys8PFxBQUFav369XZujR49KkkP9OhOJLQAAAABkMYzLmyvHc1CFChVUoUKF67aLjo5WcnKyNm7cqEaNGkmSVqxYIavVaktW82LLli2SpEqVKtn6ffXVV3Xs2DHbVOdly5bJ19dXERERDr4b52DxKAAAAAAogmrXrq22bduqb9++Wr9+vX788UcNGDBAXbt2ta2IfOjQIdWqVctWgd2zZ49Gjx6tjRs3av/+/fryyy/Vo0cPNW/eXJGRkZKkNm3aKCIiQo8//rh+/fVXLV26VC+//LL69++f6xTqwkRiCwAAAACZshaPcuVWmGbOnKlatWqpVatWat++ve666y699957tucvXryonTt32lY99vLy0nfffac2bdqoVq1aevbZZ9WlSxctWrTI9hpPT0999dVX8vT0VHR0tB577DH16NHD7r63rsZUZAAAAAAoogICAjRr1qxrPh8aGirjiunQVatW1apVq67bb0hIiL755hunxOgMJLYAAAAAkMXI3Fw5HgqMqcgAAAAAAFMjsQUAAAAAmBpTkQEAAAAgkysWdLp6PBQcFVsAAAAAgKlRsQUAAACALFbj8ubK8VBgVGwBAAAAAKZGYgsAAAAAMDWmIgMAAABAFu5ja0pUbAEAAAAApkbFFgAAAAAyWeTi2/24bqgijYotAAAAAMDUqNgCAAAAQBbDuLy5cjwUGBVbAAAAAICpkdgCAAAAAEyNqcgAAAAAkMliuHjxKGYiOwUVWwAAAACAqVGxBQAAAIAsRubmyvFQYFRsAQAAAACmRmILAAAAADA1piIDAAAAQCaLYcjiwnvLunKsooyKLQAAAADA1KjYAgAAAEAWa+bmyvFQYFRsAQAAAACmRsUWAAAAADJxja05UbEFAAAAAJgaiS0AAAAAwNSYigwAAAAAWYzMzZXjocCo2AIAAAAATI2KLQAAAABkMYzLmyvHQ4FRsQUAAAAAmBqJLQAAAADA1JiKDAAAAACZLMblzZXjoeCo2AIAAAAATI2KLQAAAABkYfEoU6JiCwAAAAAwNSq2AAAAAJDJYr28uXI8FBwVWwAAAACAqbk1sf3hhx/UsWNHBQcHy2KxaOHChe4MBwAAAABgQm5NbNPS0lS/fn1NmjTJnWEAAAAAwGVZi0e5ckOBufUa23bt2qldu3buDAEAAAAAYHKmWjwqPT1d6enptscpKSlujAYAAABAkWNkbq4cDwVmqsQ2MTFRI0eOzLa/51dPyMPHx+nj7X5kqtP7vFKNOU8UWt9TH1lbaH0n7wootL59bytRaH1Lko47/3uSpbil8E4nz1RTnap2PC6Yc406S4bF3SHkm8Vq0tgNk8YtmTt2AACKAFP9j3Po0KE6ffq0bTt48KC7QwIAAAAAuJmpykDe3t7y9vZ2dxgAAAAAiiiLYcjiwgWdXDlWUWaqii0AAAAAAFdza8U2NTVVu3fvtj3et2+ftmzZooCAAFWrVs2NkQEAAAC4Kbn6FjxUbJ3CrYntL7/8orvvvtv2OCEhQZIUFxen6dOnuykqAAAAAICZuDWxbdmypQx+QwEAAADgRmFIsrp4PBQY19gCAAAAAEyNxBYAAAAAYGqmut0PAAAAABQmbvdjTlRsAQAAAACmRsUWAAAAALIYcvHtflw3VFFGxRYAAAAAYGoktgAAAAAAU2MqMgAAAABkMQwXT0VmLrIzULEFAAAAAJgaFVsAAAAAyGKVZHHxeCgwKrYAAAAAAFMjsQUAAAAAmBpTkQEAAAAgk8UwZHHhgk6uHKsoo2ILAAAAADA1KrYAAAAAkIXb/ZgSFVsAAAAAgKlRsQUAAACALFRsTYmKLQAAAADA1EhsAQAAAKCIOnXqlLp37y5fX1/5+/urd+/eSk1NvWb7/fv3y2Kx5Lh9/vnntnY5PT979mxXvKUcMRUZAAAAALIUsanI3bt315EjR7Rs2TJdvHhRPXv2VL9+/TRr1qwc21etWlVHjhyx2/fee+/pjTfeULt27ez2f/TRR2rbtq3tsb+/v9PjzysSWwAAAABws5SUFLvH3t7e8vb2LlCf27dv15IlS7RhwwY1btxYkjRx4kS1b99e//3vfxUcHJztNZ6engoKCrLbt2DBAj388MMqXbq03X5/f/9sbd2FqcgAAAAAkMXqhk2XK6V+fn62LTExscBvZe3atfL397cltZIUExMjDw8PrVu3Lk99bNy4UVu2bFHv3r2zPde/f3+VL19eTZo00bRp02S4cSEsKrYAAAAA4GYHDx6Ur6+v7XFBq7WSlJSUpMDAQLt9xYoVU0BAgJKSkvLUx4cffqjatWvrzjvvtNs/atQo3XPPPSpZsqS+/fZbPfXUU0pNTdXAgQMLHHd+kNgCAAAAgJv5+vraJba5GTJkiMaOHZtrm+3btxc4pnPnzmnWrFl65ZVXsj135b6GDRsqLS1Nb7zxBoktAAAAALibxTBkceGU2vyM9eyzzyo+Pj7XNtWrV1dQUJCOHTtmt//SpUs6depUnq6NnTt3rs6ePasePXpct21UVJRGjx6t9PR0p1SbHUViCwAAAAAmUqFCBVWoUOG67aKjo5WcnKyNGzeqUaNGkqQVK1bIarUqKirquq//8MMPdd999+VprC1btqhs2bJuSWolElsAAAAA+H9F6HY/tWvXVtu2bdW3b19NnTpVFy9e1IABA9S1a1fbisiHDh1Sq1at9PHHH6tJkya21+7evVs//PCDvvnmm2z9Llq0SEePHtUdd9whHx8fLVu2TGPGjNFzzz1XaO/lekhsAQAAAKCImjlzpgYMGKBWrVrJw8NDXbp00dtvv217/uLFi9q5c6fOnj1r97pp06apSpUqatOmTbY+ixcvrkmTJmnw4MEyDEM1atTQuHHj1Ldv30J/P9dCYgsAAAAAWayGZHFhxdZauGMFBARo1qxZ13w+NDQ0x9v0jBkzRmPGjMnxNW3btlXbtm2dFqMzcB9bAAAAAICpkdgCAAAAAEyNqcgAAAAAkKUILR51M6FiCwAAAAAwNSq2AAAAAGDj4oqtqNg6AxVbAAAAAICpkdgCAAAAAEyNqcgAAAAAkIXFo0yJii0AAAAAwNSo2AIAAABAFqshly7oZKVi6wxUbAEAAAAApkbFFgAAAACyGNbLmyvHQ4FRsQUAAAAAmBqJLQAAAADA1JiKDAAAAABZuN2PKVGxBQAAAACYGhVbAAAAAMjC7X5MiYotAAAAAMDUSGwBAAAAAKbGVGQAAAAAyMLiUaZExRYAAAAAYGpUbAEAAAAgiyEXV2xdN1RRRsUWAAAAAGBqVGwBAAAAIAvX2JoSFVsAAAAAgKmR2AIAAAAATI2pyAAAAACQxWqVZHXxeCgoKrYAAAAAAFOjYgsAAAAAWVg8ypSo2AIAAAAATI3EFgAAAABgakxFBgAAAIAsTEU2JSq2AAAAAABTo2ILAAAAAFmshiQXVlGtVGydgYotAAAAAMDUqNgCAAAAQCbDsMowrC4dDwVHxRYAAAAAYGoktgAAAAAAU2MqMgAAAABkMQzXLujE7X6cgootAAAAAMDUqNgCAAAAQBbDxbf7oWLrFFRsAQAAAACmRmILAAAAADA1piIDAAAAQBarVbK48N6y3MfWKajYAgAAAABMjYotAAAAAGRh8ShTomILAAAAADA1KrYAAAAAkMmwWmW48Bpbg2tsnYKKLQAAAADA1EhsAQAAAACmxlRkAAAAAMjC4lGmRMUWAAAAAGBqVGwBAAAAIIvVkCxUbM2Gii0AAAAAwNRIbAEAAAAApsZUZAAAAADIYhiSXHhvWaYiOwUVWwAAAACAqVGxBQAAAIBMhtWQ4cLFowwqtk5BxRYAAAAAYGoktgAAAAAAU2MqMgAAAABkMaxy7eJRLhyrCKNiCwAAAAAwNSq2AAAAAJCJxaPM6Yao2E6aNEmhoaHy8fFRVFSU1q9f7+6QAAAAAAAm4fbEds6cOUpISNDw4cO1adMm1a9fX7GxsTp27Ji7QwMAAABwszGsrt9QYG5PbMeNG6e+ffuqZ8+eioiI0NSpU1WyZElNmzbN3aEBAAAAAEzArdfYXrhwQRs3btTQoUNt+zw8PBQTE6O1a9dma5+enq709HTb49OnT0uSrOfPF0p8KWcyCqXfLIUVt1S4sZs1bsm8sZs1bsm8sZs1bsm8sZs1bsm8sZs1bsm8sZs1bsm8sZs1bsmcsaekXq4+mvm60Uu6KLkw/Eu66LrBijCL4cZv3eHDh1W5cmX99NNPio6Otu1/4YUXtGrVKq1bt86u/YgRIzRy5EhXhwkAAADAAQcPHlSVKlXcHYZDzp8/r7CwMCUlJbl87KCgIO3bt08+Pj4uH7uoMNWqyEOHDlVCQoLtcXJyskJCQnTgwAH5+fm5MTI4Q0pKiqpWraqDBw/K19fX3eHACTimRQvHs2jheBYtHM+ix6zH1DAMnTlzRsHBwe4OxWE+Pj7at2+fLly44PKxvby8SGoLyK2Jbfny5eXp6amjR4/a7T969KiCgoKytff29pa3t3e2/X5+fqY64ZE7X19fjmcRwzEtWjieRQvHs2jheBY9ZjymZi44+fj4kGCalFsXj/Ly8lKjRo20fPly2z6r1arly5fbTU0GAAAAAOBa3D4VOSEhQXFxcWrcuLGaNGmiCRMmKC0tTT179nR3aAAAAAAAE3B7YvvII4/o+PHjGjZsmJKSktSgQQMtWbJEFStWvO5rvb29NXz48BynJ8N8OJ5FD8e0aOF4Fi0cz6KF41n0cEwBx7h1VWQAAAAAAArKrdfYAgAAAABQUCS2AAAAAABTI7EFAAAAAJgaiS0AAAAAwNRMndhOmjRJoaGh8vHxUVRUlNavX+/ukJAPI0aMkMVisdtq1arl7rCQRz/88IM6duyo4OBgWSwWLVy40O55wzA0bNgwVapUSSVKlFBMTIx27drlnmCRJ9c7pvHx8dnO2bZt27onWOQqMTFRt99+u8qUKaPAwEB17txZO3futGtz/vx59e/fX+XKlVPp0qXVpUsXHT161E0R43ryckxbtmyZ7Rx94okn3BQxcjNlyhRFRkbK19dXvr6+io6O1uLFi23Pc34CeWfaxHbOnDlKSEjQ8OHDtWnTJtWvX1+xsbE6duyYu0NDPtSpU0dHjhyxbWvWrHF3SMijtLQ01a9fX5MmTcrx+ddff11vv/22pk6dqnXr1qlUqVKKjY3V+fPnXRwp8up6x1SS2rZta3fOfvrppy6MEHm1atUq9e/fXz///LOWLVumixcvqk2bNkpLS7O1GTx4sBYtWqTPP/9cq1at0uHDh/XAAw+4MWrkJi/HVJL69u1rd46+/vrrbooYualSpYpee+01bdy4Ub/88ovuuecederUSb///rskzk/AIYZJNWnSxOjfv7/tcUZGhhEcHGwkJia6MSrkx/Dhw4369eu7Oww4gSRjwYIFtsdWq9UICgoy3njjDdu+5ORkw9vb2/j000/dECEcdfUxNQzDiIuLMzp16uSWeFAwx44dMyQZq1atMgzj8vlYvHhx4/PPP7e12b59uyHJWLt2rbvChAOuPqaGYRgtWrQwnnnmGfcFhQIpW7as8cEHH3B+Ag4yZcX2woUL2rhxo2JiYmz7PDw8FBMTo7Vr17oxMuTXrl27FBwcrOrVq6t79+46cOCAu0OCE+zbt09JSUl256qfn5+ioqI4V01u5cqVCgwMVM2aNfXkk0/q5MmT7g4JeXD69GlJUkBAgCRp48aNunjxot05WqtWLVWrVo1z1CSuPqZZZs6cqfLly6tu3boaOnSozp49647w4ICMjAzNnj1baWlpio6O5vwEHFTM3QHkx4kTJ5SRkaGKFSva7a9YsaJ27NjhpqiQX1FRUZo+fbpq1qypI0eOaOTIkWrWrJm2bdumMmXKuDs8FEBSUpIk5XiuZj0H82nbtq0eeOABhYWFac+ePXrppZfUrl07rV27Vp6enu4OD9dgtVo1aNAgNW3aVHXr1pV0+Rz18vKSv7+/XVvOUXPI6ZhKUrdu3RQSEqLg4GD99ttvevHFF7Vz507Nnz/fjdHiWrZu3aro6GidP39epUuX1oIFCxQREaEtW7ZwfgIOMGVii6KlXbt2tr9HRkYqKipKISEh+uyzz9S7d283RgYgJ127drX9vV69eoqMjFR4eLhWrlypVq1auTEy5KZ///7atm0baxgUIdc6pv369bP9vV69eqpUqZJatWqlPXv2KDw83NVh4jpq1qypLVu26PTp05o7d67i4uK0atUqd4cFmI4ppyKXL19enp6e2VaFO3r0qIKCgtwUFZzF399ft956q3bv3u3uUFBAWecj52rRVr16dZUvX55z9gY2YMAAffXVV/r+++9VpUoV2/6goCBduHBBycnJdu05R2981zqmOYmKipIkztEblJeXl2rUqKFGjRopMTFR9evX11tvvcX5CTjIlImtl5eXGjVqpOXLl9v2Wa1WLV++XNHR0W6MDM6QmpqqPXv2qFKlSu4OBQUUFhamoKAgu3M1JSVF69at41wtQv7++2+dPHmSc/YGZBiGBgwYoAULFmjFihUKCwuze75Ro0YqXry43Tm6c+dOHThwgHP0BnW9Y5qTLVu2SBLnqElYrValp6dzfgIOMu1U5ISEBMXFxalx48Zq0qSJJkyYoLS0NPXs2dPdocFBzz33nDp27KiQkBAdPnxYw4cPl6enpx599FF3h4Y8SE1NtasC7Nu3T1u2bFFAQICqVaumQYMG6T//+Y9uueUWhYWF6ZVXXlFwcLA6d+7svqCRq9yOaUBAgEaOHKkuXbooKChIe/bs0QsvvKAaNWooNjbWjVEjJ/3799esWbP0xRdfqEyZMrbr8vz8/FSiRAn5+fmpd+/eSkhIUEBAgHx9ffX0008rOjpad9xxh5ujR06ud0z37NmjWbNmqX379ipXrpx+++03DR48WM2bN1dkZKSbo8fVhg4dqnbt2qlatWo6c+aMZs2apZUrV2rp0qWcn4Cj3L0sc0FMnDjRqFatmuHl5WU0adLE+Pnnn90dEvLhkUceMSpVqmR4eXkZlStXNh555BFj9+7d7g4LefT9998bkrJtcXFxhmFcvuXPK6+8YlSsWNHw9vY2WrVqZezcudO9QSNXuR3Ts2fPGm3atDEqVKhgFC9e3AgJCTH69u1rJCUluTts5CCn4yjJ+Oijj2xtzp07Zzz11FNG2bJljZIlSxr333+/ceTIEfcFjVxd75geOHDAaN68uREQEGB4e3sbNWrUMJ5//nnj9OnT7g0cOerVq5cREhJieHl5GRUqVDBatWplfPvtt7bnOT+BvLMYhmG4MpEGAAAAAMCZTHmNLQAAAAAAWUhsAQAAAACmRmILAAAAADA1ElsAAAAAgKmR2AIAAAAATI3EFgAAAABgaiS2AAAAAABTI7EFAAAAAJgaiS0AmMj+/ftlsVi0ZcsWh1+7fPly1a5dWxkZGc4PLBcWi0ULFy7Mc/v4+Hh17tzZKWM7s6/CFBoaqgkTJji1z65du+rNN990ap8AANyoSGwBII/i4+NlsVhksVhUvHhxVaxYUa1bt9a0adNktVoLZTxnJmUvvPCCXn75ZXl6ekqSpk+fLovFotq1a2dr+/nnn8tisSg0NNRp4zvLypUrbcfBw8NDfn5+atiwoV544QUdOXLEru1bb72l6dOnuydQB2zYsEH9+vVzap8vv/yyXn31VZ0+fdqp/QIAcCMisQUAB7Rt21ZHjhzR/v37tXjxYt1999165plndO+99+rSpUvuDu+a1qxZoz179qhLly52+0uVKqVjx45p7dq1dvs//PBDVatWzZUhOmznzp06fPiwNmzYoBdffFHfffed6tatq61bt9ra+Pn5yd/f331B5lGFChVUsmRJp/ZZt25dhYeH65NPPnFqvwAA3IhIbAHAAd7e3goKClLlypV122236aWXXtIXX3yhxYsX21UGk5OT1adPH1WoUEG+vr6655579Ouvv9qeHzFihBo0aKB3331XVatWVcmSJfXwww/bqmsjRozQjBkz9MUXX9iqkytXrrS9fu/evbr77rtVsmRJ1a9fP1tierXZs2erdevW8vHxsdtfrFgxdevWTdOmTbPt+/vvv7Vy5Up169YtWz9TpkxReHi4vLy8VLNmTf3vf/+ze37Xrl1q3ry5fHx8FBERoWXLlmXr4+DBg3r44Yfl7++vgIAAderUSfv37881/pwEBgYqKChIt956q7p27aoff/xRFSpU0JNPPmlrc3XVu2XLlnr66ac1aNAglS1bVhUrVtT777+vtLQ09ezZU2XKlFGNGjW0ePFiu7G2bdumdu3aqXTp0qpYsaIef/xxnThxwq7fgQMH6oUXXlBAQICCgoI0YsQI2/OGYWjEiBGqVq2avL3/r717D4ky6+MA/vVSb1PeSrvormkXNAutprYlDYtWJpFsQtAKMbGSbmQtqwXrH0VL0da2shtkdkHtAiXUKEWpRWazs6WVNWWJmFlaaSIWZald5vf+EfO8Putls9p6p/1+QHCec57fuczA8Jtz5sx/4OXlhaSkJKX8r1uRa2trodfr4eTkBBcXF8TExODRo0dKufX1c+DAAfj6+sLV1RXz58/Hs2fPVP2OjIzE4cOHez23REREtoaJLRHRB5o5cybGjx+PY8eOKdeio6PR2NiIU6dO4cqVK9Bqtfjuu+/Q3Nys1Ll9+zZycnJw/Phx5Ofn4+rVq1ixYgUAIDk5GTExMcoKcX19PYKDg5V7U1NTkZycjGvXrsHPzw8LFizoccXYaDRi8uTJXZYtWrQIOTk5ePHiBYC3W5TDw8MxdOhQVT2DwYDVq1fjhx9+QHl5OZYuXYqEhAQUFRUBACwWC6KiotC3b1+UlJRg165dWLdunSrGq1evMGvWLDg7O8NoNMJkMsHJyQnh4eF4+fLlu0x3tzQaDZYtWwaTyYTGxsZu62VnZ8PDwwOlpaVYtWoVli9fjujoaAQHB6OsrAw6nQ5xcXHKfDx58gQzZ87ExIkTcfnyZeTn5+PRo0eIiYnpFHfAgAEoKSnB1q1bsXHjRiWxP3r0KNLS0pCRkYGqqirk5uYiMDCwy/5ZLBbo9Xo0NzejuLgYp0+fxp07dzBv3jxVverqauTm5uLEiRM4ceIEiouLsWXLFlWdKVOmoLS0FO3t7b2eTyIiIpsiRET0TuLj40Wv13dZNm/ePAkICBAREaPRKC4uLtLW1qaqM2rUKMnIyBARkfXr14uDg4Pcv39fKT916pTY29tLfX19t+3V1NQIANm7d69y7ebNmwJAKioquu27q6ur7N+/X3UtMzNTXF1dRURkwoQJkp2dLRaLRUaNGiV5eXmSlpYmPj4+Sv3g4GBJTExUxYiOjpaIiAgRESkoKBBHR0d58OCBakwAxGAwiIjIgQMHxN/fXywWi1Knvb1dNBqNFBQUdDvujoqKigSAPH78uFOZtb2SkpIuY02fPl2mTZumPH79+rUMGDBA4uLilGv19fUCQC5cuCAiIj/99JPodDpVO3V1dQJAKisru4wrIvLNN9/IunXrRERk+/bt4ufnJy9fvuxyTD4+PpKWliYiIoWFheLg4CC1tbVKufU5Li0tFZG3r5/+/fvL06dPlTopKSny7bffquKazWYBIHfv3u2yXSIioi8FV2yJiD4CEYGdnR0AwGw2o6WlBe7u7nByclL+ampqUF1drdwzfPhwfPXVV8rjqVOnwmKxoLKy8m/bCwoKUv739PQEgB5XKVtbWzttQ+5o0aJFyMzMRHFxMZ4/f46IiIhOdSoqKhASEqK6FhISgoqKCqXc29sbXl5eqjF1ZDabcfv2bTg7OyvzMmjQILS1tanm5n2JCAAoz0VXOs6dg4MD3N3dVaun1pVq63yazWYUFRWpnssxY8YAgKrPHeMCb58Xa4zo6Gi0trZi5MiRSExMhMFg6HaF3TqP3t7eyrWxY8fCzc1NmWvg7fZlZ2fnLtuz0mg0AKCsPhMREX2pHD93B4iIvgQVFRUYMWIEAKClpQWenp6q78RafayDjPr06aP8b03iejqZ2cPDA48fP+62PDY2FmvXrsWGDRsQFxcHR8d/5u2hpaUFkyZNwqFDhzqVDR48+IPjWxO/nk5z7jh3AJRTrjs+Bv43ny0tLYiMjMTPP//cKZb1Q4Xu4lpjeHt7o7KyEmfOnMHp06exYsUKbNu2DcXFxZ3ue1c9tWdl3fr+MeaWiIjo/xkTWyKiD3T27FncuHED33//PQBAq9WioaEBjo6OPSZYtbW1ePjwobLCefHiRdjb28Pf3x8A0Ldv34/2m7MTJ07ErVu3ui0fNGgQ5syZg5ycHOzatavLOgEBATCZTIiPj1eumUwmjB07Vimvq6tDfX29kvBdvHhRFUOr1eLIkSMYMmQIXFxcPnRYKq2trdi9ezdCQ0M/aiKn1Wpx9OhR+Pr6flDCr9FoEBkZicjISKxcuRJjxozBjRs3oNVqVfWs81hXV6es2t66dQtPnjxR5vpdlZeX4+uvv4aHh8d795uIiMgWcCsyEVEvtLe3o6GhAQ8ePEBZWRk2b94MvV6P2bNnY+HChQCAsLAwTJ06FXPnzkVhYSHu3r2LP//8E6mpqbh8+bISq1+/foiPj4fZbIbRaERSUhJiYmIwbNgwAG9XHa9fv47Kyko0NTXh1atX793vWbNm4Y8//uixTlZWFpqampRttn+VkpKCrKwspKeno6qqCr/++iuOHTuG5ORkZdx+fn6qMaWmpqpixMbGwsPDA3q9HkajETU1NTh37hySkpJw//79Xo2psbERDQ0NqKqqwuHDhxESEoKmpiakp6f3Ks7fWblyJZqbm7FgwQJcunQJ1dXVKCgoQEJCwjt/8JCVlYV9+/ahvLwcd+7cwcGDB6HRaODj49OpblhYGAIDAxEbG4uysjKUlpZi4cKFmD59ercHgHXHaDRCp9P16h4iIiJbxMSWiKgX8vPz4enpCV9fX4SHh6OoqAi///478vLy4ODgAODtltCTJ08iNDQUCQkJys/R3Lt3T3XS8OjRoxEVFYWIiAjodDoEBQVh586dSnliYiL8/f0xefJkDB48GCaT6b37HRsbi5s3b/b4/V2NRgN3d/duy+fOnYvffvsNv/zyC8aNG4eMjAxkZmZixowZAAB7e3sYDAa0trZiypQpWLJkCTZt2qSK0b9/f5w/fx7Dhw9HVFQUAgICsHjxYrS1tfV6Bdff3x9eXl6YNGkStmzZgrCwMJSXl/d6VfPveHl5wWQy4c2bN9DpdAgMDMSaNWvg5uYGe/t3ext1c3PDnj17EBISgqCgIJw5cwbHjx/vcr7t7OyQl5eHgQMHIjQ0FGFhYRg5ciSOHDnSq363tbUhNzcXiYmJvbqPiIjIFtmJ9aQNIiL6ZDZs2IDc3Fxcu3btk7WZkpKCp0+fIiMj45O1SZ9Peno6DAYDCgsLP3dXiIiI/nFcsSUi+pdITU2Fj49Pj4dM0ZejT58+2LFjx+fuBhER0SfBw6OIiP4l3Nzc8OOPP37ubtAnsmTJks/dBSIiok+GW5GJiIiIiIjIpnErMhEREREREdk0JrZERERERERk05jYEhERERERkU1jYktEREREREQ2jYktERERERER2TQmtkRERERERGTTmNgSERERERGRTWNiS0RERERERDbtv2WceDMAUeiaAAAAAElFTkSuQmCC",
1324 | "text/plain": [
1325 | ""
1326 | ]
1327 | },
1328 | "metadata": {},
1329 | "output_type": "display_data"
1330 | }
1331 | ],
1332 | "source": [
1333 | "# Initialize PositionalEncoding\n",
1334 | "pos_encoding_layer = PositionalEncoding(d_model, max_len)\n",
1335 | "\n",
1336 | "# Extract the positional encodings\n",
1337 | "pos_encoding = pos_encoding_layer.pe.squeeze(1).numpy()\n",
1338 | "\n",
1339 | "# Plot the positional encoding\n",
1340 | "plt.figure(figsize=(12, 8))\n",
1341 | "plt.pcolormesh(pos_encoding, cmap='viridis')\n",
1342 | "plt.xlabel('Depth (Model Dimension)')\n",
1343 | "plt.xlim((0, d_model))\n",
1344 | "plt.ylabel('Position in Sequence')\n",
1345 | "plt.ylim((0, max_len))\n",
1346 | "plt.colorbar(label=\"Encoding Value\")\n",
1347 | "plt.title('Positional Encoding Visualization (PositionalEncoding Class)')\n",
1348 | "plt.show()\n"
1349 | ]
1350 | },
1351 | {
1352 | "cell_type": "code",
1353 | "execution_count": null,
1354 | "metadata": {
1355 | "executionInfo": {
1356 | "elapsed": 1,
1357 | "status": "aborted",
1358 | "timestamp": 1724522516517,
1359 | "user": {
1360 | "displayName": "Roland Potthast",
1361 | "userId": "09141136587533247770"
1362 | },
1363 | "user_tz": -120
1364 | },
1365 | "id": "ZRHPIJYYeW1Y"
1366 | },
1367 | "outputs": [],
1368 | "source": []
1369 | }
1370 | ],
1371 | "metadata": {
1372 | "colab": {
1373 | "authorship_tag": "ABX9TyOqzrZ2Ox1pYCh9SvUgAsLy",
1374 | "provenance": []
1375 | },
1376 | "kernelspec": {
1377 | "display_name": "Python 3 (ipykernel)",
1378 | "language": "python",
1379 | "name": "python3"
1380 | },
1381 | "language_info": {
1382 | "codemirror_mode": {
1383 | "name": "ipython",
1384 | "version": 3
1385 | },
1386 | "file_extension": ".py",
1387 | "mimetype": "text/x-python",
1388 | "name": "python",
1389 | "nbconvert_exporter": "python",
1390 | "pygments_lexer": "ipython3",
1391 | "version": "3.11.7"
1392 | }
1393 | },
1394 | "nbformat": 4,
1395 | "nbformat_minor": 4
1396 | }
1397 |
--------------------------------------------------------------------------------
/tutorial3/3_3_RAG_example_0.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": 1,
6 | "metadata": {
7 | "colab": {
8 | "base_uri": "https://localhost:8080/"
9 | },
10 | "id": "4lfRO8R_oZDt",
11 | "outputId": "d89a3da0-6500-4d06-f8a5-b305cf11de83"
12 | },
13 | "outputs": [
14 | {
15 | "name": "stdout",
16 | "output_type": "stream",
17 | "text": [
18 | "Collecting transformers\n",
19 | " Downloading transformers-4.44.2-py3-none-any.whl.metadata (43 kB)\n",
20 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m43.7/43.7 kB\u001b[0m \u001b[31m3.4 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
21 | "\u001b[?25hCollecting faiss-cpu\n",
22 | " Downloading faiss_cpu-1.8.0.post1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.7 kB)\n",
23 | "Requirement already satisfied: numpy in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (1.26.4)\n",
24 | "Requirement already satisfied: torch in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (2.4.0)\n",
25 | "Requirement already satisfied: filelock in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from transformers) (3.15.4)\n",
26 | "Collecting huggingface-hub<1.0,>=0.23.2 (from transformers)\n",
27 | " Downloading huggingface_hub-0.25.0-py3-none-any.whl.metadata (13 kB)\n",
28 | "Requirement already satisfied: packaging>=20.0 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from transformers) (24.1)\n",
29 | "Requirement already satisfied: pyyaml>=5.1 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from transformers) (6.0.2)\n",
30 | "Collecting regex!=2019.12.17 (from transformers)\n",
31 | " Downloading regex-2024.9.11-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (40 kB)\n",
32 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m40.5/40.5 kB\u001b[0m \u001b[31m5.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
33 | "\u001b[?25hRequirement already satisfied: requests in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from transformers) (2.32.3)\n",
34 | "Collecting safetensors>=0.4.1 (from transformers)\n",
35 | " Downloading safetensors-0.4.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.8 kB)\n",
36 | "Collecting tokenizers<0.20,>=0.19 (from transformers)\n",
37 | " Downloading tokenizers-0.19.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.7 kB)\n",
38 | "Collecting tqdm>=4.27 (from transformers)\n",
39 | " Downloading tqdm-4.66.5-py3-none-any.whl.metadata (57 kB)\n",
40 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m57.6/57.6 kB\u001b[0m \u001b[31m9.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
41 | "\u001b[?25hRequirement already satisfied: typing-extensions>=4.8.0 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (4.12.2)\n",
42 | "Requirement already satisfied: sympy in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (1.13.2)\n",
43 | "Requirement already satisfied: networkx in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (3.3)\n",
44 | "Requirement already satisfied: jinja2 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (3.1.4)\n",
45 | "Requirement already satisfied: fsspec in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (2024.6.1)\n",
46 | "Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.1.105 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (12.1.105)\n",
47 | "Requirement already satisfied: nvidia-cuda-runtime-cu12==12.1.105 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (12.1.105)\n",
48 | "Requirement already satisfied: nvidia-cuda-cupti-cu12==12.1.105 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (12.1.105)\n",
49 | "Requirement already satisfied: nvidia-cudnn-cu12==9.1.0.70 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (9.1.0.70)\n",
50 | "Requirement already satisfied: nvidia-cublas-cu12==12.1.3.1 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (12.1.3.1)\n",
51 | "Requirement already satisfied: nvidia-cufft-cu12==11.0.2.54 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (11.0.2.54)\n",
52 | "Requirement already satisfied: nvidia-curand-cu12==10.3.2.106 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (10.3.2.106)\n",
53 | "Requirement already satisfied: nvidia-cusolver-cu12==11.4.5.107 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (11.4.5.107)\n",
54 | "Requirement already satisfied: nvidia-cusparse-cu12==12.1.0.106 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (12.1.0.106)\n",
55 | "Requirement already satisfied: nvidia-nccl-cu12==2.20.5 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (2.20.5)\n",
56 | "Requirement already satisfied: nvidia-nvtx-cu12==12.1.105 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (12.1.105)\n",
57 | "Requirement already satisfied: triton==3.0.0 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from torch) (3.0.0)\n",
58 | "Requirement already satisfied: nvidia-nvjitlink-cu12 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from nvidia-cusolver-cu12==11.4.5.107->torch) (12.6.20)\n",
59 | "Requirement already satisfied: MarkupSafe>=2.0 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from jinja2->torch) (2.1.5)\n",
60 | "Requirement already satisfied: charset-normalizer<4,>=2 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from requests->transformers) (3.3.2)\n",
61 | "Requirement already satisfied: idna<4,>=2.5 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from requests->transformers) (3.8)\n",
62 | "Requirement already satisfied: urllib3<3,>=1.21.1 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from requests->transformers) (2.2.2)\n",
63 | "Requirement already satisfied: certifi>=2017.4.17 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from requests->transformers) (2024.7.4)\n",
64 | "Requirement already satisfied: mpmath<1.4,>=1.1.0 in /media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages (from sympy->torch) (1.3.0)\n",
65 | "Downloading transformers-4.44.2-py3-none-any.whl (9.5 MB)\n",
66 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m9.5/9.5 MB\u001b[0m \u001b[31m83.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m:00:01\u001b[0m00:01\u001b[0m\n",
67 | "\u001b[?25hDownloading faiss_cpu-1.8.0.post1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (27.0 MB)\n",
68 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m27.0/27.0 MB\u001b[0m \u001b[31m63.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m:00:01\u001b[0m00:01\u001b[0m\n",
69 | "\u001b[?25hDownloading huggingface_hub-0.25.0-py3-none-any.whl (436 kB)\n",
70 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m436.4/436.4 kB\u001b[0m \u001b[31m26.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
71 | "\u001b[?25hDownloading regex-2024.9.11-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (792 kB)\n",
72 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m792.8/792.8 kB\u001b[0m \u001b[31m44.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
73 | "\u001b[?25hDownloading safetensors-0.4.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (435 kB)\n",
74 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m435.0/435.0 kB\u001b[0m \u001b[31m58.2 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
75 | "\u001b[?25hDownloading tokenizers-0.19.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB)\n",
76 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m3.6/3.6 MB\u001b[0m \u001b[31m81.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m:00:01\u001b[0m\n",
77 | "\u001b[?25hDownloading tqdm-4.66.5-py3-none-any.whl (78 kB)\n",
78 | "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m78.4/78.4 kB\u001b[0m \u001b[31m14.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
79 | "\u001b[?25hInstalling collected packages: tqdm, safetensors, regex, faiss-cpu, huggingface-hub, tokenizers, transformers\n",
80 | "Successfully installed faiss-cpu-1.8.0.post1 huggingface-hub-0.25.0 regex-2024.9.11 safetensors-0.4.5 tokenizers-0.19.1 tqdm-4.66.5 transformers-4.44.2\n",
81 | "\n",
82 | "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m24.0\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m24.2\u001b[0m\n",
83 | "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpip install --upgrade pip\u001b[0m\n",
84 | "Note: you may need to restart the kernel to use updated packages.\n"
85 | ]
86 | }
87 | ],
88 | "source": [
89 | "%pip install transformers faiss-cpu numpy torch"
90 | ]
91 | },
92 | {
93 | "cell_type": "code",
94 | "execution_count": 1,
95 | "metadata": {
96 | "colab": {
97 | "base_uri": "https://localhost:8080/"
98 | },
99 | "id": "LSw7vBnmoaHJ",
100 | "outputId": "0b422bb4-6686-4972-93eb-f0e5d064bf2c"
101 | },
102 | "outputs": [],
103 | "source": [
104 | "import numpy as np\n",
105 | "import faiss\n",
106 | "import torch\n",
107 | "from transformers import AutoTokenizer, AutoModel"
108 | ]
109 | },
110 | {
111 | "cell_type": "code",
112 | "execution_count": 2,
113 | "metadata": {
114 | "colab": {
115 | "base_uri": "https://localhost:8080/"
116 | },
117 | "id": "LSw7vBnmoaHJ",
118 | "outputId": "0b422bb4-6686-4972-93eb-f0e5d064bf2c"
119 | },
120 | "outputs": [
121 | {
122 | "name": "stderr",
123 | "output_type": "stream",
124 | "text": [
125 | "/media/nas/uwork1/shollbor/pys/lib64/python3.11/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884\n",
126 | " warnings.warn(\n"
127 | ]
128 | }
129 | ],
130 | "source": [
131 | "# Step 1: Load the LLM\n",
132 | "model_name = \"distilbert-base-uncased\" # You can use any compatible model\n",
133 | "tokenizer = AutoTokenizer.from_pretrained(model_name)\n",
134 | "model = AutoModel.from_pretrained(model_name)"
135 | ]
136 | },
137 | {
138 | "cell_type": "code",
139 | "execution_count": 3,
140 | "metadata": {
141 | "colab": {
142 | "base_uri": "https://localhost:8080/"
143 | },
144 | "id": "LSw7vBnmoaHJ",
145 | "outputId": "0b422bb4-6686-4972-93eb-f0e5d064bf2c"
146 | },
147 | "outputs": [],
148 | "source": [
149 | "# Step 2: Prepare some documents for the vector database\n",
150 | "documents = [\n",
151 | " \"The cat sat on the mat.\",\n",
152 | " \"The dog chased the ball.\",\n",
153 | " \"Birds fly in the sky.\",\n",
154 | " \"Fish swim in the ocean.\",\n",
155 | " \"Tables have four legs.\"\n",
156 | "]"
157 | ]
158 | },
159 | {
160 | "cell_type": "code",
161 | "execution_count": 5,
162 | "metadata": {
163 | "colab": {
164 | "base_uri": "https://localhost:8080/"
165 | },
166 | "id": "LSw7vBnmoaHJ",
167 | "outputId": "0b422bb4-6686-4972-93eb-f0e5d064bf2c"
168 | },
169 | "outputs": [],
170 | "source": [
171 | "# Step 3: Encode documents into vectors\n",
172 | "def encode_documents(documents):\n",
173 | " inputs = tokenizer(documents, padding=True, truncation=True, return_tensors=\"pt\")\n",
174 | " with torch.no_grad():\n",
175 | " embeddings = model(**inputs).last_hidden_state.mean(dim=1) # Average pooling\n",
176 | " return embeddings.numpy()\n",
177 | "\n",
178 | "# Create the vector database\n",
179 | "document_vectors = encode_documents(documents)\n",
180 | "dim = document_vectors.shape[1]"
181 | ]
182 | },
183 | {
184 | "cell_type": "code",
185 | "execution_count": 6,
186 | "metadata": {
187 | "colab": {
188 | "base_uri": "https://localhost:8080/"
189 | },
190 | "id": "LSw7vBnmoaHJ",
191 | "outputId": "0b422bb4-6686-4972-93eb-f0e5d064bf2c"
192 | },
193 | "outputs": [],
194 | "source": [
195 | "# Step 4: Build the FAISS index\n",
196 | "index = faiss.IndexFlatL2(dim) # Using L2 distance\n",
197 | "index.add(document_vectors) # Add document vectors to the index"
198 | ]
199 | },
200 | {
201 | "cell_type": "code",
202 | "execution_count": 7,
203 | "metadata": {
204 | "colab": {
205 | "base_uri": "https://localhost:8080/"
206 | },
207 | "id": "LSw7vBnmoaHJ",
208 | "outputId": "0b422bb4-6686-4972-93eb-f0e5d064bf2c"
209 | },
210 | "outputs": [],
211 | "source": [
212 | "# Step 5: Define a function for RAG\n",
213 | "def retrieve_and_generate(query):\n",
214 | " # Encode the query\n",
215 | " query_vector = encode_documents([query])\n",
216 | "\n",
217 | " # Retrieve top-k similar documents\n",
218 | " k = 1 # Number of top results to retrieve\n",
219 | " D, I = index.search(query_vector, k) # D: distances, I: indices\n",
220 | "\n",
221 | " # Get the relevant documents\n",
222 | " relevant_docs = [documents[i] for i in I[0]]\n",
223 | "\n",
224 | " # Simple \"generation\" (for demonstration, just concatenate)\n",
225 | " response = \" \".join(relevant_docs)\n",
226 | " return response"
227 | ]
228 | },
229 | {
230 | "cell_type": "code",
231 | "execution_count": 8,
232 | "metadata": {
233 | "colab": {
234 | "base_uri": "https://localhost:8080/"
235 | },
236 | "id": "LSw7vBnmoaHJ",
237 | "outputId": "0b422bb4-6686-4972-93eb-f0e5d064bf2c"
238 | },
239 | "outputs": [
240 | {
241 | "name": "stdout",
242 | "output_type": "stream",
243 | "text": [
244 | "Response: Fish swim in the ocean.\n"
245 | ]
246 | }
247 | ],
248 | "source": [
249 | "# Step 6: Use the RAG system\n",
250 | "query = \"What do animals do?\"\n",
251 | "response = retrieve_and_generate(query)\n",
252 | "print(\"Response:\", response)"
253 | ]
254 | },
255 | {
256 | "cell_type": "code",
257 | "execution_count": 9,
258 | "metadata": {
259 | "colab": {
260 | "base_uri": "https://localhost:8080/"
261 | },
262 | "id": "BHdRR80won5N",
263 | "outputId": "2c21bb9c-cbf9-4bb8-a4dd-fae086335906"
264 | },
265 | "outputs": [
266 | {
267 | "name": "stdout",
268 | "output_type": "stream",
269 | "text": [
270 | "Response: The dog chased the ball.\n"
271 | ]
272 | }
273 | ],
274 | "source": [
275 | "query = \"What do you know about barking \"\n",
276 | "response = retrieve_and_generate(query)\n",
277 | "print(\"Response:\", response)"
278 | ]
279 | },
280 | {
281 | "cell_type": "code",
282 | "execution_count": null,
283 | "metadata": {
284 | "id": "1iz0oMpZpeU7"
285 | },
286 | "outputs": [],
287 | "source": []
288 | }
289 | ],
290 | "metadata": {
291 | "colab": {
292 | "provenance": []
293 | },
294 | "kernelspec": {
295 | "display_name": "Python 3 (ipykernel)",
296 | "language": "python",
297 | "name": "python3"
298 | },
299 | "language_info": {
300 | "codemirror_mode": {
301 | "name": "ipython",
302 | "version": 3
303 | },
304 | "file_extension": ".py",
305 | "mimetype": "text/x-python",
306 | "name": "python",
307 | "nbconvert_exporter": "python",
308 | "pygments_lexer": "ipython3",
309 | "version": "3.11.9"
310 | }
311 | },
312 | "nbformat": 4,
313 | "nbformat_minor": 4
314 | }
315 |
--------------------------------------------------------------------------------
/tutorial3/E-AI_Talks_Basics_03_LLM_Transformer_RAG.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial3/E-AI_Talks_Basics_03_LLM_Transformer_RAG.pdf
--------------------------------------------------------------------------------
/tutorial3/E-AI_Talks_Basics_03_LLM_Transformer_RAG.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial3/E-AI_Talks_Basics_03_LLM_Transformer_RAG.pptx
--------------------------------------------------------------------------------
/tutorial4/4-1#git_demo_store#hooks#post-receive:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 |
3 | # Simple post-receive hook demo
4 | WORK_TREE="~/e-ai_tutorials/tutorial4/git_demo_work"
5 | GIT_DIR="$(pwd)" # Automatically set to the path of the bare repository
6 |
7 | echo "Post-receive hook triggered. Updating work tree..."
8 | git --work-tree="$WORK_TREE" --git-dir="$GIT_DIR" checkout -f
9 | echo "Work tree updated successfully."
10 |
11 |
12 |
--------------------------------------------------------------------------------
/tutorial4/4-2_provision.eccodes.sh:
--------------------------------------------------------------------------------
1 | set -e
2 |
3 | apt-get update && apt-get install -y \
4 | wget \
5 | python3 \
6 | gcc g++ gfortran \
7 | libc-dev \
8 | python3-dev python3-venv \
9 | git \
10 | cmake \
11 | make \
12 | libaec-dev \
13 | perl \
14 | bzip2 \
15 | && rm -rf /var/lib/apt/lists/*
16 |
17 | wget -q https://confluence.ecmwf.int/download/attachments/45757960/eccodes-2.33.0-Source.tar.gz
18 |
19 | tar xzf eccodes-2.33.0-Source.tar.gz
20 | rm eccodes-2.33.0-Source.tar.gz
21 | cd eccodes-2.33.0-Source && mkdir build
22 | cd build && cmake .. -DCMAKE_INSTALL_MESSAGE=NEVER
23 | make -j$(grep processor /proc/cpuinfo | wc -l)
24 | make install VERBOSE=0
25 | cd ../../ && rm -rf eccodes-2.33.0-Source
26 |
27 | # clean up packages that were used only for this build process
28 | apt-get remove -y \
29 | gcc g++ gfortran \
30 | libc-dev
31 | apt autoremove -y
32 | rm -rf /var/lib/apt/lists/*
33 |
34 |
35 | # Optional: Use local definition files
36 | # cd /usr/local/share/eccodes/
37 | # wget -q http://opendata.dwd.de/weather/lib/grib/eccodes_definitions.edzw-2.32.0-1.tar.bz2
38 | # tar xf eccodes_definitions.edzw-2.32.0-1.tar.bz2
39 | # rm eccodes_definitions.edzw-2.32.0-1.tar.bz2
40 | #
41 | # To use these, add the following line to the Dockerfile
42 | # ENV ECCODES_DEFINITION_PATH="/usr/local/share/eccodes/definitions.edzw-2.32.0-1/:/usr/local/share/eccodes/definitions/"
43 |
--------------------------------------------------------------------------------
/tutorial4/4_1_2_mlflow_server_via_ngrok.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": null,
6 | "metadata": {
7 | "colab": {
8 | "base_uri": "https://localhost:8080/"
9 | },
10 | "id": "alzT-_Q7OFyl",
11 | "outputId": "c8bc38c4-961f-46b2-d2d6-6e7b7a17c5f1"
12 | },
13 | "outputs": [],
14 | "source": [
15 | "!pip install pyngrok"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": null,
21 | "metadata": {
22 | "colab": {
23 | "base_uri": "https://localhost:8080/"
24 | },
25 | "id": "5wr-_JqjOiWy",
26 | "outputId": "9fe28cb4-ffd0-4312-e2f0-239f3513a70c"
27 | },
28 | "outputs": [],
29 | "source": [
30 | "from pyngrok import ngrok\n",
31 | "ngrok.set_auth_token(\"xxx\")"
32 | ]
33 | },
34 | {
35 | "cell_type": "code",
36 | "execution_count": null,
37 | "metadata": {
38 | "colab": {
39 | "base_uri": "https://localhost:8080/"
40 | },
41 | "id": "xcUt_DTOOokx",
42 | "outputId": "31e54521-2580-4d79-fa99-38a8e0fc37b3"
43 | },
44 | "outputs": [],
45 | "source": [
46 | "public_url = ngrok.connect(5000)\n",
47 | "print(\"Public URL:\", public_url)"
48 | ]
49 | },
50 | {
51 | "cell_type": "code",
52 | "execution_count": null,
53 | "metadata": {
54 | "colab": {
55 | "base_uri": "https://localhost:8080/"
56 | },
57 | "id": "zkdF9PGyPBxB",
58 | "outputId": "1115bb3d-e8f7-408e-cec4-550d69e0a185"
59 | },
60 | "outputs": [],
61 | "source": [
62 | "!pip install mlflow"
63 | ]
64 | },
65 | {
66 | "cell_type": "code",
67 | "execution_count": null,
68 | "metadata": {
69 | "id": "sPswTaVtPOMQ"
70 | },
71 | "outputs": [],
72 | "source": [
73 | "import os\n",
74 | "\n",
75 | "backend_store = \"/content/mlflow_backend\"\n",
76 | "artifact_store = \"/content/mlflow_artifacts\"\n",
77 | "\n",
78 | "os.makedirs(backend_store, exist_ok=True)\n",
79 | "os.makedirs(artifact_store, exist_ok=True)"
80 | ]
81 | },
82 | {
83 | "cell_type": "code",
84 | "execution_count": null,
85 | "metadata": {
86 | "colab": {
87 | "base_uri": "https://localhost:8080/"
88 | },
89 | "id": "nbuXvHVNPRpf",
90 | "outputId": "8e379e3f-399c-4f8c-a9f4-0c7a64248a08"
91 | },
92 | "outputs": [],
93 | "source": [
94 | "!mlflow server \\\n",
95 | " --backend-store-uri sqlite:///{backend_store}/mlflow.db \\\n",
96 | " --default-artifact-root {artifact_store} \\\n",
97 | " --host 0.0.0.0 \\\n",
98 | " --port 5000"
99 | ]
100 | },
101 | {
102 | "cell_type": "code",
103 | "execution_count": null,
104 | "metadata": {
105 | "id": "ePvcp9ShOsJ5"
106 | },
107 | "outputs": [],
108 | "source": [
109 | "# Disconnecting public url\n",
110 | "#ngrok.disconnect(public_url)"
111 | ]
112 | }
113 | ],
114 | "metadata": {
115 | "colab": {
116 | "provenance": []
117 | },
118 | "kernelspec": {
119 | "display_name": "Python 3 (ipykernel)",
120 | "language": "python",
121 | "name": "python3"
122 | },
123 | "language_info": {
124 | "codemirror_mode": {
125 | "name": "ipython",
126 | "version": 3
127 | },
128 | "file_extension": ".py",
129 | "mimetype": "text/x-python",
130 | "name": "python",
131 | "nbconvert_exporter": "python",
132 | "pygments_lexer": "ipython3",
133 | "version": "3.11.10"
134 | }
135 | },
136 | "nbformat": 4,
137 | "nbformat_minor": 4
138 | }
139 |
--------------------------------------------------------------------------------
/tutorial4/4_1_3_MLFlow_Application.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "code",
5 | "execution_count": null,
6 | "metadata": {
7 | "colab": {
8 | "base_uri": "https://localhost:8080/"
9 | },
10 | "id": "N54DAuujPdx5",
11 | "outputId": "6442c7f3-5679-41ca-c6ac-d53387c73418"
12 | },
13 | "outputs": [],
14 | "source": [
15 | "!pip install mlflow"
16 | ]
17 | },
18 | {
19 | "cell_type": "code",
20 | "execution_count": null,
21 | "metadata": {
22 | "colab": {
23 | "base_uri": "https://localhost:8080/"
24 | },
25 | "id": "1mGoqsN_PjEf",
26 | "outputId": "8ea062f5-e58f-4765-a652-fe0d82e6fb24"
27 | },
28 | "outputs": [],
29 | "source": [
30 | "import mlflow\n",
31 | "\n",
32 | "mlflow.set_tracking_uri(\"https://d23b-34-105-74-98.ngrok-free.app\") # Replace with your public URL\n",
33 | "mlflow.set_experiment(\"Colab Experiment\")"
34 | ]
35 | },
36 | {
37 | "cell_type": "code",
38 | "execution_count": null,
39 | "metadata": {
40 | "colab": {
41 | "base_uri": "https://localhost:8080/"
42 | },
43 | "id": "x40UwF5_Pjr3",
44 | "outputId": "b0653ff0-db84-4b34-85c1-aa86b6689598"
45 | },
46 | "outputs": [],
47 | "source": [
48 | "with mlflow.start_run(run_name=\"Example Run\"):\n",
49 | " # Log parameters\n",
50 | " mlflow.log_param(\"param1\", 10)\n",
51 | " mlflow.log_param(\"param2\", 20)\n",
52 | "\n",
53 | " # Log metrics\n",
54 | " mlflow.log_metric(\"accuracy\", 0.95)\n",
55 | " mlflow.log_metric(\"loss\", 0.1)\n",
56 | "\n",
57 | " # Log an artifact (e.g., a text file)\n",
58 | " with open(\"output.txt\", \"w\") as f:\n",
59 | " f.write(\"This is an example artifact.\")\n",
60 | " mlflow.log_artifact(\"output.txt\")\n",
61 | "print(\"Run completed and logged to MLflow server.\")\n"
62 | ]
63 | },
64 | {
65 | "cell_type": "code",
66 | "execution_count": null,
67 | "metadata": {
68 | "colab": {
69 | "base_uri": "https://localhost:8080/"
70 | },
71 | "id": "5RsAAbLCPoPO",
72 | "outputId": "07e91428-9645-43e6-96bf-92cb809b1d20"
73 | },
74 | "outputs": [],
75 | "source": [
76 | "# Install required packages\n",
77 | "!pip install mlflow torch torchvision scikit-learn matplotlib\n",
78 | "\n",
79 | "import mlflow\n",
80 | "import torch\n",
81 | "import torch.nn as nn\n",
82 | "import torch.optim as optim\n",
83 | "from sklearn.datasets import load_iris\n",
84 | "from sklearn.model_selection import train_test_split\n",
85 | "from sklearn.preprocessing import OneHotEncoder\n",
86 | "from torch.utils.data import DataLoader, TensorDataset\n",
87 | "\n",
88 | "# Set MLflow Tracking URI (Replace with your ngrok public URL from Notebook 1)\n",
89 | "mlflow.set_tracking_uri(\"https://d23b-34-105-74-98.ngrok-free.app\") # Replace with your public URL\n",
90 | "mlflow.set_experiment(\"Loss Curves Training\")\n",
91 | "\n",
92 | "# Load the Iris dataset\n",
93 | "data = load_iris()\n",
94 | "X = data['data']\n",
95 | "y = data['target']\n",
96 | "\n",
97 | "# One-hot encode the target\n",
98 | "encoder = OneHotEncoder(sparse_output=False)\n",
99 | "y = encoder.fit_transform(y.reshape(-1, 1))\n",
100 | "\n",
101 | "# Split data\n",
102 | "X_train, X_val, y_train, y_val = train_test_split(X, y, test_size=0.2, random_state=42)\n",
103 | "\n",
104 | "# Convert to PyTorch tensors\n",
105 | "X_train = torch.tensor(X_train, dtype=torch.float32)\n",
106 | "X_val = torch.tensor(X_val, dtype=torch.float32)\n",
107 | "y_train = torch.tensor(y_train, dtype=torch.float32)\n",
108 | "y_val = torch.tensor(y_val, dtype=torch.float32)\n",
109 | "\n",
110 | "# Create DataLoader\n",
111 | "def get_data_loader(X, y, batch_size):\n",
112 | " dataset = TensorDataset(X, y)\n",
113 | " return DataLoader(dataset, batch_size=batch_size, shuffle=True)\n",
114 | "\n",
115 | "# Define a simple PyTorch model\n",
116 | "class SimpleNN(nn.Module):\n",
117 | " def __init__(self, input_dim, hidden_dim, output_dim):\n",
118 | " super(SimpleNN, self).__init__()\n",
119 | " self.fc1 = nn.Linear(input_dim, hidden_dim)\n",
120 | " self.relu = nn.ReLU()\n",
121 | " self.fc2 = nn.Linear(hidden_dim, output_dim)\n",
122 | " self.softmax = nn.Softmax(dim=1)\n",
123 | "\n",
124 | " def forward(self, x):\n",
125 | " x = self.fc1(x)\n",
126 | " x = self.relu(x)\n",
127 | " x = self.fc2(x)\n",
128 | " return self.softmax(x)\n",
129 | "\n",
130 | "# Train and log metrics to MLflow\n",
131 | "with mlflow.start_run(run_name=\"Loss Curves Example\"):\n",
132 | " # Define model, optimizer, and loss function\n",
133 | " model = SimpleNN(X_train.shape[1], hidden_dim=64, output_dim=y_train.shape[1])\n",
134 | " criterion = nn.CrossEntropyLoss()\n",
135 | " optimizer = optim.Adam(model.parameters(), lr=0.01)\n",
136 | "\n",
137 | " # Create data loaders\n",
138 | " train_loader = get_data_loader(X_train, y_train, batch_size=16)\n",
139 | " val_loader = get_data_loader(X_val, y_val, batch_size=16)\n",
140 | "\n",
141 | " # Training parameters\n",
142 | " epochs = 50\n",
143 | " train_losses = []\n",
144 | " val_losses = []\n",
145 | "\n",
146 | " # Training loop\n",
147 | " for epoch in range(epochs):\n",
148 | " # Training phase\n",
149 | " model.train()\n",
150 | " train_loss = 0.0\n",
151 | " for X_batch, y_batch in train_loader:\n",
152 | " optimizer.zero_grad()\n",
153 | " outputs = model(X_batch)\n",
154 | " loss = criterion(outputs, torch.argmax(y_batch, dim=1))\n",
155 | " loss.backward()\n",
156 | " optimizer.step()\n",
157 | " train_loss += loss.item()\n",
158 | " train_loss /= len(train_loader)\n",
159 | " train_losses.append(train_loss)\n",
160 | "\n",
161 | " # Validation phase\n",
162 | " model.eval()\n",
163 | " val_loss = 0.0\n",
164 | " with torch.no_grad():\n",
165 | " for X_batch, y_batch in val_loader:\n",
166 | " outputs = model(X_batch)\n",
167 | " loss = criterion(outputs, torch.argmax(y_batch, dim=1))\n",
168 | " val_loss += loss.item()\n",
169 | " val_loss /= len(val_loader)\n",
170 | " val_losses.append(val_loss)\n",
171 | "\n",
172 | " # Log metrics to MLflow\n",
173 | " mlflow.log_metric(\"train_loss\", train_loss, step=epoch)\n",
174 | " mlflow.log_metric(\"val_loss\", val_loss, step=epoch)\n",
175 | "\n",
176 | " print(f\"Epoch {epoch + 1}/{epochs} - Train Loss: {train_loss:.4f}, Val Loss: {val_loss:.4f}\")\n",
177 | "\n",
178 | " # Log model parameters\n",
179 | " mlflow.log_param(\"hidden_dim\", 64)\n",
180 | " mlflow.log_param(\"learning_rate\", 0.01)\n",
181 | " mlflow.log_param(\"batch_size\", 16)\n",
182 | " mlflow.log_param(\"epochs\", epochs)\n",
183 | "\n",
184 | " print(\"Training completed and metrics logged to MLflow server.\")\n"
185 | ]
186 | },
187 | {
188 | "cell_type": "code",
189 | "execution_count": null,
190 | "metadata": {
191 | "id": "pbyIKMGBVoms"
192 | },
193 | "outputs": [],
194 | "source": [
195 | "myurl=\"https://d23b-34-105-74-98.ngrok-free.app\""
196 | ]
197 | },
198 | {
199 | "cell_type": "code",
200 | "execution_count": null,
201 | "metadata": {
202 | "id": "W_-z1_wlYFvN"
203 | },
204 | "outputs": [],
205 | "source": [
206 | "import time"
207 | ]
208 | },
209 | {
210 | "cell_type": "code",
211 | "execution_count": null,
212 | "metadata": {
213 | "colab": {
214 | "base_uri": "https://localhost:8080/"
215 | },
216 | "id": "nIjYEBT-QRb1",
217 | "outputId": "43ff4f31-d876-47bb-f8a4-f7e6171e80f9"
218 | },
219 | "outputs": [],
220 | "source": [
221 | "# Install required packages\n",
222 | "#!pip install mlflow torch torchvision scikit-learn matplotlib\n",
223 | "\n",
224 | "import mlflow\n",
225 | "import torch\n",
226 | "import torch.nn as nn\n",
227 | "import torch.optim as optim\n",
228 | "from sklearn.datasets import load_iris\n",
229 | "from sklearn.model_selection import train_test_split\n",
230 | "from sklearn.preprocessing import OneHotEncoder\n",
231 | "from torch.utils.data import DataLoader, TensorDataset\n",
232 | "\n",
233 | "# Set MLflow Tracking URI (Replace with your ngrok public URL)\n",
234 | "mlflow.set_tracking_uri(myurl) # Replace with your ngrok public URL\n",
235 | "experiment_name = \"Step-by-Step Loss Logging\"\n",
236 | "mlflow.set_experiment(experiment_name)\n",
237 | "\n",
238 | "# Get experiment details and print the link\n",
239 | "experiment = mlflow.get_experiment_by_name(experiment_name)\n",
240 | "experiment_id = experiment.experiment_id\n",
241 | "tracking_url = f\"{myurl}/#/experiments/{experiment_id}\" # Replace with your public URL\n",
242 | "print(f\"MLflow Experiment Tracking URL: {tracking_url}\")\n",
243 | "\n",
244 | "# Load the Iris dataset\n",
245 | "data = load_iris()\n",
246 | "X = data['data']\n",
247 | "y = data['target']\n",
248 | "\n",
249 | "# One-hot encode the target\n",
250 | "encoder = OneHotEncoder(sparse_output=False)\n",
251 | "y = encoder.fit_transform(y.reshape(-1, 1))\n",
252 | "\n",
253 | "# Split data\n",
254 | "X_train, X_val, y_train, y_val = train_test_split(X, y, test_size=0.2, random_state=42)\n",
255 | "\n",
256 | "# Convert to PyTorch tensors\n",
257 | "X_train = torch.tensor(X_train, dtype=torch.float32)\n",
258 | "X_val = torch.tensor(X_val, dtype=torch.float32)\n",
259 | "y_train = torch.tensor(y_train, dtype=torch.float32)\n",
260 | "y_val = torch.tensor(y_val, dtype=torch.float32)\n",
261 | "\n",
262 | "# Create DataLoader\n",
263 | "def get_data_loader(X, y, batch_size):\n",
264 | " dataset = TensorDataset(X, y)\n",
265 | " return DataLoader(dataset, batch_size=batch_size, shuffle=True)\n",
266 | "\n",
267 | "# Define a simple PyTorch model\n",
268 | "class SimpleNN(nn.Module):\n",
269 | " def __init__(self, input_dim, hidden_dim, output_dim):\n",
270 | " super(SimpleNN, self).__init__()\n",
271 | " self.fc1 = nn.Linear(input_dim, hidden_dim)\n",
272 | " self.relu = nn.ReLU()\n",
273 | " self.fc2 = nn.Linear(hidden_dim, output_dim)\n",
274 | " self.softmax = nn.Softmax(dim=1)\n",
275 | "\n",
276 | " def forward(self, x):\n",
277 | " x = self.fc1(x)\n",
278 | " x = self.relu(x)\n",
279 | " x = self.fc2(x)\n",
280 | " return self.softmax(x)\n",
281 | "\n",
282 | "time.sleep(5)\n",
283 | "# Train and log metrics to MLflow\n",
284 | "with mlflow.start_run(run_name=\"Interactive Loss Logging\"):\n",
285 | " # Define model, optimizer, and loss function\n",
286 | " model = SimpleNN(X_train.shape[1], hidden_dim=64, output_dim=y_train.shape[1])\n",
287 | " criterion = nn.CrossEntropyLoss()\n",
288 | " optimizer = optim.Adam(model.parameters(), lr=0.01)\n",
289 | "\n",
290 | " # Create data loaders\n",
291 | " train_loader = get_data_loader(X_train, y_train, batch_size=16)\n",
292 | " val_loader = get_data_loader(X_val, y_val, batch_size=16)\n",
293 | "\n",
294 | " # Training parameters\n",
295 | " epochs = 50\n",
296 | "\n",
297 | " # Log initial parameters\n",
298 | " mlflow.log_param(\"hidden_dim\", 64)\n",
299 | " mlflow.log_param(\"learning_rate\", 0.01)\n",
300 | " mlflow.log_param(\"batch_size\", 16)\n",
301 | " mlflow.log_param(\"epochs\", epochs)\n",
302 | "\n",
303 | " # Training loop\n",
304 | " for epoch in range(epochs):\n",
305 | " # Training phase\n",
306 | " model.train()\n",
307 | " train_loss = 0.0\n",
308 | " for X_batch, y_batch in train_loader:\n",
309 | " optimizer.zero_grad()\n",
310 | " outputs = model(X_batch)\n",
311 | " loss = criterion(outputs, torch.argmax(y_batch, dim=1))\n",
312 | " loss.backward()\n",
313 | " optimizer.step()\n",
314 | " train_loss += loss.item()\n",
315 | " train_loss /= len(train_loader)\n",
316 | "\n",
317 | " # Validation phase\n",
318 | " model.eval()\n",
319 | " val_loss = 0.0\n",
320 | " with torch.no_grad():\n",
321 | " for X_batch, y_batch in val_loader:\n",
322 | " outputs = model(X_batch)\n",
323 | " loss = criterion(outputs, torch.argmax(y_batch, dim=1))\n",
324 | " val_loss += loss.item()\n",
325 | " val_loss /= len(val_loader)\n",
326 | "\n",
327 | " # Log metrics to MLflow\n",
328 | " mlflow.log_metric(\"train_loss\", train_loss, step=epoch)\n",
329 | " mlflow.log_metric(\"val_loss\", val_loss, step=epoch)\n",
330 | "\n",
331 | " # Output for real-time updates in Colab\n",
332 | " print(f\"Epoch {epoch + 1}/{epochs} - Train Loss: {train_loss:.4f}, Val Loss: {val_loss:.4f}\")\n",
333 | "\n",
334 | " # Log the model\n",
335 | " mlflow.pytorch.log_model(model, \"model\")\n",
336 | "\n",
337 | "print(\"Training completed. Visit the MLflow Experiment Tracking URL to view metrics in real time.\")\n"
338 | ]
339 | },
340 | {
341 | "cell_type": "code",
342 | "execution_count": null,
343 | "metadata": {
344 | "id": "9S0KEUrVU1Sa"
345 | },
346 | "outputs": [],
347 | "source": []
348 | }
349 | ],
350 | "metadata": {
351 | "colab": {
352 | "provenance": []
353 | },
354 | "kernelspec": {
355 | "display_name": "Python 3 (ipykernel)",
356 | "language": "python",
357 | "name": "python3"
358 | },
359 | "language_info": {
360 | "codemirror_mode": {
361 | "name": "ipython",
362 | "version": 3
363 | },
364 | "file_extension": ".py",
365 | "mimetype": "text/x-python",
366 | "name": "python",
367 | "nbconvert_exporter": "python",
368 | "pygments_lexer": "ipython3",
369 | "version": "3.11.10"
370 | }
371 | },
372 | "nbformat": 4,
373 | "nbformat_minor": 4
374 | }
375 |
--------------------------------------------------------------------------------
/tutorial4/E-AI_Talks_Basics_04_MLOps_final.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial4/E-AI_Talks_Basics_04_MLOps_final.pdf
--------------------------------------------------------------------------------
/tutorial4/E-AI_Talks_Basics_04_MLOps_final_static.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial4/E-AI_Talks_Basics_04_MLOps_final_static.pptx
--------------------------------------------------------------------------------
/tutorial5/1_3_basic_wind_chill_example_with_logging.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torch.nn as nn
3 | import torch.optim as optim
4 | import numpy as np
5 | import matplotlib.pyplot as plt
6 |
7 | #######
8 |
9 | #initialized MLflow
10 | import mlflow
11 | mlflow.set_tracking_uri(uri="http://localhost:5000")
12 | mlflow.set_experiment("Wind Chill Example")
13 |
14 | #######
15 | # Generate data
16 | n_samples = 500
17 |
18 | tt = np.random.uniform(-20, 10, n_samples) # Temperature in Celsius
19 | ff = np.random.uniform(0, 50, n_samples) # Wind speed in km/h
20 |
21 | # Wind Chill Formula
22 | wc = 13.12 + 0.6215 * tt - 11.37 * (ff ** 0.16) + 0.3965 * tt * (ff ** 0.16)
23 |
24 | # Convert to PyTorch tensors
25 | x_train = torch.tensor(np.column_stack((tt, ff)), dtype=torch.float32)
26 | y_train = torch.tensor(wc, dtype=torch.float32).view(-1, 1)
27 |
28 | ##########
29 | # Step 2: Build a Neural Network Model with Hidden Layers
30 | class wind_chill_model(nn.Module):
31 | def __init__(self, hidden_dim):
32 | super(wind_chill_model, self).__init__()
33 | self.fc1 = nn.Linear(2, hidden_dim) # First hidden layer
34 | self.fc2 = nn.Linear(hidden_dim, hidden_dim) # Second hidden layer
35 | self.fc3 = nn.Linear(hidden_dim, 1) # Output layer
36 | self.relu = nn.ReLU() # Activation function
37 |
38 | def forward(self, x):
39 | x = self.relu(self.fc1(x)) # Apply ReLU after the first hidden layer
40 | x = self.relu(self.fc2(x)) # Apply ReLU after the second hidden layer
41 | x = self.fc3(x) # Output layer (no activation for regression)
42 | return x
43 |
44 | hidden_dim = 20
45 | model = wind_chill_model(hidden_dim=hidden_dim)
46 |
47 | # Define the loss function and optimizer
48 | criterion = nn.MSELoss()
49 | optimizer = optim.Adam(model.parameters(), lr=0.0005)
50 |
51 |
52 | #########
53 | # Create a validation data set
54 | n_vsamples=100
55 |
56 | vtt = np.random.uniform(-20, 10, n_vsamples) # Temperature in Celsius
57 | vff = np.random.uniform(0, 50, n_vsamples) # Wind speed in km/h
58 | vwc = 13.12 + 0.6215 * vtt - 11.37 * (vff ** 0.16) + 0.3965 * vtt * (vff ** 0.16)
59 |
60 | x_val = torch.tensor(np.column_stack((vtt, vff)), dtype=torch.float32)
61 | y_val = torch.tensor(vwc, dtype=torch.float32).view(-1, 1)
62 |
63 |
64 | ##########
65 | # Training loop
66 | train_loss = [] # Initialize loss list
67 | validation_loss = [] # validation loss
68 | n_epoch = 10000 # Set number of epochs
69 |
70 | with mlflow.start_run(run_name="logging 01"):
71 | # Log the hyperparameters
72 | mlflow.log_params({
73 | "hidden_dim": hidden_dim,
74 | })
75 |
76 | for epoch in range(n_epoch):
77 | model.train() # Set model to train mode
78 | optimizer.zero_grad() # Clear gradients
79 | y_pred = model(x_train) # Forward pass
80 | loss = criterion(y_pred, y_train) # Compute loss
81 | loss.backward() # Backpropagate error
82 | optimizer.step() # Update weights
83 |
84 | train_loss.append(loss.item()) # Save loss
85 |
86 | y_pred=model(x_val) # predict on validateion dataset
87 | vloss=criterion(y_pred,y_val)
88 | validation_loss.append(vloss.item())
89 |
90 | # Print losses every 500 epochs
91 | if (epoch + 1) % 500 == 0:
92 | print(f'Epoch [{epoch + 1}/{n_epoch}], Loss: {loss.item():.4f}, val_loss: {vloss.item():.4f}')
93 |
94 | # Log the losses metrics
95 | mlflow.log_metric("loss", loss.item(), step=(epoch+1)*x_train.shape[0])
96 | mlflow.log_metric("val_loss", vloss.item(), step=(epoch+1)*x_train.shape[0])
97 |
98 | ###########
99 | # Loss curve
100 | plt.plot(np.arange(n_epoch),train_loss,label="training loss")
101 | plt.plot(np.arange(n_epoch),validation_loss,label="validation loss")
102 | plt.yscale('log')
103 | plt.xlabel("epoch")
104 | plt.ylabel("loss")
105 | plt.legend()
106 | plt.tight_layout()
107 | mlflow.log_figure(plt.gcf(), "figure.png")
108 |
109 |
110 | ####
111 | from mlflow.models import infer_signature
112 | signature = infer_signature(x_val.numpy(), model(x_val).detach().numpy())
113 | model_info = mlflow.pytorch.log_model(model, "model", signature=signature)
--------------------------------------------------------------------------------
/tutorial5/E-AI_Talks_Basics_05_MLflow_all.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial5/E-AI_Talks_Basics_05_MLflow_all.pdf
--------------------------------------------------------------------------------
/tutorial5/E-AI_Talks_Basics_05_MLflow_all.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial5/E-AI_Talks_Basics_05_MLflow_all.pptx
--------------------------------------------------------------------------------
/tutorial5/auth_config.ini:
--------------------------------------------------------------------------------
1 | [mlflow]
2 | default_permission = READ
3 | database_uri = sqlite:///basic_auth.db
4 | admin_username = admin
5 | admin_password = to-be-changed
6 | authorization_function = mlflow.server.auth:authenticate_request_basic_auth
7 |
--------------------------------------------------------------------------------
/tutorial5/mlflow_setup.py:
--------------------------------------------------------------------------------
1 | #!/bin/env python3
2 | """
3 | MLflow user credentials setup utility
4 |
5 | This program initializes your mlflow configuration and can update your password.
6 | """
7 | #
8 | # ---------------------------------------------------------------
9 | # Copyright (C) 2004-2025, DWD, MPI-M, DKRZ, KIT, ETH, MeteoSwiss
10 | # Contact information: icon-model.org
11 | #
12 | # Author: Marek Jacob (DWD)
13 | #
14 | # SPDX-License-Identifier: BSD-3-Clause
15 | # ---------------------------------------------------------------
16 |
17 | import configparser
18 | from getpass import getpass
19 | import os
20 | import sys
21 | import pathlib
22 |
23 | from mlflow.server import get_app_client
24 |
25 | # Configure you ml flow server
26 | tracking_uri = "http://mlflow.dwd.de:5000/"
27 | tracking_uri = "http://localhost:5000/"
28 |
29 |
30 | def setup_config(config_file):
31 | """
32 | """
33 | print(f"{config_file} does not exist...")
34 | print(" ... create a new one")
35 |
36 | config_file.parent.mkdir(mode=0o700, parents=True, exist_ok=True)
37 | user = input(f"Please enter your mlflow username for server {tracking_uri}:\n")
38 | password = getpass(f"Please enter your mlflow (initial) password:\n")
39 |
40 | # create empty file
41 | open(config_file, "w").close()
42 |
43 | # set permissions to user read/write only
44 | config_file.chmod(0o600)
45 |
46 | with open(config_file, "a") as f:
47 | f.write("[mlflow]\n")
48 | f.write(f"mlflow_tracking_username = {user}\n")
49 | f.write(f"mlflow_tracking_password = {password}\n")
50 |
51 | try:
52 | print(f" ... testing user {user}")
53 | test_connection(user)
54 | except Exception as e:
55 | print(e)
56 | print("Wrong username or password.")
57 | os.remove(config_file)
58 | print(f" ... deleting {config_file}")
59 | sys.exit(1)
60 |
61 |
62 | def test_connection(user):
63 | auth_client = get_app_client("basic-auth", tracking_uri=tracking_uri)
64 | auth_client.get_user(user)
65 |
66 | def change_password(user, parser, config_file):
67 | password = getpass(f"Please enter a new password for mlflow on {tracking_uri}:\n")
68 | password2 = getpass(f"Please repeat that password:\n")
69 | if password != password2:
70 | print("Error passwords mismatch.")
71 | sys.exit(1)
72 |
73 | auth_client = get_app_client("basic-auth", tracking_uri=tracking_uri)
74 | auth_client.update_user_password(user, password)
75 | parser.set("mlflow", "mlflow_tracking_password", password)
76 |
77 | with open(config_file, 'w') as configfile:
78 | parser.write(configfile)
79 | print(f" ... password updated in {config_file}")
80 |
81 | user = parser.get("mlflow", "mlflow_tracking_username")
82 | try:
83 | test_connection(user)
84 | except Exception:
85 | raise
86 | else:
87 | print(f" ... an successfully tested on {tracking_uri}")
88 |
89 |
90 | def main():
91 | config_file = pathlib.Path.home() / ".mlflow" / "credentials"
92 |
93 | if not config_file.exists():
94 | setup_config(config_file)
95 |
96 | # set permissions to user read/write only
97 | config_file.chmod(0o600)
98 |
99 | parser = configparser.ConfigParser()
100 | assert parser.read(config_file)
101 | user = parser.get("mlflow", "mlflow_tracking_username")
102 |
103 | print(f" ... testing user {user}")
104 | try:
105 | test_connection(user)
106 | except Exception as e:
107 | print(f"Error while trying to access user {user}")
108 | print(e)
109 | sys.exit(1)
110 |
111 | change_password(user, parser, config_file)
112 |
113 | if __name__ == '__main__':
114 | main()
115 |
--------------------------------------------------------------------------------
/tutorial5/screen_mlflow.sh:
--------------------------------------------------------------------------------
1 | #!/usr/bin/env bash
2 | set -e
3 | DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
4 | cd "$DIR"
5 |
6 | SCREEN_SESSION=mlflow
7 | export MLFLOW_AUTH_CONFIG_PATH="${DIR}/auth_config.ini"
8 |
9 | export OPENBLAS_NUM_THREADS=1
10 |
11 | send_to_screen(){
12 | # Replace occurrences of $ with \$ to prevent variable substitution:
13 | string="${1//$/\\$}"
14 | screen -xr $SCREEN_SESSION -X stuff "$string\r"
15 | }
16 |
17 | # start a detached screen session
18 | screen -dmS $SCREEN_SESSION
19 |
20 | ulimit -Sv unlimited
21 |
22 | send_to_screen "date"
23 | send_to_screen "echo \$PWD"
24 | send_to_screen "echo \$MLFLOW_AUTH_CONFIG_PATH"
25 | send_to_screen "source /hpc/uwork/fe1ai/VenvPy3.11/bin/activate"
26 | send_to_screen "mlflow server --app-name basic-auth --backend-store-uri \"sqlite:///${DIR}/mlflow.db\" --artifacts-destination \"${DIR}/mlflow-artifacts\" --workers 10 --host 0.0.0.0 --port 5000"
27 | echo "Started mlflow in a detached screen session."
28 | echo "Enter \`screen -xr $SCREEN_SESSION\` to attach."
29 | echo "Then press 'ctrl+a d' to detach."
30 |
--------------------------------------------------------------------------------
/tutorial6/.github/workflows/some-name.yml:
--------------------------------------------------------------------------------
1 | # This workflow will install Python dependencies, run tests and lint with a single version of Python
2 | # For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python
3 |
4 | on: push
5 | name: Test Python with Pytest
6 |
7 | jobs:
8 | build:
9 | runs-on: ubuntu-latest
10 | steps:
11 | - uses: actions/checkout@v4
12 | - name: Set up Python 3.10
13 | uses: actions/setup-python@v3
14 | with:
15 | python-version: "3.10"
16 | - name: Install and run pytest
17 | run: |
18 | python -m pip install pytest
19 | pytest
20 |
21 | my_matrix:
22 | strategy:
23 | fail-fast: false
24 | matrix:
25 | platform: ["ubuntu-latest", "macos-latest"]
26 | python-version: ["3.9", "3.10", "3.11", "3.12"]
27 |
28 | runs-on: ${{ matrix.platform }}
29 |
30 | steps:
31 | - uses: actions/checkout@v3
32 | - name: Set up Python ${{ matrix.python-version }}
33 | uses: actions/setup-python@v5
34 | with:
35 | python-version: ${{ matrix.python-version }}
36 | - name: Test where we are
37 | run: |
38 | echo "${{ matrix.platform }}"
39 | python --version
40 |
--------------------------------------------------------------------------------
/tutorial6/.gitlab-ci.yml:
--------------------------------------------------------------------------------
1 | # content of .gitlab_ci.yml
2 |
3 | stages:
4 | - test
5 |
6 | pytest:
7 | stage: test
8 | image: python:3.10
9 | script:
10 | - pip install pytest
11 | - pytest
12 |
13 |
--------------------------------------------------------------------------------
/tutorial6/.pre-commit-config.yaml:
--------------------------------------------------------------------------------
1 | repos:
2 | - repo: https://github.com/pre-commit/pre-commit-hooks
3 | rev: v5.0.0
4 | hooks:
5 | - id: end-of-file-fixer
6 | - id: trailing-whitespace
7 | - repo: https://github.com/psf/black
8 | rev: 22.10.0
9 | hooks:
10 | - id: black
11 |
12 |
--------------------------------------------------------------------------------
/tutorial6/E-AI_Talks_Basics_06_CICD_final.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial6/E-AI_Talks_Basics_06_CICD_final.pdf
--------------------------------------------------------------------------------
/tutorial6/E-AI_Talks_Basics_06_CICD_final.pptx:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/eumetnet-e-ai/tutorials/8dedbddb429c60fd32cd1024cdc76c9d321ea651/tutorial6/E-AI_Talks_Basics_06_CICD_final.pptx
--------------------------------------------------------------------------------
/tutorial6/hello world.py:
--------------------------------------------------------------------------------
1 | # This an example file that can be beautified with python black.
2 |
3 | def abc ( ):
4 | a='A'
5 | bb = "B"
6 | ccc="C"
7 | looooooong = [111111111, 222222222,333333333,444444444,555555555,666666666, 777777777]
8 | return ["hello", "world",
9 | "!"]
10 |
11 | print( "Incorrect formatting"
12 | )
13 |
--------------------------------------------------------------------------------
/tutorial6/test_example.py:
--------------------------------------------------------------------------------
1 | # content of test_example.py
2 |
3 | def add(a, b):
4 | return a + b
5 |
6 | def test_answer():
7 | assert add(1, 3) == 5
8 |
9 | #####################################
10 |
11 | def test_answer_correctly():
12 | assert add(1, 3) == 4
13 |
14 | def test_demo_with_message():
15 | val = 5 + 3
16 | assert val % 2 == 0, "even value expected"
17 |
18 | import pytest
19 | def test_zero_division():
20 | with pytest.raises(ZeroDivisionError):
21 | 1 / 0
22 |
23 | #####################################
24 |
25 | import torch
26 | def some_f():
27 | return torch.Tensor([3.14])
28 |
29 | def test_torch():
30 | val = some_f()
31 | torch.testing.assert_close(
32 | actual=val,
33 | expected=torch.Tensor([torch.pi]),
34 | atol=0.002,
35 | rtol=0.0000001,
36 | )
37 |
38 | #####################################
39 |
40 | class TestClass:
41 | def test_one(self):
42 | x = "this"
43 | assert "h" in x
44 |
45 | def test_two(self):
46 | x = "hello"
47 | assert hasattr(x, "check")
48 |
49 | #####################################
50 |
51 | class TestClassDemoInstance:
52 | value = 0
53 | def test_one(self):
54 | self.value = 1
55 | assert self.value == 1
56 |
57 | def test_two(self):
58 | assert self.value == 0
59 |
60 | #####################################
61 |
62 | import pytest
63 |
64 | @pytest.fixture
65 | def simple_data():
66 | return [42]
67 |
68 | def test_simple_data(simple_data):
69 | assert simple_data[0] == 42
70 | assert len(simple_data) == 1
71 |
72 | def test_two(simple_data):
73 | simple_data.append(23)
74 | assert sum(simple_data) == 65
75 |
76 | #####################################
77 |
78 | @pytest.mark.parametrize("n,expected", [(1, 2), (3, 4)])
79 | class TestClass:
80 | def test_simple_case(self, n, expected):
81 | assert n + 1 == expected
82 |
83 | def test_weird_simple_case(self, n, expected):
84 | assert (n * 1) + 1 == expected
85 |
86 | #####################################
87 |
88 | import xarray, numpy
89 |
90 | def my_processing(filename):
91 | data = xarray.open_dataset(filename)
92 | # some processing
93 | return data
94 |
95 | def open_dataset_mock(*kwargs, **args):
96 | return xarray.Dataset({"X": numpy.arange(5)})
97 |
98 | def test_processing(monkeypatch):
99 | monkeypatch.setattr(xarray, "open_dataset", open_dataset_mock)
100 | x = my_processing("no-name.nc")
101 | assert x.X.sum() == 10
--------------------------------------------------------------------------------
/tutorial6/test_pytorch.py:
--------------------------------------------------------------------------------
1 | import pytest, torch
2 |
3 | @pytest.fixture
4 | def x_gpu():
5 | return torch.Tensor([42]).cuda()
6 |
7 | @pytest.mark.gpu
8 | def test_cuda(x_gpu):
9 | assert x_gpu.is_cuda
10 | assert not x_gpu.cpu().is_cuda
11 |
--------------------------------------------------------------------------------