├── PIT-IDM_1_UTE.tar ├── PIT-IDM_2_UTE.tar ├── LICENSE ├── README.md ├── predict.py ├── predict_demo2.csv ├── utils.py ├── model1_test.py ├── model2_test.py ├── multi_attention_forward.py └── single vehicle trajectories visualization _I80.ipynb /PIT-IDM_1_UTE.tar: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/qcl-0218/Physics-informed-Transformer-IDM/HEAD/PIT-IDM_1_UTE.tar -------------------------------------------------------------------------------- /PIT-IDM_2_UTE.tar: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/qcl-0218/Physics-informed-Transformer-IDM/HEAD/PIT-IDM_2_UTE.tar -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2022 Gengmaosi 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Physics-informed-Transformer-IDM 2 | The implementation of the paper "A Physics-Informed Transformer Model for Vehicle Trajectory Prediction on Highways". The paper is now under review. Because of the confidentiality agreement and the commercial project application, the training part of the PIT-IDM has been deleted. Still, we upload the single trajectory visualization code, i.e., 'single vehicle trajectories visualization_UTE.ipynb' and 'single vehicle trajectories visualization_I80.ipynb', and trajectory prediction demo code, i.e., 'predict.py'. 'PIT-IDM_1_UTE.tar' and 'PIT-IDM_2_UTE.tar' stand for the models already pre-trained on the UTE dataset. 3 | 4 | # Abstract of the paper 5 | Autonomous Vehicles (AVs) have made remarkable developments and are anticipated to replace human drivers. In transitioning from human-driven vehicles to fully AVs, one crucial task is to predict the trajectories of the subject vehicle and its surrounding vehicles in real time. Most existing methods of vehicle trajectory prediction on highways are based on data-driven models that lack interpretability and physical constraints or physics-based models with low prediction accuracy. This paper proposes a Physics-Informed Deep Learning framework that fully leverages the advantages of data-driven and physics-based models to go beyond the existing models. We use the transformer neural network architecture with self-attention as Physics-Uninformed Neural Network (PUNN) and Intelligent Driver Model (IDM) as Physics-Informed Neural Network (PINN), then complete the construction of Physics-Informed Transformer-Intelligent Driver Model (PIT-IDM). The experiments have been conducted in two datasets with different traffic environments, i.e., Next Generation SIMulation (NGSIM) data in the US, and the Ubiquitous Traffic Eyes (UTE) data in China. Compared with the three kinds of baselines, the best performing PIT-IDM reduces longitudinal trajectory prediction errors by 5%-50%, some of which are even reduced by more than 70%. Extensive empirical analyses have been carried out to verify its excellent spatio-temporal transferability and explore the physics-informed mechanism underlying this deep learning method. The results further validate the efficacy of this Physics-Informed Deep Learning framework in enhancing model accuracy, interpretability, and transferability. 6 | 7 | # Getting started 8 | The Physics-Informed Transformer-IDM was constructed based on the open-source deep learning framework named pytorch. Please ensure you have already install it before using our code. 9 | 10 | # Citation 11 | If you find our work useful for your research, please consider citing the paper (the paper is now under review). 12 | 13 | # Reference 14 | The "multi_attention_forward.py" code base borrows from https://github.com/Majiker/STAR 15 | -------------------------------------------------------------------------------- /predict.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import math 3 | import copy 4 | import time 5 | import numpy as np 6 | import pandas as pd 7 | from model1_test import PIT_IDM_1 8 | from model2_test import PIT_IDM_2 9 | #from model1 import PIT_IDM_1 10 | #from model2 import PIT_IDM_2 11 | from utils import TrajectoryData, data_preprocess, step_error, all_step_error, _generate_square_subsequent_mask 12 | from torch.utils.data import sampler, Dataset, DataLoader 13 | 14 | all_pd_data=pd.read_csv(r'\predict_demo2.csv') 15 | all_pd_data = all_pd_data[['Vehicle_ID', 'Frame_ID', 'y', 'Pre_y', 'v', 'spacing', 'delta_v', 'preceding', 'direction', 'Pre_v']] 16 | 17 | pro_data = all_pd_data.to_numpy() 18 | #输入数据处理、缩放过程 19 | output_length = 30 20 | dt = 0.1 21 | input_x = pro_data[:,[2,4,5,6]] 22 | v_pre = pro_data[-1,9] 23 | y_pre = pro_data[-1,3] 24 | if v_pre != 0: 25 | fut_v_pre = v_pre*np.ones(output_length) 26 | fut_y_pre = np.linspace(y_pre, y_pre+v_pre*output_length*dt, output_length+1)[1:] 27 | else: 28 | fut_v_pre = np.zeros(output_length) 29 | fut_y_pre = np.linspace(y_pre, y_pre+pro_data[-1,4]*output_length*dt, output_length+1)[1:] 30 | input_hist = np.array([fut_y_pre, fut_v_pre]).transpose(1,0) 31 | max_num = np.max(input_x[:,0])+185 32 | min_num = np.min(input_x[:,0]) 33 | input_x[:,0] = input_x[:,0]-min_num 34 | input_hist[:,0] = input_hist[:,0]-min_num 35 | input_x = input_x/(max_num-min_num) 36 | input_hist = input_hist/(max_num-min_num) 37 | input_x = torch.tensor(input_x).unsqueeze(0).to(torch.float32) 38 | input_hist = torch.tensor(input_hist).unsqueeze(0).to(torch.float32) 39 | 40 | #ninput = 1 41 | #ntoken = 1 42 | #ninp = 14 43 | #nhead = 2 44 | #nhid = 28 45 | #fusion_size = 3 46 | #nlayers = 3 47 | #dropout = 0.1 48 | #output_length = 30 49 | #s_0 = 1.667/(max_num-min_num) 50 | #T = 0.504 51 | #a = 0.430/(max_num-min_num) 52 | #b = 3.216/(max_num-min_num) 53 | #v_d = 16.775/(max_num-min_num) 54 | #dt = 0.1 55 | #lr_PUNN = 0.0005 56 | #lr_PINN = 0.0000001 57 | #epoch_num = 10 58 | #alpha = 0.7 59 | #model_1 = PIT_IDM_1(ninput, ntoken, ninp, nhead, nhid, fusion_size, nlayers, dropout, output_length, s_0, T, a, b, v_d, dt, lr_PUNN, lr_PINN, epoch_num, alpha) 60 | #模型预测过程 61 | #model_location = r'\PIT-IDM(1)_CKQ4_0917.tar' 62 | #outputs = model_1.predict(input_x, input_hist, model_location) 63 | #output = outputs.detach().numpy() 64 | #output = output*(max_num-min_num)+min_num 65 | #print(output, output.shape) 66 | 67 | ninput = 1 68 | ntoken = 1 69 | ninp = 50 70 | nhead = 10 71 | nhid = 50 72 | fusion_size = 3 73 | nlayers = 2 74 | dropout = 0.1 75 | output_length = 30 76 | s_0 = 1.667/(max_num-min_num) 77 | T = 0.504 78 | a = 0.430/(max_num-min_num) 79 | b = 3.216/(max_num-min_num) 80 | v_d = 16.775/(max_num-min_num) 81 | dt = 0.1 82 | lr_PUNN = 0.0005 83 | lr_PINN = 0.0000001 84 | epoch_num = 500 85 | alpha = 0.7 86 | model_2 = PIT_IDM_2(ninput, ntoken, ninp, nhead, nhid, fusion_size, nlayers, dropout, output_length, s_0, T, a, b, v_d, dt, lr_PUNN, lr_PINN, epoch_num, alpha) 87 | model_location = r'\PIT-IDM(2)_CKQ4_0917.tar' 88 | start_of_seq = torch.Tensor([0]).unsqueeze(0).unsqueeze(1).repeat(input_x.shape[0], 1, 1) 89 | dec_input = start_of_seq 90 | for i in range(output_length): 91 | target_mask = _generate_square_subsequent_mask(dec_input.shape[1]) 92 | outputs = model_2.predict(input_x, dec_input, target_mask, input_hist, model_location) 93 | dec_input = torch.cat((dec_input, outputs[:, -1:, :]), 1) 94 | output = outputs.detach().numpy() 95 | output = output*(max_num-min_num)+min_num 96 | print(output, output.shape) 97 | print(max_num,min_num) 98 | -------------------------------------------------------------------------------- /predict_demo2.csv: -------------------------------------------------------------------------------- 1 | Vehicle_ID,Frame_ID,preceding,y,Pre_y,v,Pre_v,spacing,direction,delta_v,Lane_ID 2 | 11287,9220,11270,142.9056591,168.9604953,10.53205767,10.58541568,26.05483615,1,0.053358014,2 3 | 11287,9221,11270,143.9600997,170.0196005,10.57226705,10.60631777,26.05950076,1,0.034050727,2 4 | 11287,9222,11270,145.0185779,171.0808319,10.61374887,10.62694841,26.06225406,1,0.013199534,2 5 | 11287,9223,11270,146.0811868,172.1441715,10.65421274,10.65056506,26.06298466,1,-0.003647686,2 6 | 11287,9224,11270,147.1480538,173.2098724,10.69976641,10.67309588,26.06181865,1,-0.026670524,2 7 | 11287,9225,11270,148.2194417,174.2778896,10.74430211,10.69806981,26.05844784,1,-0.046232302,2 8 | 11287,9226,11270,149.2952573,175.348431,10.78985577,10.72358664,26.05317365,1,-0.066269127,2 9 | 11287,9227,11270,150.3757042,176.4214877,10.83642739,10.74964639,26.04578344,1,-0.086781,2 10 | 11287,9228,11270,151.4607908,177.4972948,10.88478044,10.77814924,26.03650397,1,-0.106631199,2 11 | 11287,9229,11270,152.5507801,178.5758977,10.93262451,10.80556627,26.02511754,1,-0.127058242,2 12 | 11287,9230,11270,153.6455957,179.6573686,10.9825045,10.83651222,26.01177287,1,-0.145992278,2 13 | 11287,9231,11270,154.7453649,180.7419248,11.03187551,10.86772963,25.99655987,1,-0.164145879,2 14 | 11287,9232,11270,155.850164,181.8297019,11.08226448,10.90193305,25.9795379,1,-0.180331426,2 15 | 11287,9233,11270,156.96001,182.920881,11.13341691,10.9353221,25.96087097,1,-0.198094809,2 16 | 11287,9234,11270,158.0749962,184.0154619,11.18711425,10.97142571,25.94046576,1,-0.215688533,2 17 | 11287,9235,11270,159.1953176,185.1137253,11.23928464,11.00834369,25.91840766,1,-0.230940951,2 18 | 11287,9236,11270,160.320898,186.215671,11.29196402,11.04553312,25.89477302,1,-0.246430893,2 19 | 11287,9237,11270,161.4518136,187.3214077,11.34667931,11.08462275,25.86959406,1,-0.262056563,2 20 | 11287,9238,11270,162.5881747,188.4310348,11.40063113,11.12316946,25.84286004,1,-0.277461675,2 21 | 11287,9239,11270,163.7299136,189.5446429,11.45636439,11.16551655,25.8147293,1,-0.290847835,2 22 | 11287,9240,11270,164.8772931,190.6624672,11.5131156,11.20623491,25.78517412,1,-0.306880688,2 23 | 11287,9241,11270,166.0302792,191.784381,11.56808538,11.24912491,25.75410177,1,-0.318960467,2 24 | 11287,9242,11270,167.1888466,192.9106467,11.62305516,11.29310074,25.72180007,1,-0.329954424,2 25 | 11287,9243,11270,168.3528341,194.0413638,11.67853392,11.34006258,25.68852972,1,-0.338471347,2 26 | 11287,9244,11270,169.5225471,195.1767857,11.73553963,11.38648151,25.65423868,1,-0.34905812,2 27 | 11287,9245,11270,170.6979176,196.3168943,11.79407226,11.43425771,25.61897671,1,-0.359814553,2 28 | 11287,9246,11270,171.8791408,197.4618162,11.85133245,11.48257683,25.5826754,1,-0.368755626,2 29 | 11287,9247,11270,173.0660386,198.6115515,11.90757469,11.5311674,25.54551287,1,-0.376407284,2 30 | 11287,9248,11270,174.2586279,199.7662357,11.96585284,11.58111525,25.50760787,1,-0.384737581,2 31 | 11287,9249,11270,175.4570274,200.9258691,12.02234956,11.63133456,25.46884165,1,-0.391014994,2 32 | 11287,9250,11270,176.6610506,202.0906234,12.08011873,11.68345406,25.42957272,1,-0.396664666,2 33 | 11287,9251,11270,177.8708841,203.26058,12.13661545,11.73503065,25.38969591,1,-0.401584801,2 34 | 11287,9252,11270,179.0862989,204.4356848,12.19311217,11.78823597,25.34938594,1,-0.404876202,2 35 | 11287,9253,11270,180.3073627,205.6161186,12.24909991,11.84035546,25.30875591,1,-0.408744446,2 36 | 11287,9254,11270,181.5340078,206.801791,12.30483316,11.89328933,25.2677832,1,-0.411543833,2 37 | 11287,9255,11270,182.7662256,207.9927743,12.36031192,11.94540883,25.22654868,1,-0.414903097,2 38 | 11287,9256,11270,184.0039569,209.1888786,12.41451824,11.99725687,25.18492171,1,-0.417261369,2 39 | 11287,9257,11270,185.2470828,210.3902396,12.46796108,12.048562,25.14315677,1,-0.419399083,2 40 | 11287,9258,11270,186.4956033,211.5966219,12.52318535,12.09932422,25.10101862,1,-0.423861136,2 41 | 11287,9259,11270,187.7495779,212.8081072,12.57586473,12.14954353,25.0585293,1,-0.426321204,2 42 | 11287,9260,11270,189.0088877,214.0246953,12.63032553,12.20084866,25.01580757,1,-0.429476876,2 43 | 11287,9261,11270,190.2734989,215.2462867,12.68224144,12.24998214,24.97278784,1,-0.432259297,2 44 | 11287,9262,11270,191.5433944,216.4728634,12.73542979,12.29965854,24.92946897,1,-0.435771256,2 45 | 11287,9263,11270,192.8185234,217.7042896,12.78556427,12.34743474,24.88576615,1,-0.438129527,2 46 | 11287,9264,11270,194.0986145,218.9405562,12.83671671,12.39521095,24.84194174,1,-0.441505758,2 47 | 11287,9265,11270,195.3838966,220.1815729,12.88736016,12.44325861,24.79767627,1,-0.444101553,2 48 | 11287,9266,11270,196.6742172,221.4273396,12.93927607,12.48967754,24.7531224,1,-0.449598531,2 49 | 11287,9267,11270,197.9695931,222.6777115,12.98788361,12.5352821,24.70811841,1,-0.45260151,2 50 | 11287,9268,11270,199.2700328,223.9326343,13.03903605,12.58115812,24.66260153,1,-0.457877931,2 51 | 11287,9269,11270,200.5754006,225.1921262,13.08917052,12.62594831,24.61672558,1,-0.463222215,2 52 | -------------------------------------------------------------------------------- /utils.py: -------------------------------------------------------------------------------- 1 | from torch.utils.data import sampler, Dataset, DataLoader 2 | import numpy as np 3 | import pandas as pd 4 | import torch 5 | import math 6 | import torch.nn as nn 7 | import torch.nn.functional as F 8 | 9 | class TrajectoryData(Dataset): 10 | """ 11 | A customized data loader for Traffic. 12 | """ 13 | 14 | def __init__(self, matrixs, labels, history_labels): 15 | """ Intialize the Traffic dataset 16 | 17 | Args: 18 | - data: numpy datatype 19 | """ 20 | self.matrixs = torch.DoubleTensor(matrixs.astype(float)) 21 | self.matrixs = self.matrixs.to(torch.float32) 22 | self.labels = torch.DoubleTensor(labels.astype(float)) 23 | self.labels = self.labels.to(torch.float32) 24 | self.history_labels = torch.DoubleTensor(history_labels.astype(float)) 25 | self.history_labels = self.history_labels.to(torch.float32) 26 | self.len = matrixs.shape[0] 27 | 28 | # probably the most important to customize. 29 | def __getitem__(self, index): 30 | """ Get a sample from the dataset 31 | """ 32 | matrix = self.matrixs[index] 33 | label = self.labels[index] 34 | history_label = self.history_labels[index] 35 | return matrix, label, history_label 36 | 37 | def __len__(self): 38 | """ 39 | Total number of samples in the dataset 40 | """ 41 | return self.len 42 | 43 | def data_preprocess(pro_data, direc, input_length, output_length, frame, dt): 44 | all_input = [] 45 | for i in range(int(pro_data.shape[0])-2*input_length): 46 | if pro_data[i,0] == pro_data[i+input_length-1,0] and pro_data[i,7] == pro_data[i+input_length-1,7] and \ 47 | pro_data[i+input_length-1,1]-pro_data[i,1]==(input_length-1)*frame and pro_data[i+input_length-1,0] == pro_data[i+input_length-1+output_length,0] and \ 48 | pro_data[i+input_length-1, 7] == pro_data[i+input_length-1+output_length, 7] and pro_data[i+input_length-1+output_length, 1] \ 49 | -pro_data[i+input_length-1, 1] == output_length*frame: 50 | no_use = pro_data[i+input_length:i+2*input_length] 51 | no_use = np.array(no_use) 52 | the_output = no_use[:, :] 53 | all_together = np.hstack((pro_data[i:i+input_length][:,:], the_output)) 54 | all_input.append(all_together) 55 | 56 | x = [] 57 | y = [] 58 | labels = [] 59 | hist_labels = [] 60 | for i in range(len(all_input)): 61 | temp = all_input[i] 62 | begin_pos = temp[0, 2] 63 | temp_y = temp[:output_length,12]-begin_pos 64 | temp_y_pre = temp[:output_length,13]-begin_pos 65 | temp_yv_pre = temp[:output_length,-1] 66 | temp[:,2] = temp[:,2]-begin_pos 67 | temp[:,3] = temp[:,3]-begin_pos 68 | #if temp_y[0] <= 0: 69 | #continue 70 | if direc == 1: 71 | temp[:,2] = temp[:,2]*temp[0,8] 72 | temp[:,3] = temp[:,3]*temp[0,8] 73 | temp_y = temp_y*temp[0,8] 74 | temp_y_pre = temp_y_pre * temp[0, 8] 75 | yv_pre = temp[-1,9] 76 | y_pre = temp[-1,3] 77 | if yv_pre != 0: 78 | test_yv_pre = yv_pre*np.ones(output_length) 79 | test_y_pre = np.linspace(y_pre, y_pre+yv_pre*output_length*dt, output_length+1)[1:] 80 | else: 81 | test_yv_pre = np.zeros(output_length) 82 | test_y_pre = np.linspace(y_pre, y_pre+temp[-1,4]*output_length*dt, output_length+1)[1:] 83 | if temp_y[0] <= temp[-1,2] or temp_y[0]<=0 or temp_y[0]>=temp_y_pre[0]: 84 | continue 85 | y.append([temp_y]) 86 | x.append(temp[:,[2,4,5,6]]) 87 | labels.append([temp_y_pre,temp_yv_pre]) 88 | hist_labels.append([test_y_pre, test_yv_pre]) 89 | 90 | x = np.array(x) 91 | y = np.array(y) 92 | labels = np.array(labels) 93 | hist_labels = np.array(hist_labels) 94 | #打乱顺序 95 | #all_num = range(0, len(x), 1) 96 | #num = random.sample(all_num, len(x)) 97 | #x = x[num] 98 | #y = y[num] 99 | #labels= labels[num] 100 | #hist_labels= hist_labels[num] 101 | x_train = x[:int(0.8 * len(x)), :, :] 102 | x_val = x[int(0.8 * len(x)):int(0.9 * len(x)), :, :] 103 | x_test = x[int(0.9 * len(x)):, :, :] 104 | y_train = y[:int(0.8 * len(x)), :, :] 105 | y_val = y[int(0.8 * len(x)):int(0.9 * len(x)), :, :] 106 | y_test = y[int(0.9 * len(x)):, :, :] 107 | train_labels = labels[:int(0.8 * len(x)), :, :] 108 | val_labels = labels[int(0.8 * len(x)):int(0.9 * len(x)), :, :] 109 | test_labels = labels[int(0.9 * len(x)):, :, :] 110 | test_pre_labels = hist_labels[int(0.9 * len(x)):, :, :] 111 | 112 | y_train = y_train.transpose(0, 2, 1) 113 | train_labels = train_labels.transpose(0, 2, 1) 114 | y_test = y_test.transpose(0, 2, 1) 115 | test_labels = test_labels.transpose(0, 2, 1) 116 | test_pre_labels = test_pre_labels.transpose(0, 2, 1) 117 | y_val = y_val.transpose(0, 2, 1) 118 | val_labels = val_labels.transpose(0, 2, 1) 119 | 120 | max_num = max([np.max(x_train[:, :, 0]), np.max(y_train[:, :, 0]), np.max(x_test[:, :, 0]), np.max(y_test[:, :, 0]), 121 | np.max(x_val[:, :, 0]), np.max(y_val[:, :, 0])]) 122 | min_num = min([np.min(x_train[:, :, 0]), np.min(y_train[:, :, 0]), np.min(x_test[:, :, 0]), np.min(y_test[:, :, 0]), 123 | np.min(x_val[:, :, 0]), np.min(y_val[:, :, 0])]) 124 | 125 | x_train[:, :, [0]] = x_train[:, :, [0]] - min_num 126 | y_train[:, :, [0]] = y_train[:, :, [0]] - min_num 127 | train_labels[:, :, [0]] = train_labels[:, :, [0]] - min_num 128 | x_test[:, :, [0]] = x_test[:, :, [0]] - min_num 129 | y_test[:, :, [0]] = y_test[:, :, [0]] - min_num 130 | test_labels[:, :, [0]] = test_labels[:, :, [0]] - min_num 131 | test_pre_labels[:, :, [0]] = test_pre_labels[:, :, [0]] - min_num 132 | x_val[:, :, [0]] = x_val[:, :, [0]] - min_num 133 | y_val[:, :, [0]] = y_val[:, :, [0]] - min_num 134 | val_labels[:, :, [0]] = val_labels[:, :, [0]] - min_num 135 | 136 | x_train = x_train / (max_num - min_num) 137 | y_train = y_train / (max_num - min_num) 138 | train_labels = train_labels / (max_num - min_num) 139 | x_test = x_test / (max_num - min_num) 140 | y_test = y_test / (max_num - min_num) 141 | test_labels = test_labels / (max_num - min_num) 142 | test_pre_labels = test_pre_labels / (max_num - min_num) 143 | x_val = x_val / (max_num - min_num) 144 | y_val = y_val / (max_num - min_num) 145 | val_labels = val_labels / (max_num - min_num) 146 | 147 | return x_train, y_train, train_labels, x_test, y_test, test_labels,test_pre_labels, x_val, y_val, val_labels, max_num, min_num 148 | 149 | def all_step_error(outputs, labels): 150 | out = ((outputs - labels) ** 2) ** 0.5 151 | lossVal = torch.mean(out[:, :, 0], dim=0) 152 | err_mean = torch.zeros([6]) 153 | err_final = torch.zeros([6]) 154 | for i in range(6): 155 | err_mean[i] = torch.mean(lossVal[:5 * (i + 1)]) 156 | err_final[i] = lossVal[5 * (i + 1) - 1] 157 | return err_mean, err_final 158 | 159 | 160 | def step_error(outputs, labels): 161 | out = ((outputs - labels) ** 2) ** 0.5 162 | lossVal = torch.mean(out[:, :, 0], dim=0) 163 | lossmean = torch.mean(lossVal) 164 | lossfinal = lossVal[-1] 165 | 166 | return lossmean, lossfinal 167 | 168 | class PositionalEncoding(nn.Module): 169 | 170 | def __init__(self, d_model, max_len=5000): 171 | super(PositionalEncoding, self).__init__() 172 | pe = torch.zeros(max_len, d_model) 173 | position = torch.arange(0, max_len, dtype=torch.float).unsqueeze(1) 174 | div_term = torch.exp(torch.arange(0, d_model, 2).float() * (-math.log(10000.0) / d_model)) 175 | pe[:, 0::2] = torch.sin(position * div_term) 176 | pe[:, 1::2] = torch.cos(position * div_term) 177 | pe = pe.unsqueeze(0).transpose(0, 1) 178 | #pe.requires_grad = False 179 | self.register_buffer('pe', pe) 180 | 181 | def forward(self, x): 182 | return x + self.pe[:x.size(0), :] 183 | 184 | def _generate_square_subsequent_mask(sz): 185 | mask = (torch.triu(torch.ones(sz, sz)) == 1).transpose(0, 1) 186 | mask = mask.float().masked_fill(mask == 0, float('-inf')).masked_fill(mask == 1, float(0.0)) 187 | return mask -------------------------------------------------------------------------------- /model1_test.py: -------------------------------------------------------------------------------- 1 | import math 2 | import copy 3 | import time 4 | import numpy as np 5 | import pandas as pd 6 | import torch 7 | import torch.nn as nn 8 | import torch.nn.functional as F 9 | from torch.nn import TransformerEncoder, TransformerEncoderLayer 10 | from torch.nn import TransformerDecoder, TransformerDecoderLayer 11 | from utils import step_error 12 | criterion = nn.MSELoss() 13 | 14 | # 无tensor梯度的IDM模型 15 | def model_IDM(inputs_IDM, his_labels, output_length, s_0, T, a, b, v_d, dt): 16 | v_pred = torch.zeros((inputs_IDM.shape[0], output_length, 1)) 17 | y_pred = torch.zeros((inputs_IDM.shape[0], output_length, 1)) 18 | acc = torch.zeros((inputs_IDM.shape[0], output_length, 1)) 19 | y = inputs_IDM[:, 0] 20 | v = inputs_IDM[:, 1] 21 | s = inputs_IDM[:, 2] 22 | delta_v = inputs_IDM[:, 3] 23 | 24 | s_x = s_0 + torch.max(torch.tensor(0), v * T + ((v * delta_v) / (2 * (a * b) ** 0.5))) 25 | # s_x = torch.tensor(2.5)+ torch.max(torch.tensor(0), v*torch.tensor(1.25)+((v*delta_v)/(2*(torch.tensor(1.75)*torch.tensor(1.25))**0.5))) 26 | a_f = a * (1 - (v / v_d) ** 4 - (s_x / s) ** 2) 27 | # a_f = torch.tensor(1.75)*(1-(v/torch.tensor(30))**4-(s_x/s)**2) 28 | v_pred[:, 0, 0] = v + a_f * dt 29 | for i in range(len(v_pred)): 30 | if v_pred[i, 0, 0] <= 0: 31 | v_pred[i, 0, 0] = 0 32 | y_pred[:, 0, 0] = y + v_pred[i, 0, 0] * dt 33 | acc[:, 0, 0] = a_f 34 | 35 | for i in range(y_pred.shape[0]): 36 | for j in range(output_length - 1): 37 | v = v_pred[i, j, 0] 38 | delta_v = his_labels[i, j, 1] - v_pred[i, j, 0] 39 | s = his_labels[i, j, 0] - y_pred[i, j, 0] 40 | # s_x = self.s_0 + self.T*v - ((v * delta_v)/(2*(self.a*self.b)**0.5)) 41 | # s_x = s_0 + v*T-((v*delta_v)/(2*(a*b)**0.5)) 42 | s_x = s_0 + torch.max(torch.tensor(0), v * T + ((v * delta_v) / (2 * (a * b) ** 0.5))) 43 | # acc_temp = self.a*(1-(v/self.v_d)**4-(s_x/s)**2) 44 | acc_temp = a * (1 - (v / v_d) ** 4 - (s_x / s) ** 2) 45 | v2 = v + acc_temp * dt 46 | if v2 <= 0: 47 | v2 = 0 48 | acc_temp = (v2 - v) / dt 49 | y1 = y_pred[i, j, 0] 50 | y2 = y1 + v2 * dt 51 | acc[i, j + 1, 0] = acc_temp 52 | v_pred[i, j + 1, 0] = v2 53 | y_pred[i, j + 1, 0] = y2 54 | 55 | return y_pred 56 | 57 | #PINN部分 (IDM) 58 | class IDMModel(nn.Module): 59 | def __init__(self, s_0, T, a, b, v_d): 60 | super(IDMModel, self).__init__() 61 | self.model_type = 'IDM' 62 | self.dt = 0.1 63 | self.s_0 = torch.tensor([1.667], requires_grad=True) 64 | self.T = torch.tensor([0.504], requires_grad=True) 65 | self.a = torch.tensor([0.430], requires_grad=True) 66 | self.b = torch.tensor([3.216], requires_grad=True) 67 | self.v_d = torch.tensor([16.775], requires_grad=True) 68 | 69 | self.s_0 = torch.nn.Parameter(self.s_0) 70 | self.T = torch.nn.Parameter(self.T) 71 | self.a = torch.nn.Parameter(self.a) 72 | self.b = torch.nn.Parameter(self.b) 73 | self.v_d = torch.nn.Parameter(self.v_d) 74 | 75 | self.s_0.data.fill_(s_0) 76 | self.T.data.fill_(T) 77 | self.a.data.fill_(a) 78 | self.b.data.fill_(b) 79 | self.v_d.data.fill_(v_d) 80 | 81 | def forward(self, inputs_IDM, his_labels): 82 | y = inputs_IDM[:, 0] 83 | v = inputs_IDM[:, 1] 84 | s = inputs_IDM[:, 2] 85 | delta_v = inputs_IDM[:, 3] 86 | 87 | s_x = self.s_0 + v * self.T + ((v * delta_v) / (2 * (self.a * self.b) ** 0.5)) 88 | a_f = self.a * (1 - (v / self.v_d) ** 4 - (s_x / s) ** 2) 89 | v_pred = v + a_f * self.dt 90 | for i in range(len(v_pred)): 91 | if v_pred[i] <= 0: 92 | v_pred[i] == 0 93 | output_IDM = y + v_pred * self.dt 94 | return output_IDM.unsqueeze(1).unsqueeze(2), torch.Tensor(self.s_0.data.cpu().numpy()), torch.Tensor( 95 | self.T.data.cpu().numpy()), torch.Tensor(self.a.data.cpu().numpy()), torch.Tensor( 96 | self.b.data.cpu().numpy()), torch.Tensor(self.v_d.data.cpu().numpy()) 97 | 98 | #PUNN部分 (Transformer) 99 | class TransformerModel(nn.Module): 100 | 101 | def __init__(self, ninput, ntoken, ninp, nhead, nhid, fusion_size, nlayers, dropout, output_length, s_0, T, a, b, v_d, dt): 102 | super(TransformerModel, self).__init__() 103 | 104 | self.model_type = 'Transformer' 105 | self.src_mask = None 106 | self.embedding_layer = nn.Linear(ninput, ninp) 107 | self.encoder_layers = TransformerEncoderLayer(ninp, nhead, nhid, dropout) 108 | self.transformer_encoder = TransformerEncoder(self.encoder_layers, nlayers) 109 | # self.relu = nn.ReLU() 110 | # self.leaky_relu = nn.LeakyReLU(0.1) 111 | # self.sig = nn.Sigmoid() 112 | self.decoder = nn.Linear(ninp, ntoken) 113 | self.dropout_in = nn.Dropout(dropout) 114 | self.fusion_layer_1 = nn.Linear(fusion_size, ntoken) 115 | self.fusion_layer_2 = nn.Linear(fusion_size, ntoken) 116 | self.output_length = output_length 117 | self.s_0 = s_0 118 | self.T = T 119 | self.a = a 120 | self.b = b 121 | self.v_d = v_d 122 | self.dt = dt 123 | 124 | def _generate_square_subsequent_mask(self, sz): 125 | mask = (torch.triu(torch.ones(sz, sz)) == 1).transpose(0, 1) 126 | mask = mask.float().masked_fill(mask == 0, float('-inf')).masked_fill(mask == 1, float(0.0)) 127 | return mask 128 | 129 | def init_weights(self): 130 | initrange = 1 131 | # self.transformer_encoder.weight.data.uniform_(-initrange, initrange) 132 | self.embedding_layer.bias.data.zero_() 133 | self.embedding_layer.weight.data.uniform_(-initrange, initrange) 134 | self.decoder.bias.data.zero_() 135 | self.decoder.weight.data.uniform_(-initrange, initrange) 136 | 137 | def forward(self, inputs, his_labels): 138 | #Transformer 模型输出 139 | src_inputs = inputs[:, :, 0].unsqueeze(2) 140 | src = src_inputs.transpose(1, 0) 141 | src = self.embedding_layer(src) 142 | # pos_src = self.pos_encoder(src) 143 | if self.src_mask is None or self.src_mask.size(0) != len(src): 144 | mask = self._generate_square_subsequent_mask(len(src)) 145 | self.src_mask = mask 146 | 147 | enc_src = self.transformer_encoder(src, self.src_mask) 148 | enc_src = enc_src[-1] 149 | enc_src = enc_src.repeat(self.output_length, 1, 1) 150 | output = self.decoder(enc_src) 151 | output = output.transpose(0, 1) 152 | #历史信息 153 | dv = (src_inputs[:, -1, 0] - src_inputs[:, -(1+self.output_length), 0]) / self.output_length 154 | hist = torch.zeros(output.shape) 155 | for i in range(src_inputs.shape[0]): 156 | hist[i, :, 0] = torch.linspace(src_inputs[i, -1, 0].item(), src_inputs[i, -1, 0].item() + dv[i].item() * self.output_length, 157 | (self.output_length+1))[1:] 158 | #物理信息 159 | output_IDM = model_IDM(inputs[:, -1, :], his_labels[:, :, :], self.output_length, self.s_0, self.T, self.a, self.b, self.v_d, self.dt) 160 | #结果输出端融合 161 | fusion = torch.cat([output, hist, output_IDM], axis=2) 162 | final_output = self.fusion_layer_1(fusion) 163 | 164 | return final_output 165 | 166 | class PIT_IDM_1(nn.Module): 167 | def __init__(self, ninput, ntoken, ninp, nhead, nhid, fusion_size, nlayers, dropout, output_length, s_0, T, a, b, v_d, dt, lr_PUNN, lr_PINN, epoch_num, alpha): 168 | super(PIT_IDM_1, self).__init__() 169 | self.dt = dt 170 | self.output_length = output_length 171 | 172 | self.PUNN = TransformerModel(ninput, ntoken, ninp, nhead, nhid, fusion_size, nlayers, dropout, output_length, s_0, T, a, b, v_d, dt) 173 | self.PINN = IDMModel(s_0, T, a, b, v_d) 174 | 175 | #self.optimizer = torch.optim.Adam( 176 | #[{'params': self.PUNN.parameters(), 'lr': 0.0005}, {'params': self.PINN.parameters(), 'lr': 0.0000001}]) 177 | self.optimizer = torch.optim.Adam( 178 | [{'params': self.PUNN.parameters(), 'lr': lr_PUNN}, {'params': self.PINN.parameters(), 'lr': lr_PINN}]) 179 | self.scheduler = torch.optim.lr_scheduler.StepLR(self.optimizer, 15, gamma=0.1) 180 | 181 | self.epoches = epoch_num 182 | self.alpha = alpha 183 | 184 | def net_PUNN(self, inputs, his_labels): 185 | output_trans = self.PUNN(inputs, his_labels) 186 | return output_trans 187 | 188 | def net_PINN(self, inputs_IDM, his_labels): 189 | output_IDM = self.PINN(inputs_IDM, his_labels) 190 | return output_IDM 191 | 192 | #def train(self, dataloaders, dataset_sizes, max_num, min_num, save_file, dataset_name): 193 | ##the training part has been deleted because the confidentiality agreement 194 | def predict(self, inputs, his_labels, model_location): 195 | self.PUNN.load_state_dict(torch.load(model_location)) 196 | self.PUNN.eval() 197 | out = self.net_PUNN(inputs, his_labels) 198 | return out 199 | 200 | 201 | -------------------------------------------------------------------------------- /model2_test.py: -------------------------------------------------------------------------------- 1 | import math 2 | import copy 3 | import time 4 | import torch 5 | import torch.nn as nn 6 | import torch.nn.functional as F 7 | from torch.nn import TransformerEncoder, TransformerEncoderLayer 8 | from torch.nn import TransformerDecoder, TransformerDecoderLayer 9 | from utils import step_error 10 | criterion = nn.MSELoss() 11 | 12 | class PositionalEncoding(nn.Module): 13 | 14 | def __init__(self, d_model, max_len=5000): 15 | super(PositionalEncoding, self).__init__() 16 | pe = torch.zeros(max_len, d_model) 17 | position = torch.arange(0, max_len, dtype=torch.float).unsqueeze(1) 18 | div_term = torch.exp(torch.arange(0, d_model, 2).float() * (-math.log(10000.0) / d_model)) 19 | pe[:, 0::2] = torch.sin(position * div_term) 20 | pe[:, 1::2] = torch.cos(position * div_term) 21 | pe = pe.unsqueeze(0).transpose(0, 1) 22 | #pe.requires_grad = False 23 | self.register_buffer('pe', pe) 24 | 25 | def forward(self, x): 26 | return x + self.pe[:x.size(0), :] 27 | 28 | def _generate_square_subsequent_mask(sz): 29 | mask = (torch.triu(torch.ones(sz, sz)) == 1).transpose(0, 1) 30 | mask = mask.float().masked_fill(mask == 0, float('-inf')).masked_fill(mask == 1, float(0.0)) 31 | return mask 32 | # 无tensor梯度的IDM模型 33 | def model_IDM(inputs_IDM, his_labels, output_length, s_0, T, a, b, v_d, dt): 34 | v_pred = torch.zeros((inputs_IDM.shape[0], output_length, 1)) 35 | y_pred = torch.zeros((inputs_IDM.shape[0], output_length, 1)) 36 | acc = torch.zeros((inputs_IDM.shape[0], output_length, 1)) 37 | y = inputs_IDM[:, 0] 38 | v = inputs_IDM[:, 1] 39 | s = inputs_IDM[:, 2] 40 | delta_v = inputs_IDM[:, 3] 41 | 42 | s_x = s_0 + torch.max(torch.tensor(0), v * T + ((v * delta_v) / (2 * (a * b) ** 0.5))) 43 | # s_x = torch.tensor(2.5)+ torch.max(torch.tensor(0), v*torch.tensor(1.25)+((v*delta_v)/(2*(torch.tensor(1.75)*torch.tensor(1.25))**0.5))) 44 | a_f = a * (1 - (v / v_d) ** 4 - (s_x / s) ** 2) 45 | # a_f = torch.tensor(1.75)*(1-(v/torch.tensor(30))**4-(s_x/s)**2) 46 | v_pred[:, 0, 0] = v + a_f * dt 47 | for i in range(len(v_pred)): 48 | if v_pred[i, 0, 0] <= 0: 49 | v_pred[i, 0, 0] = 0 50 | y_pred[:, 0, 0] = y + v_pred[i, 0, 0] * dt 51 | acc[:, 0, 0] = a_f 52 | 53 | for i in range(y_pred.shape[0]): 54 | for j in range(output_length - 1): 55 | v = v_pred[i, j, 0] 56 | delta_v = his_labels[i, j, 1] - v_pred[i, j, 0] 57 | s = his_labels[i, j, 0] - y_pred[i, j, 0] 58 | # s_x = self.s_0 + self.T*v - ((v * delta_v)/(2*(self.a*self.b)**0.5)) 59 | # s_x = s_0 + v*T-((v*delta_v)/(2*(a*b)**0.5)) 60 | s_x = s_0 + torch.max(torch.tensor(0), v * T + ((v * delta_v) / (2 * (a * b) ** 0.5))) 61 | # acc_temp = self.a*(1-(v/self.v_d)**4-(s_x/s)**2) 62 | acc_temp = a * (1 - (v / v_d) ** 4 - (s_x / s) ** 2) 63 | v2 = v + acc_temp * dt 64 | if v2 <= 0: 65 | v2 = 0 66 | acc_temp = (v2 - v) / dt 67 | y1 = y_pred[i, j, 0] 68 | y2 = y1 + v2 * dt 69 | acc[i, j + 1, 0] = acc_temp 70 | v_pred[i, j + 1, 0] = v2 71 | y_pred[i, j + 1, 0] = y2 72 | 73 | return y_pred 74 | 75 | #PINN部分 (IDM) 76 | class IDMModel(nn.Module): 77 | def __init__(self, s_0, T, a, b, v_d): 78 | super(IDMModel, self).__init__() 79 | self.model_type = 'IDM' 80 | self.dt = 0.1 81 | self.s_0 = torch.tensor([1.667], requires_grad=True) 82 | self.T = torch.tensor([0.504], requires_grad=True) 83 | self.a = torch.tensor([0.430], requires_grad=True) 84 | self.b = torch.tensor([3.216], requires_grad=True) 85 | self.v_d = torch.tensor([16.775], requires_grad=True) 86 | 87 | self.s_0 = torch.nn.Parameter(self.s_0) 88 | self.T = torch.nn.Parameter(self.T) 89 | self.a = torch.nn.Parameter(self.a) 90 | self.b = torch.nn.Parameter(self.b) 91 | self.v_d = torch.nn.Parameter(self.v_d) 92 | 93 | self.s_0.data.fill_(s_0) 94 | self.T.data.fill_(T) 95 | self.a.data.fill_(a) 96 | self.b.data.fill_(b) 97 | self.v_d.data.fill_(v_d) 98 | 99 | def forward(self, inputs_IDM, his_labels): 100 | y = inputs_IDM[:, 0] 101 | v = inputs_IDM[:, 1] 102 | s = inputs_IDM[:, 2] 103 | delta_v = inputs_IDM[:, 3] 104 | 105 | s_x = self.s_0 + v * self.T + ((v * delta_v) / (2 * (self.a * self.b) ** 0.5)) 106 | a_f = self.a * (1 - (v / self.v_d) ** 4 - (s_x / s) ** 2) 107 | v_pred = v + a_f * self.dt 108 | for i in range(len(v_pred)): 109 | if v_pred[i] <= 0: 110 | v_pred[i] == 0 111 | output_IDM = y + v_pred * self.dt 112 | return output_IDM.unsqueeze(1).unsqueeze(2), torch.Tensor(self.s_0.data.cpu().numpy()), torch.Tensor( 113 | self.T.data.cpu().numpy()), torch.Tensor(self.a.data.cpu().numpy()), torch.Tensor( 114 | self.b.data.cpu().numpy()), torch.Tensor(self.v_d.data.cpu().numpy()) 115 | 116 | #PUNN部分 (Transformer with encoder and decoder) 117 | class TransformerModel(nn.Module): 118 | def __init__(self, ninput, ntoken, ninp, nhead, nhid, fusion_size, nlayers, dropout, output_length, s_0, T, a, b, v_d, dt): 119 | super(TransformerModel, self).__init__() 120 | self.model_type = 'Transformer' 121 | 122 | self.src_mask = None 123 | self.encoder_pos = PositionalEncoding(ninp) 124 | self.encoder_emb = nn.Linear(ninput, ninp) 125 | self.encoder_layer = TransformerEncoderLayer(ninp, nhead, nhid, dropout) 126 | self.encoder = TransformerEncoder(self.encoder_layer, nlayers) 127 | self.decoder_emb = nn.Linear(ninput, ninp) 128 | self.decoder_layer = TransformerDecoderLayer(ninp, nhead, nhid, dropout) 129 | self.decoder = TransformerDecoder(self.decoder_layer, nlayers) 130 | self.output_layer = nn.Linear(ninp, ntoken) 131 | self.fusion_layer = nn.Linear(fusion_size, ntoken) 132 | self.output_length = output_length 133 | self.s_0 = s_0 134 | self.T = T 135 | self.a = a 136 | self.b = b 137 | self.v_d = v_d 138 | self.dt = dt 139 | self.init_weights() 140 | 141 | def init_weights(self): 142 | initrange = 0.1 143 | self.encoder_emb.bias.data.zero_() 144 | self.encoder_emb.weight.data.uniform_(-initrange, initrange) 145 | self.decoder_emb.bias.data.zero_() 146 | self.decoder_emb.weight.data.uniform_(-initrange, initrange) 147 | self.output_layer.bias.data.zero_() 148 | self.output_layer.weight.data.uniform_(-initrange, initrange) 149 | self.fusion_layer.bias.data.zero_() 150 | self.fusion_layer.weight.data.uniform_(-initrange, initrange) 151 | 152 | def forward(self, src_inputs, dec_input, target_mask, his_labels): 153 | inputs = src_inputs[:, :, [0]] 154 | src = self.encoder_pos(self.encoder_emb(inputs.transpose(0, 1))) 155 | memory = self.encoder(src) 156 | inp_decoder = self.decoder_emb(dec_input.transpose(0, 1)) 157 | out_decoder = self.decoder(inp_decoder, memory, target_mask) 158 | output = self.output_layer(out_decoder) 159 | output = output.transpose(0, 1) 160 | dv = (inputs[:, -1, 0] - inputs[:, -2, 0]) / 1 161 | hist = torch.zeros(output.shape) 162 | for i in range(hist.shape[0]): 163 | hist[i, :, 0] = torch.linspace(inputs[i, -1, 0].item(), 164 | inputs[i, -1, 0].item() + dv[i].item() * output.shape[1], 165 | output.shape[1] + 1)[1:] 166 | out_length = output.shape[1] 167 | output_IDM = model_IDM(src_inputs[:, -1, :], his_labels[:, :, :], out_length, self.s_0, self.T, self.a, self.b, self.v_d, self.dt) 168 | fusion = torch.cat([output, hist, output_IDM], axis=2) 169 | final_output = self.fusion_layer(fusion) 170 | return final_output 171 | 172 | class PIT_IDM_2(nn.Module): 173 | def __init__(self, ninput, ntoken, ninp, nhead, nhid, fusion_size, nlayers, dropout, output_length, s_0, T, a, b, v_d, dt, lr_PUNN, lr_PINN, epoch_num, alpha): 174 | super(PIT_IDM_2, self).__init__() 175 | self.dt = dt 176 | self.output_length = output_length 177 | 178 | self.PUNN = TransformerModel(ninput, ntoken, ninp, nhead, nhid, fusion_size, nlayers, dropout, output_length, s_0, T, a, b, v_d, dt) 179 | self.PINN = IDMModel(s_0, T, a, b, v_d) 180 | 181 | #self.optimizer = torch.optim.Adam( 182 | #[{'params': self.PUNN.parameters(), 'lr': 0.0005}, {'params': self.PINN.parameters(), 'lr': 0.0000001}]) 183 | self.optimizer = torch.optim.Adam( 184 | [{'params': self.PUNN.parameters(), 'lr': lr_PUNN}, {'params': self.PINN.parameters(), 'lr': lr_PINN}]) 185 | self.scheduler = torch.optim.lr_scheduler.StepLR(self.optimizer, 15, gamma=0.1) 186 | 187 | self.epoches = epoch_num 188 | self.alpha = alpha 189 | 190 | def net_PUNN(self, src_inputs, dec_input, target_mask, his_labels): 191 | output_trans = self.PUNN(src_inputs, dec_input, target_mask, his_labels) 192 | return output_trans 193 | 194 | def net_PINN(self, inputs_IDM, his_labels): 195 | output_IDM = self.PINN(inputs_IDM, his_labels) 196 | return output_IDM 197 | 198 | #def train(self, dataloaders, dataset_sizes, max_num, min_num, save_file, dataset_name): 199 | ##the training part has been deleted because the confidentiality agreement 200 | 201 | def predict(self, inputs, dec_input, target_mask, his_labels, model_location): 202 | self.PUNN.load_state_dict(torch.load(model_location)) 203 | self.PUNN.eval() 204 | out = self.net_PUNN(inputs, dec_input, target_mask, his_labels) 205 | return out 206 | -------------------------------------------------------------------------------- /multi_attention_forward.py: -------------------------------------------------------------------------------- 1 | import torch 2 | from torch.nn.functional import softmax, dropout 3 | 4 | def linear(input, weight, bias=None): 5 | # type: (Tensor, Tensor, Optional[Tensor]) -> Tensor 6 | r""" 7 | Applies a linear transformation to the incoming data: :math:`y = xA^T + b`. 8 | Shape: 9 | - Input: :math:`(N, *, in\_features)` where `*` means any number of 10 | additional dimensions 11 | - Weight: :math:`(out\_features, in\_features)` 12 | - Bias: :math:`(out\_features)` 13 | - Output: :math:`(N, *, out\_features)` 14 | """ 15 | if input.dim() == 2 and bias is not None: 16 | # fused op is marginally faster 17 | ret = torch.addmm(bias, input, weight.t()) 18 | else: 19 | output = input.matmul(weight.t()) 20 | if bias is not None: 21 | output += bias 22 | ret = output 23 | return ret 24 | 25 | 26 | def multi_head_attention_forward(query, # type: Tensor 27 | key, # type: Tensor 28 | value, # type: Tensor 29 | embed_dim_to_check, # type: int 30 | num_heads, # type: int 31 | in_proj_weight, # type: Tensor 32 | in_proj_bias, # type: Tensor 33 | bias_k, # type: Optional[Tensor] 34 | bias_v, # type: Optional[Tensor] 35 | add_zero_attn, # type: bool 36 | dropout_p, # type: float 37 | out_proj_weight, # type: Tensor 38 | out_proj_bias, # type: Tensor 39 | training=True, # type: bool 40 | key_padding_mask=None, # type: Optional[Tensor] 41 | need_weights=True, # type: bool 42 | attn_mask=None, # type: Optional[Tensor] 43 | use_separate_proj_weight=False, # type: bool 44 | q_proj_weight=None, # type: Optional[Tensor] 45 | k_proj_weight=None, # type: Optional[Tensor] 46 | v_proj_weight=None, # type: Optional[Tensor] 47 | static_k=None, # type: Optional[Tensor] 48 | static_v=None # type: Optional[Tensor] 49 | ): 50 | # type: (...) -> Tuple[Tensor, Optional[Tensor]] 51 | r""" 52 | Args: 53 | query, key, value: map a query and a set of key-value pairs to an output. 54 | See "Attention Is All You Need" for more details. 55 | embed_dim_to_check: total dimension of the model. 56 | num_heads: parallel attention heads. 57 | in_proj_weight, in_proj_bias: input projection weight and bias. 58 | bias_k, bias_v: bias of the key and value sequences to be added at dim=0. 59 | add_zero_attn: add a new batch of zeros to the key and 60 | value sequences at dim=1. 61 | dropout_p: probability of an element to be zeroed. 62 | out_proj_weight, out_proj_bias: the output projection weight and bias. 63 | training: apply dropout if is ``True``. 64 | key_padding_mask: if provided, specified padding elements in the key will 65 | be ignored by the attention. This is an binary mask. When the value is True, 66 | the corresponding value on the attention layer will be filled with -inf. 67 | need_weights: output attn_output_weights. 68 | attn_mask: mask that prevents attention to certain positions. This is an additive mask 69 | (i.e. the values will be added to the attention layer). 70 | use_separate_proj_weight: the function accept the proj. weights for query, key, 71 | and value in different forms. If false, in_proj_weight will be used, which is 72 | a combination of q_proj_weight, k_proj_weight, v_proj_weight. 73 | q_proj_weight, k_proj_weight, v_proj_weight, in_proj_bias: input projection weight and bias. 74 | static_k, static_v: static key and value used for attention operators. 75 | Shape: 76 | Inputs: 77 | - query: :math:`(L, N, E)` where L is the target sequence length, N is the batch size, E is 78 | the embedding dimension. 79 | - key: :math:`(S, N, E)`, where S is the source sequence length, N is the batch size, E is 80 | the embedding dimension. 81 | - value: :math:`(S, N, E)` where S is the source sequence length, N is the batch size, E is 82 | the embedding dimension. 83 | - key_padding_mask: :math:`(N, S)`, ByteTensor, where N is the batch size, S is the source sequence length. 84 | - attn_mask: :math:`(L, S)` where L is the target sequence length, S is the source sequence length. 85 | - static_k: :math:`(N*num_heads, S, E/num_heads)`, where S is the source sequence length, 86 | N is the batch size, E is the embedding dimension. E/num_heads is the head dimension. 87 | - static_v: :math:`(N*num_heads, S, E/num_heads)`, where S is the source sequence length, 88 | N is the batch size, E is the embedding dimension. E/num_heads is the head dimension. 89 | Outputs: 90 | - attn_output: :math:`(L, N, E)` where L is the target sequence length, N is the batch size, 91 | E is the embedding dimension. 92 | - attn_output_weights: :math:`(N, L, S)` where N is the batch size, 93 | L is the target sequence length, S is the source sequence length. 94 | """ 95 | 96 | tgt_len, bsz, embed_dim = query.size() 97 | assert embed_dim == embed_dim_to_check 98 | assert key.size() == value.size() 99 | 100 | head_dim = embed_dim // num_heads 101 | assert head_dim * num_heads == embed_dim, "embed_dim must be divisible by num_heads" 102 | scaling = float(head_dim) ** -0.5 103 | 104 | if not use_separate_proj_weight: 105 | if torch.equal(query, key) and torch.equal(key, value): 106 | # self-attention 107 | q, k, v = linear(query, in_proj_weight, in_proj_bias).chunk(3, dim=-1) 108 | 109 | elif torch.equal(key, value): 110 | # encoder-decoder attention 111 | # This is inline in_proj function with in_proj_weight and in_proj_bias 112 | _b = in_proj_bias 113 | _start = 0 114 | _end = embed_dim 115 | _w = in_proj_weight[_start:_end, :] 116 | if _b is not None: 117 | _b = _b[_start:_end] 118 | q = linear(query, _w, _b) 119 | 120 | if key is None: 121 | assert value is None 122 | k = None 123 | v = None 124 | else: 125 | 126 | # This is inline in_proj function with in_proj_weight and in_proj_bias 127 | _b = in_proj_bias 128 | _start = embed_dim 129 | _end = None 130 | _w = in_proj_weight[_start:, :] 131 | if _b is not None: 132 | _b = _b[_start:] 133 | k, v = linear(key, _w, _b).chunk(2, dim=-1) 134 | 135 | else: 136 | # This is inline in_proj function with in_proj_weight and in_proj_bias 137 | _b = in_proj_bias 138 | _start = 0 139 | _end = embed_dim 140 | _w = in_proj_weight[_start:_end, :] 141 | if _b is not None: 142 | _b = _b[_start:_end] 143 | q = linear(query, _w, _b) 144 | 145 | # This is inline in_proj function with in_proj_weight and in_proj_bias 146 | _b = in_proj_bias 147 | _start = embed_dim 148 | _end = embed_dim * 2 149 | _w = in_proj_weight[_start:_end, :] 150 | if _b is not None: 151 | _b = _b[_start:_end] 152 | k = linear(key, _w, _b) 153 | 154 | # This is inline in_proj function with in_proj_weight and in_proj_bias 155 | _b = in_proj_bias 156 | _start = embed_dim * 2 157 | _end = None 158 | _w = in_proj_weight[_start:, :] 159 | if _b is not None: 160 | _b = _b[_start:] 161 | v = linear(value, _w, _b) 162 | else: 163 | q_proj_weight_non_opt = torch.jit._unwrap_optional(q_proj_weight) 164 | len1, len2 = q_proj_weight_non_opt.size() 165 | assert len1 == embed_dim and len2 == query.size(-1) 166 | 167 | k_proj_weight_non_opt = torch.jit._unwrap_optional(k_proj_weight) 168 | len1, len2 = k_proj_weight_non_opt.size() 169 | assert len1 == embed_dim and len2 == key.size(-1) 170 | 171 | v_proj_weight_non_opt = torch.jit._unwrap_optional(v_proj_weight) 172 | len1, len2 = v_proj_weight_non_opt.size() 173 | assert len1 == embed_dim and len2 == value.size(-1) 174 | 175 | if in_proj_bias is not None: 176 | q = linear(query, q_proj_weight_non_opt, in_proj_bias[0:embed_dim]) 177 | k = linear(key, k_proj_weight_non_opt, in_proj_bias[embed_dim:(embed_dim * 2)]) 178 | v = linear(value, v_proj_weight_non_opt, in_proj_bias[(embed_dim * 2):]) 179 | else: 180 | q = linear(query, q_proj_weight_non_opt, in_proj_bias) 181 | k = linear(key, k_proj_weight_non_opt, in_proj_bias) 182 | v = linear(value, v_proj_weight_non_opt, in_proj_bias) 183 | q = q * scaling 184 | 185 | if bias_k is not None and bias_v is not None: 186 | if static_k is None and static_v is None: 187 | k = torch.cat([k, bias_k.repeat(1, bsz, 1)]) 188 | v = torch.cat([v, bias_v.repeat(1, bsz, 1)]) 189 | if attn_mask is not None: 190 | attn_mask = torch.cat([attn_mask, 191 | torch.zeros((attn_mask.size(0), 1), 192 | dtype=attn_mask.dtype, 193 | device=attn_mask.device)], dim=1) 194 | if key_padding_mask is not None: 195 | key_padding_mask = torch.cat( 196 | [key_padding_mask, torch.zeros((key_padding_mask.size(0), 1), 197 | dtype=key_padding_mask.dtype, 198 | device=key_padding_mask.device)], dim=1) 199 | else: 200 | assert static_k is None, "bias cannot be added to static key." 201 | assert static_v is None, "bias cannot be added to static value." 202 | else: 203 | assert bias_k is None 204 | assert bias_v is None 205 | 206 | q = q.contiguous().view(tgt_len, bsz * num_heads, head_dim).transpose(0, 1) 207 | if k is not None: 208 | k = k.contiguous().view(-1, bsz * num_heads, head_dim).transpose(0, 1) 209 | if v is not None: 210 | v = v.contiguous().view(-1, bsz * num_heads, head_dim).transpose(0, 1) 211 | 212 | if static_k is not None: 213 | assert static_k.size(0) == bsz * num_heads 214 | assert static_k.size(2) == head_dim 215 | k = static_k 216 | 217 | if static_v is not None: 218 | assert static_v.size(0) == bsz * num_heads 219 | assert static_v.size(2) == head_dim 220 | v = static_v 221 | 222 | src_len = k.size(1) 223 | 224 | if key_padding_mask is not None: 225 | assert key_padding_mask.size(0) == bsz 226 | assert key_padding_mask.size(1) == src_len 227 | 228 | if add_zero_attn: 229 | src_len += 1 230 | k = torch.cat([k, torch.zeros((k.size(0), 1) + k.size()[2:], dtype=k.dtype, device=k.device)], dim=1) 231 | v = torch.cat([v, torch.zeros((v.size(0), 1) + v.size()[2:], dtype=v.dtype, device=v.device)], dim=1) 232 | if attn_mask is not None: 233 | attn_mask = torch.cat([attn_mask, torch.zeros((attn_mask.size(0), 1), 234 | dtype=attn_mask.dtype, 235 | device=attn_mask.device)], dim=1) 236 | if key_padding_mask is not None: 237 | key_padding_mask = torch.cat( 238 | [key_padding_mask, torch.zeros((key_padding_mask.size(0), 1), 239 | dtype=key_padding_mask.dtype, 240 | device=key_padding_mask.device)], dim=1) 241 | 242 | attn_output_weights = torch.bmm(q, k.transpose(1, 2)) 243 | assert list(attn_output_weights.size()) == [bsz * num_heads, tgt_len, src_len] 244 | 245 | if attn_mask is not None: 246 | attn_mask = attn_mask.unsqueeze(0) 247 | attn_output_weights += attn_mask 248 | 249 | if key_padding_mask is not None: 250 | attn_output_weights = attn_output_weights.view(bsz, num_heads, tgt_len, src_len) 251 | attn_output_weights = attn_output_weights.masked_fill( 252 | key_padding_mask.unsqueeze(1).unsqueeze(2), 253 | float('-inf'), 254 | ) 255 | attn_output_weights = attn_output_weights.view(bsz * num_heads, tgt_len, src_len) 256 | 257 | attn_output_weights = softmax( 258 | attn_output_weights, dim=-1) 259 | 260 | attn_output_weights = dropout(attn_output_weights, p=dropout_p, training=training) 261 | 262 | attn_output = torch.bmm(attn_output_weights, v) 263 | # import pdb; pdb.set_trace() 264 | assert list(attn_output.size()) == [bsz * num_heads, tgt_len, head_dim] 265 | attn_output = attn_output.transpose(0, 1).contiguous().view(tgt_len, bsz, embed_dim) 266 | attn_output = linear(attn_output, out_proj_weight, out_proj_bias) 267 | if need_weights: 268 | # average attention weights over heads 269 | attn_output_weights = attn_output_weights.view(bsz, num_heads, tgt_len, src_len) 270 | return attn_output, attn_output_weights.sum(dim=1) / num_heads 271 | else: 272 | return attn_output, None 273 | -------------------------------------------------------------------------------- /single vehicle trajectories visualization _I80.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 49, 6 | "id": "integrated-forty", 7 | "metadata": {}, 8 | "outputs": [], 9 | "source": [ 10 | "import math\n", 11 | "import copy\n", 12 | "import time\n", 13 | "import torch\n", 14 | "import torch.nn as nn\n", 15 | "import torch.nn.functional as F\n", 16 | "from torch.nn import TransformerEncoder, TransformerEncoderLayer\n", 17 | "import scipy.io as sio\n", 18 | "import matplotlib.pyplot as plt\n", 19 | "import numpy as np\n", 20 | "import pandas as pd\n", 21 | "criterion = nn.MSELoss()\n", 22 | "\n", 23 | "def all_step_error(outputs, labels):\n", 24 | " out = ((outputs-labels)**2)**0.5\n", 25 | " lossVal = torch.mean(out[:,:,0],dim=0)\n", 26 | " err_mean = torch.zeros([6])\n", 27 | " err_final = torch.zeros([6])\n", 28 | " for i in range(6):\n", 29 | " err_mean[i] = torch.mean(lossVal[:5*(i+1)])\n", 30 | " err_final[i] = lossVal[5*(i+1)-1]\n", 31 | " return err_mean, err_final\n", 32 | "\n", 33 | "def step_error(outputs, labels):\n", 34 | " out = ((outputs-labels)**2)**0.5\n", 35 | " lossVal = torch.mean(out[:,:,0],dim=0)\n", 36 | " lossmean = torch.mean(lossVal)\n", 37 | " lossfinal = lossVal[-1]\n", 38 | " \n", 39 | " return lossmean, lossfinal\n", 40 | "\n", 41 | "def model_IDM(inputs_IDM, his_labels):\n", 42 | " v_pred = torch.zeros((inputs_IDM.shape[0], output_length, 1))\n", 43 | " y_pred = torch.zeros((inputs_IDM.shape[0], output_length, 1))\n", 44 | " acc = torch.zeros((inputs_IDM.shape[0], output_length, 1))\n", 45 | " y = inputs_IDM[:,0]\n", 46 | " v = inputs_IDM[:,1]\n", 47 | " s = inputs_IDM[:,2]\n", 48 | " delta_v = inputs_IDM[:,3]\n", 49 | " \n", 50 | " s_x = s_0+ torch.max(torch.tensor(0), v*T+((v*delta_v)/(2*(a*b)**0.5)))\n", 51 | " #s_x = torch.tensor(2.5)+ torch.max(torch.tensor(0), v*torch.tensor(1.25)+((v*delta_v)/(2*(torch.tensor(1.75)*torch.tensor(1.25))**0.5)))\n", 52 | " a_f = a*(1-(v/v_d)**4-(s_x/s)**2)\n", 53 | " #a_f = torch.tensor(1.75)*(1-(v/torch.tensor(30))**4-(s_x/s)**2)\n", 54 | " v_pred[:,0,0] = v+a_f*dt\n", 55 | " for i in range(len(v_pred)):\n", 56 | " if v_pred[i,0,0]<=0:\n", 57 | " v_pred[i,0,0]=0\n", 58 | " y_pred[:,0,0] = y+v_pred[i,0,0]*dt\n", 59 | " acc[:,0,0] = a_f\n", 60 | " \n", 61 | " for i in range(y_pred.shape[0]):\n", 62 | " for j in range(output_length-1):\n", 63 | " v = v_pred[i,j,0]\n", 64 | " delta_v = his_labels[i,j,1]-v_pred[i,j,0]\n", 65 | " s = his_labels[i,j,0]-y_pred[i,j,0]\n", 66 | " #s_x = self.s_0 + self.T*v - ((v * delta_v)/(2*(self.a*self.b)**0.5))\n", 67 | " #s_x = s_0 + v*T-((v*delta_v)/(2*(a*b)**0.5))\n", 68 | " s_x = s_0 + torch.max(torch.tensor(0), v*T+((v*delta_v)/(2*(a*b)**0.5)))\n", 69 | " #acc_temp = self.a*(1-(v/self.v_d)**4-(s_x/s)**2)\n", 70 | " acc_temp = a*(1-(v/v_d)**4-(s_x/s)**2)\n", 71 | " v2 = v + acc_temp * dt\n", 72 | " if v2<=0:\n", 73 | " v2 = 0\n", 74 | " acc_temp = (v2-v)/dt\n", 75 | " y1 = y_pred[i,j,0]\n", 76 | " y2 = y1 + v2 * dt\n", 77 | " acc[i,j+1,0] = acc_temp\n", 78 | " v_pred[i,j+1,0] = v2\n", 79 | " y_pred[i,j+1,0] = y2\n", 80 | " \n", 81 | " return y_pred\n", 82 | "\n", 83 | "class IDMModel(nn.Module):\n", 84 | " def __init__(self, s_0, T, a, b, v_d):\n", 85 | " super(IDMModel, self).__init__()\n", 86 | " self.model_type = 'IDM'\n", 87 | " self.dt=0.1\n", 88 | " self.s_0 = torch.tensor([1.667], requires_grad=True)\n", 89 | " self.T = torch.tensor([0.504], requires_grad=True)\n", 90 | " self.a = torch.tensor([0.430], requires_grad=True)\n", 91 | " self.b = torch.tensor([3.216], requires_grad=True)\n", 92 | " self.v_d = torch.tensor([16.775], requires_grad=True)\n", 93 | " \n", 94 | " self.s_0 = torch.nn.Parameter(self.s_0)\n", 95 | " self.T = torch.nn.Parameter(self.T)\n", 96 | " self.a = torch.nn.Parameter(self.a)\n", 97 | " self.b = torch.nn.Parameter(self.b)\n", 98 | " self.v_d = torch.nn.Parameter(self.v_d)\n", 99 | " \n", 100 | " self.s_0.data.fill_(s_0)\n", 101 | " self.T.data.fill_(T)\n", 102 | " self.a.data.fill_(a)\n", 103 | " self.b.data.fill_(b)\n", 104 | " self.v_d.data.fill_(v_d)\n", 105 | " def forward(self, inputs_IDM, his_labels):\n", 106 | " y = inputs_IDM[:,0]\n", 107 | " v = inputs_IDM[:,1]\n", 108 | " s = inputs_IDM[:,2]\n", 109 | " delta_v = inputs_IDM[:,3]\n", 110 | " \n", 111 | " s_x = self.s_0+ v*self.T+((v*delta_v)/(2*(self.a*self.b)**0.5))\n", 112 | " a_f = self.a*(1-(v/self.v_d)**4-(s_x/s)**2)\n", 113 | " v_pred = v+a_f*self.dt\n", 114 | " for i in range(len(v_pred)):\n", 115 | " if v_pred[i]<=0:\n", 116 | " v_pred[i]==0\n", 117 | " output_IDM = y+v_pred*self.dt\n", 118 | " return output_IDM.unsqueeze(1).unsqueeze(2), torch.Tensor(self.s_0.data.cpu().numpy()), torch.Tensor(self.T.data.cpu().numpy()), torch.Tensor(self.a.data.cpu().numpy()), torch.Tensor(self.b.data.cpu().numpy()), torch.Tensor(self.v_d.data.cpu().numpy())\n", 119 | "\n", 120 | "class TransformerModel(nn.Module):\n", 121 | "\n", 122 | " def __init__(self, ntoken, ninp, nhead, nhid, nlayers, dropout, fusion_size, output_length):\n", 123 | " super(TransformerModel, self).__init__()\n", 124 | " \n", 125 | " self.model_type = 'Transformer'\n", 126 | " self.src_mask = None\n", 127 | " self.embedding_layer = nn.Linear(ninput, ninp)\n", 128 | " self.encoder_layers = TransformerEncoderLayer(ninp, nhead, nhid, dropout)\n", 129 | " self.transformer_encoder = TransformerEncoder(self.encoder_layers, nlayers)\n", 130 | " #self.relu = nn.ReLU()\n", 131 | " #self.leaky_relu = nn.LeakyReLU(0.1)\n", 132 | " #self.sig = nn.Sigmoid()\n", 133 | " self.decoder = nn.Linear(ninp, ntokens)\n", 134 | " self.dropout_in = nn.Dropout(dropout)\n", 135 | " self.fusion_layer_1 = nn.Linear(fusion_size, ntokens)\n", 136 | " self.fusion_layer_2 = nn.Linear(fusion_size, ntokens)\n", 137 | " self.out_length = output_length\n", 138 | " \n", 139 | " #self.pos_encoder = PositionalEncoding(ninp, dropout)\n", 140 | " \n", 141 | " #self.init_weights()\n", 142 | "\n", 143 | " def _generate_square_subsequent_mask(self, sz):\n", 144 | " mask = (torch.triu(torch.ones(sz, sz)) == 1).transpose(0, 1)\n", 145 | " mask = mask.float().masked_fill(mask == 0, float('-inf')).masked_fill(mask == 1, float(0.0))\n", 146 | " return mask\n", 147 | "\n", 148 | " def init_weights(self):\n", 149 | " initrange = 1\n", 150 | " #self.transformer_encoder.weight.data.uniform_(-initrange, initrange)\n", 151 | " self.embedding_layer.bias.data.zero_()\n", 152 | " self.embedding_layer.weight.data.uniform_(-initrange, initrange)\n", 153 | " self.decoder.bias.data.zero_()\n", 154 | " self.decoder.weight.data.uniform_(-initrange, initrange)\n", 155 | " def forward(self, inputs, his_labels):\n", 156 | " \n", 157 | " src_inputs = inputs[:,:,0].unsqueeze(2)\n", 158 | " src = src_inputs.transpose(1,0)\n", 159 | " src = self.embedding_layer(src)\n", 160 | " #pos_src = self.pos_encoder(src)\n", 161 | " if self.src_mask is None or self.src_mask.size(0) != len(src):\n", 162 | " mask = self._generate_square_subsequent_mask(len(src))\n", 163 | " self.src_mask = mask\n", 164 | " \n", 165 | " enc_src = self.transformer_encoder(src, self.src_mask)\n", 166 | " enc_src = enc_src[-1]\n", 167 | " enc_src = enc_src.repeat(self.out_length, 1, 1)\n", 168 | " output = self.decoder(enc_src)\n", 169 | " output = output.transpose(0,1)\n", 170 | " \n", 171 | " dv = (src_inputs[:,-1,0]-src_inputs[:,-31,0])/30\n", 172 | " hist = torch.zeros(output.shape)\n", 173 | " for i in range(src_inputs.shape[0]):\n", 174 | " hist[i,:,0] = torch.linspace(src_inputs[i,-1,0].item(), src_inputs[i,-1,0].item()+dv[i].item()*30, 31)[1:]\n", 175 | " output_IDM = model_IDM(inputs[:,-1,:], his_labels[:,:,:])\n", 176 | " fusion_1 = torch.cat([output,hist,output_IDM],axis=2)\n", 177 | " final_output = self.fusion_layer_1(fusion_1)\n", 178 | " \n", 179 | " #fusion_2 = torch.cat([fusion_output,hist],axis=1)\n", 180 | " #final_output = self.fusion_layer_2(fusion_2)\n", 181 | " \n", 182 | " return final_output\n" 183 | ] 184 | }, 185 | { 186 | "cell_type": "code", 187 | "execution_count": 50, 188 | "id": "novel-nutrition", 189 | "metadata": {}, 190 | "outputs": [], 191 | "source": [ 192 | "class PINN(nn.Module):\n", 193 | " def __init__(self, s_0, T, a, b, v_d, ntokens, ninp, nhead, nhid, nlayers, dropout, fusion_size):\n", 194 | " super(PINN, self).__init__()\n", 195 | " self.dt = 0.1\n", 196 | " self.output_length = 30\n", 197 | " #self.s_0 = torch.tensor(s_0, requires_grad=True)\n", 198 | " #self.T = torch.tensor(T, requires_grad=True)\n", 199 | " #self.a = torch.tensor(a, requires_grad=True)\n", 200 | " #self.b = torch.tensor(b, requires_grad=True)\n", 201 | " #self.v_d = torch.tensor(v_d, requires_grad=True)\n", 202 | " \n", 203 | " #self.s_0 = torch.nn.Parameter(self.s_0)\n", 204 | " #self.T = torch.nn.Parameter(self.T)\n", 205 | " #self.a = torch.nn.Parameter(self.a)\n", 206 | " #self.b = torch.nn.Parameter(self.b)\n", 207 | " #self.v_d = torch.nn.Parameter(self.v_d)\n", 208 | " \n", 209 | " self.PUNN = TransformerModel(ntokens, ninp, nhead, nhid, nlayers, dropout, fusion_size,self.output_length)\n", 210 | " self.PINN = IDMModel(s_0, T, a, b, v_d)\n", 211 | " \n", 212 | " #self.Transformer.register_parameter('s_0', self.s_0)\n", 213 | " #self.Transformer.register_parameter('T', self.T)\n", 214 | " #self.Transformer.register_parameter('a', self.a)\n", 215 | " #self.Transformer.register_parameter('b', self.b)\n", 216 | " #self.Transformer.register_parameter('v_d', self.v_d)\n", 217 | " \n", 218 | " #self.optimizer = torch.optim.Adam(self.Transformer.parameters(), lr=0.00005)\n", 219 | " self.optimizer = torch.optim.Adam([{'params':self.PUNN.parameters(), 'lr':0.0005},{'params':self.PINN.parameters(),'lr':0.0000001}])\n", 220 | " self.scheduler = torch.optim.lr_scheduler.StepLR(self.optimizer, 15, gamma=0.1)\n", 221 | " #self.optimizer_PUNN = torch.optim.Adam(self.PUNN.parameters(), lr=0.0005)\n", 222 | " #self.scheduler_PUNN = torch.optim.lr_scheduler.StepLR(self.optimizer_PUNN, 15, gamma=0.1)\n", 223 | " #self.optimizer_PINN = torch.optim.Adam(self.PINN.parameters(), lr=0.0005)\n", 224 | " #self.scheduler_PINN = torch.optim.lr_scheduler.StepLR(self.optimizer_PINN, 15, gamma=0.1)\n", 225 | " #self.optimizer = torch.optim.Adam(self.Transformer.parameters())\n", 226 | " self.epoches = 10\n", 227 | " self.alpha = 0.7\n", 228 | " \n", 229 | " def net_PUNN(self, inputs, his_labels):\n", 230 | " output_trans = self.PUNN(inputs, his_labels)\n", 231 | " return output_trans\n", 232 | " \n", 233 | " def net_PINN(self, inputs_IDM, his_labels):\n", 234 | " output_IDM = self.PINN(inputs_IDM, his_labels)\n", 235 | " return output_IDM\n", 236 | " \n", 237 | " def PINN_without_grad(self, inputs_IDM, his_labels, s_0_item, T_item, a_item, b_item, vd_item):\n", 238 | " v_pred = torch.zeros((inputs_IDM.shape[0], self.output_length, 1))\n", 239 | " y_pred = torch.zeros((inputs_IDM.shape[0], self.output_length, 1))\n", 240 | " acc = torch.zeros((inputs_IDM.shape[0], self.output_length, 1))\n", 241 | " y = inputs_IDM[:,0]\n", 242 | " v = inputs_IDM[:,1]\n", 243 | " s = inputs_IDM[:,2]\n", 244 | " delta_v = inputs_IDM[:,3]\n", 245 | " \n", 246 | " #s_0_item = torch.Tensor(self.s_0.data.cpu().numpy())\n", 247 | " #T_item = torch.Tensor(self.T.data.cpu().numpy())\n", 248 | " #a_item = torch.Tensor(self.a.data.cpu().numpy())\n", 249 | " #b_item = torch.Tensor(self.b.data.cpu().numpy())\n", 250 | " #vd_item = torch.Tensor(self.v_d.data.cpu().numpy())\n", 251 | " s_x = s_0_item+ torch.max(torch.tensor(0), v*T_item+((v*delta_v)/(2*(a_item*b_item)**0.5)))\n", 252 | " #s_x = torch.tensor(2.5)+ torch.max(torch.tensor(0), v*torch.tensor(1.25)+((v*delta_v)/(2*(torch.tensor(1.75)*torch.tensor(1.25))**0.5)))\n", 253 | " a_f = a_item*(1-(v/vd_item)**4-(s_x/s)**2)\n", 254 | " #a_f = torch.tensor(1.75)*(1-(v/torch.tensor(30))**4-(s_x/s)**2)\n", 255 | " v_pred[:,0,0] = v+a_f*self.dt\n", 256 | " for i in range(len(v_pred)):\n", 257 | " if v_pred[i,0,0]<=0:\n", 258 | " v_pred[i,0,0]=0\n", 259 | " y_pred[:,0,0] = y+v_pred[i,0,0]*self.dt\n", 260 | " acc[:,0,0] = a_f\n", 261 | " \n", 262 | " for i in range(y_pred.shape[0]):\n", 263 | " for j in range(self.output_length-1):\n", 264 | " v = v_pred[i,j,0]\n", 265 | " delta_v = his_labels[i,j,1]-v_pred[i,j,0]\n", 266 | " s = his_labels[i,j,0]-y_pred[i,j,0]\n", 267 | " #s_x = self.s_0 + self.T*v - ((v * delta_v)/(2*(self.a*self.b)**0.5))\n", 268 | " s_x = s_0_item+ torch.max(torch.tensor(0),v*T_item+((v*delta_v)/(2*(a_item*b_item)**0.5)))\n", 269 | " #acc_temp = self.a*(1-(v/self.v_d)**4-(s_x/s)**2)\n", 270 | " acc_temp = a_item*(1-(v/vd_item)**4-(s_x/s)**2)\n", 271 | " v2 = v + acc_temp * self.dt\n", 272 | " if v2 <= 0:\n", 273 | " v2 = 0\n", 274 | " acc_temp = (v2-v)/self.dt\n", 275 | " y1 = y_pred[i,j,0]\n", 276 | " y2 = y1 + v2 * self.dt\n", 277 | " acc[i,j+1,0] = acc_temp\n", 278 | " v_pred[i,j+1,0] = v2\n", 279 | " y_pred[i,j+1,0] = y2\n", 280 | " return y_pred\n", 281 | " \n", 282 | " def train(self):\n", 283 | " #the training part has been deleted because the confidentiality agreement\n", 284 | " def predict(self, inputs, his_labels, model_location):\n", 285 | " #self.Transformer = TransformerModel(ntokens, ninp, nhead, nhid, nlayers, dropout, fusion_size)\n", 286 | " self.PUNN.load_state_dict(torch.load(model_location))\n", 287 | " #self.Transformer = model\n", 288 | " self.PUNN.eval()\n", 289 | " out_Trans = self.PUNN(inputs, his_labels)\n", 290 | " #out_IDM = self.net_IDM(inputs[:,-1,:], labels)\n", 291 | " return out_Trans\n" 292 | ] 293 | }, 294 | { 295 | "cell_type": "code", 296 | "execution_count": 51, 297 | "id": "czech-montana", 298 | "metadata": {}, 299 | "outputs": [], 300 | "source": [ 301 | "s_0 = (1.667/0.3048)/(max_num-min_num)\n", 302 | "T = 0.504\n", 303 | "a = (0.430/0.3048)/(max_num-min_num)\n", 304 | "b = (3.216/0.3048)/(max_num-min_num)\n", 305 | "v_d = (16.775/0.3048)/(max_num-min_num)\n", 306 | "\n", 307 | "ntokens = 1 # the size of outputs\n", 308 | "ninput = 1\n", 309 | "ninp = 14 # embedding dimension\n", 310 | "nhid = 28 # the dimension of the feedforward network model in nn.TransformerEncoder\n", 311 | "nlayers = 3 # the number of nn.TransformerEncoderLayer in nn.TransformerEncoder\n", 312 | "nhead = 2 # the number of heads in the multiheadattention models\n", 313 | "fusion_size = 3\n", 314 | "dropout = 0.1 # the dropout value\n", 315 | "dt = 0.1\n", 316 | "output_length = 30\n", 317 | "\n", 318 | "PIT_IDM = PINN(s_0, T, a, b, v_d, ntokens, ninp, nhead, nhid, nlayers, dropout, fusion_size)\n", 319 | "model_location = r'\\PIT-IDM(1)_I80.tar'" 320 | ] 321 | }, 322 | { 323 | "cell_type": "code", 324 | "execution_count": 3, 325 | "id": "elegant-convertible", 326 | "metadata": {}, 327 | "outputs": [], 328 | "source": [ 329 | "import pandas as pd\n", 330 | "import numpy as np\n", 331 | "\n", 332 | "lane_keep_data=pd.read_csv(r'\\lane_keep_data_I80.csv')\n", 333 | "\n", 334 | "pro_data_all = lane_keep_data[['Vehicle_ID', 'Frame_ID', 'Local_Y', 'Pre_Local_Y', 'y_v', 'Space_Headway', 'delta_v', 'Preceding', 'Pre_yv']]\n", 335 | "pro_data = pro_data_all.to_numpy()\n", 336 | "\n", 337 | "##Training Data\n", 338 | "all_input=[]\n", 339 | "for i in range(int(pro_data.shape[0]*0.8)-2*50):\n", 340 | " if pro_data[i, 0] == pro_data[i + 49, 0] and pro_data[i, 7] == pro_data[i + 49, 7] and pro_data[i + 49, 1] - pro_data[\n", 341 | " i, 1] == 49 and pro_data[i + 49, 0] == pro_data[i + 98, 0] and pro_data[i + 49, 7] == pro_data[i + 98, 7] and pro_data[i + 98, 1] - \\\n", 342 | " pro_data[i + 49, 1] == 49:\n", 343 | " no_use = pro_data[i + 50:i + 2 * 50]\n", 344 | " no_use = np.array(no_use)\n", 345 | " the_output = no_use[:, :]\n", 346 | " all_together = np.hstack((pro_data[i: i + 50][:, :], the_output))\n", 347 | " all_input.append(all_together)\n", 348 | "\n", 349 | "x_train = []\n", 350 | "y_train = []\n", 351 | "train_labels = []\n", 352 | "for i in range(len(all_input)):\n", 353 | " temp = all_input[i]\n", 354 | " temp_y = temp[:30,11]-temp[0,2]\n", 355 | " temp_y_pre = temp[:30,12]-temp[0,2]\n", 356 | " temp_yv_pre = temp[:30,-1]\n", 357 | " begin_pos = temp[0,2]\n", 358 | " temp[:,2] = temp[:,2] - begin_pos\n", 359 | " temp[:,3] = temp[:,3] - begin_pos\n", 360 | " if temp_y[0]<=temp[-1,2] or temp_y[0]<=0 or temp_y[0]>=temp_y_pre[0]:\n", 361 | " continue\n", 362 | " y_train.append([temp_y])\n", 363 | " train_labels.append([temp_y_pre, temp_yv_pre])\n", 364 | " x_train.append(temp[:,[2,4,5,6]])\n", 365 | "\n", 366 | "##Validation Data\n", 367 | "all_input = []\n", 368 | "for i in range(int(pro_data.shape[0]*0.9), int(pro_data.shape[0])-2*50):\n", 369 | " if pro_data[i, 0] == pro_data[i + 49, 0] and pro_data[i, 7] == pro_data[i + 49, 7] and pro_data[i + 49, 1] - pro_data[\n", 370 | " i, 1] == 49 and pro_data[i + 49, 0] == pro_data[i + 98, 0] and pro_data[i + 49, 7] == pro_data[i + 98, 7] and pro_data[i + 98, 1] - \\\n", 371 | " pro_data[i + 49, 1] == 49:\n", 372 | " no_use = pro_data[i + 50:i + 2 * 50]\n", 373 | " no_use = np.array(no_use)\n", 374 | " the_output = no_use[:, :]\n", 375 | " all_together = np.hstack((pro_data[i: i + 50][:, :], the_output))\n", 376 | " all_input.append(all_together)\n", 377 | "\n", 378 | "x_val = []\n", 379 | "y_val = []\n", 380 | "val_labels = []\n", 381 | "for i in range(len(all_input)):\n", 382 | " temp = all_input[i]\n", 383 | " temp_y = temp[:30,11]-temp[0,2]\n", 384 | " temp_y_pre = temp[:30,12]-temp[0,2]\n", 385 | " temp_yv_pre = temp[:30,-1]\n", 386 | " begin_pos = temp[0,2]\n", 387 | " temp[:,2] = temp[:,2] - begin_pos\n", 388 | " temp[:,3] = temp[:,3] - begin_pos\n", 389 | " if temp_y[0]<=temp[-1,2] or temp_y[0]<=0 or temp_y[0]>=temp_y_pre[0]:\n", 390 | " continue\n", 391 | " y_val.append([temp_y])\n", 392 | " val_labels.append([temp_y_pre, temp_yv_pre])\n", 393 | " x_val.append(temp[:,[2,4,5,6]])\n", 394 | "\n", 395 | "##Testing Data\n", 396 | "all_input = []\n", 397 | "for i in range(int(pro_data.shape[0]*0.8), int(pro_data.shape[0]*0.9)-2*50):\n", 398 | " if pro_data[i, 0] == pro_data[i + 49, 0] and pro_data[i, 7] == pro_data[i + 49, 7] and pro_data[i + 49, 1] - pro_data[\n", 399 | " i, 1] == 49 and pro_data[i + 49, 0] == pro_data[i + 98, 0] and pro_data[i + 49, 7] == pro_data[i + 98, 7] and pro_data[i + 98, 1] - \\\n", 400 | " pro_data[i + 49, 1] == 49:\n", 401 | " no_use = pro_data[i + 50:i + 2 * 50]\n", 402 | " no_use = np.array(no_use)\n", 403 | " the_output = no_use[:, :]\n", 404 | " all_together = np.hstack((pro_data[i: i + 50][:, :], the_output))\n", 405 | " all_input.append(all_together)\n", 406 | "\n", 407 | "x_test = []\n", 408 | "y_test = []\n", 409 | "test_labels = []\n", 410 | "for i in range(len(all_input)):\n", 411 | " temp = all_input[i]\n", 412 | " temp_y = temp[:30,11]-temp[0,2]\n", 413 | " temp_y_pre = temp[:30,12]-temp[0,2]\n", 414 | " temp_yv_pre = temp[:30,-1]\n", 415 | " begin_pos = temp[0,2]\n", 416 | " temp[:,2] = temp[:,2] - begin_pos\n", 417 | " temp[:,3] = temp[:,3] - begin_pos\n", 418 | " if temp_y[0]<=temp[-1,2] or temp_y[0]<=0 or temp_y[0]>=temp_y_pre[0]:\n", 419 | " continue\n", 420 | " y_test.append([temp_y])\n", 421 | " test_labels.append([temp_y_pre, temp_yv_pre])\n", 422 | " x_test.append(temp[:,[2,4,5,6]])\n", 423 | "\n", 424 | "x_train = np.array(x_train)\n", 425 | "y_train = np.array(y_train)\n", 426 | "train_labels = np.array(train_labels)\n", 427 | "y_train = y_train.transpose(0,2,1)\n", 428 | "train_labels = train_labels.transpose(0,2,1)\n", 429 | "\n", 430 | "x_test = np.array(x_test)\n", 431 | "y_test = np.array(y_test)\n", 432 | "test_labels = np.array(test_labels)\n", 433 | "y_test = y_test.transpose(0,2,1)\n", 434 | "test_labels = test_labels.transpose(0,2,1)\n", 435 | "\n", 436 | "x_val = np.array(x_val)\n", 437 | "y_val = np.array(y_val)\n", 438 | "val_labels = np.array(val_labels)\n", 439 | "y_val = y_val.transpose(0,2,1)\n", 440 | "val_labels = val_labels.transpose(0,2,1)\n", 441 | "\n", 442 | "max_num = max([np.max(x_train[:,:,0]),np.max(y_train[:,:,0]),np.max(x_test[:,:,0]),np.max(y_test[:,:,0]),np.max(x_val[:,:,0]),np.max(y_val[:,:,0])])\n", 443 | "min_num = min([np.min(x_train[:,:,0]),np.min(y_train[:,:,0]),np.min(x_test[:,:,0]),np.min(y_test[:,:,0]),np.min(x_val[:,:,0]),np.min(y_val[:,:,0])])\n", 444 | "\n", 445 | "x_train[:,:,[0]] = x_train[:,:,[0]]-min_num\n", 446 | "y_train[:,:,[0]] = y_train[:,:,[0]]-min_num\n", 447 | "train_labels[:,:,[0]] = train_labels[:,:,[0]]-min_num\n", 448 | "x_test[:,:,[0]] = x_test[:,:,[0]]-min_num\n", 449 | "y_test[:,:,[0]] = y_test[:,:,[0]]-min_num\n", 450 | "test_labels[:,:,[0]] = test_labels[:,:,[0]]-min_num\n", 451 | "x_val[:,:,[0]] = x_val[:,:,[0]]-min_num\n", 452 | "y_val[:,:,[0]] = y_val[:,:,[0]]-min_num\n", 453 | "val_labels[:,:,[0]] = val_labels[:,:,[0]]-min_num\n", 454 | "\n", 455 | "x_train = x_train/(max_num-min_num)\n", 456 | "y_train = y_train/(max_num-min_num)\n", 457 | "train_labels = train_labels/(max_num-min_num)\n", 458 | "x_test = x_test/(max_num-min_num)\n", 459 | "y_test = y_test/(max_num-min_num)\n", 460 | "test_labels = test_labels/(max_num-min_num)\n", 461 | "x_val = x_val/(max_num-min_num)\n", 462 | "y_val = y_val/(max_num-min_num)\n", 463 | "val_labels = val_labels/(max_num-min_num)" 464 | ] 465 | }, 466 | { 467 | "cell_type": "code", 468 | "execution_count": 4, 469 | "id": "fantastic-cowboy", 470 | "metadata": {}, 471 | "outputs": [], 472 | "source": [ 473 | "dt = 0.1\n", 474 | "all_input=[]\n", 475 | "for i in range(int(pro_data.shape[0]*0.8), int(pro_data.shape[0]*0.9)-2*50):\n", 476 | " if pro_data[i, 0] == pro_data[i + 49, 0] and pro_data[i, 7] == pro_data[i + 49, 7] and pro_data[i + 49, 1] - pro_data[\n", 477 | " i, 1] == 49 and pro_data[i + 49, 0] == pro_data[i + 98, 0] and pro_data[i + 49, 7] == pro_data[i + 98, 7] and pro_data[i + 98, 1] - \\\n", 478 | " pro_data[i + 49, 1] == 49:\n", 479 | " no_use = pro_data[i + 50:i + 2 * 50]\n", 480 | " no_use = np.array(no_use)\n", 481 | " the_output = no_use[:, :]\n", 482 | " all_together = np.hstack((pro_data[i: i + 50][:, :], the_output))\n", 483 | " all_input.append(all_together)\n", 484 | "\n", 485 | "x_test = []\n", 486 | "y_test = []\n", 487 | "test_labels = []\n", 488 | "test_pre_labels = []\n", 489 | "for i in range(len(all_input)):\n", 490 | " temp = all_input[i]\n", 491 | " temp_y = temp[:30,11]-temp[0,2]\n", 492 | " temp_y_pre = temp[:30,12]-temp[0,2]\n", 493 | " temp_yv_pre = temp[:30,-1]\n", 494 | " begin_pos = temp[0,2]\n", 495 | " temp[:,2] = temp[:,2] - begin_pos\n", 496 | " temp[:,3] = temp[:,3] - begin_pos\n", 497 | " if temp_y[0]<=temp[-1,2] or temp_y[0]<=0 or temp_y[0]>=temp_y_pre[0]:\n", 498 | " continue\n", 499 | " yv_pre = temp[-1, 8]\n", 500 | " y_pre = temp[-1, 3]\n", 501 | " if yv_pre != 0:\n", 502 | " test_yv_pre = yv_pre*np.ones(30)\n", 503 | " test_y_pre = np.linspace(y_pre, y_pre+yv_pre*30*dt,31)[1:]\n", 504 | " else:\n", 505 | " test_yv_pre = np.zeros(30)\n", 506 | " test_y_pre = np.linspace(y_pre, y_pre+temp[-1,4]*30*dt,31)[1:]\n", 507 | " y_test.append([temp_y])\n", 508 | " test_labels.append([temp_y_pre, temp_yv_pre])\n", 509 | " x_test.append(temp[:,[2,4,5,6]])\n", 510 | " test_pre_labels.append([test_y_pre, test_yv_pre])\n", 511 | " \n", 512 | "x_test = np.array(x_test)\n", 513 | "y_test = np.array(y_test)\n", 514 | "test_labels = np.array(test_labels)\n", 515 | "test_pre_labels = np.array(test_pre_labels)\n", 516 | "y_test = y_test.transpose(0,2,1)\n", 517 | "test_labels = test_labels.transpose(0,2,1)\n", 518 | "test_pre_labels = test_pre_labels.transpose(0,2,1)\n", 519 | "\n", 520 | "x_test[:,:,[0]] = x_test[:,:,[0]]-min_num\n", 521 | "y_test[:,:,[0]] = y_test[:,:,[0]]-min_num\n", 522 | "test_labels[:,:,[0]] = test_labels[:,:,[0]]-min_num\n", 523 | "test_pre_labels[:,:,[0]] = test_pre_labels[:,:,[0]]-min_num\n", 524 | "\n", 525 | "x_test = x_test/(max_num-min_num)\n", 526 | "y_test = y_test/(max_num-min_num)\n", 527 | "test_labels = test_labels/(max_num-min_num)\n", 528 | "test_pre_labels = test_pre_labels/(max_num-min_num)" 529 | ] 530 | }, 531 | { 532 | "cell_type": "code", 533 | "execution_count": 5, 534 | "id": "systematic-avatar", 535 | "metadata": {}, 536 | "outputs": [], 537 | "source": [ 538 | "all_input = []\n", 539 | "for i in range(int(pro_data.shape[0]*0.8), int(pro_data.shape[0]*0.9)-2*50):\n", 540 | " if pro_data[i, 0] == pro_data[i + 49, 0] and pro_data[i, 7] == pro_data[i + 49, 7] and pro_data[i + 49, 1] - pro_data[\n", 541 | " i, 1] == 49 and pro_data[i + 49, 0] == pro_data[i + 98, 0] and pro_data[i + 49, 7] == pro_data[i + 98, 7] and pro_data[i + 98, 1] - \\\n", 542 | " pro_data[i + 49, 1] == 49:\n", 543 | " no_use = pro_data[i + 50:i + 2 * 50]\n", 544 | " no_use = np.array(no_use)\n", 545 | " the_output = no_use[:, :]\n", 546 | " all_together = np.hstack((pro_data[i: i + 50][:, :], the_output))\n", 547 | " all_input.append(all_together)\n", 548 | "\n", 549 | "begin_positions = []\n", 550 | "for i in range(len(all_input)):\n", 551 | " temp = all_input[i]\n", 552 | " temp_y = temp[:30,11]-temp[0,2]\n", 553 | " temp_y_pre = temp[:30,12]-temp[0,2]\n", 554 | " temp_yv_pre = temp[:30,-1]\n", 555 | " begin_pos = temp[0,2]\n", 556 | " temp[:,2] = temp[:,2] - begin_pos\n", 557 | " temp[:,3] = temp[:,3] - begin_pos\n", 558 | " if temp_y[0]<=temp[-1,2] or temp_y[0]<=0 or temp_y[0]>=temp_y_pre[0]:\n", 559 | " continue\n", 560 | " begin_positions.append(begin_pos)\n" 561 | ] 562 | }, 563 | { 564 | "cell_type": "code", 565 | "execution_count": 6, 566 | "id": "subjective-transaction", 567 | "metadata": {}, 568 | "outputs": [], 569 | "source": [ 570 | "from torch.utils.data import sampler, Dataset, DataLoader\n", 571 | "import torch\n", 572 | "\n", 573 | "class TrajectoryData(Dataset):\n", 574 | " \"\"\"\n", 575 | " A customized data loader for Traffic.\n", 576 | " \"\"\"\n", 577 | " def __init__(self,matrixs,labels,history_labels):\n", 578 | " \"\"\" Intialize the Traffic dataset\n", 579 | " \n", 580 | " Args:\n", 581 | " - data: numpy datatype\n", 582 | " \"\"\"\n", 583 | " self.matrixs = torch.DoubleTensor(matrixs.astype(float))\n", 584 | " self.matrixs = self.matrixs.to(torch.float32)\n", 585 | " self.labels = torch.DoubleTensor(labels.astype(float))\n", 586 | " self.labels = self.labels.to(torch.float32)\n", 587 | " self.history_labels = torch.DoubleTensor(history_labels.astype(float))\n", 588 | " self.history_labels = self.history_labels.to(torch.float32)\n", 589 | " self.len = matrixs.shape[0]\n", 590 | " # probably the most important to customize.\n", 591 | " def __getitem__(self, index):\n", 592 | " \"\"\" Get a sample from the dataset\n", 593 | " \"\"\"\n", 594 | " matrix = self.matrixs[index]\n", 595 | " label = self.labels[index]\n", 596 | " history_label = self.history_labels[index]\n", 597 | " return matrix,label,history_label\n", 598 | "\n", 599 | " def __len__(self):\n", 600 | " \"\"\"\n", 601 | " Total number of samples in the dataset\n", 602 | " \"\"\"\n", 603 | " return self.len\n" 604 | ] 605 | }, 606 | { 607 | "cell_type": "code", 608 | "execution_count": 7, 609 | "id": "given-compact", 610 | "metadata": {}, 611 | "outputs": [], 612 | "source": [ 613 | "trainset = TrajectoryData(x_train, y_train, train_labels)\n", 614 | "trainset_loader = DataLoader(trainset, batch_size=32, shuffle=True, num_workers=0, drop_last=True)\n", 615 | "valset = TrajectoryData(x_val, y_val, val_labels)\n", 616 | "valset_loader = DataLoader(valset, batch_size=32, shuffle=False, num_workers=0, drop_last=True)\n", 617 | "testset = TrajectoryData(x_test, y_test, test_pre_labels)\n", 618 | "testset_loader = DataLoader(testset, batch_size=32, shuffle=False, num_workers=0, drop_last=True)\n", 619 | "\n", 620 | "dataloaders={}\n", 621 | "dataloaders['train']=trainset_loader\n", 622 | "dataloaders['val']=valset_loader\n", 623 | "dataloaders['test']=testset_loader\n", 624 | "dataset_sizes={}\n", 625 | "dataset_sizes['train']=len(x_train)\n", 626 | "dataset_sizes['val']=len(x_val)\n", 627 | "dataset_sizes['test']=len(x_test)" 628 | ] 629 | }, 630 | { 631 | "cell_type": "code", 632 | "execution_count": 10, 633 | "id": "sustainable-mileage", 634 | "metadata": {}, 635 | "outputs": [ 636 | { 637 | "data": { 638 | "text/plain": [ 639 | "" 640 | ] 641 | }, 642 | "execution_count": 10, 643 | "metadata": {}, 644 | "output_type": "execute_result" 645 | } 646 | ], 647 | "source": [ 648 | "import math\n", 649 | "import copy\n", 650 | "import time\n", 651 | "import torch\n", 652 | "import torch.nn as nn\n", 653 | "import torch.nn.functional as F\n", 654 | "from torch.nn import TransformerEncoder, TransformerEncoderLayer\n", 655 | "from torch.nn import TransformerDecoder, TransformerDecoderLayer\n", 656 | "import torch.nn as nn\n", 657 | "import math\n", 658 | "\n", 659 | "def subsequent_mask(size):\n", 660 | " \"\"\"\n", 661 | " Mask out subsequent positions.\n", 662 | " \"\"\"\n", 663 | " attn_shape = (1, size, size)\n", 664 | " mask = np.triu(np.ones(attn_shape), k=1).astype('uint8')\n", 665 | " return torch.from_numpy(mask) == 0\n", 666 | "\n", 667 | "def _generate_square_subsequent_mask(sz):\n", 668 | " mask = (torch.triu(torch.ones(sz, sz)) == 1).transpose(0, 1)\n", 669 | " mask = mask.float().masked_fill(mask == 0, float('-inf')).masked_fill(mask == 1, float(0.0))\n", 670 | " return mask\n", 671 | "\n", 672 | "class PositionalEncoding(nn.Module):\n", 673 | "\n", 674 | " def __init__(self, d_model, max_len=5000):\n", 675 | " super(PositionalEncoding, self).__init__() \n", 676 | " pe = torch.zeros(max_len, d_model)\n", 677 | " position = torch.arange(0, max_len, dtype=torch.float).unsqueeze(1)\n", 678 | " div_term = torch.exp(torch.arange(0, d_model, 2).float() * (-math.log(10000.0) / d_model))\n", 679 | " pe[:, 0::2] = torch.sin(position * div_term)\n", 680 | " pe[:, 1::2] = torch.cos(position * div_term)\n", 681 | " pe = pe.unsqueeze(0).transpose(0, 1)\n", 682 | " #pe.requires_grad = False\n", 683 | " self.register_buffer('pe', pe)\n", 684 | "\n", 685 | " def forward(self, x):\n", 686 | " return x + self.pe[:x.size(0), :]\n", 687 | " \n", 688 | "class Transformer(nn.Module):\n", 689 | " def __init__(self, ninput, ntoken, ninp, nhead, nhid, nlayers, dropout):\n", 690 | " super(Transformer, self).__init__()\n", 691 | " self.model_type = 'Transformer'\n", 692 | " \n", 693 | " self.src_mask = None\n", 694 | " self.encoder_pos = PositionalEncoding(ninp)\n", 695 | " self.encoder_emb = nn.Linear(ninput, ninp)\n", 696 | " self.encoder_layer = TransformerEncoderLayer(ninp, nhead, nhid, dropout)\n", 697 | " self.encoder = TransformerEncoder(self.encoder_layer, nlayers)\n", 698 | " self.decoder_emb = nn.Linear(ninput, ninp)\n", 699 | " self.decoder_layer = TransformerDecoderLayer(ninp, nhead, nhid, dropout)\n", 700 | " self.decoder = TransformerDecoder(self.decoder_layer, nlayers)\n", 701 | " self.output_layer = nn.Linear(ninp, ntoken)\n", 702 | " self.init_weights()\n", 703 | "\n", 704 | " def init_weights(self):\n", 705 | " initrange = 0.1 \n", 706 | " self.encoder_emb.bias.data.zero_()\n", 707 | " self.encoder_emb.weight.data.uniform_(-initrange, initrange)\n", 708 | " self.decoder_emb.bias.data.zero_()\n", 709 | " self.decoder_emb.weight.data.uniform_(-initrange, initrange)\n", 710 | " self.output_layer.bias.data.zero_()\n", 711 | " self.output_layer.weight.data.uniform_(-initrange, initrange)\n", 712 | "\n", 713 | " def forward(self, inputs, dec_input, target_mask):\n", 714 | " src = self.encoder_pos(self.encoder_emb(inputs.transpose(0,1)))\n", 715 | " memory = self.encoder(src)\n", 716 | " inp_decoder = self.decoder_emb(dec_input.transpose(0,1))\n", 717 | " out_decoder = self.decoder(inp_decoder, memory, target_mask)\n", 718 | " output = self.output_layer(out_decoder)\n", 719 | " return output.transpose(0,1)\n", 720 | "\n", 721 | "ninput = 1\n", 722 | "ntoken = 1\n", 723 | "ninp = 50\n", 724 | "nhead = 10\n", 725 | "nhid = 50\n", 726 | "dropout = 0.1\n", 727 | "nlayers = 2\n", 728 | "Trans_model_location = r'\\Transformer_I80.tar'\n", 729 | "Trans_model = Transformer(ninput, ntoken, ninp, nhead, nhid, nlayers, dropout)\n", 730 | "Trans_model.load_state_dict(torch.load(Trans_model_location))" 731 | ] 732 | }, 733 | { 734 | "cell_type": "code", 735 | "execution_count": 11, 736 | "id": "representative-context", 737 | "metadata": {}, 738 | "outputs": [], 739 | "source": [ 740 | "import scipy.io as sio\n", 741 | "import matplotlib.pyplot as plt\n", 742 | "import numpy as np\n", 743 | "import pandas as pd" 744 | ] 745 | }, 746 | { 747 | "cell_type": "code", 748 | "execution_count": 52, 749 | "id": "clear-library", 750 | "metadata": {}, 751 | "outputs": [], 752 | "source": [ 753 | "a_item = (1.317/0.3048)/(max_num-min_num)\n", 754 | "b_item = (1.625/0.3048)/(max_num-min_num)\n", 755 | "v_d_item = (13.853/0.3048)/(max_num-min_num)\n", 756 | "s_0_item = (2.600/0.3048)/(max_num-min_num)\n", 757 | "T_item = 1.279\n", 758 | "dt = 0.1\n", 759 | "output_length = 30\n", 760 | "def model_IDM_test(inputs_IDM, his_labels, output_length):\n", 761 | " v_pred = torch.zeros((inputs_IDM.shape[0], output_length, 1))\n", 762 | " y_pred = torch.zeros((inputs_IDM.shape[0], output_length, 1))\n", 763 | " acc = torch.zeros((inputs_IDM.shape[0], output_length, 1))\n", 764 | " y = inputs_IDM[:,0]\n", 765 | " v = inputs_IDM[:,1]\n", 766 | " s = inputs_IDM[:,2]\n", 767 | " delta_v = inputs_IDM[:,3]\n", 768 | " \n", 769 | " s_x = s_0_item + torch.max(torch.tensor(0), v*T_item+((v*delta_v)/(2*(a_item*b_item)**0.5)))\n", 770 | " #s_x = torch.tensor(2.5)+ torch.max(torch.tensor(0), v*torch.tensor(1.25)+((v*delta_v)/(2*(torch.tensor(1.75)*torch.tensor(1.25))**0.5)))\n", 771 | " a_f = a_item*(1-(v/v_d_item)**4-(s_x/s)**2)\n", 772 | " #a_f = torch.tensor(1.75)*(1-(v/torch.tensor(30))**4-(s_x/s)**2)\n", 773 | " v_pred[:,0,0] = v+a_f*dt\n", 774 | " for i in range(len(v_pred)):\n", 775 | " if v_pred[i,0,0]<=0:\n", 776 | " v_pred[i,0,0]=0\n", 777 | " y_pred[:,0,0] = y+v_pred[i,0,0]*dt\n", 778 | " acc[:,0,0] = a_f\n", 779 | " \n", 780 | " for i in range(y_pred.shape[0]):\n", 781 | " for j in range(output_length-1):\n", 782 | " v = v_pred[i,j,0]\n", 783 | " delta_v = his_labels[i,j,1]-v_pred[i,j,0]\n", 784 | " s = his_labels[i,j,0]-y_pred[i,j,0]\n", 785 | " #s_x = self.s_0 + self.T*v - ((v * delta_v)/(2*(self.a*self.b)**0.5))\n", 786 | " #s_x = s_0 + v*T-((v*delta_v)/(2*(a*b)**0.5))\n", 787 | " s_x = s_0_item + torch.max(torch.tensor(0), v*T_item+((v*delta_v)/(2*(a_item*b_item)**0.5)))\n", 788 | " #acc_temp = self.a*(1-(v/self.v_d)**4-(s_x/s)**2)\n", 789 | " acc_temp = a_item*(1-(v/v_d_item)**4-(s_x/s)**2)\n", 790 | " v2 = v + acc_temp * dt\n", 791 | " if v2<=0:\n", 792 | " v2 = 0\n", 793 | " acc_temp = (v2-v)/dt\n", 794 | " y1 = y_pred[i,j,0]\n", 795 | " y2 = y1 + v2 * dt\n", 796 | " acc[i,j+1,0] = acc_temp\n", 797 | " v_pred[i,j+1,0] = v2\n", 798 | " y_pred[i,j+1,0] = y2\n", 799 | " \n", 800 | " return y_pred\n" 801 | ] 802 | }, 803 | { 804 | "cell_type": "code", 805 | "execution_count": 57, 806 | "id": "continued-queen", 807 | "metadata": {}, 808 | "outputs": [ 809 | { 810 | "data": { 811 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAA5QAAAFDCAYAAABFkkH2AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8vihELAAAACXBIWXMAAAsTAAALEwEAmpwYAACtbUlEQVR4nOzddXgWxxbA4d9ECCEJTvAS3Is7tLiVCi4FihWnxV0CpbhDcQlQ3KW4uwR3D24BAoQQ/eb+sYEbQogLSc77PHng+3Z3drbcy+HszJxRWmuEEEIIIYQQQoiwMovpDgghhBBCCCGEiJ0koRRCCCGEEEIIES6SUAohhBBCCCGECBdJKIUQQgghhBBChIsklEIIIYQQQgghwkUSSiGEEEKEi1JqW0z3QQghRNQL7u97i+jsSGyUMmVK7eDgENPdEEIIEQ1OnTrlqrVOFdP9iC0SJ05crWjRorL/mBBCxH1vvnRAEsoQODg44OzsHNPdEEIIEQ2UUndjug+xSfbs2SVGCiFEPKCUuvGlYzLlVQghhBBCCCFEuEhCKYQQQgghhBAiXCShFEIIIYQQQggRLpJQCiGEEEIIIYQIFynKE04mkwlXV1fc3Nzw8/OL6e6ISJYwYUIyZMiApaVlTHdFCCFiHR8fHx48eICnp2dMd0VEAYmRQoiAJKEMpwcPHqCUwsHBAUtLS5RSMd0lEUm01rx48YIHDx6QOXPmmO6OEELEOg8ePMDOzg4HBweJj3GMxEghRGAy5TWc3r17R/r06UmQIIEEyzhGKUWKFCnkzboQQoSTp6cnKVKkkPgYB0mMFEIEJgllBJiZyX++uEr+ESSEEBEjf4/GXfJnK4QISDIiIYQQsZ/vu5jugRBCCPH10TrKY6QklCJGuLi4oJTC19c3Wu9bvnx55s6dG633FEJEseeHYWM2eLAppnsiRIRJfBRCRBqTH5zqCju/i9KkUhLKOGz58uWUKFECGxsb7O3tKVGiBNOnT0drHdNdC5GDgwO7du2KUBuOjo40bdo0knokhPgq3ZoPuyuApR3YZY/p3ohYQuKjxEch4jw/TzjcCK5PgdTlwdw6ym4lCWUcNX78eP7880969erFkydPePr0KTNnzuTw4cN4e3sHeU1s2v4kut/cCiG+MiZfONUNjrcG+++h2nFIkiumeyViAYmPQog4z/sV7K0G91dDofFQeDyoqEv7JKGMg16/fs3gwYOZPn069erVw87ODqUUhQoVYsmSJVhZWQHQokULOnToQM2aNbGxsWHv3r1cuXKF8uXLkzRpUvLmzcvGjRs/tht4OoyTkxNly5b9+FkpxcyZM8mePTvJkiWjU6dOH9/2+vn50bNnT1KmTEmWLFn477//vtj/Zs2ace/ePX788UdsbW0ZM2bMxylA8+bN45tvvqFixYrs27ePDBkyfHLthze327ZtY8SIEaxYsQJbW1sKFCjw8Zy7d+9SpkwZ7OzsqFq1Kq6urhH7Dy6EiF7ebrDvB7g2CXL8AeW3QoJkMd0rEQtIfJT4KESc9+4+7CwHrkeh9DLI3T3Kbyn7UEaSrl27cvbs2Si9R8GCBZk0aVKI5x09ehQvLy9+/vnnEM9dunQpW7ZsYfPmzbx7945ChQrRqlUrduzYwaFDh/j5559xdnYmZ86coerj5s2bOXnyJG/evKFIkSL8+OOPVK9enTlz5rB582bOnDmDjY0NdevW/WIbixcv5uDBg8ydO5fKlSsDxpoSgP3793PlyhXMzMw4fvz4F9uoXr06/fv35+bNm/z777+fPfPWrVvJmDEjNWrUYNy4cYwaNSpUzyeEiGFvrsH+n+DdHSg+B7K1iekeiVD4WmKkxEeJj0LEaW4XYW918H0L5bdBmorRclsZoYyDXF1dSZkyJRYW/39fULp0aZImTYq1tTUHDhz4+P3PP/9MmTJlMDMz4+zZs7i7u9O3b18SJEhAxYoVqVWrFsuWLQv1vfv27UvSpEn55ptvqFChwsd/QKxcuZKuXbuSMWNGkidPTr9+/cL1bI6OjtjY2GBtHf554C1btiRHjhxYW1vToEGDKP9HjhAikjzaDttLgPdLqLhbkkkRZhIfgyfxUYhY7Ol+2FkWMEHlA9GWTIKMUEaa0IwcRpcUKVLg6uqKr6/vx6B55MgRADJkyIDJZPp4bsaMGT/+/tGjR2TMmPGT/TUzZcrEw4cPQ33vNGnSfPx9okSJcHd3/6TtgO2GR8A2wutLfRRCfKW0Nqa3nukJSfLBdxvA1iGmeyXC4GuJkRIfgyfxUYhY6t4qONIUbLNCha1gE76/R8JLRijjoFKlSmFlZcWGDRtCPDfg5sTp0qXj/v37nwTUe/fukT59egBsbGzw8PD4eOzJkyeh7lPatGm5f//+J+2Gtl9f+j5wf/z8/Hj+/HmIbQghYhE/LzjeBk53h/Q/Q5XDkkyKcJP4GHwbQohY6NoUONQQUhSDKoeiPZkESSjjpKRJkzJkyBA6duzI6tWrcXd3x2QycfbsWd69+/IeNB9KqI8ZMwYfHx/27dvHpk2baNSoEWCsT1m7di0eHh7cvHmTefPmhbpPDRo0YMqUKTx48IBXr16FuCYjderU3L59O9hzcuTIgaenJ//99x8+Pj4MHz4cLy+vT9pwcXH55B8AQohY5P1T2F0Rbs+HvAOh3GqwtI3pXolYTOLj/9uQ+ChELKdNcKYXnPoTMvwMFXaCVfIY6YoklHFU7969mTBhAmPGjMHe3p7UqVPTrl07Ro8eTenSpYO8JkGCBGzcuJGtW7eSMmVKOnbsyKJFi8iVyyjF361bNxIkSEDq1Kn57bff+PXXX0Pdn99//51q1apRoEABChcuTJ06dYI9v1+/fgwfPpykSZMybty4IM9JkiQJ06dPp02bNqRPnx4bG5tPqtrVr18fMKY4FS5cONR9FUJ8BV6ehu1F4dUZKLMCCvwVpSXPRfwh8VHioxCxnp8XHG4CV8ZB9g5QdjVYRN0+kyFRsWET35hUtGhR7ezs/Nn3V65cIXfu3DHQIxFd5M9YiBhydwUcawlWKY31kskLRdutlVKntNZFo+2GsZzEyPhL/oyFiCHer+BAbXi2HwqOgty9IRqmsQcXH6UojxBCiK+DNsH5wXDpb0hVBsquAevUMd0rIYQQ4uvw7h7sqwFvb0DpJeDQJKZ7BEhCKYQQ4mvg8xaONoMHGyBrayj6D5hbxXSvhBBCiK/Dq7Owryb4voMK2yF1hZju0UfRuiBFKVVfKbVRKfVQKeWulDqllGoc4HgCpdRKpdRtpdR7pdRzpdRWpVSRsLQTzP0dlFI6iJ/lUfG8QgghQsH9NuwoBQ83Q5EpUHyOJJNCCCHEB493ws7vQJkblVy/omQSon+EsjtwB+gGuAI1gaVKqZRa66mAOaCBkcAtILH/uXuUUoW01rdD2U5IegKHA3x2jfCTCSGECLune+FgPUBDhW2QpnJM90gIIYT4etxeBMdbQ5I8UP4/SJQh5GuiWXQnlD9qrQMmb3uUUukwEsSpWuv3QMOAFyildgEvgF+ACaFpJxT9uKa1PhbOZxBCCBFRWsP1f+B0V0ic0yi+Y5ctpnslhBBCfB20NmoKnB8EqStBuTWQIElM9ypI0TrlNVAS+MEZwD6Yy94BnkCCCLYjhBDia+DnBSfawqkukK4mVD0qyaQQQgjxgcnHiJPnB4FDUyi/5atNJuHr2IeyNHA54BfKYKGUSgOMAfyAZWFtJxgLlFJ+SqnHSqkJSqmY27hFCCHik/dPYXdFuDUX8g6A79aDZeKY7pUQQgjxdfB5C/t//H+cLLUIzBOEfF0MitGEUilVCfgZ+CfQoT6AD/AY+A2oqbW+G452AvPyP6c1UAmYBXQAvliU5/nz5xQtWvTjz+zZs0O4hRBCiCC9PAXbi8KrM1BmBRQYDiriYejhQ6hZE65ciYQ+CiGEEDHF4yHsLAdPdhkF6goMj5Y9JiMqxrYNUUo5AEuBDVprp0CHnYBdQFqgI7BZKfWd1vqzEcgQ2vmE1vox0DnAV/uUUk+B6Uqpglrrs4GvSZUqFUFt2hzbOTo6cvPmTf7999+Y7kqY1KhRg0aNGvHbb7/FdFeEEGHhstQoKmBlD1WPQLKCkdLsrl3QpAl4eMDNmyD7rIvIIDFSCBHt3C4Y24J4u8H3/0G6ajHdo1CLkRFKpVRyYCtwD2ga+LjW+onW2llrvQn4EaMoT9+wthNKq/1/LRzO679aTk5O5M+fn0SJEpEmTRo6dOiAm5tbTHcr1BwdHWna9NM/1q1bt0qgFCI2MfnBmT5w5FdIXgyqn4yUZNJkgmHDoGpVSJUKTp6EH3+MeHdF/CExUgjx1Xi8E3aUAW2CKgdjVTIJMZBQKqUSAZsxiuz8oLV+F9z5Wmtf4AKQJSLtBHeLQL/GCePHj6dPnz6MHTuW169fc+zYMe7evUuVKlXw9vaOlj74+vpGy32EEF8pbzdjHciVMZCtPVTcBQkjXjvt+XOoUQOGDIGmTeHECRmZFGEjMVII8dW4tcAYmbR1gKrHIm0GT3SK1oRSKWUBrAKyAzW01s9CcU1CjNHDOxFpJxj1/H89FYE2vipv3rxhyJAhTJ06lerVq2NpaYmDgwMrV67k7t27H6fweHp60rBhQ+zs7ChcuDDnzp372Mbo0aNJnz49dnZ25MyZk927dwNgMpkYNWoUWbNmJUWKFDRo0ICXL18C4OLiglKKefPm8c0331CxYkWqV6/OtGnTPulfgQIFWLt2LQB//vknGTNmJHHixBQpUoSDBw8CsG3bNkaMGMGKFSuwtbWlQIECAJQvX565c+d+7Mvw4cPJlCkT9vb2NG/enNevX3/Sl4ULF/LNN9+QMmVK/v7776j6Ty6ECOzNNdhREp7shGIzoPiMSCkqcOQIFCoE+/fD7NmwcCHY2ERCf0W8ITFSYqQQXwWt4fwQON4KUpeHygfBJmNM9ypconsN5XSgJvAnkFwpVTLAsTNAHaAGsA14xP/XUKbl/3tQhtiO1toLQCl1E9ivtW7t/9kRsAMOA2+A74BewFqt9fkIPVnXrnD2bISaCFHBgjBpUoinHTlyBE9PT+rUqfPJ97a2ttSoUYOdO3eSM2dONmzYwLJly/j333+ZPHkyv/zyC9evX+f27dtMmzaNkydPki5dOlxcXPDz8wNgypQprF+/nv3795MqVSr++OMPOnXqxLJl/y/Cu3//fq5cuYKZmRmrVq1i1qxZdO5sLF29fPkyd+/e5YcffgCgWLFiDB48mCRJkjB58mTq16+Pi4sL1atXp3///sGuYXFycsLJyYm9e/d+DJadO3dm8eLFH885dOgQ165d4/r16xQvXpw6deqQW4YyhIhaD/+DI03AzAoq7Qb77yLcpNYwcSL06QOZMsHRo0ZiKQxKqWwY8awkkA84qLUuH+icpBix9BeM2T0HgS5a65uBzsuDsadzKcANmAsM1Vr7hbuDEiM/3kdipBACP2848TvcWQRZWkDx2WBmGdO9CrfonvJa1f/XycDRQD9pgWtACoyAtx0YiZFYFtVanwlDOx9YAOYBPl8FvgcWAFuAJsBY/1/jDFdXV1KmTImFxefvC9KmTYurq7GNZ5EiRahXrx6WlpZ0794dT09Pjh07hrm5OV5eXly+fBkfHx8cHBzImjUrALNmzeLvv/8mQ4YMWFlZ4ejoyOrVqz+ZuuPo6IiNjQ3W1tbUrl2bs2fPcveuUaR3yZIl1KlTBysrKwCaNm1KihQpsLCwoEePHnh5eXHt2rVQPeeSJUvo3r07WbJkwdbWlpEjR7J8+fJP+jJkyBCsra0pUKAABQoU+OQNsxAikmkNl0YY01xts0J150hJJt3coG5d6NHDWCd56pQkk0HIi/Gi9br/T1BWANUwXsY2wYi3u5VSH/dtUUolwyiKpzGqpw8DegBDo6zn0UxipMRIIWKU9yvYV91IJvMPhRLzY3UyCdE8Qqm1dgjhFBfgh0hoJ8jztNbLCWaLkAgJxVvR6JIyZUpcXV3x9fX9LGA+fvyYlClTApAx4/+H1c3MzMiQIQOPHj2iXLlyTJo0CUdHRy5dukS1atWYMGEC6dKl4+7du9SuXRszs/+/izA3N+fp06cfPwds187Ojh9++IHly5fTp08fli9f/snWK+PHj2fu3Lk8evQIpRRv3rz5GMxD8ujRIzJlyvTxc6ZMmfD19f2kL2nSpPn4+0SJEuHu7h6qtoUQYeT7Do61hHurIFNjKDEXLBJFuNnTp6F+fbh3D8aPh27dgqig7ukJ48ZBhw6QIkWE7xlLbdJabwBQSq0GUgY8qJQqhfEytpLWeo//d8cxlpO0Bcb5n9oesAbqaK3fADv9E05HpdQY/+/CTmLkx88SI4WIx9xvw74fwP0WlFoMmcNbU/TrEqP7UIqoUapUKaysrD6uwfjg3bt3bN26lUqVKgFw//79j8dMJhMPHjwgXbp0ADRp0oRDhw5x9+5dlFL06dMHMALh1q1bcXNz+/jj6elJ+vTpP7alAv1rr3HjxixbtoyjR4/y/v17KlSoAMDBgwcZPXo0K1eu5NWrV7i5uZEkSRK01kG2E9iH4P3BvXv3sLCwIHXq1GH67yWEiCD3O7CjNNxbDQXHQOklEU4mtYYZM6BUKfD2NtZMdu8eRDJ5+LAxXDloEKxbF6F7xmZaa1MIpxQEfIH9Aa55Cpzn0xe5NYDtgRLH5RhJ5veR0tkYJjFSYqQQMcL1GGwvCZ5PocLOOJNMgiSUcVKSJEkYMmQIXbp0Ydu2bfj4+ODi4kL9+vXJkCEDzZo1A+DUqVOsXbsWX19fJk2ahJWVFSVLluTatWvs2bMHLy8vEiZMiLW1Nebmxszh9u3bM2DAgI9B6vnz52zYsCHY/tSsWZO7d+8yePBgGjZs+PHN7du3b7GwsCBVqlT4+voybNgw3rz5/79hUqdOjYuLCyZT0P9Oaty4MRMnTuTOnTu4u7vTv39/GjZsGOQ0JiFEFHm6F7YXg3f3oPwWyNMrwpswv31r7C3ZsSNUqgRnzkDp0kGc1LkzlCtnbEK5bRu0aROh+8ZxCQHfINZBegEBF83lwlge8pHW+h7g4X8s1pMYKTFSiGh3bzXsrgCWdlD1KKSOE+/nPpKEMo7q3bs3I0aMoGfPniROnJgSJUqQMWNGdu/e/XFtxs8//8yKFStIliwZixcvZu3atVhaWuLl5UXfvn1JmTIladKk4dmzZ4wYMQIwKs799NNPVK1aFTs7O0qWLMnx48eD7YuVlRV16tRh165dNGny/+Wq1apVo0aNGuTIkYNMmTKRMGHCT6YC1a9fH4AUKVJQuPDn24S2atWKZs2a8d1335E5c2YSJkzI1KlTI/zfTggRClrDtamwpwpY2UO1E5CueoSbPX8eihaFlSthxAjYvBlSpgx00pYtkDcvTJ8OXbrApUtQLXbt2RUDbgIJlVL5P3yhlLLGKOCTPMB5yTAK8QT2yv/YJ54/f07RokU//gScrvk1kxgphIgWWsPlsXCoPiQrbGwLkjhnTPcq0qkPUydE0IoWLaqdnZ0/+/7KlStSCS2Okz9jIb7AzxNOdoTbCyD9T1B6MVgmDvm6YGgN8+cbg47JksGyZfB94Be4rq5GtdAlSyBPHpg715gTG4mUUqe01kUjtdFo9mENZcAqr0qpBBgjj0+AlhiVzkcBvwI+Wmtr//N8gJ5a68mB2nwIOGmtBwT8XmJk/CV/xkKEwOQLzp3h5iz4pgGUWgjmCWOkK+/eRXyLreDio4xQCiGECD2Ph7DreyOZzDcYvlsX4WTy3Tto0cKYsVqmjDHF9ZNkUmsjw8yd2xi6HDLEqNYTyclkXKa19gYaAakxEstHQBZgEfA0wKmvgKRBNJGEoEcuhRBCBObzBvbXMpLJPP2gzLIYSSbd3aF1ayhbFry8ou4+MpFeCCFE6Dw/DAfrGhVdy62DjL9EuMkrV6BePeNXR0cYOBDMA272dP++Ub31v/+geHGYNw/y5YvwfeMjrfUJ//0qc2Csp7yllNoMHAtw2lUCrZVUSmUEbAi0tlIIIUQQ3t0zksnXV4yK51lbx0g3jh+HX3+F27ehb98IlzcIloxQCiGECNmNWUZBAQs7qHY8UpLJRYuM9ZKurrBjhzHw+DGZNJmMMq9588LevTBhAhw5IslkBGnDNf9kMjtQGZgX4JStQDWllF2A7xoC7wlQIVYIIUQQXpyE7cXh3V2jUF0MJJO+vjBsmDHjx8cH9u0zahIkSBB195QRSiGEEF/m5w2nusDN2ZC2BpRZCgmSRqhJDw+jls78+cbU1qVLwX83BsO1a/D773DwIFSuDLNnQ+bMEbpnXKeUSgTU9P+YHkislKrn/3mL1tpDKTUIY5TRFcgPDAKWa613BmhqJvAHsFYpNRpjWqwjMCHce1AKIUR8cH8tHGkKCVNDpT2QJE+0d+H2bWjaFI4eNUYnp02DpEmj/r6SUAohhAja+8dwsB64HjHWgHz7F5iZh3xdMK5cgfr14fJlY3rrkCHwcRcDHx8YO9Z4tWptDQsWwG+/Re08nbjDHlgV6LsPnzMDLkAKYBKQErgPjAPGB7xAa/1KKVUJmAZswlg3OREjqRRCCBGY1nBlDJztCylLwXfrIaF9tHdh4ULjZa25ufGitnHj6Lu/JJRCCCE+53ocDtYBbzcouxK+qR/hJhcvhvbtjUpz27ZB1aoBDp46ZVQOOHfOWFQ5dSqkSRPhe8YXWmsXINjMW2vdFegairYuAxUjo19CCBGn+XnDyQ5wez5kagQlF0R78Z2XL6FdO1i9Gr77zlhOkilTtHZB1lAKIYQI5NYC2PUdmFkZGzBHMJn08DAquDZvbqyZPHs2QDLp4QG9e0OJEvDsGaxdC6tWSTIphBDi6+b9CvZVN5LJfIOg9JJoTyZ374Zvv4X162HUKNizJ/qTSZARSiGEEB/4ecPpbnBjOqSpDGWWg1WKCDV59aoxxfXiRRgwwKjk+nGK6759xlrJmzeNjHPs2OhZ7CGEEEJExNubsO8HeHcHSi2CzM2i9faentC/P0ycCDlzwoYNUKRItHbhE5JQCiGEgPdP4FB9eH4IcveCAiPALGIhYskSYxqOtbUxxbVaNf8Dbm7GqOScOZAli/GKtaLMsBRCCBELPDsIB2sbCxcr7gL776L19ufPGwV3Ll6ETp1gzBhIlChau/AZmfIaRzk4OGBtbY2trS2pU6emZcuWuLu7U758eebOncuSJUuwtbXF1tYWa2trzMzMPn62tbX9rL0WLVowcOBAAFxcXFBKfTw3derU1KpVi507d37WhwQJEuDq6vrJ9wULFkQphYuLS5Q9vxAiDFyPw7Yi8PK0MSpZaEyEkkkPD2M5ZNOmULiwMcX1YzK5fj3kyWPsJ9mzJ1y4IMmkiHYSI4UQ4XJ7IeypBAlSGFtoRWMyaTIZO2gVKwbPnxvbM0+bFvPJJEhCGadt2rQJd3d3Tp8+zcmTJxk+fPjHY7/++ivu7u64u7uzdetW0qVL9/Gzu7t7qNp3c3PD3d2dc+fOUaVKFWrXro2Tk9Mn52TOnJlly5Z9/HzhwgXev38fKc8nhIgEt+YFWC95BDI1jFBzly9D8eJGgdYBA4z1HOnTA0+fQoMGULs2pEpl7Lg8duzXEQlFvCQxUggRatoEZ/vBsRaQqpxRX8AuW7Td/v59qFIFevSAGjWMd7E1a4Z8XXSRhDIeSJ8+PTVq1ODixYtR0n6aNGn4888/cXR0pE+fPphMpo/HmjVrxqJFiz5+XrhwIc2bN4+SfgghwsDPG050gONtwL48VHeGZAUi1KSTk1F059kz2L4dhg8HC3NtHMid21jk8fff4OxsnCjEV0BipBAiWL7vjC20Lo+CbG2hwjawSh5tt1++3Ci8c/w4zJ0L69YZ72VDzdMTzpyJsv6BrKGMNF27GtO6olLBgjBpUtivu3//Plu2bKFOnTocPHgwsrv1UZ06dejVqxfXrl0jd+7cAJQsWZLFixdz5coVcuTIwYoVKzh06NDHqUFCiBjwyf6SfeDbvyO0v6S7u7GOY9EiqFDBWDuZNi3g4gJt28LOnVCmjBEJc+WKtMcIyNPXEytzK5TsWflVkhgpMVKIWMnjIez/EdzOQeGJkPPPaNsb+fVrI7YuWWIUQv/3X8gWlkHR589h+nT45x+jz3fvQsKoqUIrI5Rx2C+//ELSpEkpW7Ys33//Pf3794/S+6VLlw6Aly9ffvL9hzewO3fuJFeuXKRPnz5K+yGECIbrMWO95KuzUGYFFBwVoWTywgVjPcfixTBkiJE7prX3M/5lnzcvHD1qBLMDB6IkmfT282bGyRlknZKVXbd3RXr7Iu6SGCmECNYLZ9hezKjo+t1GyNU12pLJffsgf35jdNLREQ4dCkMyefWqURHvm2+Mi4sXNxqysoqy/kbrCKVSqj7QDCgCJAGuAeO01sv8jycA/gWKAmkBd8AZGKi1PhWorTzAVKAU4AbMBYZqrf1C6EMSYBLwC0ZCvRn4Q2v9IiLPFp63olFt/fr1VK5cOczXjRgxghEjRgDQtGlTZs6cGarrHj58CEDy5J9OA2jWrBnfffcdd+7ckak8QsQUreHmbDjVBRJlhGrbIWn+CDU3bx506WLs9LFrl39tnUuXjIo8x48bCzxmzDCCWiTzM/mx9MJShuwbwh23O5TJWIakCZNG+n1E5JAYKTFSiFjl3mo42hwS2hv1BZLmi5bbenkZ9QcmTDASyMOHjdHJEGkN+/fD+PGwebORPDZvDt26GUtOolh0T3ntDtwBugGuQE1gqVIqpdZ6KmAOaGAkcAtI7H/uHqVUIa31bQClVDJgF3AZ+BnICozHSBBDmieyAsgJtAFMwGhgPVAu0p4yluvfv3+43tSuW7cOe3t7cubM+cn3mTJlInPmzGzZsoV58+ZFVjeFEKHl5wnOnY0CPGlrQJklkCBZuJt78wbat4dly4wiAYsXQ+pk3jB0pLFGMkkSY45O48aR/jZXa836q+sZuHcgl59fplCaQmxpsoXq2arLdFcRLSRGChGHaQ2XRsD5gZCyNHy3zkgqo8GFC8Z2IBcuGDF23DiwsQnhIh8fWLXKSCRPn4aUKY3pQh07gn309BuiP6H8UWsdsD72HqVUOoxEc6rW+j3wSYlBpdQu4AXGiOIE/6/bA9ZAHa31G2CnUiox4KiUGuP/3WeUUqWAasD3WusD/t89BI4rpSprrWW+VDg8ffqUVatWMXToUCZPnoyZ2eczqefNm8erV6+wsbHB19c3BnopRDz17j4crAsvT0LegZDfMUJTXE+fhoYN4fZto+hOv35gduIYVGpjjE7++qux03KYKgaETGvNzts7GbBnAM6PnMmZIicr662kbp66mClZvSG+XhIjhYgl/Dzh+O/g8i84/Aol5oJ51Kw5DMhkMsJm//6QLJkxwPjDDyFc9Pq1sZfzlClGCdicOWHWLGjWzNj8OZpFa0IZKJn84AzGKOOXvAM8gQQBvqsBbA+UOC7HGG38Htj0hbZqAE8/JJP+fTqhlLrjf0wSyjBImjQpWmtsbGwoWrQoq1atonr16kGemzVr1mjunRCCp/vgUAMjSJZbBxl/CXdTWhv7XfXsaeSK+/ZBuULu0H2gEdAyZDA2xYqCOuaH7x1mwJ4B7L+7n0xJMrHg5wU0/bYpFhHYK1OIqCYxUohY5P0TOFAbXhyDb4dD3v7Rsl7y3j347Tcjpv7yC8yeHcL7WBcXmDzZKHLn7g7lyxt1Cn74AYJ4WRVdvoZoXBpj6upHypi3ZA6kBHoAfsCyAKfkAvYEvEZrfU8p5eF/7EsJZS7gahDfX/E/Fmd8aUPkffv2ffZd+fLlefDgQbDtBdw7y8HBAa11uPtgYWERquuFEOGkNVybDGd6gl12I5lMEv6/4l69glatYP16qFXL2GMy5ZmdkL+tEdw6dYKRI8HOLtIeAYxEcuj+oey8vZPUNqmZWmMqvxf+HSuLqCssIOIHiZFCiI9enoEDP4PXCyi3BjLWifJbag1Llxrh088P5s+HFi2CyWFPnDCmta5ebSSODRtC9+5QuHCU9zU0YjShVEpVwhidbBXoUB+MdZQAz4GaWuu7AY4nwyjEE9gr/2NfEtx1WYK64Pnz5xQNsF9a27Ztadu2bTC3EEKIGOTrYUzZubsUMtSGUk5gmTjczR09Co0awePHRpGArs1fonp0h4ULjSk2Bw9C2bKR13/g0L1DOO5zZPed3djb2DO2ylg6FO2ATYKQFpMIIYQQYXBvjVF8xyoFVD0MyQpG+S1fvoQOHWDlSmNHrUWLIEtQWYifH2zaZCSShw4Z9Ql69DCq4WXMGOX9DIsYSyiVUg7AUmCD1top0GEnjOmnaYGOwGal1Hda64AjmUG9vlNf+D6gMF2XKlUqnJ2dQ2hSCCG+Au634UAdcDsPBf6GPH0hnOsLTSYYO9aoNvfNN3D4kKbY3dWQp7MRDQcMgIEDI3VPqwN3DzB0/1D23NmDvY0946qMo33R9pJICiGEiFxaw8XhcGEwpChpFN+xThPlt922zZjx4+oKI0ZA795gHrisgbs7ODkZ5bFv3QIHB+P3rVpF+kygyBIjCaVSKjmwFbgHNA18XGv9BHjif+5W4BLQF/hQT/sVkDSIppMQ9AjkB6+AoGYmJw3hOiGE+Lo92gpHfjWCZPktkC7otVqh8eyZUW18+3aoXx/mDH1Ekn4dYcMGKFIEduyAAgUirev7XfYzdP9Q9rrsJbVNaiZUnUC7ou1IZJko0u4hhBBCAOD7Ho63grvLwaEZlJgd5cV33r0zksfp040tmrdsgYIFA5304IFRrGDWLHBzg1KljOUktWuDxdewSvHLor13SqlEGHs/JgB+0Fq/C+58rbWvUuoCn05JvUqgNY9KqYyADUGvkQx4XVDbg+TC2DpECCFiF22Ci3/DhSGQ9Fv4bi3YBjmDP1R274amTY11kzOnm2hrNhdVspdRmnzsWOjaNVICm9aarTe3MvrwaA7cPUAa2zRMrDaRtkXaSiIphBAiang8ggO/wEtnKDgKcveO8uI7x48bxVdv3jSWPf79d6DJPadOGWVeV6wwpgfVrWvsH1mqVJT2KzJFa0KplLIAVgHZgTJa62ehuCYhUBg4HODrrUAvpZSd1vqt/3cNgffA/mCa2woMUkqV1Vof8m+/KEayujWszyOEEDHK281Y+/FwEzg0heKzwCJ8yZivLzg6GlNwcuaE7XPu8e14/9JzFSoY5ckjoRKlt583yy8uZ+yRsVx8dpEMiTMwsdpE2hVph7Vl9Jc6F0IIEU+8PAX7fwKf1/DdesjwU5TezsfH2F7r778hfXrYs8coygoY6yM3bzYSyf37jamsXbrAH38YU1xjmegeoZwO1AT+BJIrpUoGOHYGqIOxfcc24BH/X0OZlv/vQQkwE/gDWKuUGo2REDoCEwJuJaKUugns11q3BtBaH1VKbQcWKaV6AiaMrUYOyR6UQohYxe2iUeL8nQsUmQo5OoX7Leu9e9CkCRw+DK1amJiSdTI29fuDlZWRSLZuHeE3uG+83jDn1BwmHZ/EgzcPyGefj0W/LKJhvoYkME8QcgNCCCFEeLksh+MtIWFqqHIEkn0bpbe7etUYlXR2NrYFmTzZqKmDu7tRKn3yZGN95DffGEV3Wrf2PyF2iu6Esqr/r5ODOJYZuIaxpnICRkXWx8BxoKjW+tKHE7XWr/wrxE7D2CLEDZiIkVQGZIGx/UhAjfzPnQ+YYUy//SO8DySEENHu7go41sqo3lp5H6QqE+6m1q831vn7+MCSES40WV0XnE4bG2L98w+kSxehrj5++5gpx6cww3kGr71eU96hPLNrzaZ6tuqoaNjjSwghRDymTXBuIFweCanKQbnVkNA+ym5nMhmhs3dvsLGBNWugTh2MN7fDpxovaV+/hpIljSlBdep89esjQyNan0Br7RDCKS7AD6Fs6zJQMaz301q7AS39f4QQIvYw+cCZPnBtopFEll0F1mnD1ZSnJ/TqZaz/L1zIxIriE8g2qC+kTAmrVhlrOCKQ8D1484C/D/zN/LPz8TX5Ujd3XXqV7kWx9MXC3aYQQggRaj5v4EhTY1lI1t+h6DSIwhkx9+9Dy5ZGLYIffoC5cyGNyzFoONHILOH/6yNLlgy+sVgm9qfEQggRH7x/CocbwrP9kKMLFBoX7sB47Zqxt+TZs9CtwQNGnq6O1axLxq7K48dD8uTh7ubzd88ZeWgk009Ox6RNtCrUip6le5ItebZwtymEEEKEydtbcOAneHPNSCSzd4yy4jtaw7//Gksg/fxgzkxfWiddi6o9EY4dM6aydutmnPDNN1HSh5gWvg3KxFfPwcGBXbt24eTkhLm5Oba2ttja2pI5c2ZatmzJ9evXP57r4uKCUorChQt/0oarqysJEiTAIRYuDhYiTnl+FLYVhhcnoNQiKDolXMmk1rBwobHzx/17JjbV+IcJKzNi5fvO2ApkwYJwJ5Nunm4M2jOIzJMzM/n4ZJrkb8L1LteZWWumJJPiqyMxUog47Mlu2F4M3j+BCjsiVGMgJM+fQ716xlZb+XP5cK79DNoMz4xq1NA4OGWKsR3I2LFxNpkESSjjhVKlSuHu7s7r16/ZtWsX1tbWFClShIsXL35y3rt37z75bunSpWTOnDm6uyuE+EBruP4P7P7e2COr6lHI3CxcTb19axQIaNECiji84GyCEtTa1sXYBuTiRahSJVztvvN+x8iDI8kyOQvDDw6nVo5aXO54mfk/z8chqUO42hQiOkmMFCKO0BquTYW91cA6HVQ/CWmCXR0XIZs2Qf78sHmTiTHFVrLvTBKyjOsIuXIZ+zZfu2aMStraRlkfvhaSUMYj5ubmZM2alenTp/P999/j6Oj4yfFmzZqxcOHCj58XLVpE8+bNo7mXQggAfD2MLUGcO0OaqlDdGZIVCFdTzs5QqBAsW6YZlm8ley7ZkyHFezh61ChZbmMT5ja9fL2YcnwKWadkpf+e/pT5pgxn2p1heb3l5EyZM1z9FCImSYwUIhbz84YTbeHUH5DuB+MFbAT2ZA7OmzfQppWJn36CNO43cPYpQK9LLTFv9RtcugQ7d8JPP4F54LqgcZesoYwsp7rCq7NRe49kBaHIpEhpqk6dOvTr1++T75o2bUq5cuUYNWoU169f5+3bt5QoUYI5c+ZEyj2FEKH09hYcrANuFyD/MMg3AFTY3/+ZTEa+2K+fJk1iD/bbNqDstZ3gOBj69YME4Zk2q1l5aSV9d/fFxc2Fipkrsq7COkpljD0bMIsYIDFSCBFV3j+FQ3Xh+WHIOwC+HRaumBka+7e8o0VTH+69sqMfIxiSciFWQ9sa5dKTJYuSe8YGklDGU+nSpePly5effJchQwZy5szJrl272Lt3r7x5FSImPNxsVKVTZlB+C6SrHq5mnj41prdu2wa1Ux9h7tMfSV4yJ8w9A3nzhqvNo/eP0mNHD44+OEqB1AXY2WwnlbNUDldbQnzNJEYKEUu8cIaDtcHrBZReBg6NouQ27x+8YEC9a0w6XpIsPOZg4UGUHlwZal2OVyORXyIJZWSJpLei0eXhw4ckD6L4RvPmzXFycuLIkSMcOHCAGzduxEDvhIiHTH5wcShc/AuSFYJya8A2fOuzdu6EZs00bi/8mJ6gJ+3fzEFNHgmdOoUr8N15dYe+u/uy8tJK0tqmZf5P82leoDnmZhJERShJjBRCRLY7/8KJ38HKHqochuSFIv8ejx/j3GsFzZbW4KouTcfMWxmzMDU25aZG/r1iMVlDGU+tW7eOcuXKffZ93bp1+e+//8iSJQuZMmWKgZ4JEQ95vYD9tYxkMktLIzCGI5n08YG+faFaNU2KNy6c9C1Eh/JXUJcvwR9/hDmZdPN0o9eOXuT6Jxebr29myPdDuNHlBi0LtZRkUsRpEiOF+IqZ/OBMLzjaDFIUN2oMRHYy6eKCT7vODMk4n5JLOvPWOhU7Zrvwz+0a2JQrHPL18YyMUMYjfn5+3Lt3jwkTJrBv3z6OHj362Tk2Njbs2bOHZPF4HrgQ0erFSThYDzyfQLGZkK1tuMqb37oFTRqbOHHSjLZm85iY0JFEM0cYpV3D2J6Pnw+zTs3CcZ8jL9+/pEXBFvxV4S/SJ04f5n4JEVtIjBQiFvB+BYcawZMdkL0TFJkIZpaR1/61azBiBJf+PUNz7cRpXZhmv7xlyoLkJE0a/j2a4zpJKOOBo0ePYmtri9aalClTUr58eU6ePEnu3LmDPL9o0aLR3EMh4iGt4dYccO4CCdNAlUOQoli4mlqyBDq088P8/TtW0Yp69S1g8ilInTrMbW29sZXuO7pz1fUqFTNXZHzV8RRMUzBc/RIiNpAYKUQs4XYJDvwMHveg+BzI1iby2n70CBwd8Zu7gInmPRmo5pI4mRlrZkOdOnaRd584ShLKOMrFxeXj71u0aBHsuQ4ODmitgzxWuXLlT9oSQkQCXw842RHuLIS01aD0ErBKEeZm3r6Fzu19WbTUgrIcYYl9d76ZM8goVx5Gl59fpvv27my/tZ3sybOzsdFGauWohYqizaCFiEkSI4WIZR5sMArWWdhApX2QqnTktPv6NYwZAxMncsvnG1qkvcqhR1n55ReYNQvs7SPnNnGdJJRCCBGd3t6Cg3XB7RzkG2z8hGM9orMzNP7Fg9sPrRiCIwN/f4bF2F2QJEmY2nH1cMVxnyMznWdiZ2XHxGoT6VisIwnMw76liBBCCBGptMmoL3DBEZIXg+/WQqIMEW/XywtmzIDhw9EvXjCz2Dx6XmyB5TsznJygefNwrT6JtyShFEKI6PJgk1FEQJnB9/9B+pphbsJkggnDPeg/NAFpTK7sS9+Hckvaw/ffh6kdHz8fpp+cjuN+R956vaV90fY4lnckZaKUYe6TEEIIEel83sCRZvBwI2RuDsVngXnCiLVpMsHSpTBwINy9y/1yTWjtM4OdxxJTpQrMmwcZM0ZO9+MTSSiFECKqmXzh/GC4PBKSF4Gyq8HWIczNPHkCv1V/yo5zqanDWub8cYHko+aDtXWo29Bas+XGFnrs6MG1F9eomrUqE6pOIK99+PamFEIIISLd66tw8Bd4exOKTIEcnSM+ZLh7N/ToAefOoQsWYmH9zfw5Ox9+fsZgZbt2MioZXpJQCiFEVPJ8BoebwNPdkPV3KDolXG9Yty55SYvWZrzxSszMjH/Tdl0NVJE6YWrj3JNz9NzZk123d5EjRQ42N95Mzew1ZZ2kEEKIr8eDjcZ6SfOEUHE3pA7bDJzP23sA3bvDqlWQOTNPpq+l3dZf2DhOUa4cLFgAWbNGTtfjK0kohRAiqjw/DIcagPdLKDEPsrYKcxOe7zV9f7rM5F15ya8usKfrcfKO6Q2WoS+T/ujtIwbuGYjTWSeSWSdjUrVJdCzWEUvzSCy1LoQQQkTEJ+sli0K5tWATgfmn3t4waRIMGwZ+fjBsGKuy9KHDnwlwd4fx4+HPP8O8RbMIgiSUQggR2bSGa5PgTG+wyQRVj0KygmFu5vL2+zSp580597x0SbeGMdu+JWH+0JdJf+f9jrFHxjL2yFh8Tb50L9WdAeUGkMxa9tATQgjxFflkveRvUGwGWIR+Ocdn9uyBTp3g6lX46SdeDJlCpzGZWDEYihWDhQvhCzsDiXCQhFIIISKTzxs41hrur4YMv0DJBZAgaZia0L5+zG68h26ry2CjPNjcaSs/TKkNZmahut7P5MfCcwsZuGcgj90f0yBvA0ZWGkmWZFnC/jxCCCFEVPq4XvIWFJkKOTqFfzHjw4fQsycsXw6ZM8OmTWzwq0W7mvDyJfz1F/TtCxaSAUWq0P3rJJIopeorpTYqpR4qpdyVUqeUUo0DHE+rlBqrlDrnf/y+UmqhUipdoHb2KaX0F35KBXN/hy9cszwqn1sIEU+4XYBtxeDBOig4xpiuE8Zk8sXBy9S1P0D71VUom/Ia509688O0GqFOJnfe2knh2YVpvbE1mZJm4nCrw6yot0KSSSGEEF+fBxtgRwnwegmVdkPOcBbf8fEx5rDmygXr1oGjI68OXaL5ylr88gukSQMnTxrFXSWZjHzRmlAC3QF3oBvwE7AXWKqU6uJ/vAhQG1gG/Aj0AkoAR5RStgHa6QiUCvSzE3AFToaiHz0DXTswQk8lPvP06VO+++477Ozs6NGjR0x3R4iod2cxbC9hjFBW2gN5eoUtKHp5sbf5Agp8l5jNr8owrulZtj0pSNoi6UK+Fjj/9Dw1ltSg6r9Veev1lhX1VnCk1RFKZ4ykzZ/FV00plU0pNcv/hayfUmpfEOekVUotCPBS94xS6tdA57T4wovX9tH2MHGcxEchAJMfnBsEB34BuxxQ/RTYfxe+ts6cMeax9uxpbKF1+TJbiw8hXzFrli6FQYPgxAkoUCBSn0AEEN05+o9aa9cAn/f4jz52B6YCh4BcWmvfDycopU4D14C6wEIArfXlgI0qpRIARYEVAa8NxjWt9bEIPclXzNb2/7m3h4cHVlZWmPuvOJ41axa//vrrly6NNLNnzyZlypS8efNGKkiKuM3PE051hZuzjGBYZjlYpw1TEz4HjjKkzgVGvWhD9sRPObbOg8IVC4bq2gdvHjBo7yAWnl1IkoRJGFtlLF2Kd8HKwirszyJis7xATeAYkCDwQaWUGbARSAH0Bp4A9YB/lVIeWut1gS6pCLwP8Pl2VHQ6ukl8FOIr4PUSjvwKj7dB1tZQdFr49pf09obhw2HkSEiZEtau5U2l2nTvbuwnmTcvbNwIRYpE/iOIT0VrQhkomfzgDPCz/3G3IK65rpTyAOyDabo6kAxjZDPec3d3//h7BwcH5s6dS+XKlT87z9fXF4soGve/e/cuefLkCVewjMp+RUf7Ih5xvwOH6sPLU5C7NxT4G8zC8L8td3dudJzIr4urcZK2tK52n8lrMmJjE/Klrz1fM/rwaCYem4hJm+hRqgf9yvUjuXXy8D+PiM02aa03ACilVgMpAx3PgfHi9Set9Sb/73YrpUoAjYDACeVJrbU7cYzEx5htXwhenYUDdeD9Qyg+G7L9Hr52nJ2hZUu4eBGaN4eJE9l9Jjmt8hu7hPTpA0OHgpW8W40W0T3lNSilgctfOqiU+hZIFNw5GMHwIXAwlPdc4D8l6LFSaoJSKgJlpGKPffv2kSFDBkaPHk2aNGlo2bIlr169olatWqRKlYpkyZJRq1YtHjx48PGa8uXLM2jQIMqUKYOdnR1Vq1bF1dV4L+Dp6UnTpk1JkSIFSZMmpVixYjx9+pQWLVqwcOFCxowZg62tLbt27cLLy4uuXbuSLl060qVLR9euXfHy8vpivxwdHalfvz5NmzbFzs6O/Pnzc/36dUaOHIm9vT0ZM2Zkx44dH/v5+vVrWrduTdq0aUmfPj0DBw7Ez88PACcnJ8qUKUO3bt1Injw5jo6O0fcfXcRdDzbC1sLGpsvfrYdCo8OUTOqt25ifaSiFFnfjplVeVi9+z9xtISeT3n7eTDk+haxTsjLy0Ejq5q7Ltc7XGFt1rCST8ZjW2hTCKR/2iHkd6Hs3IN4Pk0l8lPgoosGdf2FHaTB5Q+UD4UsmPT2hXz8oWdKosrN5M+7/LKTToORUrgzW1nD4MIwaJclkdIrR11BKqUoYo5NBbs7mP0VnMnAD2PGFcxJhrLecrbXWIdzSC/jHv603QHmgD5DVvx/h1nVbV84+ORuRJkJUME1BJlWfFKE2njx5wsuXL7l79y4mkwkPDw9atmzJypUr8fPzo1WrVnTu3Jn169d/vGbp0qVs3bqVjBkzUqNGDcaNG8eoUaNYuHAhr1+/5v79+1hZWXH27Fmsra1xcnICIEOGDAwfPhyAwYMHc+zYMc6ePYtSip9//pnhw4fz119/Bdmv0aNHs2nTJjZs2ICTkxOtWrWiWrVqtGnThocPH+Lk5ES7du24c+cOAL/99hupU6fm5s2bvHv3jlq1apExY0batWsHwPHjx2nUqBHPnj3Dx8cnQv8NRTxn8oVzA+DKGEhWGMqtAtswFLx58YJXHQfQbmVFVjGW8oVes2hDEjKGsNWW1prVl1fTb3c/br26RcXMFRlTeQxF0slcHhEqF4HjwDCl1O/AU6AOUAaoEcT5t5RSKYBbwASt9ayI3Dw2xEiJjxIfRRQx+cDpHnB9Kth/D2VWgHXqsLdz7Bi0agVXrhi/jh/PvrNJafUtuLhAt27w999GUimiV4yNUCqlHIClwAattdMXThuJUTSnmdb6S3/L/QjYEorprlrrx1rrzlrrjVrrfVprR4z1mz8ppQoGdc3z588pWrTox5/Zs2eHdJuvmpmZGUOHDsXKygpra2tSpEhB3bp1SZQoEXZ2dgwYMID9+/d/ck3Lli3JkSMH1tbWNGjQgLNnzwJgaWnJixcvuHnzJubm5hQpUoTEiRMHed8lS5YwePBg7O3tSZUqFUOGDGHx4sVf7BdAuXLlqFatGhYWFtSvX5/nz5/Tt29fLC0tadSoES4uLri5ufH06VO2bt3KpEmTsLGxwd7enm7durF8+f+L96ZLl44uXbpgYWHxsX0hwszjEeyuaCST2dpB1cOhTya1hhUr2J+1FQVW9medWV1G/uXLrpMhJ5Pnnpyj/MLyNFjdAGtLa7Y02cKuZrskmRSh5v/CtQZG3L+OMVI5G2iltd4T4NTHwCCgGUZ8PQ7MVEp1C6rduBQjJT5KfBRR4P0TI25enwo5u0HFnWFPJt+/h169oEwZcHeHbdt4N2UeXQYlpUIFMDeHAwdgwgRJJgPTWnPh6QVWXVoVpfcJ0wilUio/UBxIAyQEXmIEpiNa61dhaCc5sBW4BzT9wjkdMaq8NtZaHw+muUbATa21c2jvH8hqYDpQGDgb+GCqVKlwdg656YiOHEaXVKlSkTDh/xc+e3h40K1bN7Zt28arV8Yf4du3b/Hz8/tYqCBNmjQfz0+UKNHHNSjNmjXj/v37NGrUCDc3N5o2bcrff/+NpaUlgT169IhMmTJ9/JwpUyYePXr0xX4BpE79/79wrK2tSZky5cc+fQh67u7uPHr0CB8fH9Km/X8hFJPJRMYA/0rPGNK/2IUIyZM9cKQx+LhDqX8hcxiKdzx4gE/7Ljj+V5SRrCNbJh+OrDKnWLHgL3vh8YLBewcz89RMkiVMxswfZtKmcBvMzcwj9iziqxFZcTUU9zEDFmMU5WkIPMMo4jNPKfVCa70NQGu9Hdge4NKtSikrYKBSanLgqbVxKUZKfBQikj0/AofqgbcblF4KDo1DvOQz585BkyZw+TK0awdjxnDgbGJafgu3b8Off8KIEZAoUaT3PtbSWuP8yJm1V9ay5soabry8QRKrJPyS6xcszT//OygyhJhQKqWyAB2AX4HUgAljzYUXkBRjfaNJKbUfmItRafWLazn8p6huxqhC94PW+l0Q59TFqPraW2u9Ipi2kmC8cR0T0nMEQwf6NU4LXARg/PjxXLt2jePHj5MmTRrOnj1LoUKFCHn2sPEGdsiQIQwZMgQXFxdq1qxJzpw5ad269WfnpkuXjrt375I3b14A7t27R7p0/98OISKV7jJmzIiVlRWurq5fLCYglfREuGkTXBoBF4aAXU6otBeS5AndtSYTzJ7NzZ4zaeIxl5MUpVVLE5OnWGFr++XL/Ex+zD41m4F7B+Lm6UbHoh0ZWmGorJGMIyI7roZSLeAHIIfW+ob/d/uUUhkxYui2YK5dDTQAHIgj1V6DIvFRiEiitTEieboH2GSCqtsg2bdha8NkgsmToW9fSJ4ctm3Do1w1+veHKVMgc2bYt8/YJUSASZs4cv8Iay6vYe3Vtdx7fQ8LMwsqOFSgR6keUZpMQghTXpVSc4FLQEFgGFAISKi1TqW1zqC1tsWovvojcAEjKF1RSpX9QnsWwCogO1BDa/0siHPKA0uAaVrrcSH0vzZgRcSqu9bz//VUBNqItd6+fYu1tTVJkybl5cuXDB06NNTX7t27lwsXLuDn50fixImxtLT8+IY0sMaNGzN8+HCeP3+Oq6srw4YNo2nTIAenwyxt2rRUrVqVHj168ObNG0wmE7du3fpsapIQYeb1AvbVgvOD4JtGUO1E6JPJ69fR5Sswv8MJCnoe5YZdIVatgnnzzYJNJg/cPUCR2UXouKUj36b+lrPtzjK15lRJJuOIyI6rYZAL8AiQTH5wBqOOQGjEixevH0h8FCIcfN8ZW4Kc+hPS1YDqzmFPJh89gurVoXt3qFEDzp/nsG01ChY0csxOneD8eUkmAU49OkXH/zqSfkJ6yi0ox3Tn6Xyb+lsW/LyApz2fsqPZDtoVbUdq23CsWQ2DkNZQemLsC1lFaz1Ta31ea+0X8ASttavWeqvWuiuQCRgMpP9Ce9Mxptj8BSRXSpUM8GOllMoNrAeuAisCHQ8q4DUCzmmtrwR1M6XUTaXUvACfHZVS45VSdZRSlZVSw4CJwFqt9fkQ/lvESV27duX9+/ekTJmSkiVLUr169VBf++TJE+rVq0fixInJnTs333///ReD4MCBAylatCjffvst+fPnp3DhwgwcODCyHoNFixbh7e1Nnjx5SJYsGfXq1ePx48eR1r6Ih54fha2F4OluKDYDSv8LlsFkgh/4+MCoUbzIX576x7rTmvkUK5eQ8xfNqVfvy5c9ePOAxmsa873T97zyfMWq+qvY03wP+VPnj7xnEl+DyI6roXUXSKSUyhno+yKASwjX1gVc/duINyQ+ChFGb67D9pJwd7mxjdZ36yFB0rC1sX49fPstHDoEs2bhsWQdPUalolw5I7zu2QNTpxKq7bXiKpM2sfHaRso7lafonKI4nXWiTMYyLK2zlOe9nrOp8SZaFGwRvS+itdbR9oMRtPQXfhyAFsEcdwrUVkrAB+gbwv2cAnxuBDhjFCPwBm5ivCG2+lIbRYoU0UG5fPlykN+LuEP+jOMpk0nrKxO0Xmqh9frMWr9wDv21p05pXbCg3kVFnS6hq7a0NOkxY7T28/vyJW+93urBewZr6+HWOuHwhHrI3iH6nfe7iD+HCBfAWUdjXIysH4xpsvX8f45ijIJ++JwIsMNICK8AjYHKGC9UNdAxQDtrMKqf18CYJrvY/5wuQd1XYmT8JX/G4hP31mq9MrHWq1No/WhH2K93d9f699+1Bq0LF9b6yhV96JDW2bMbX3XooPXbt5Hf7djE3ctd/3PiH51tSjaNI/qbid/o8UfGa7f3btFy/+DiY7RuG6K1dgjhFCf/n9C05cr/99UK1f201suB5UGfLYSI97zd4FhLeLAeMvwCJReE7u3q+/cwdCheY6cwMOE4xtGRnJlg01IoXDjoS/xMfiw6t4gBewbw2P0xDfI2YHTl0TgkdYi0xxHxij3GkpKAPnzOrLV28d+qayQwHkiMsSVIe4xqrx9cw9jKKyPG/pSXgeZa68UIIURgJl84PxAuj4bkRaHcGrD5JmxtODvDr7/CjRvQpw8efYcxcFgCJk2CTJlg926oWDFKeh8rPHr7iGknpjHTeSavPF9RIn0J/q73N3Vy18EiDPtfR6WwVnktjrFuMT1GNbqAtNa6YWR1TAghotXLU3CwPnjch8ITIeefEJpiFfv3w++/c+WGOU2SX+Xsy29o3x7Gj/9y1bndt3fTY0cPzj09R4n0JVjdYDWlM5aO3OcRsUJkxVWttQtGAhjcOTeB+iGc0x/oH5p7CiHiOc9ncLgxPN0D2dpCkclgHvivsWCYTMZeH/36QZo0sHs3hywr0Kq4kVt27AijRxNs3YG47NSjU0w+PpnlF5fjp/2ok7sO3Ut2p1TGUjHdtc+EOqH034NqPMZmyLcxpowKIUTspjXcmA6nu0PC1FDlIKQsGfJ1r19D797o2bOZkWIQPRIMwdbMnI0b4ccfg77kqutVeu3sxebrm8mUJBPL6i6jYd6GUmUxnpK4KoSItZ4fhcMNwMsVSsyHrC3Ddv2LF/Dbb/Dff1C7Nh5T5jJgfHImTzZGJffsgQoVoqbrXzMfPx/WXFnD1BNTOXL/CLYJbOlUrBN/lPiDzMkyx3T3vigsI5Q9gMlAd/95tEIIEbv5vIHjv8O9lZCuJpRaBFYpQr5u40bo0IGnj020znKJ/27noXp1WLDAeMkamKuHK0P3DWWG8wwSWSZiVKVR/FnyTxJahOFNroiLJK4KIWIXreHaFDjTExJlhCpHIHmhsLVx+DA0agTPnsHUqRwq0ImWFRQ3bxoVXEeNin+jks/ePWP2qdnMcJ7Bo7ePyJY8G5OrT6ZFwRYktkoc090LUVgSSivgPwl6/2cymTAzC6lQroiN5H/m8cCr88aGy+63oeAoyN0LVAj/f376FP74A1auZFOmzrROOoE3Dy2ZPBm6dPl8hqy3nzfTTkxj2P5hvPV+S9vCbRlaYSj2NvZR91wiNonTcVVrLaPvcVQc/Z+sCInPWzjexngJm/4nKOUECZKF/nqTCcaMgYEDwcGBd7uPMWB1Iab8AQ4O8XNU0vmRM1NPTGX5xeV4+3lTLWs15vw4h+rZqmMW0r9JviJhSSidgDrArqjpSuxiY2PDw4cPSZ06NZaWlhI04xCtNS9evCBhQhk9ipO0hltzjD2yEiSDSnvA/ruQr1m8GLp1491bEz2KOTPrZBEKFIC9S8B/P/IAp2s2XNtAr529uPnyJtWyVmN81fHktc8bdPsivnIijsbVhAkT8uLFC1KkSCHxMY6RGBlPuV2CQ3Xh7Y3Qv4QN6PlzaN4ctm2DBg3Y12werX+z5fZt6NwZRo6MP6OS917fY/3V9Sy7uIxjD45hm8CWtoXb0rl4Z3KmDLyzU+wQloSyDzBNKbUL2AO4BTqutdYzIqtjX7sMGTLg6urK3bt38fX1jenuiEiWMGFCMmTIENPdEJHN5y2caAd3l0GaqlB6MSQMYbTQxQXat4ft23H+thW/vpnODWcrevWCv/4CK6tPTz/75Czdt3dnr8tecqfMzZYmW6iRvUaUPZKI1eJsXM2QIQMPHjzg+fPnMd0VEQUkRsYzd/41YqelHVTcDanLh+36/fuhSRN48QL3iXPoc701039UZM1qHPouhHe6ccFV16usvbKWdVfX4fzIGYC8qfIyqdokWhRsQZKESWK4hxETloSyIvArxl5WQRXv1UCsDHzhYWZmhr29Pfb2MnVNiFjh1Xk4VB/cbxobLufpG/zbVT8/+Ocf6N8fP8wZVesIjttKkiaNYvfuz6flPHF/wsA9A5l/Zj7JrZMzrcY02hZpi6V5sLsbifgtzsZVS0tLMmf+egtICCFCwc8LTnWFmzMhVTkouwKs04b+epMJRoyAIUMga1Z2D95P65HZuHcPunaFv//+cjX02E5rzenHpz8mkVdcrwBQPH1xRlUaRe3ctcmRIkcM9zLyhCWhnA4cB/4EbmqtfaKmS0IIEYm0hltz4dQf/lNc94Y8xfXyZWjdGo4d4853v9HMYyaHNyekYUOYMQOSBVgy8t7nPROPTWTkoZF4+XrRrWQ3Bn43kGTWYVhXIuIriatCiK+Tu4vxEvalszG9tcAICMueh8+fQ9OmsGMHb+q3ppftdGa3T0COHHDwIJQpE2U9jzFaa84+Ocvyi8tZcWkFd1/fxVyZ873D93Qs1pFfcv1ChsRxc2Q/LAllOqCj1vpKVHVGCCEilc9bONEe7i4N3RRXb2+jvNzff6NtbFnU9jBdlpVCKcW//xozdgIuBzt87zAtN7Tkxssb/JzzZ8ZWGUv2FNmj/rlEXCFxVQjx9Xm4GY42B+0H5dZBxl/Cdv2RI9CgAbi6sv2P//h9XQ0ePlT07AnDhoG1dZT0OsZcc73G8ovLWXZxGddeXMPCzIKqWaviWN6RH3P8SIpEoageH8uFJaHcBRQgDhYPEELEQWGd4nrihDEqefEiL2q3oZ3PVNbMTki5crBokVGB7oP3Pu8ZuGcgE49NJFPSTOxstpPKWSpH+SOJOEfiqhDi62HyhfMD4fJoSFYQyq4Cu2yhv15rmDQJevfGLX1eelQ/y/wpKcmd28gxS5SIqo5Hv3uv77Hi4gqWXVzGmSdnUCi+d/ie7qW6Uzd33XiRRAYUloRyCjBTKWVN0MUD0FpfjqR+CSFE+HxWxTWEKa7v3sHgwUYQTJuW7Y5HaTmrJK6uRtW5Xr3A3Pz/px+9f5QWG1pw/cV1OhTtwOjKo7GzsovyxxJxksRVIcTXweMRHG4Ezw9CtrZQeBJYhGEo8fVraNUK1q5lY4m/aX+3L882m9GvnxFi40JRYG8/b9ZeWcsM5xkcuHsAMNZETqw2kfp56pM+cfoY7mHMCesIJcAwYGigYwqjeIA5QggRU3ze+FdxXR66Ka67d8Pvv8OdO7xv04U+ZmOZ6mhFnjzw339QKMBeze993jN472AmHJtAhsQZ2NVsF5WyVIr6ZxJxmcRVIUTMe7ILDjcB33dQajFkbhq268+ehXr1cHVx589Cl1l6PDfffgubNkORIlHS42j14M0DZjnPYs7pOTx995SsybIyvMJwGuVrRNbkWWO6e1+FsCSU8WyrUSFErPLyNBxqAO9coMBIyNP7y1Nc3dygZ0+YNw+yZ+f0bGeaTizClSvw55/GyGTANR7HHhyj5YaWXHW9Srsi7RhbZayMSorIIHFVCBFzTH5waThcGApJckPZfZAkT+iv19qIo507s8rmNzrZTcXtYgKGDoW+fSFBgijreZTTWrPnzh6mO09nw9UNmLSJH3L8QKdinaiatSpmYdmDMx4IdUKptd4flR0RQohw0RquT4MzPY3RyEr7wL7sl89fvx46doRnz/Dr2Ycxdn8xuKMl9vawYwdUqfL/Uz19PXHc58jYI2NJb5eeHU13UCVrlS82LURYSFwVQsQYz2dwpCk82QkOzaD4DLCwCf31795Bp048WbiNzvY7WfOsHEWKwO4FkD9/1HU7qr32fM2ic4uY7jydq65XSWGdgh6letC+aHsyJ5OtkL4k2IRSKWWrtXYPa6NKKTut9dvwd0sIIULB+xUcaw0P1kG6WlDKCay+sBD+6VPo0gVWrYKCBbkzawfNx+Tj0CGjGN2MGZA8+f9PP/7gOC03tOSK6xV+L/w746qOI7FV4mh5LBF3SVwVQsS4Z4fgcEPwegHF50DW1p+WMA/JlSvoevVZcrkQf1rf5t1ra0aNgh49wCIscx+/Iuefnmf6yen8e/5f3vm8o0T6Eiz8ZSEN8jYgoUUcWAAaxUL6Y7+nlJoOLNBa3wruRKWUFVALYz+tncBfkdNFIYQIgutxo4CAxwMoNB5ydQs6IGoNixcbuyh7eKD/HoGTfS/+aGKBmZlx6Ndf/3+pp68nQ/YOYdzRcaS3S8+2X7dRLVu1aH00EadJXBVCxAxtgstjjEquNpmh2n9GNdew+Pdf7rf9i3Z+/7CVypQqCPPnQ65cUdHhqOXt582ay2uY7jydQ/cOkdAiIY3zNaZjsY4UTVc0prsXq4SUUFbBCGD9lVLngCPARcAV8AKSApmBIsD3wHtgHDAtivorhIjvtIarE+BsX0iUAaochpTFgz737l1o1w62b4cyZXg2egFtx2ZnwwYoXx6cnCBTpv+ffvzBcVpsaMFV16u0KdSGcVXHkSRhkuh4KhF/SFwVQkQ/z+fG3pKPt8E3DaHEbLAMw6yb9+8x/dGV2XMVvc1P45fAmkmToHPnTyuhxwb3Xt9jlvMs5p6Zy7N3z8iaLCvjq46nRcEWJLdOHnID4jPBJpRa61NATaVUdqA5UAloBVgFOO0ecNj/+41aa58o6qsQIr7zdIVjv8GjLZCxDpSYBwmSfn6eyQTTpxtVAQCmTmVTxo60qWOGmxuMH28MWJr5r6kPPCq5vel2qmatGk0PJeITiatCiGj37KAxo8frBRSbAdnahW2K640b3PypO22u9mA/5an0vYnZc8zIkiXquhzZvP282XFrB3NPz2XT9U1oramVoxadinWiStYqUmQngkI101lrfQMY5P+DUioZkBB4obX2jrruCSGEv6f74UgT8HKFotMge8egA+LVq9CmDRw+DNWq8Xb8bLpP+oa5XaBAAWOnkHz5/n96wAquvxf+nbFVxsqopIhyEleFEFFOm+DyKDg/CGyyQNXNkLxQyNcF4LdiFZN+O8MgrxVYJrJkzmRo3dosTPloTPE1+bLnzh6WX1zOuqvrcPN0I1WiVPQp04e2RdrikNQhprsYZ4Rr6azW+lV4rlNK1QeaYUzlSQJcA8ZprZf5H08LdAeqAlmBVxibPffTWj8K0E4LYEEQt+igtZ4ZQh+SAJOAXwAzYDPwh9b6RXieSQgRxT6UNb84DGyzQfkvrPnw8YGxY2HoULCxAScnDmdtTvOfFHfuGIOVjo5g5T8O5OHjgeM+R8YfHS+jkiLGhTeuCiFEkDyf+1dx3RG+Ka5eXlxsOZ7Wyypxgvr8WNmDGU6WpE8fdV2ODH4mPw7cPcCKSytYc2UNrh6u2CWw4+dcP9MobyOqZK1CAvNYvJ/JVyq6azF1B+4A3TDWi9QEliqlUmqtp2IkmrWBucBxIDXgCBxRSuULojJeRYz1JR/cDkUfVgA5gTaACRgNrAfKhe+RhBBRxuMhHPkVnu2HzM2h6D9gafv5eadPQ6tWcO4c1K+P9/ipOM5IzehWxhrJAwegrP9OIj5+Piw4u4Ch+4fy6O0jqeAqhBAibnl2AA439p/iOhOytQ3TFFfvSzcYVXUPwx/1JIm1N8tm+9Lw10Rf7aik1pqTj06y+NxiVl9ZzRP3JySyTMSPOX6kYd6G1MheQyq1RrHoTih/1Fq7Bvi8RymVDiPRnAocAnJprX0/nKCUOo0xklkXWBiovZNhKb+ulCoFVAO+11of8P/uIXBcKVVZa70rPA8lhIgCD7cY6yX93kPJhZCl+efnvH9vjEiOGwepUsHatVzMXpumPxq5ZevWMHEi2NmBSZtYfXk1A/cM5MbLG5TKUIrldZdTLpO8SxJCCBEHmPyMKa4XBoNt1i/P6PkSrTnp+B+t/srMRd2ORmUfMGVtBlKlirIeR8izd89YfG4x88/O5/Lzy1iZW1Eze00a5WvED9l/wCZBGPbVFBESrQlloGTygzPAz/7H3YK45rpSygOwj4Qu1ACefkgm/ds/oZS6439MEkohYpqfN5zrD1fHQ9ICUHYFJM75+XkHDhhrJW/cgNat8Rs1lolOyRjQCJImhQ0b4KefjDeXO2/tot/ufpx6fIq8qfKyodEGfszxI+prfd0qhBBChMX7x3CkGTzdDZkaQfHZYGkX6ss9Hr9mcMWDTLxag7QJXrJxxnN+bJUhCjscPr4mX7be2Mr8s/PZfH0zviZfSqQvwaxas2iYt6HUQIghX8P2o6WBy186qJT6Fkj0hXNuKaVSALeACVrrWSHcKxdwNYjvr/gfE0LEpLe3jGk6L09C9k5QeByYB5qm8uaNsSByxgxwcICdO7mTtTIt6ho5Zu3aMGuWMWB54uEJ+u3ux547e8iUJBMLf1nIr/l/xdwsltU4F0IIIb7k8Q442gx83kLxOZC1dZimuO795zJtutpw27cW7YqdYvS2giRJ/nXFyauuV1lwZgGLzi/iifsT7G3s6VqiKy0LtSRPqjwx3b14L0YTSqVUJYzRyVZfOG4GTAZuADsCHHqMURnvBGAONAZmKqUSaa0nBnPLZIBbEN+/AoIsfvz8+XOKFv3/5qZt27albdu2wdxCCBEuLkvhRHtQ5lBujbEtSGDr1xubXj16BF27ov8azvwVNnStbWwBsnAhNGsGN1/eoP3Kvqy9spZUiVIxufpk2hVph5WF1edtCiGEELGRyQfODzamuSbJCxX3QNK8ob7c7YUfvaufZ45zIbJZ3GHvtEuU71QkCjscNm6ebqy8tBKns04cfXAUc2VOrRy1aFmwJTWz18TS3DKmuyj8hSmh9F/vWAvIgFHePCCtte4ThrYcgKXABq210xdOGwmUwljz+HEfLq31dmB7gPO2KqWsgIFKqclaa1Mwt9ZBdecL35MqVSqcnZ2DaU4IESE+7nCqC9x2glRloPRSsPnm03MePYIuXWDtWsifH9as4alDCX5vAps2QYUKsGABJEntRs8dfzH1xFSsLKwYWn4o3Up2w84q9NN+hIhOkRlXhRDxyLu7xowe16OQ9XcoMgksEoX68o3zXenQQfPE+1t659qI497vsU6TOer6G0p+Jj9239mN01kn1l1dh6evJ3lS5WFslbE0/bYpaWzTxHQXRRBCnVAqpWoDyzBGBJ8BgffJ0kCoAp9SKjmwFWPz5qZfOKcj0AtorLU+HopmVwMNAAe+XO31FRDU0uKkBD1yKYSISq/OGpstv7kO+QZBvsFgFuCvJZMJ5syBPn3A0xNGjICePVm32ZK2teDtW6PoTsfOvsw/O5dBKwfxwuMFrQq1YnjF4RJ4xFctMuOqECIeub8ejrUE7Qell4FDo1Bf+vQp/FHnASuPZOBbswtsGHKcokN+DNMU2ahw/cV1Fp5dyKLzi3jw5gHJEiajdaHWtCjYgiJpi0jNg69cWEYoR2BMO22htX4Z3hsqpRJh7P2YAPhBa/0uiHPqYlR97a21XhHGWwQ50ujvKkFvD5ILY+sQIUR00BquT4UzvcAqJVTaDakrfHrO1avQti0cPGgMQc6ahVuq7PzRGhYvhiJFYNEieGi1kyJzunPx2UW+z/Q9E6tNpFDasG3cLEQMiZS4KoSIJ/y84ExPuD4NkheBMsvBLluoLtUaFs30oFtXE++8U/FXuhn02VEJy7y1orjTX+bj58Pqy6uZdnIaR+4fwUyZUT1bdSZUncCPOX+UrT5ikbAklBmBLhFMJi2AVUB2oIzW+lkQ55QHlgDTtNbjwtB8XYy9Le8Gc85WYJBSqqzW+pD//YpirJ/cGoZ7CSHCy9MVjreCh5sgXS0ouQASpvz/cW9vGDUK/v4bbGxg/nxo0YKduxStKsLjxzB4MDTocJ2+e3uy6fomMifNzJoGa6idq7a8xRSxSYTjqhAinnhzzZjR8+os5OwKBUeBeejqAty5A+0avGSnc3LKcpA57U+Ta3IHSJAgSrv8JS/fv2T2qdlMOzGNh28fkj15dkZXHk3Tb5uSzi5djPRJRExYEsojQE4itrXGdKAm8CeQXClVMsCxMxiJ3XqMkcQVgY4/11rfAlBKrcEoyHMeY6pQQ/+fPwKun1RK3QT2a61bA2itjyqltgOLlFI9ARMwGjgke1AKEQ2e7oMjv4KXKxSeBDn/+HSazZEj8PvvcPkyNG4MEyfyzjY1fbrAP/9Arlyw44Abm98Oo+CcqVhbWDO68mj+LPGnFNwRsVFkxFUhRFymtVFjwLkzWFjDdxshw4+hutTPD6aM92HgABPmvhZMTzWEdutrYFb6z6jt8xdcdb3K5GOTWXhuIe9931MpcyVm1ZpFjew1MFNmMdInETnCklB2B5YopdyBnQSx5lBr7RFCG1X9f50cxLHMQAkgCVAAOBzo+EKghf/vr2FUhs2IUVDnMtBca7040DUWGAlnQI2AicB8wAxj+u0fIfRbCBERJl+4MBQu/W1Mz/l+EyQv/P/jr19Dv37GViDffAP//Qc1a3L0KDRvDrduwZ/d/MjRYAGNDvbH1cOVNoXb8FeFv0htmzrmnkuIiImMuCqEiKu8X8PJ9nB3ubEspNRiSJQ+VJeePw9tmrzj5CUbarGJ6c2OkXFGf2PmTzTSWrPr9i4mHpvI1ptbsTK34tf8v9K1ZFfyp84frX0RUUdpHdySwwAnKhWwcmqQF2mtv65NayJB0aJFtVR5FSIC3F3gSBOjEl2WFlBkKljaGse0hnXrjK1Anj6FP/+EYcPwsrTF0RHGjIGMGaHn5CM4Pf2DU49PUfabskypPkXWSYoooZQ6pbUuGvKZkXKvWB9XJUYKEUVcjxtVXD3uwbfDIHcfCMUeyp6e8PdfJkaN0iQzvWBq0sE0WPoLqkb1aOi0QWvNuafn2HB1Aysvr+Ty88vY29jTqVgn2hdtj72NfbT1RUSe4OJjWEYoWxF8wRshhPjU3RVwwn/f1sCV6B48MBLJDRugUCFj/48iRTh3zhiVPH8eGv3+CF2pD13O/kt6u/QsrbOURvkayTpJEVdIXBVCfEqb4PIYOD/IGI2sfBBSlQrVpQcPwu8tvLl2OwHNWciEn/aTYv5YSJEiijttFNg5eO8g66+uZ+O1jdx9fReFonTG0iz4eQGN8zWWpSlxWKgTymD2ihRCiE/5uMOpP+D2AkhREsosBVv//a38/Iyprf36Gb8fOxa6dsUXC8aMAEdHSJbSi+ZzJrLm2XB8b/gyoNwA+pXth02C6J2qI0RUkrgqhPjE+8dwpBk83Q3f1IfisyFB0hAve/0a+vTWzJqtcFCP2J6oG1Vn14Mm86J0O5C3Xm/Zfms766+u578b/+Hm6UZCi4RUzVqVwd8PplaOWjIaGU+EZYQS+LgJcykgOfASOKq1fhTZHRNCxFIvTxvTdN7egLwDIP8QMLM0jl24YBTdOX4cqlUzEsvMmbl6FX77DU6c0JRttZkH+bqx6OEtfsn1C+OrjidLsiwx+0xCRCGJq0IIHm6BYy3A1x1KzIUsrUKVDK5bB506mHj6FHownqFldmGzZLZRjyAKaK1xfuTMTOeZLLu4jPe+70lhnYJfcv3Czzl/pkqWKvLyNx4KdUKplDLH2Bvydz4tdOOnlJqNUfrcFOTFQoi4T5vg6iQ41xes7KHSHkhd3jj2/j0MGwbjxkGyZLBkCTRujEkrJk+E/v3BKv1VCoztyqF328mVIBfbm26nataqwd1RiFhN4qoQAj9PONPb2Js56bfG3pJJcod42aNH0KULrF0LBSyvsNG8NUX/rg09/wPzyF967e7tzrILy5h5aianH5/GxtKGpt82pem3TSmdsTQWZmEeoxJxSFj+9IdirPfoD6wAngKpMbbrGAa8AAZHdgeFELHA+6fGm9XH2yDDL8bbVSv/NRs7d0L79nD7NrRsaUxxTZHi48cDJ9zI3GIY99NNxcXXhonVJtKpWCcszS1j8omEiA4SV4WIz9wuwZHG4HbBf2/JkWCeMNhLTCaYOxd699Z4vfNlFIPonnkzlksXQpEikd7FC08vMOvULBafX8wbrzfks8/HPzX/4df8v5IkYZJIv5+IncKSUDYHBmqtxwX47h4wVimlMbbekMAnRHzzcZrOWyg2HbK1N6bpPH8O3bvDv/9Cjhywdy+UL4/WMHsWdO/ph6nAfOz6D8DF5Eqbgm0YXnG4rLcQ8YnEVSHiI63hxgw40wMsE0P5LZCuRoiXXbtmrBo5eBAq2Doz27cx2dpXgXHHI3U7kPc+71lzZQ0znWdy+P5hrMytaJC3Ae2LtqdUhlJSGE98JiwJpT1w/gvHzvsfF0LEF59N09kHSfIYgdLJCXr0gLdvYdAgY05rwoQ8eACtW8OOq4ewa/8HHrZnKJqhLJOrb6Nw2sIh3VGIuEbiqhDxjacrHG8FDzdB2upQ0gmsg99P2dsbRo+G4cM1icy9mWf5Jy2t1qCWzIOffoq0rl1+fpnZp2az6NwiXnm+IlvybIyrMo7fCv5GykQpI+0+Iu4JS0J5HWgE7AjiWCPgWqT0SAjx9XO7aOwtGXiazvXrxvTWvXuhbFmYNQvy5EFrWLwIOvd/gEeZ3tBqGUkSZ2B2lWU0zNtQ3naK+EriqhDxyZNdcLQ5eL2AwpMgZxdQZsFecuSIMSp5+TI0THuAyY8bkLpaIXC6AGnSRLhL733es/ryamafns2he4ewNLOkTu46tC3SlvIO5TELoX9CQNgSyuHAcqXUN8BqjLUe9kB9oAJG8BNCxGVaw/V/4ExPSJAEym+FdNWN16cjh8Pw4ZAwIcyebQxFmpnx5Am06PCK7e6jMWs1GQtLTf+yg+hTpo9UghPxncRVIeIDP284PxCujIXEuY3YmaxAsJe8fm3srjVjBnyTwp3NNq354fVmmDoaOnYEs4glepeeXWLO6TmfjEaOqTyGFgVbkMomVYTaFvFPWPahXKmUcsMoIjAZsAR8gFNAda31zijpoRDi6+D5HI61gkebIW0NKLnAmKZz6BC0a2e8Pm3QACZPhjRp0BoWLXlPB6epvC8yCmXtRqN8Tfi70nAckjrE9NMIEeMiM64qpbIBvYCSQD7goNa6fKBz0gIjgKpAEuAGME5rvSTQeXkwqs+WAtyAucBQrbVfmB9SiPju9RU48iu8OmPUGCg8HiwSBXvJunXQuTM8eaLplnUjw279im2ZguB0DrJli1B3Tj06Rd/dfdl1exeWZpbUzVOXtoWN0UiZLSTCK0w1frXWO4AdSikzICXgKiXNhYgHHu+Ao7+B9ysoMgVydAY3N+jazhiNzJQJtmyBGkZRgUdPfPlx0EJOJx4CZR/yXdoaTPlpJAXSBP9GVoj4JhLjal6gJnAMSBD4oH/7G4EUQG/gCVAP+Fcp5aG1Xud/XjJgF3AZ+BnICowHzICB4eiXEPHTx8I7PcHCBr5bDxl+DvaShw+NRHL9eijo8IoNdvUp+uAQjB8Bf/4Zoe1Abr+6zYA9A1h+cTkpE6VkVKVRtCrUSkYjRaQI16Yx/sHuWST3RQjxtfHzhLP94Noko+BOhe2QND+sXGkEN1dXo/jO0KFgY4PWmt4L1jPxXH/8MlwloyqBU9MlVMzyfUw/iRBftUiIq5u01hsAlFKrMZLTgHIARYGftNab/L/brZQqgTG1dp3/d+0Ba6CO1voNsFMplRhwVEqN8f9OCBEcz2f+M3r+g7TV/Gf0pP3i6X5+MHOmMcXV11cz5tt/6Xq+FZbFC8PCs5ArV7i74urhyl/7/2KG8wwszCwYUG4Avcv0JrFV4nC3KURgwSaUSqkxwBSt9QP/3wdHa637RF7XhBAxKmDhnRydoeAYePAUfv0Btm6FokWNXwsVAmDj+f20WtqXF9bHsEqQk/Gl1vJHlV9kCo0QAURVXA3FqOaHjV1fB/reDQj4f9IawPZAieNyYDTwPbAJIcSXPdwCx1uC92soMtmIn8EUtjl/Htq2hePHoUqBp8y4/yNZr5yFEcOgVy+wCNfYDx4+Hkw6NonRh0fj7u1Oq4KtcCzvSPrE6cP5YEJ8WUj/K60PLAEeAA0AHcy5GpCEUojYTmu4Pg3O9DIK73z/H6SuCpMmwZAhxh6TkyYZ83LMzTnx8ARtVwzk3Nud4JOenxPNYdnwFlhbhS8IChHHxVRcvQgcB4YppX7HKABUByiDkUR+kAvY80kntL6nlPLwPyYJpRBB8X1vxM0b/xhbaVXcDUnzffF0Dw8YNgzGj4ekiU38W+Ifmhz/A1WoECw8Bfnzh68bJl8Wnl3I4H2DefT2ET/l/ImRlUaSJ1We8D6ZECEK9l98WuvMAX7vEOW9EULErPdP4FhLeLwN0v0AJefDhbtQqxicPWvsdzVtGmTMyPmn5+mzfRDb7myEdylJd3s8a/p1oGQR65h+CiG+WjEVV7XWWilVA9iAsV0JGAWAWmqtAyaQyTBGLQN75X9MCBHYq7NwuAm8uQI5u0HBEcZWWl+wc6exw9bt29Cy7HXGXqhBijMPjAyzb1+wtPzitV/i6evJwrMLGXd0HDdf3qRE+hIsr7uccpnKReDBhAidUNccVko1V0ql+MKx5Eqp5pHXLSFEtHu4GbZ8C8/2QdF/oPBS6DsCSpaEp09h9WpYv57rid7TeE1jCs4syPar+zHb9xd9E93mztLukkwKEQbRGVf9i/IsxijK0xBjW5JJwDylVPVApwc1aqqC+v758+cULVr048/s2bMjq8tCfP1MfnB5DGwvDj5uUGEHFJnwxWTy2TNo2hSqVgVzkzd7CvVg/qGcpMifDs6dg0GDwpxMunm6MfLgSBwmOdD+v/YkTZiUtQ3WcrT1UUkmRbQJy5y0BRglxF8EcSyz//FFkdEpIUQ08vUwqtDdmAFJC0CZpXDgFlTOBw8eQIcOMGIEd7Ubwza2YeG5hSi/hOhD/cjzuif/zkn2YRmlECJsojOu1gJ+AHJorW/4f7dPKZURGANs8//uFZA0iOuTEMTIZapUqXB2do6kLgoRi7i7wNHm8PwgZKwDxWZBwsC1sAxag5MT9OwJb99qBlU6Sv/DP5DwpcnYaLJt2zDvK/no7SMmHZvETOeZvPV+S5UsVehbti8VHCpI7QIR7cKSUAb3v84UgFR+EyK2eXna2B/rzVXI3RNSdYTWvWDNGsiXD1as4G3hfDjuc2TqiamgzUh4rgvvd/ZjUFd7Bg6EBJ9tUCCECKXojKu5AI8AyeQHZ4CfAny+6n/uR/5Jp43/MSHiN63hzmJw7mx8LukEmZsb9QWCcPWqMb11/34oU8Cd2V6/kWf3Wvj5Z/jnH0gftiI5119cZ+zhsSw6vwhfky/189Snd5neFE5bOIIPJkT4hVTl9WeMfag+GKSUeh7otIRAOeBkSDdTStUHmgFFMN52XsPYVHmZ//G0QHeMTZezYrwp3QP001o/Cm07wdzfAbgTxKEVWutGIfVfiDjD5AdXxsL5QZAwNZTfARtuQt+C4OUFI0agu3dn3e3/+OOf+jx6+4hsb1txY44jOb/JgNNeKFIkph9CiNgnsuNqGNwFEimlcmqtrwX4vgjgEuDzVqCXUspOa/3W/7uGwHtgfyT2R4jYx+sFnGgH99dAqnJQahHYOgR5qqcnjBoFI0dCokSaWVXX0mZXI8zsUxpLSOrU+WISGpjWmoP3DjLx2EQ2XN2AlYUVrQu1pkepHmRNnjUSH1CI8AlphNIeCFhmKiuQJtA53sAOYHgo7tcdI6HrBrhibMK8VCmVUms9FSOw1QbmYlSjSw04AkeUUvm01u6hbCckPYHDAT67huIaIeKGgNN0vqkPdl2hfi84cgQqVYKZM7mTwpwua+ry343/yGxdgBTrV3P7QkkG9DWWeFhZxfRDCBFrRXZcBUAplQgjFgKkBxIrper5f97i/3MPWK+UGgY8x5gC2wDoFKCpmcAfwFql1GggC0YcniB7UIp47dE2ON4KvFyh4GjI1QPMzIM8de9eY1Ty+nVo8t19Jtz6hdQ7TkObNjBmDCQLXX0rHz8fVl1exYSjEzj1+BTJrZPTv1x/uhTvQmrb1JH5dEJESEhVXucAcwCUUnuBjlrrKxG4349a64DJ2x6lVDqMBHEqcAjIpbX2/XCCUuo0xghkXWBhKNsJyTWt9bEIPIcQsY/W4LIEnDsZvy86DxbfhtHlIXFiWLgQ7yYNGX90An+t+AtzZU6xFxM5+U9n8uWxYOsxY+tJIUT4RUFc/cAeWBXouw+fM2utXZRSlYCRwHggMXALaA98rKSjtX7lf940jC1C3ICJGEmlEPGPrwec6W1sB5IkL5TfAskKBnmqq6uxTnLhQsic0YdtxYZR7cBwYwuQZQegXOiK5Lx6/4rZp2Yz9cRUHr59SM4UOZn5w0yaFWhGIstEkfhwQkSOUK+h1FpXiOjNAiWBH5zBf/qP1totiGuu++9/ZR/adoQQgXi/ghMd4N4KSFUWVCeoNRhu3IDmzWH8ePa7X6LDrEJccb1C6aR1uTltEqfvZGBgfxg4UEYlhYhskRFXA7TlQvBrMtFa38TYBzOkti4DFSOnZ0LEYi9OwtFm8OZasNuBaA2LFkGPHvD6taZf2UMMPPETiV75woQJxr7NoajeeuPFDSYfn8yCswvw8PGgUuZKzP5xNtWzVcdMha1ojxDRKaQ1lB2BVVrr5/6/D47WWs8IRx9KA5eD6cO3QKLgzglNO4EsUEolB54By4ABWuv3obxWiNjlyR449puxx2T2ATDzIcxvDFmywM6dPC9VgF47e7Lw3EK+SexAhceb2ev4A/nzw9YTUFjW+QsRaaIprgohIsLkAxeHw6W/wTotVNwFaSoFeeq1a0Yx9L17oXTuV8yybkK+Q9ugYUMYPz7EojsmbWLnrZ1MOzmN/67/h6W5JU3yN6Fria4USFMgKp5OiEgX0gjlNMAZY63FtBDO1UCYAp//tJqfgVZfOG4GTAZuYKwnCVc7AXgB//i39QYoD/TBWMMS5Ojmhz22Pmjbti1t27YN4TZCfAX8POHcQLg6Aeyyg3aEHybDy5fQty964ECcrq+k5z8Neev1lnqp+3Hw74EcfJKIwYNhwACp4CpEFIjSuCqEiKDXl+FIM3h12qjeWmQyJEj62WmenkbBnVGjIFFCEzMLzub3sx0xy5kDdu6EypWDvc0brzcsPLuQaSencf3Fdext7Bn43UA6FutIGtvAy6qF+LqFtIbSLKjfRwb/iqtLgQ1aa6cvnDYSY4+u77XWPhFoBwCt9WOgc4Cv9imlngLTlVIFtdZnA18je2yJWOnVOTjSFF5fhNRNYNIz2DwQiheHXbu4mi4B7Vf9wP67+ymRtgwpj81i9eC8FCgAWzci+0oKEUWiMq4KISJAm+DaZDjbDyztoNwaY3/JIOzebYxK3rgBTb69wIQbP5H62lP4e7gx7zWYNSLXXK8x7cQ0nM454e7tTon0Jfi39r/Uy1MPKwtZWyJip7DsQxlp/KebbsWoONf0C+d0BHoBjbXWx8PbTiisBqYDhYGz4WxDiK+Dyc8YkTw/EBIkh5et4fdlxobJU6bg2bYVo46OZeTGkSSyTESH9HNYM7AVp16Y4egI/frJqKQQQoh45t1dONoCnu2D9D9C8Tlg/XkV1adPjXxxyRLIluYtO1K1p8r5pVC3LowbBw4OQTZv0ia23NjC1BNT2XFrBwnME9Awb0O6FO9CsfTFovTRhIgOoU4olVLlgORa6w3+n1MCU4A8wG6g75dGEQO1kwjYDCQAftBavwvinLoY1Vp7a61XhLedUNKBfhUidnJ3MdZKPjsAdhVh9As4PA9q1YLp09nre5P2cwpz/cV16mRvgu/mCcxYlppChWDHdiggSzWEiFaRFVeFEOGkNdx2glN/Gp9LzIcsLT7bH9JkgrlzoU8f8HhnYvA3C+l3rwMJv80JK/dC+fJBNu/l68W/5/9l3NFxXHW9Sjq7dPxV4S/aFmmLvY19kNcIERuFZYRyDEYCt8H/82SgErAOaIGxPrF/cA0opSwwyphnB8porZ8FcU55YAkwTWs9LrzthMGHfbpORaANIWKO1nBnMTj7z+a+VxMGboOUqWDlSlxrlqfXrt44nXUiS7Is9MuwnTl/VuXNGxg+HHr3DlXxOSFE5ItwXBVChNP7J3CiHTzcCPbfQ0knsHX47LQLF4w9JY8cgfJprzLDrTa5PFxh1hRo3RrMP9+L0s3TjVnOs5h8fDKP3R9TKE0hltZZSr089bA0l4Ar4p6wJJQ5gaHwcXSwNtBKa71cKXUSI+iFFPimY2y8/CeQXClVMsCxMxgbKK8HrgIrAh1/rrW+FZp2tNZe/v28CezXWrf2/+wI2AGHMYryfIcxrXat1vp8KP87CPH18HphBMT7a8AyP4xyg9NboHVr/EaPYsHd9fT9JzevvV7TpWA/XBYOYuQ6a4oXh/nzIW/emH4AIeK1yIirQoiwurcKTnYAH3coNA5ydYNA23K8ewdDh8LEiZqkVu9ZmLAbzZ7NR3X/AwYNgqRJP2v2wZsHTD42mVmnZvHW+y1VslRhUe1FVMpcCaWC3dVHiFgtLAllAsDT//dl/K/9z//zdSBtKNqo6v/r5CCOZQZKAEmAAhhJX0ALMd7YhqYdF//fWwABXx1dBXoCbQBrjLWXY4G/Q9F3Ib4uj7bC8dbg6QrXi8CwU5AtO+zdy+EslvyxphqnH5+mbMay1NQzGNsyHx4eMGYMdOsGFjGygloIEUBkxFUhRGh5vYCTnYw9mZMXg1ILIUnuz07buBG6dNHcu6dok3glo950JEWt0jD+EuTI8dn5l55dYtzRcSw5vwSTNtEgbwN6le5FobRS4U7ED2H5J+VVoDqwD/gVOKq1fut/LB3wMqQGtNYOIZzi5P8T0XaCPE9rvRxYHpprhfhq+bjDmV5wcyaQAcbawsVz0LcfD7q2pM8hR5buX0p6u/RMLb+UrWMb0f8/RZkyMG8e5MwZ0w8ghPAX4bgqhAilBxvhRFvwfgnfDoc8fcDs038G37sHf/wBGzZAvkS3OURzymR8DeOWQPXqn5yrtWbX7V1MODaBbTe3kcgyEe2LtqdbyW5kTpY5Op9MiBgXloRyGLBKKdUaYxQx4L6N1TGmrAohotLzI3C0ObjfhsuZYcwdKFgMzxP/MP7tDkbMLYifyY8B5QaS5kZfBvxkg48PTJoEnTsHudRDCBFzJK4KEdW8X8PprkbxnaTfQoXtkOzTKnQf4qTjEBN4ezOGQXS1W47l5CHQosUnU3q8fL1YemEpE45N4OKzi6SxTcNfFf6iQ9EOpEiUIjqfTIivRqgTSq31RqVUbqAQcEFrfT3A4aOArEEUIqr4ecMFR7gyGnyTwsSEcO0petR41lfLRI9dDbnjdoc6uevwZ65xDOuemd27oUIFmDMHsmaN6QcQQgQmcVWIKPZ4h7E05P1jyDsQ8g0C80/3xjp8GNr/7svFKxb8pP5jilUvMg1sBD2vgK3tx/NcPVyZcXIG/5z8h6fvnpLfPj9OPzvRKF8j2T9SxHthWkWltb4N3A7i+9mR1iMhxKfcLsKRpuB2Di7bw4RnULYK1+b3ofOFUexavYu8qfKy/dddXPmvEjVaGiORs2ZBmzbGFpRCiK+TxFUhooDPWzjT21gakjgXVDkCKYt/csqLF9Cnpx/znMzJqJ6wXnXh59YpYegeSJfu43nXXK8x8dhEFp5biKevJzWy1aB7qe5SaEeIAMKUUCqlsmBURS0LJMdY33EQGKu1vhP53RMiHjP5wbWJcG4AeFvCPxZwwwfPf2YzyuEBIzfXJJFlIqbWmEp52/a0+82CI0egZk2YORMyZozpBxBChETiqhCR7MluY1Ty3T3I1d1YL2lh/fGwyQRO80307uaNm7sFPRiHY+Uj2E74C/Ll+3je5eeX+evAX6y4uIIE5gloXqA5XUt2JU+qPDHxVEJ81UKdUCqligB7MSrSbQaeAqmBusCvSqkKWuvTUdJLIeIb9ztwrAU8OwDX7GDiW6jZkD0z6tP+cD9uHLhBk/xNGFNxAotnpqaoIyRKBIsWQdOmn+3JLIT4CklcFSISBRyVtMsBVQ5BqtKfnHLhvKZDYzcOX05GGZyZkWsK+af8DlV6fjzn0rNL/HXgL1ZeWolNAhv6lu1L15Jdsbexj+4nEiLWCMsI5TiMAgE1tNYeH7703ztri//xipHbPSHiGa3h1lw41R28vGEecMuOZwum0cNsJ/9uqkfWZFnZ0XQH9u5V+LkynDoFdevCtGmQJk1MP4AQIgwkrgoRGZ7s8R+VvBvkqKS7Ozi2f8KkpalIqv2Yn6IXv00qhFmT5R/XhVx6dolhB4ax6tIqbBLY0K9sP7qX6i6FdoQIhbAklMWBBgGDHoDW2kMpNQ5YEak9EyK+8XgEx9vA461wKyFM9sZUvx3zR+Sl96GuuHu7M7DcQHoU78/40daMGgXJk8OqVVCvXkx3XggRDhJXhYgIH3c42xtuzAC77FDlIKQq8/Gw1rBuxhP+7GnJg/dpaGO1mFGDPUjRYzhYGYV0Lj67yLD9w1h9eTW2CWzpX64/3Up2k0RSiDAIS0L5HvjS/7uS8//NmYUQYeWyHE52AM+3sBhwSc+lxYNo/2Quh3bNotw35ZhVaxavbuSmdHG4cgWaN4cJEyCFxDwhYiuJq0KE19O9cKyVMSqZsxsUGA4WiT4evn3ClS4NnrLlbl4KqPOsbLGGUpMaQpIkABx/cJyxR8ay5soa7BLYMaDcALqV6kZy6+Qx9URCxFphSSj/A0YppW5rrQ99+FIpVRYYCWyK7M4JEed5uoJzJ7i3Elws4R8TXk2783fPBIw6/jt2VnbM+2ke9bK1YNBAM6ZONYrtbN362R7LQojYR+KqEGHl8xbO9oUb041RycoHwL7sx8Oez94wtqEzI/aVwoJvmFBqFV1WlMUiY1tM2sSW65sZe2QsB+4eIGnCpAwsN1ASSSEiKCwJZXdgA7BfKfUco3iAvf/PEaBH5HdPiDjs4WY42go8X8AqwCUnhxb8ye83xnP16FWaftuUCVUncPZIKgrUARcX6NQJRo4EO7uY7rwQIhJIXBUiLB7vgOO/g8f9z0clvbzY0XULnWYX4KapIg0yHGbCkjSk/64+Xr5eLD6zgLFHxnLF9QoZE2dkYrWJtC7UGjsrCahCRFSoE0qt9QugrFKqOlAMSAs8Bo5rrXdEUf+EiHt83sCprnB7ATw0h5lmvPmtL31buzLj6O9kSpKJbb9uo3iKavToAgsWQI4ccOAAlCsX050XQkQWiatChJK3G5zpCbfm+e8reRhSlTKO+fnxYOo6ug9IyCqP2mS3fsCOMdeo0rkMbp5ujD40msnHJ/PY/TEFUhfg39r/a+/O43Qq/z+Ovz7GkpF9r5TSgkRFZUu+Ulm+pcVayVLGWrayZUtkD5GELJUlKQnNT187IXuSLclSlG0QxjZz/f4492iaBmOW+57l/Xw87sc993Wuc87nvgyXzznXua5PqXt3XTIEZQjoVxJJTa6aUJpZZqAGUBivo1vonPu/JI5LJHX6YyF81wjCf/cGs/12P1+PeYlWPw7g4KaDtC/bnj7/6cP8OddTvA0cPgxdukCvXnDddYEOXkQSg/pVkWvw+1xY0wLOHoTiXeCeXhB0HTjHhdnfMKLlNnr/0YIIS8/bjXbxxpgiHL94iE7/68SYdWP46/xfVL2tKpOensRjtz2GaV0tkUR3xYTSt+DyArxOL8pJM6urq6ci1+DCKdj4hrc+1h/pYEIm/mjSldee28LnK1/lnnz38GW9LymU7kFeqg+zZsG998K8eXD//YEOXkQSi/pVkTg6d9QbzbPnU8hxD1T6CnKX8bZ99x3LW06l5Y8t+Yma/Pf+A7w3owCZ8gXTZUkHPlz/IecizlH37rp0Kt+J+wreF8hvIpLqXe0O5SAgEngYWA/cCowGPvT9LCJXc2gZLH8Rzu6HULhwuAKThlaj06bBhO8Mp1+Vfrxe7g0+npSBx1+Hs2e95yQ7doQMGpEjktqoXxW5mv1fwtpWXlJZohfc3Q2CMsLmzfz5+mA6/a8qH/M+t+T6i9njLnJ/1UgGrmjLuGnjuBh5kRdLvki3h7txZ+47A/1NRNKEqyWU5YCOzrnvfJ+3mVlz33tB59zBpA1PJAW7GO7NRLdjJByGs1MyM/GFZxlk37Fn9Zs8fPPDjHtyHEHH76La47B4MVSqBOPGec9MikiqpH5V5HLC/4B1r8L+mZDzPvjPfMhZCnbvJqJHbz6Ymp3ujOJM0PV0bX+eF9sf5b0NA6jz3kQiXSSNSzWm68NduS3nbYH+JiJpytUSyoLA7hhlvwAGFMB79kNEYjqyGpY0gPN7OLUQPsxQlKF1wzh4YgplbyrLyOojeeLWmgwfbvTsCRkzwocfwiuvQLp0gQ5eRJKQ+lWRmJyDXz+GDe3h4hko1Q+KvQGHjkKbNqwes4lWkaPYyL1UfeQCnYfuZfrv71Dqo8mks3S8fN/LdK7QmVty3BLobyKSJsVllleX5FGIpBYR52BTD9g+hGOHHaPWZ2LEnRk4FrmdR/M/ypSHp1G5cGU2bzbKlYP166FWLXj/fbjxxkAHLyJ+on5VJMqpPbCmOfzxLeStCA+NB1cAer7FkXc/psvZXnzkRnFjgQjGDP2djdn6Uv2b8aRPl56WZVrSqUInbsp2U6C/hUiaFpeEcr6ZXYylfGHMcudcvsQJSyQFOroWFtfjzzO/8u5mGJ0lPaduP8dTdzxBt4rdeOimhwgPhzffhEGDIHdumDEDatcGTTonkqaoXxWJjICf34cfugEGZd6HQo1h9AdE9BvA+LBn6ZpxK38FZaFVuyMEPTKAdj+8z8XIi4TcH8Kbld7khqw3BPpbiAhXTyjfSsyTmVkdoCFQGsgO7ACGOOem+bYXxFvo+XGgCBAGLAK6OucOxDhWcWAk3vMox4HxwFvOuYirxJAdGA48DaQD5gKv+dYDE7l2EedgQ3dObBvCoD9g+Ck4mzUd9UrUoWvFrtyT/x7Ae0YyJAR27YImTWDIEMiVK8Cxi4i/JWq/KpIindgK378CR1ZBwepQZjR8/R08Xpx1e/PQKtsK1nIXFSqe4J7mvfn4l3c5s+EMDUs2pNcjvbg1p+avEklOrphQOucSu+PrAPwKtAeO4K3DNdXM8jjnRuIlms/gJYffA/mB3sBKMyvhnDsFYGY58aZd3wrUwks+h+IliN2vEsNnwF3AK3gz7Q0EvsKbcU/k2hxbz9n5zzL62D76HYJj6aBB8dq89dg73JH7DgDCwqBTJxg/Hm67DRYsgEcfDXDcIhIQSdCviqQcEedh2yDY8jZkyArlPoFd+eGRZzm6cS/dco9lnD1Lvmzh1O83iG/PDOS7bceoXbw2fSr3oVjeYoH+BiISi7gMeU1MTzrnjkT7vMjMbsBLNEcCK4CizrlLQ37MbAPenczngMm+4hZAZuBZ59xJ4H9mlg3obWaDfGX/YmblgCeAR5xzy3xlvwPfm1lV59yCxPyykopFnCdibRc+2TCMnkdhfyQ8kedB+tcec2m9K+fgiy+gTRs4csRLKnv1guDgAMcuIiLib0dWw/fN4MQWuKU+XBcCLQcSMf9/jM/VmW7X9+L4iXRUeWMcW/L0YvrRP6h+e3X6VunL/QW1ILNIcubXhDJGMhllI95dRpxzx2PZZ6eZnQGiP0dSHZgfI3Gcjne38RFgzmVCqA78GZVM+o6/xsx+9W1TQilX5Y6sY86X1el29Ag/nYcHXH4mN5jMf+564lKd33+H1q1h9my47z745hu4X/2hiIikNRf+gh/ehJ2jIPhGKPYRDF8KnzzKmuur0LrQ76zbX4AST88nV6WOLDz5ExVyV+DzujN4+BYNHhNJCfx9hzI25fGGrsbKzEoCwTHqFMV7tvIS59w+X+JZlMsnlEWB7bGUb/NtE7m8iPOs+r+GvL5lBivPwp3nMjCzYn+erd4B882qExnpLf/RpQucP+9NvtO+PaRPDn/TRERE/On3ubC2FZz5DQo3g3lZoH4rjrjcdL1nFR/9+CC5C/1EyUGN2XxmPkWCijCzzkyeLfbspX5VRJK/gP4318wexbs72fQy29MBI4CfgW+jbcqJNxFPTGG+bZdzpf20Cq5c1pH9C+j8eS0m/HWGghHwYfCTNO06g/QZr7tUZ+tWaNYMVq70npH88EMoUiSAQYuIiARC+J+w/jXYNwOyFYe/OkKtSUQcOca4B8fTbcdLnNhzhBJdW/BTpvFcjMzG0MeH0vqB1mRKnynQ0YvINQpYQmlmhYGpwGzn3KTLVOuPN4vrI865CzG2xbaOl12mPN77HT58mDJlylz6HBISQkhIyFVOIalF5MVwJsyoTudflnIyEjodz0uP5gu4vmjJS3XOnYN33oH+/SFrVpg0CV56SUuBiIhIGuMc7J4IG1+Hi6ch4wvQeR1sHcKqe1vSJs9gNmxIx63PD+Lcne+wLfIsrz7wKj0q9SB3cO5ARy8i8RSQhNLMcgGhwD7gxcvUaQW8ATRwzn0fY3MYkCOW3bIT+x3I6PvljaU8x+X2y5s3L+vWrbvCISW12vTTZFp+3YzV5y9Q6YIx+qbO3N3znX9kisuXe3cld+yAF16Ad9+FfFo1TkRE0pqTO2FtC/hzMQTfD1MywFdT+PO2cnSpspdJi28kR6VPyfNiT369uI+nb3+agVUHcmfuOwMduYgkkN8TSjMLxlv7MSNQ0zl3OpY6z+HN+trJOfdZLIfZToxnHs2sEJCF2J+RjL5fbE94F8VbOkSEk2cO0fOTyoz8Yxu5DSafvJ2GHZdiN/y9gPLx49C5M4wdC4ULQ2goVKsWsJBFREQCI+IcbB0EP/UDywQ/lYMBq7iYIx+jn1tKjwUVOR08m9w9qnM03VbK5CvDzMc+5pHCjwQ6chFJJOn8eTIzSw98DtwBVHfOHYqlTmVgCjDKOTfkMocKBZ4ws6zRyuoB4cDSK4QQChQws4rRzlcG7/nJ0Lh/E0mNnHN8tqwXRYcW4L0/ttH8bBA7in7AS0N/vpRMOgczZ0KxYt66kh07wpYtSiZFRCQNOrQcQu+DH3tC2G3Q7jwMWc/Suh9wf4EDtN1wEV4pR0TtZ8mbL5KZdWay5pU1SiZFUhl/36EcDdQA2gK5zKxstG0b8RK7r/DuJH4WY/th59wvvp/HAK8BX5rZQN9+vYF3oy8lYma7gKXOuZcBnHOrzGw+8LGZvQ5E4i01skJrUKZtq/YsocvMuiw7fZjSQTA7vCwPdP4Gcv49x9P+/d5SIHPmeEuBzJ0LpUsHMGgREZFAOHcMNnWGX8ZDRC4Ynw2WbeNArZa87gYxbdk2rqtZDW5cQPZshRhW+SNeKvUS6dNpynOR1Mjff7Mf972PiGXbrcBDeM9BlgK+i7F9MtAYwDkX5pshdhTeEiHHgWF4SWV06YGgGGX1fXUn4N2hnYuXnEoa9NOhn3hzzivM/m01+YNg9JlMhFSdTtBjT1+qExEBo0bBm296dygHD4Z27bQUiIiIpDHOwd5psL49nDsCK3LAhGOcL/sYw1+dwFuz/+Js+UYQ8iXXZ85D/0rDaFGmBdelv+6qhxaRlMuv/yV2zhW+SpVJvldcjrUVqHKt53POHQea+F6SRu09vpdei7rx8Y9TyZoO+maCdueeJUuvTyA4+FK9jRshJATWrYPq1WH0aO+ZSRERkTTl1G5vTcmD8+HPLDAiEq6/kf/rNp8Wc65n7++docl0smTMwhvle9OhXAeyZsp69eOKSIqneyySphw+fZh3lr/D6LWjsMiLdMwOXX7LTe76s6FMhUv1Tp+GXr1g+HDInRumT4e6dbUUiIiIpDER52H7UNj8FpyP8Ga52JqN3a9O5JX1xVm8uS/U/IxMQZl5rWxHOlXoRJ7gPIGOWkT8SAmlpAmnz59myMohDFk1hDPnT9EkG/SKgELpWsGAEf8YvxoaCi1bwt693pIgAwf+41FKEZFkx8xux1tqqyxQAljunKscbXtlYPFldv/WOfeEr15jYGIsdVo658YkXsSSIhxaBquawemdsAb4MgtnmvajfemqjN84gMjiM8loWXitbGc6VexA3iyxrcwmIqmdEkpJ9X7880fqzqzL9iPbee46o28hKLrzFnh5Htxx96V6f/7pPRs5fbo3i+uyZfBwbIvMiIgkP3fjTXq3Gm9Zrpg2AOVilN0MfEbss5xXwZs5PcruRIhRUoqzh2F9R9j7CRwxmJwO91AL3u36PD2+H0b4rZ1Jf2dWWt/blV6PdyB3cO5ARywiAaSEUlIt5xzjN4zntdBXyeEiWHAjPHo8HaR/E/r3vjR+NTLSWwKkc2c4cwb69IFOnSBTpsDGLyJyDeY452YDmNlM4B9jDn0zoK+OXmZmlfBmO58Ry/HWOudOJVGskly5SPhlAqxpDxGnvGkLwx/n607teGXFWA4fqki6m7Lx0i09GFavHbky5wp0xCKSDCihlFTp5LmTNJ/TnOk/TeexTPBJfsi/twS0CIUCN12qt2ULNG8OK1dC5cowZgzcdVfg4hYRiQ/nXGQ8dquPt7TWgcSOR1Kg41tg0QtwdrO3eNuKO9jSvC/1Nn3L1q01Iev11AjuzeSWbclzfY5ARysiyYgSSkl1NhzcQL3pz/DryX28kxs6n8pAujzDoUmrS3XOnIG+fb0lQLJnh8mToWFDTbojImmDmd0B3AeEXKbKL2aWG/gFb43nD/0WnPjXhVOw8g3Y/yGcdjA3G0dqvkXDGgf5vy2NIH0kJcPbMuPVbtxVSJPtiMi/KaGUVMM5x/trRtFxfjvypotkST6oeOJhaD0Hsma/VG/+fGjVCnbvhsaNvaQyj/pIEUlbGgAXgC9ilB8EeuBNwRLkqzfGzIKdc8NiHuTw4cOUKVPm0ueQkBBCQi6Xo0qy4hz8PBVWtYIMJ2FlEGfztuL12gUYs7MPERmOk/fwi0xs2IeaFQoHOloRScaUUEqqEBYexsvTnmHW/qXUDIZJF4LJc8/HUPG5S3X++AM6dIBp07xhrYsXe8NcRUTSoPp4s7sei17onJsPzI9WFGpmmYDuZjYi5tDavHnzsm7duqSPVhLXiZ9hTm1ItxkOQsRvVXnvsRp0Xz+MM3v2k+nPavStOIDOvUpp5I6IXJUSSknxvtuzlBc/rcFvEWcYmg3ap6uFtZ0BGb2JDiMjYdw46NLFG+rau7f3sybdEZG0yMxKAcWAfnHcZSZQFyiMZntN2SLOwvw2cGQCXHCw9mYWVe/MC+fH8MfmDtjhMjyfdxIfDq/C9dcHOlgRSSmUUEqKdT7iPL2mN2DQri+5JT18F5SNBx+bCyX+Xutj82Zo0QJWrdKkOyIiPvXxlgSZfY37uSSIRfzlh49hTWvIcgq2ZOZQ8e48f/dOFm5qDScKUfroZ0zvUYfbb9ctSRG5NkooJUX6cd9KGn78OD9EnOaVzPBursZkbTIegoIAOH0a3noL3n0XcubUpDsiItHUw1tmJK7LgjwHHAH2Jl1IkmSO7ICZT0O27XDSiDz6Am+XfYi+G3tx0f4i144uTGjUnVo1sgQ6UhFJoZRQSooSERnBsE/r8Oavs8gRBF+nz8eT9RfDzcUv1ZkzB9q0gX374JVXYMAAyK01l0UkFTOzYKCG7+ONQDYzq+37/I1z7oyvXlngVqDDZY7zBd6EPJvxJuWp53u9Fs+lSSRQLp6DGY3g/AzI5GB7SZY8+hYNFvTnjx+nEHSwMp2Kvk/fT4uTIUOggxWRlEwJpaQYe379jkZTHmdZxBmeyQAf3tGZvHUGXNq+fz+0bQuzZsHdd8Py5VCxYgADFhHxn3zA5zHKoj7fCuzx/VwfOAGEXuY4O4CmQCHAgK3AS865TxIzWEliy9+Hza9DzrOwLwdHy42gztG1LF70HJzNS+VznzK93/Pkz69hOyKScEooJdlzzjHpoydpe2AeAJMy3sxLzVdzNnNB9u+HI0dg0SJvsp2ICO+OZPv2l+bkERFJ9Zxze/ASwKvVawe0u8L2bkC3xIpL/GzfRpj1LOTdAxeD2HniFYbeeA8TFnXiYsbDFPytNVOb9aFy2RyBjlREUhEllJKsLf7qK3qva8yyDCd4KF1mCnw3iZ7b6tKqrzdja3Q1asCoUXDrrYGJVUREJCDOnoKP63Ehwzd8lxHG/licORGOU1nGA5DhRFn6lvmGbr3v11wCIpLolFBKshQZGcm4ceOYPnYwm2qdoMaBqhzb9DUXcmemcmXIkwfy5vXe8+SBQoXg/vs16Y6IiKQtOyZ3ZumhUSxKd4a5f2XgtF2ADDsJPlGJSuleoeFDNXmh2p1kzqwOUkSShhJKSXa2bdtGSEgIK1as4D//+Q/Lqr5HqfI1rr6jiIhIGvHrr7/SekxF/i/4AA7IdDYHBcOfpmGR//JqjccoXiRboEMUkTRCCaUkG+fOnWPAgAG88847ZMmShQkTJtC4cWNMtx1FREQACA8PZ9CgQQwYMIBqTxbmuRty8kTlkbxU8xEyZkgX6PBEJA1SQinJwvLlywkJCWH79u08//zzDBs2jHz58gU6LBERkWTBOcfXX39Nu3bt2LNnD/Xq1WPIkCHcdNNNgQ5NRNI4v17KMrM6Zva1mf1uZqfMbL2ZNYhRp5WZzTOzo2bmzKxyLMdZ4tsW26vcFc5f+DL7TE/8bytxcfz4cZo3b06lSpU4e/YsoaGhTJkyRcmkiIiIz86dO6lRowZPP/00wcHBLFq0iOnTpyuZFJFkwd93KDsAvwLtgSN4izBPNbM8zrmRvjovAQ6YDzSI9SjQCoj5cEAf4D5gbRzieB34LtrnI3GKXhKNc44vvviCV199lUOHDtGxY0feeustsmTJEujQREREkoVTp07Rr18/hg4dSubMmRk2bBitW7cmQ4YMgQ5NROQSfyeUTzrnoidvi8zsBrxEMyqhLO+cizSzElwmoXTObY3+2cwyAmWAz5xzF+MQxw7n3OprD18Sw/79+2ndujVz5szhvvvuY+7cuZQuXTrQYYmIiCQLzjlmzpxJhw4d+O2332jUqBEDBgygQIECgQ5NRORf/DrkNUYyGWUjkC9anch4HLoakBOYFs/QxA8iIiJ47733KF68OAsXLmTIkCGsWbNGyaSIiIjP9u3befzxx6lbty558uRhxYoVTJo0ScmkiCRbyWE6sPLA1qvWurL6wO/A8jjWn2hmEWZ20MzeNbPMCTy/XMXmzZspX748bdu2pUKFCmzZsoWOHTuSPr3mhRIRETl9+jRdu3alZMmSrF27llGjRrFu3ToqVKgQ6NBERK4ooP+bN7NHgVpA0wQcIxh4EhjrnHNXqX4OeB/4FjgJVAY6A0V8cfzL4cOHKVOmzKXPISEhhISExDfcNCc8PJw+ffowZMgQcubMyZQpU2jQoIGWAhEREcEb3jpr1izatWvH/v37adSoEQMHDiR//vyBDk1EJE4CllCaWWFgKjDbOTcpAYd6ErieOAx3dc4dBNpEK1piZn8Co83sXufcppj75M2bl3Xr1iUgvLRr4cKFNG/enF9++YUmTZowePBgcufOHeiwREREkoWff/6ZV199lfnz51OyZEmmTp1KxYoVAx2WiMg1CciQVzPLBYQC+4AXE3i4+sAu51x8s76Zvvf7ExiH+Bw5coTGjRtTtWpVzIyFCxcyYcIEJZMiIiLAmTNn6NGjByVKlGDlypUMHz6c9evXK5kUkRTJ73cofUNU5wIZgZrOudMJOFZ2oDowKAEhuRjvEk/OOaZMmUL79u05fvw43bp1o3v37mTOrEdURUREAObMmcNrr73Gnj17eOGFFxg8eDAFCxYMdFgiIvHm1zuUZpYe+By4A6junDuUwEM+A2QiYbO71va9r09gLGna7t27qVatGg0bNqRIkSJs2LCBfv36KZkUEREB9uzZQ61atXjqqacIDg5m8eLFfPrpp0omRSTF8/cdytFADaAtkMvMykbbttE5d87MygCFgUK+8kfMLA+wJ5ZhrfWBH5xz22I7mZntApY65172fe4NZAW+w5uUpxLwBvClc25zIny/NOfixYsMGzaMXr16kT59ekaNGkWLFi0ICgoKdGgiIiIBd+7cOYYOHUrfvn0xMwYOHEi7du3ImDFjoEMTEUkU/k4oH/e9j4hl263AHrxJcxpFK+/te58MNI4q9CWZjwI9rnC+9ED0zGY78DrwCpAZ7xnOwUC/uIUv0a1bt45mzZqxadMmnnrqKd5//31uuummQIclIiKSLCxYsIDWrVuzc+dOnnvuOYYNG0ahQoWuvqOISAri14TSOVc4DnUaEy1xvEK9I0CGazmfc246MP1qx5YrO3XqFD179mTEiBHkz5+fL774gmeeeUZLgYiIiAAHDhygY8eOTJ8+nSJFihAaGkq1atUCHZaISJIIyCyvknKFhoZSokQJhg0bRrNmzdi6dSvPPvuskkkREUnzLl68yPDhwylatCizZs2id+/ebNmyRcmkiKRqAVuHUlKWQ4cO0a5dO6ZNm0axYsVYvny5pjcXERHx+e6772jVqhWbN2+mevXqjBw5kiJFigQ6LBGRJKc7lHJFzjkmTZpEsWLFmDlzJr1792bjxo1KJkVERIDDhw/TtGlTKlasSFhYGF9++SXz5s1TMikiaYbuUMpl7dq1i+bNm7No0SIqVKjAuHHjKFasWKDDEhERCbiIiAjGjx9P165d+euvv+jcuTM9evQgS5YsgQ5NRMSvdIdS/uXChQsMGDCAe+65h3Xr1jFmzBiWLVumZFJERARYv3495cqVo0WLFpQqVYoffviBAQMGKJkUkTRJCaX8w9q1a3nggQfo2rUrNWrUYNu2bTRv3px06fSrIiIiaVtYWBitW7fmgQceYN++fUyZMoVFixZRvHjxQIcmIhIwyhIE8JYCad++PWXLluXw4cPMmjWLL774ghtuuCHQoYmIiARUZGQkEydO5K677mLMmDG0adOGHTt28Pzzz2uWcxFJ8/QMpRAaGkrLli3Zu3cvLVu2pH///mTPnj3QYYmIiATcxo0bad26NatWraJ8+fJ8++233HvvvYEOS0Qk2dAdyjTs0KFDvPDCC9SoUYPg4GBWrFjB6NGjlUyKiEiaFxYWRps2bShTpgy7du1i0qRJLF++XMmkiEgMSijTIOcckydPplixYnz++eeXlgKpUKFCoEMTEREJqMjISCZNmsRdd93FBx98QKtWrdi5cyeNGjXSfAIiIrHQkNc0Zvfu3TRv3pwFCxZQvnx5xo0bp8kEREREgE2bNtG6dWtWrlxJuXLlNLxVRCQOdKktjbh48SJDhgyhRIkSfP/994wePZrly5crmRQRkTTv2LFjtG7dmtKlS/Pzzz8zceJEVqxYoWRSRCQOdIcyDdiwYQPNmjVjw4YN1KpVi1GjRnHTTTcFOiwREZGAioiI4KOPPqJbt26EhYXRqlUr+vTpQ86cOQMdmohIiqE7lKnYmTNneOONN3jwwQc5cOAAM2fOZNasWUomRUQkzVu1ahUPPfQQzZs3p3jx4mzYsIGRI0cqmRQRuUZKKFOpBQsWcM899zBkyBCaNm3K1q1bee6557ReloiIpGl//PEHjRs3pnz58hw8eJCpU6eydOlSSpUqFejQRERSJCWUqczRo0dp3Lgxjz32GEFBQSxZsoSxY8fqiquIiKRpFy5cYNiwYdx1111MnTqVzp07s2PHDho0aKCLrSIiCaBnKFMJ5xzTp0+nbdu2hIWF0a1bN3r06MF1110X6NBEREQCav78+bRv355t27ZRrVo1RowYwZ133hnosEREUgXdoUwF9u7dS82aNXn++ecpXLgw69evp1+/fkomRUQkTdu5cyf//e9/qVatGufPn+err77im2++UTIpIpKI/JpQmlkdM/vazH43s1Nmtt7MGsSo08rM5pnZUTNzZlY5luM09m2L+WoRhxiym9lEMwszsxNmNsXMcifet/SfiIgIhg8fzt13382yZcsYPnw4q1atomTJkoEOTUREJGCOHz9Ohw4dLvWPAwcO5KeffqJWrVoa3ioiksj8PeS1A/Ar0B44AtQApppZHufcSF+dlwAHzAcaxHqUv1UBwqN93h2HGD4D7gJeASKBgcBXwMNx+wrJww8//ECzZs1Yu3YtNWrUYPTo0dxyyy2BDktERCRgIiIiGD9+PN27d+fo0aM0bdqUfv36kT9//kCHJiKSavk7oXzSOXck2udFZnYDXqIZlVCWd85FmlkJrp5QrnXOnYrryc2sHPAE8Ihzbpmv7HfgezOr6pxbEOdvEiDh4eG8/fbbDB48mJw5czJt2jTq1aunK64iIpKmLV68mHbt2rF582YefvhhRowYwX333RfosEREUj2/DnmNkUxG2Qjki1YnMglDqA78GZVM+s63Bu+uafUkPG+iWLx4MSVLlqR///68+OKLbNu2jfr16yuZFBGRNOvgwYPUqVOHKlWqcOLECWbMmMHSpUuVTIqI+ElymJSnPLA1nvv+YmYXzWyHmTWPQ/2iwPZYyrf5tiVLx44d4+WXX6ZKlSo451iwYAETJ04kd+4U+einiIgkMjO73cw+NLMfzCzCzJbE2F75MnMPODObH6NucTNbaGZnzOyAmfUxsyC/fqE4iIyMZNy4cRQrVow5c+bQp08ftm3bRp06dXShVUTEjwK6bIiZPQrUAppe464HgR7AGiAIb2jsGDMLds4Nu8J+OYHjsZSHAbfFtsPhw4cpU6bMpc8hISGEhIRcY7jx45zjs88+o23bthw9epTOnTvTs2dPgoOD/XJ+ERFJMe7Gm5dgNZAxlu0bgHIxym7Gm1cgNKrAzHICC/Au9NYCigBD8S5Ad0/0qONpx44dhISEsGzZMipXrsyHH36omVtFRAIkYAmlmRUGpgKznXOTrmVf59x8vEl7ooSaWSagu5mNuMqwWRdbOJcpJ2/evKxbt+5awksU+/bto1WrVsybN48HHniAb7/9llKlSvk9DhERSRHmOOdmA5jZTCBP9I3OuZN4yeYlZlYJb3K6GdGKWwCZgWd9+/zPzLIBvc1skK8sYM6fP8/gwYN5++23yZw5M+PHj6dp06a6IykiEkABGfJqZrnwrojuA15MpMPOBHIBha9QJwzIEUt5DmK/c+l3ERERjBgxguLFi7NkyRKGDRvGqlWrlEyKiMhlxXP+gfrAUufcgWhl1YH5MRLH6XhJ5iMJCDHBvv/+e0qXLk337t156qmn2LZtGy+//LKSSRGRAPN7QmlmwcBcvCE5NZ1zpxP5FLHeafTZTuzPSl7u2Uq/+uGHHyhXrhzt2rWjUqVK/PTTT7Rr146goGT36IqIiKRgZnYHcB8wLcamf/WHzrl9wBkCNNfAqVOnaNu2LeXKlSMsLIzZs2czY8YMChQoEIhwREQkBr8mlGaWHvgcuAOo7pw7lIiHfw5vbcu9V6gTChQws4rRYiqD9/xk6GX3SmLh4eF07dqV0qVLs3fvXqZNm8a8efO0rqSIiCSVBsAF4IsY5VeaayBnzMKoeQaiXmPHjk3UINeuXcu9997LyJEjadWqFVu3buWpp55K1HOIiEjC+PsZytF4kwa0BXKZWdlo2zY65875ErzCQCFf+SNmlgfY45xbB2BmX+BNyLMZb1Keer7Xa9GH/ZjZLrzhPC8DOOdW+Waz+9jMXsd7dmQgsCJQa1AuWrSI5s2bs2vXLpo0acKQIUPIlStXIEIREZG0oz7wrXPuWCzb4jzXQFLNMxAZGcngwYPp3r07BQsWZMmSJVSqVCnRzyMiIgnn74Tycd/7iFi23QrsAdoAjaKV9/a9TwYa+37egTczbCG8Tm4r8JJz7pMYx0yPl3BGVx8YBkzAu0M7F3jtmr5FIjh27Bivv/46EydOpEiRIixcuJAqVar4OwwREUljzKwUUAzoF8vmy801kB0/zTVw4MABGjZsyKJFi6hduzZjx44lZ85/3RwVEZFkwq8JpXOucBzqNObvxPFydboB3eJzPufccaCJ7+V3MZcC6dKlCz179iRz5syBCEdERNKe+kA4MDuWbf+aa8DMCgFZ8MNcA3PmzKFJkyaEh4drBlcRkRQiILO8plX79u3jv//9Lw0aNODmm29m/fr19O/fX8mkiIj4Uz28ZUZOxbItFHjCzLLGqB8OLE2qgMLDw2nTpg1PPfUUhQoVYv369ZrBVUQkhVBC6QcxlwJ59913Wb16tZYCERGRRGFmwWZW28xqAzcCeaM++2ZXj6pXFu8Rk5izu0YZA5wDvjSzqmYWgvfoybtJtQblli1bePDBB3n//ffp0KEDq1evpmjRgEwoKyIi8eDvZyjTnEOHDvHkk0+yZs0aqlWrxgcffEDhwoUDHZaIiKQu+fBmUY8u6nPUHAXgDXc9wWVmNnfOhZnZo8AoYA7ec5PD+Hs+g0T1ySefEBISQrZs2QgNDaVatWpJcRoREUlCSiiTWO7cucmXLx9TpkyhQYMGGr4jIiKJzjm3B2+SuqvVawe0u0qdrYBfZokrVKgQjz76KB999BH58+f3xylFRCSRKaFMYkFBQcyZMyfQYYiIiCQ7lStXpnLlyoEOQ0REEkDPUIqIiIiIiEi8KKEUERERERGReFFCKSIiIiIiIvGihFJERERERETiRQmliIiIiIiIxIsSShEREREREYkXJZQiIiIiIiISL0ooE8HYsWMDHUKqpzZOWmrfpKc2Tnpq4+RHfyZJT22c9NTGSUvtm/SSuo2VUCYC/UVIemrjpKX2TXpq46SnNk5+9GeS9NTGSU9tnLTUvklPCaWIiIiIiIgkS+acC3QMyZqZHQb2XqVaHuCIH8JJy9TGSUvtm/TUxkkvMdr4Fudc3sQIJi2IQx+p3/ukpzZOemrjpKX2TXpJ2j8qoRQREREREZF40ZBXERERERERiRcllCIiIiIiIhIvSihFREREREQkXpRQXiMzu9HMTpmZM7Pro5WbmXUzs/1mFm5my8zs3gCGmmKYWWNfe8Z8tYhWR+2bQGaW3sy6mNnPZnbOzH4zs2Ex6qid48nMllzm99iZWTlfHbVvAphZfTPb4Ps3+Hcz+9jMbohRR20cQOojE5/6yKSn/jFpqX/0j0D2kUoor91g4FQs5V2AHsBA4ElfnQVmVsCPsaV0VYBy0V5fRtum9k24icBrwBDgcbw2DY9RR+0cf6345+9vOeB/eLOqrfXVUfvGk5k9BUwDVgK1gM5AJWCumUXvy9TGgaU+Mumoj0w66h+TlvrHJBbwPtI5p1ccX8DDwDHgdcAB1/vKrwNOAD2j1c0CHAb6Bjru5P4CGkdvz1i2q30T3sbVgAtA8SvUUTsnbptn9P178YHaN1HaczqwPkbZU75/O4qpjQP/Uh+ZZO2qPjJp21f9o//bXP1j4rdpQPtI3aGMIzMLAkYCffj3Oi7lgWzAjKgC59xpYA5Q3V8xpmJq34RrCixyzm29Qh21c+KqBuTEu2IIat+EyoDXEUZ33Pduvne1cYCojwwotW/CqH/0P/WPiS+gfaQSyrhrgZfZvx/LtqJABPBzjPJtvm0SN7+Y2UUz22FmzaOVq30T7iFgp5mNMrOTZnbGzL6MMbZe7Zy46gO/A8t9n9W+CTMBeNjMXjKzbGZ2J9AXWBztP4Jq48BRH5n01EcmDfWP/qf+MfEFtI9UQhkHZpYbeBvo4Jy7EEuVnMAp51xEjPIwINjMMiZ1jCncQbzx3A3xxnN/D4wxs/a+7WrfhCuAN2zqXrx/yJsApYFZZhZ15UrtnEjMLBjvd/kz5xtTgto3QZxz8/B+h8fiXYXdAQQBz0arpjYOAPWRSU59ZNJS/+hH6h+TRqD7yPTx3TGN6Qd875z75gp1XCxldoVt4uOcmw/Mj1YUamaZgO5mNiKqWiy7qn3jznyvWs65owBmdhBYijfRw0JfPbVz4ngSuJ6/h/NEUfvGk5n9BxgDjABCgfxAb7z/9FWN1kGqjf1PfWQSUh+Z5NQ/+pf6xyQQ6D5SCeVVmNndeOPrK5lZDl9xsO89u5lF4GX2Wc0sKEbWnwM4c5krtnJlM4G6QGHUvokhDNgd1Vn6rADOA8XxOky1c+KpD+xyzq2LVqb2TZihwNfOuc5RBWa2CdiON6Pdl6iN/U59ZMCoj0w86h/9S/1j0ghoH6khr1d3B96Drqvw/iDC+PsZkd/wJiHYjndb+fYY+xb1bZP4c6h9E8O2y5QbEOn7We2cCMwsO97D7TGvvqp9E6YosCl6gXNuB97U/kV8RWpj/1MfGVjqIxNO/aOfqH9MUgHtI5VQXt0K4D8xXgN922rgrbm1EjgJ1InaKdoY8VB/BpuKPIc3U+Be1L6JYS5Q0szyRCurhPcfwR98n9XOieMZIBP/7jDVvgmzF7g/eoGZFQMyA3t8RWpj/1MfGRjqIxOP+kf/Uf+YdALaR2rI61U4544AS6KXmVlh34/LnXOnfGUDgB5mFoaX5XfAS9hH+i3YFMrMvgDWAJvxrpzU871ec85FAmfVvgk2Fm/R5jlm9g6QFe8/fQuccysAnHNq58RRH/jBOfePq95q3wQbAwwzswP8/XxIT7yO8htQGweC+sikpz4yyal/9B/1j0knoH2kEsrEMwDvD6QrkBtYBzzmnPszoFGlDDvwnsEphDfEZCvwknPuk2h11L4J4Jw7aWZVgPfwFr89D8wG2seoqnZOAN8V7kfxZmSMjdo3/t7D+71tibdExXG8u2NdfetoRVEbJ0/6c4k/9ZFJSP2jf6h/THIB7SPt7xl7RUREREREROJOz1CKiIiIiIhIvCihFBERERERkXhRQikiIiIiIiLxooRSRERERERE4kUJpYiIiIiIiMSLEkoRERERERGJFyWUIn5mZi4Or8pmtsfMhgQ41jxmNsrMdpvZWTM7YGbzzezpaHUeN7N2gYtSRERSA/WPIimT1qEU8TMzKxvtY2ZgEdAXmBetfCtQBDjqnNvnx/AuMbMMwAYgGHgH+AW4CXgcOOGce9VXbwhQ2zlXOBBxiohI6qD+USRlSh/oAETSGufc6qifzex634+/RC/32ei/qGJVGSgBPOicWxut/FMzs8CEJCIiqZX6R5GUSUNeRZKpmEN6zGySma0zs5pmttXMzpjZPDPLZWa3m9liMzvtq1MyxrHSmVkXM9tlZufMbKeZNbpKCDl873/E3OB8QxvMrDfQEbgl2nCkSdHOW9HMlvpiPWpm48wsa7TtjX37PGBmy80s3BfbMzHir+jbftL32mRmdeLUkCIikqqof/xH/OofJeCUUIqkLDcDfYDuQAhQHhgLTPe9auONPJge4yrpSN8+Y4GawCxggpn99wrn2gRE+upVNLPYRjSMB6bidarlfK+3AcysArDQt6020A6oAUyM5TifAbOBZ4Efgc/NrJTvONmAucBu4DnfsT7h7w5dRERE/aP6RwkQDXkVSVlyAeWcc78A+K60vgE0cs597CszvOdNigLbzOx2oCXQxDk32XecBWZWEOiF1xn9i3PuZzN7AxgALAfOmtlS4CPn3Oe+Or+Z2UHgXCxDkgYAK51z9aIKzOx3YKGZlXDObYlWd7xzboivzny8Z2S6AvWBO4HsQBvn3F+++t9eQ5uJiEjqp/7Ro/5R/E53KEVSlj1RnaXPLt/7oljKbvS9P4p3JXWWmaWPeuFdHb3XzIIudzLn3LvArUBrYA7wEDDDzPpfKUgzC8a7GjsjxjlXABeA0jF2mRXtnJF4V2Mf9BX9ApwCpppZLTPLcaVzi4hImqT+Uf2jBIgSSpGU5XiMz+djKY8qu873ngcIAk7gdVZRr0l4oxQKXumEzrnfnXOjnXN18Wax+z/gDTPLfYXdcvrOOTrGOc8BGYBCMeofiuVzQd/5w/BmzssAzAAO+56Nue1KcYuISJpyPMZn9Y8ifqIhryKp3zHgIlAB70psTDE7q8tyzp02s9FANeB24Ohlqh4HHNAb+CaW7QdifM4X41j5gIPRzrsKqGZmmYGqwLt4z6ZEn2JeRETkWqh/FEkESihFUr9FeFdDszvn/hfXncwsF3DSOXcxxqY7fO9RHe15/r7aC1zqWFcDdznn+sThdM8A23znTQfUAtbErOScCwfmmFkJvGdIRERE4kv9o0giUEIpkso553aY2Ri8me0GAevwOri7gTudc69cZtcqQH8zmwisxbt6Wx7oAsx1zv3qq7cdyG9mjYEtwBHn3B6gE94EA5HATOAvvFn4agJvOud2RjvXK2Z23rd/M7yruw0AzKwm0BT4CtiH9+xLc/75XIyIiMg1Uf8okjiUUIqkDa2BnXidUR/gJN5McR9dYZ/v8R7+r4vX+QUBe4C+wIho9WYA/wEGAXmByUBj59wKM6sEvIU3jXkQsBfvGZM/Y5yrPjDMd+zfgHrOuaiFq3fhDQ96B2+oz2G8mfe6XcP3FxERiY36R5EEMt/6qyIifue7ajsRyOqcOxXgcERERJIF9Y+SkmiWVxEREREREYkXJZQiIiIiIiISLxryKiIiIiIiIvGiO5QiIiIiIiISL0ooRUREREREJF6UUIqIiIiIiEi8KKEUERERERGReFFCKSIiIiIiIvHy/8Y6VnDr0xGGAAAAAElFTkSuQmCC\n", 812 | "text/plain": [ 813 | "
" 814 | ] 815 | }, 816 | "metadata": { 817 | "needs_background": "light" 818 | }, 819 | "output_type": "display_data" 820 | } 821 | ], 822 | "source": [ 823 | "fig, (ax1,ax2)=plt.subplots(1,2,figsize=(15,5))\n", 824 | "\n", 825 | "plt.rcParams['xtick.direction'] = 'in'\n", 826 | "plt.rcParams['ytick.direction'] = 'in'\n", 827 | "\n", 828 | "i = 155\n", 829 | "inputs = torch.from_numpy(x_test[300*i+200,:,:]).unsqueeze(0).to(torch.float32)\n", 830 | "hist_labels = torch.from_numpy(test_pre_labels[300*i+200,:,:]).unsqueeze(0).to(torch.float32)\n", 831 | "labels = torch.from_numpy(y_test[300*i+200,:,:]).unsqueeze(0).to(torch.float32)\n", 832 | "inputs_trans = inputs[:,:,[0]]\n", 833 | "start_of_seq = torch.Tensor([0]).unsqueeze(0).unsqueeze(1).repeat(labels.shape[0],1,1)\n", 834 | "dec_input = start_of_seq\n", 835 | "for j in range(output_length):\n", 836 | " target_mask = _generate_square_subsequent_mask(dec_input.shape[1])\n", 837 | " output_trans = Trans_model(inputs_trans, dec_input, target_mask)\n", 838 | " dec_input = torch.cat((dec_input, output_trans[:,-1:,:]),1)\n", 839 | " outputs_trans = dec_input[:,1:,:]\n", 840 | "outputs = PIT_IDM.predict(inputs, hist_labels, model_location)\n", 841 | "outputs_IDM = model_IDM_test(inputs[:,-1,:], hist_labels[:,:,:],output_length)\n", 842 | "y_label = labels[0,:,0].detach().numpy()*(max_num-min_num)\n", 843 | "y_output = outputs[0,:,0].detach().numpy()*(max_num-min_num)\n", 844 | "y_output_IDM = outputs_IDM[0,:,0].detach().numpy()*(max_num-min_num)\n", 845 | "y_output_trans = outputs_trans[0,:,0].detach().numpy()*(max_num-min_num)\n", 846 | "x_pred = np.arange(1,32)+49\n", 847 | "x_obs = np.arange(1,51)\n", 848 | "y_obs = inputs[0,:,0].detach().numpy()*(max_num-min_num)\n", 849 | "y_label = np.concatenate((np.array([y_obs[-1]]),y_label))\n", 850 | "y_output= np.concatenate((np.array([y_obs[-1]]),y_output))\n", 851 | "y_output_IDM = np.concatenate((np.array([y_obs[-1]]),y_output_IDM))\n", 852 | "y_output_trans = np.concatenate((np.array([y_obs[-1]]),y_output_trans))\n", 853 | "\n", 854 | "ax1.plot(x_obs[40:], (y_obs[40:]+min_num+begin_positions[300*i+200])*0.3048, color='black', label='Ground truth')\n", 855 | "ax1.plot(x_pred, (y_label+min_num+begin_positions[300*i+200])*0.3048, color='red', label='Observation')\n", 856 | "ax1.plot(x_pred, (y_output+min_num+begin_positions[300*i+200])*0.3048, color='blue', label='PIT-IDM')\n", 857 | "ax1.plot(x_pred, (y_output_IDM+min_num+begin_positions[300*i+200])*0.3048, color='orange', label='IDM')\n", 858 | "ax1.plot(x_pred, (y_output_trans+min_num+begin_positions[300*i+200])*0.3048, color='green', label='Transformer')\n", 859 | "ax1.set_xlabel('Time Steps', fontdict={'size':15})\n", 860 | "ax1.set_ylabel('Position (m)', fontdict={'size':15})\n", 861 | "ax1.tick_params(axis='x', labelsize=15)\n", 862 | "ax1.tick_params(axis='y', labelsize=15)\n", 863 | "ax1.legend(prop={'size':12})\n", 864 | "\n", 865 | "i = 150\n", 866 | "\n", 867 | "inputs = torch.from_numpy(x_test[300*i+1,:,:]).unsqueeze(0).to(torch.float32)\n", 868 | "hist_labels = torch.from_numpy(test_pre_labels[300*i+1,:,:]).unsqueeze(0).to(torch.float32)\n", 869 | "labels = torch.from_numpy(y_test[300*i+1,:,:]).unsqueeze(0).to(torch.float32)\n", 870 | "inputs_trans = inputs[:,:,[0]]\n", 871 | "start_of_seq = torch.Tensor([0]).unsqueeze(0).unsqueeze(1).repeat(labels.shape[0],1,1)\n", 872 | "dec_input = start_of_seq\n", 873 | "for j in range(output_length):\n", 874 | " target_mask = _generate_square_subsequent_mask(dec_input.shape[1])\n", 875 | " output_trans = Trans_model(inputs_trans, dec_input, target_mask)\n", 876 | " dec_input = torch.cat((dec_input, output_trans[:,-1:,:]),1)\n", 877 | " outputs_trans = dec_input[:,1:,:]\n", 878 | "outputs = PIT_IDM.predict(inputs, hist_labels, model_location)\n", 879 | "outputs_IDM = model_IDM_test(inputs[:,-1,:], hist_labels[:,:,:],output_length)\n", 880 | "y_label = labels[0,:,0].detach().numpy()*(max_num-min_num)\n", 881 | "y_output = outputs[0,:,0].detach().numpy()*(max_num-min_num)\n", 882 | "y_output_IDM = outputs_IDM[0,:,0].detach().numpy()*(max_num-min_num)\n", 883 | "y_output_trans = outputs_trans[0,:,0].detach().numpy()*(max_num-min_num)\n", 884 | "x_pred = np.arange(1,32)+49\n", 885 | "x_obs = np.arange(1,51)\n", 886 | "y_obs = inputs[0,:,0].detach().numpy()*(max_num-min_num)\n", 887 | "y_label = np.concatenate((np.array([y_obs[-1]]),y_label))\n", 888 | "y_output= np.concatenate((np.array([y_obs[-1]]),y_output))\n", 889 | "y_output_IDM = np.concatenate((np.array([y_obs[-1]]),y_output_IDM))\n", 890 | "y_output_trans = np.concatenate((np.array([y_obs[-1]]),y_output_trans))\n", 891 | "\n", 892 | "ax2.plot(x_obs[40:], (y_obs[40:]+min_num+begin_positions[300*i+200])*0.3048, color='black', label='Ground truth')\n", 893 | "ax2.plot(x_pred, (y_label+min_num+begin_positions[300*i+200])*0.3048, color='red', label='Observation')\n", 894 | "ax2.plot(x_pred, (y_output+min_num+begin_positions[300*i+200])*0.3048, color='blue', label='PIT-IDM')\n", 895 | "ax2.plot(x_pred, (y_output_IDM+min_num+begin_positions[300*i+200])*0.3048, color='orange', label='IDM')\n", 896 | "ax2.plot(x_pred, (y_output_trans+min_num+begin_positions[300*i+200])*0.3048, color='green', label='Transformer')\n", 897 | "ax2.set_xlabel('Time Steps', fontdict={'size':15})\n", 898 | "ax2.set_ylabel('Position (m)', fontdict={'size':15})\n", 899 | "ax2.tick_params(axis='x', labelsize=15)\n", 900 | "ax2.tick_params(axis='y', labelsize=15)\n", 901 | "ax2.legend(prop={'size':12})\n", 902 | "\n", 903 | "fig.savefig(r'G:\\科研\\PINN英文论文\\一轮修改\\single_I80_all.svg',format='svg',dpi=600,bbox_inches = 'tight')" 904 | ] 905 | }, 906 | { 907 | "cell_type": "code", 908 | "execution_count": null, 909 | "id": "similar-polish", 910 | "metadata": {}, 911 | "outputs": [], 912 | "source": [] 913 | } 914 | ], 915 | "metadata": { 916 | "kernelspec": { 917 | "display_name": "pytorch", 918 | "language": "python", 919 | "name": "pytorch" 920 | }, 921 | "language_info": { 922 | "codemirror_mode": { 923 | "name": "ipython", 924 | "version": 3 925 | }, 926 | "file_extension": ".py", 927 | "mimetype": "text/x-python", 928 | "name": "python", 929 | "nbconvert_exporter": "python", 930 | "pygments_lexer": "ipython3", 931 | "version": "3.7.10" 932 | } 933 | }, 934 | "nbformat": 4, 935 | "nbformat_minor": 5 936 | } 937 | --------------------------------------------------------------------------------