├── README.md ├── data ├── atis-2.dev.iob ├── atis-2.dev.w-intent.iob ├── atis-2.train.iob ├── atis-2.train.w-intent.iob ├── atis.test.iob ├── atis.test.w-intent.iob ├── atis.train.iob ├── atis.train.w-intent.iob └── sample.iob ├── program ├── BasicModel.py ├── BasicModel.pyc ├── Encoding.py ├── History.py ├── History.pyc ├── PredefinedEmbedding.py ├── PredefinedEmbedding.pyc ├── SequenceTagger.py ├── wordSlotDataSet.py └── wordSlotDataSet.pyc ├── sample ├── rnn+emb.3.h5 ├── rnn+emb_H-50_O-adam_A-tanh_WR-embedding.test.3 ├── rnn+emb_H-50_O-adam_A-tanh_WR-embedding.test.3.prob └── tag.list └── script ├── run_joint.sh ├── run_sample.sh └── run_slot.sh /README.md: -------------------------------------------------------------------------------- 1 | # JointSLU: Joint Semantic Parsing for Spoken/Natural Language Understanding 2 | 3 | 4 | *A Keras implementation of the models described in [Hakkani-Tur et al. (2016)] (https://www.csie.ntu.edu.tw/~yvchen/doc/IS16_MultiJoint.pdf).* 5 | 6 | This model learns various RNN architectures (RNN, GRU, LSTM, etc.) for joint semantic parsing, 7 | where intent prediction and slot filling are performed in a single network model. 8 | 9 | ## Content 10 | * [Requirements](#requirements) 11 | * [Getting Started](#getting-started) 12 | * [Model Running](#model-running) 13 | * [Contact](#contact) 14 | * [Reference](#reference) 15 | 16 | ## Requirements 17 | 1. Python 18 | 2. Numpy `pip install numpy` 19 | 3. Keras and associated Theano or TensorFlow `pip install keras` 20 | 4. H5py `pip install h5py` 21 | 22 | ## Dataset 23 | 1. Train: word sequences with IOB slot tags and the intent label (data/atis.train.w-intent.iob) 24 | 2. Test: word sequences with IOB slot tags and the intent label (data/atis.test.w-intent.iob) 25 | 26 | 27 | ## Getting Started 28 | You can train and test JointSLU with the following commands: 29 | 30 | ```shell 31 | git clone --recursive https://github.com/yvchen/JointSLU.git 32 | cd JointSLU 33 | ``` 34 | You can run a sample tutorial with this command: 35 | ```shell 36 | bash script/run_sample.sh rnn theano 0 | sh 37 | ``` 38 | Then you can see the predicted result in `sample/rnn+emb_H-50_O-adam_A-tanh_WR-embedding.test.3`. 39 | 40 | ## Model Running 41 | To reproduce the work described in the paper. 42 | You can run the slot filling only experiment using BLSTM by: 43 | ```shell 44 | bash script/run_slot.sh blstm theano 0 | sh 45 | ``` 46 | You can run the joint frame parsing (intent prediction and slot filling) experiment using BLSTM by: 47 | ```shell 48 | bash script/run_joint.sh blstm theano 0 | sh 49 | ``` 50 | 51 | ## Contact 52 | Yun-Nung (Vivian) Chen, y.v.chen@ieee.org 53 | 54 | ## Reference 55 | 56 | Main papers to be cited 57 | ``` 58 | @Inproceedings{hakkani-tur2016multi, 59 | author = {Hakkani-Tur, Dilek and Tur, Gokhan and Celikyilmaz, Asli and Chen, Yun-Nung and Gao, Jianfeng and Deng, Li and Wang, Ye-Yi}, 60 | title = {Multi-Domain Joint Semantic Frame Parsing using Bi-directional RNN-LSTM}, 61 | booktitle = {Proceedings of Interspeech}, 62 | year = {2016} 63 | } 64 | 65 | 66 | -------------------------------------------------------------------------------- /data/sample.iob: -------------------------------------------------------------------------------- 1 | BOS i want to fly from baltimore to dallas round trip EOS O O O O O O B-fromloc.city_name O B-toloc.city_name B-round_trip I-round_trip atis_flight 2 | BOS round trip fares from baltimore to philadelphia less than 1000 dollars round trip fares from denver to philadelphia less than 1000 dollars round trip fares from pittsburgh to philadelphia less than 1000 dollars EOS O B-round_trip I-round_trip O O B-fromloc.city_name O B-toloc.city_name B-cost_relative O B-fare_amount I-fare_amount B-round_trip I-round_trip O O B-fromloc.city_name O B-toloc.city_name B-cost_relative O B-fare_amount I-fare_amount B-round_trip I-round_trip O O B-fromloc.city_name O B-toloc.city_name B-cost_relative O B-fare_amount I-fare_amount atis_airfare 3 | BOS show me the flights arriving on baltimore on june fourteenth EOS O O O O O O O B-toloc.city_name O B-arrive_date.month_name B-arrive_date.day_number atis_flight 4 | BOS what are the flights which depart from san francisco fly to washington via indianapolis and arrive by 9 pm EOS O O O O O O O O B-fromloc.city_name I-fromloc.city_name O O B-toloc.city_name O B-stoploc.city_name O O B-arrive_time.time_relative B-arrive_time.time I-arrive_time.time atis_flight 5 | BOS which airlines fly from boston to washington dc via other cities EOS O O O O O B-fromloc.city_name O B-toloc.city_name B-toloc.state_code O O O atis_airline 6 | BOS i'm looking for a flight from charlotte to las vegas that stops in st. louis hopefully a dinner flight how can i find that out EOS O O O O O O O B-fromloc.city_name O B-toloc.city_name I-toloc.city_name O O O B-stoploc.city_name I-stoploc.city_name O O B-meal_description O O O O O O O atis_flight 7 | BOS okay and then from pittsburgh i'd like to travel to atlanta on september fourth EOS O O O O O B-fromloc.city_name O O O O O B-toloc.city_name O B-depart_date.month_name B-depart_date.day_number atis_flight 8 | BOS show me all the flights from philadelphia to cincinnati EOS O O O O O O O B-fromloc.city_name O B-toloc.city_name atis_flight 9 | BOS okay i'd like a flight on us air from indianapolis to san diego in the afternoon what's available EOS O O O O O O O B-airline_name I-airline_name O B-fromloc.city_name O B-toloc.city_name I-toloc.city_name O O B-depart_time.period_of_day O O atis_flight 10 | BOS on tuesday what flights leave phoenix to st. paul minnesota and leave after noon EOS O O B-depart_date.day_name O O O B-fromloc.city_name O B-toloc.city_name I-toloc.city_name B-toloc.state_name O O B-depart_time.time_relative B-depart_time.time atis_flight 11 | -------------------------------------------------------------------------------- /program/BasicModel.py: -------------------------------------------------------------------------------- 1 | """ 2 | RNN-based slot filling 3 | by V. Chen, D. Hakkani-Tur & G. Tur 4 | """ 5 | 6 | import os, sys, json 7 | import numpy as np 8 | from scipy import io 9 | from wordSlotDataSet import dataSet, readNum 10 | from PredefinedEmbedding import PredefinedEmbedding 11 | from Encoding import encoding 12 | import argparse 13 | from keras.preprocessing import sequence 14 | from keras.models import Sequential, Graph, Model 15 | from keras.layers import Input, merge, Merge, Dense, TimeDistributedDense, Dropout, Activation, RepeatVector, Permute, Reshape, RepeatVector, Flatten 16 | from keras.layers.convolutional import Convolution1D, MaxPooling1D, AveragePooling1D 17 | from keras.layers.embeddings import Embedding 18 | from keras.layers.recurrent import SimpleRNN, GRU, LSTM 19 | from keras.layers.wrappers import TimeDistributed 20 | from keras.optimizers import SGD, RMSprop, Adagrad, Adadelta, Adam, Adamax 21 | from keras.constraints import maxnorm, nonneg 22 | from keras import backend as K 23 | from keras.callbacks import EarlyStopping, ModelCheckpoint 24 | from History import LossHistory 25 | 26 | class KerasModel( object ): 27 | 28 | def __init__(self,argparams): 29 | # PARAMETERS 30 | self.hidden_size = argparams['hidden_size'] # size of hidden layer of neurons 31 | self.learning_rate = argparams['learning_rate'] 32 | self.training_file = argparams['train_data_path'] 33 | self.validation_file = argparams['dev_data_path'] 34 | self.test_file = argparams['test_data_path'] 35 | self.result_path = argparams['result_path'] 36 | self.train_numfile = argparams['train_numfile'] 37 | self.dev_numfile = argparams['dev_numfile'] 38 | self.test_numfile = argparams['test_numfile'] 39 | self.update_f = argparams['sgdtype'] # options: adagrad, rmsprop, vanilla. default: vanilla 40 | self.decay_rate = argparams['decay_rate'] # for rmsprop 41 | self.default = argparams['default_flag'] # True: use defult values for optimizer 42 | self.momentum = argparams['momentum'] # for vanilla update 43 | self.max_epochs = argparams['max_epochs'] 44 | self.activation = argparams['activation_func'] # options: tanh, sigmoid, relu. default: relu 45 | self.smooth_eps = argparams['smooth_eps'] # for adagrad and rmsprop 46 | self.batch_size = argparams['batch_size'] 47 | self.input_type = argparams['input_type'] # options: 1hot, embedding, predefined 48 | self.emb_dict = argparams['embedding_file'] 49 | self.embedding_size = argparams['embedding_size'] 50 | self.dropout = argparams['dropout'] 51 | self.dropout_ratio = argparams['dropout_ratio'] 52 | self.iter_per_epoch = argparams['iter_per_epoch'] 53 | self.arch = argparams['arch'] 54 | self.init_type = argparams['init_type'] 55 | self.fancy_forget_bias_init = argparams['forget_bias'] 56 | self.time_length = argparams['time_length'] 57 | self.his_length = argparams['his_length'] 58 | self.mdl_path = argparams['mdl_path'] 59 | self.log = argparams['log'] 60 | self.record_epoch = argparams['record_epoch'] 61 | self.load_weight = argparams['load_weight'] 62 | self.combine_his = argparams['combine_his'] 63 | self.time_decay = argparams['time_decay'] 64 | self.shuffle = argparams['shuffle'] 65 | self.set_batch = argparams['set_batch'] 66 | self.tag_format = argparams['tag_format'] 67 | self.e2e_flag = argparams['e2e_flag'] 68 | self.output_att = argparams['output_att'] 69 | self.sembedding_size = self.embedding_size 70 | self.model_arch = self.arch 71 | if self.validation_file is None: 72 | self.nodev = True 73 | else: 74 | self.nodev = False 75 | if self.input_type == 'embedding': 76 | self.model_arch = self.model_arch + '+emb' 77 | if self.time_decay: 78 | self.model_arch = self.model_arch + '+T' 79 | if self.e2e_flag: 80 | self.model_arch = 'e2e-' + self.model_arch 81 | 82 | def test(self, H, X, data_type, tagDict, pad_data): 83 | # open a dir to store results 84 | if self.default: 85 | target_file = self.result_path + '/' + self.model_arch + '_H-'+str(self.hidden_size)+'_O-'+self.update_f+'_A-'+self.activation+'_WR-'+self.input_type 86 | else: 87 | target_file = self.result_path + '/' + self.model_arch +'-LR-'+str(self.learning_rate)+'_H-'+str(self.hidden_size)+'_O-'+self.update_f+'_A-'+self.activation+'_WR-'+self.input_type 88 | 89 | if 'memn2n' in self.arch or self.arch[0] == 'h': 90 | batch_data = [H, X] 91 | else: 92 | batch_data = X 93 | 94 | # output attention 95 | if self.output_att is not None: 96 | x1 = self.model.inputs[0] 97 | x2 = self.model.inputs[1] 98 | #x = self.model.layers[1].input 99 | y = self.model.get_layer(name='match').output 100 | # y = self.model.layers[9].output 101 | f = K.function([x1, x2, K.learning_phase()], y) 102 | att_mtx = f([batch_data[0], batch_data[1], 0]) 103 | row, col = np.shape(att_mtx) 104 | fo = open(self.output_att, 'wb') 105 | for i in range(0, row): 106 | for j in range(0, col): 107 | fo.write("%e " %att_mtx[i][j]) 108 | fo.write('\n') 109 | fo.close() 110 | sys.stderr.write("Output the attention weights in the file %s.\n" %self.output_att) 111 | exit() 112 | if "predict_classes" in dir(self.model): 113 | prediction = self.model.predict_classes(batch_data) 114 | probability = self.model.predict_proba(batch_data) 115 | else: 116 | probability = self.model.predict(batch_data) 117 | prediction = np.argmax(probability, axis=2) 118 | 119 | # output prediction and probability results 120 | fo = open(target_file+"."+ data_type, "wb") 121 | for i, sent in enumerate(prediction): 122 | for j, tid in enumerate(sent): 123 | if pad_data[i][j] != 0: 124 | if self.tag_format == 'normal': 125 | fo.write(tagDict[tid] + ' ') 126 | elif self.tag_format == 'conlleval': 127 | fo.write(tagDict[tid] + '\n') 128 | fo.write('\n') 129 | fo.close() 130 | fo = open(target_file+"."+ data_type+'.prob', "wb") 131 | for i, sent in enumerate(probability): 132 | for j, prob in enumerate(sent): 133 | if pad_data[i][j] != 0: 134 | for k, val in enumerate(prob): 135 | fo.write("%e " %val) 136 | fo.write("\n") 137 | fo.close() 138 | 139 | def build( self ): 140 | 141 | # decide main model 142 | if self.input_type == '1hot': 143 | self.embedding_size = self.input_vocab_size 144 | 145 | # set optimizer 146 | opt_func = self.update_f 147 | if not self.default: 148 | if self.update_f == 'sgd': 149 | opt_func = SGD(lr=self.learning_rate, momentum=self.momentum, decay=self.decay_rate) 150 | elif self.update_f == 'rmsprop': 151 | opt_func = RMSprop(lr=self.learning_rate, rho=self.rho, epsilon=self.smooth_eps) 152 | elif self.update_f == 'adagrad': 153 | opt_func = Adagrad(lr=self.learning_rate, epsilon=self.smooth_eps) 154 | elif self.update_f == 'adadelta': 155 | opt_func = Adadelta(lr=self.learning_rate, rho=self.rho, epsilon=self.smooth_eps) 156 | elif self.update_f == 'adam': 157 | opt_func = Adam(lr=self.learning_rate, beta_1=self.beta1, beta_2=self.beta2, epsilon=self.smooth_eps) 158 | elif self.update_f == 'adamax': 159 | opt_func = Adamax(lr=self.learning_rate, beta_1=self.beta1, beta_2=self.beta2, epsilon=self.smooth_eps) 160 | else: 161 | sys.stderr.write("Invalid optimizer.\n") 162 | exit() 163 | 164 | # Vallina RNN (LSTM, SimpleRNN, GRU) 165 | # Bidirectional-RNN (LSTM, SimpleRNN, GRU) 166 | if self.arch == 'lstm' or self.arch == 'rnn' or self.arch == 'gru' or self.arch == 'blstm' or self.arch == 'brnn' or self.arch == 'bgru': 167 | if self.input_type == 'embedding': 168 | raw_current = Input(shape=(self.time_length,), dtype='int32') 169 | current = Embedding(input_dim=self.input_vocab_size, output_dim=self.embedding_size, input_length=self.time_length, mask_zero=True)(raw_current) 170 | else: 171 | current = raw_current = Input(shape=(self.time_length, self.input_vocab_size)) 172 | if 'rnn' in self.arch: 173 | forward = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 174 | backward = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(current) 175 | elif 'gru' in self.arch: 176 | forward = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 177 | backward = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(current) 178 | elif 'lstm' in self.arch: 179 | forward = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 180 | backward = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(current) 181 | if 'b' in self.arch: 182 | tagger = merge([forward, backward], mode='concat') 183 | else: 184 | tagger = forward 185 | if self.dropout: 186 | tagger = Dropout(self.dropout_ratio)(tagger) 187 | prediction = TimeDistributed(Dense(self.output_vocab_size, activation='softmax'))(tagger) 188 | 189 | self.model = Model(input=raw_current, output=prediction) 190 | self.model.compile(loss='categorical_crossentropy', optimizer=opt_func) 191 | 192 | # 2-Stacked Layered RNN (LSTM, SimpleRNN, GRU) 193 | elif self.arch == '2lstm' or self.arch == '2rnn' or self.arch == '2gru': 194 | self.model = Sequential() 195 | if self.input_type == 'embedding': 196 | self.model.add(Embedding(self.input_vocab_size, self.embedding_size, input_length=self.time_length)) 197 | if self.arch == '2lstm': 198 | basic_model = LSTM(self.hidden_size, return_sequences=True, input_shape=(self.time_length, self.embedding_size), init=self.init_type, activation=self.activation) 199 | stack_model = LSTM(self.hidden_size, return_sequences=True, input_shape=(self.time_length, self.hidden_size), init=self.init_type, activation=self.activation) 200 | elif self.arch == '2rnn': 201 | basic_model = SimpleRNN(self.hidden_size, return_sequences=True, input_shape=(self.time_length, self.embedding_size), init=self.init_type, activation=self.activation) 202 | stack_model = SimpleRNN(self.hidden_size, return_sequences=True, input_shape=(self.time_length, self.hidden_size), init=self.init_type, activation=self.activation) 203 | else: 204 | basic_model = GRU(self.hidden_size, return_sequences=True, input_shape=(self.time_length, self.embedding_size), init=self.init_type, activation=self.activation) 205 | stack_model = GRU(self.hidden_size, return_sequences=True, input_shape=(self.time_length, self.hidden_size), init=self.init_type, activation=self.activation) 206 | self.model.add(basic_model) 207 | if self.dropout: 208 | self.model.add(Dropout(self.dropout_ratio)) 209 | self.model.add(stack_model) 210 | if self.dropout: 211 | self.model.add(Dropout(self.dropout_ratio)) 212 | self.model.add(TimeDistributed(Dense(self.output_vocab_size))) 213 | self.model.add(Activation('softmax')) 214 | self.model.compile(loss='categorical_crossentropy', optimizer=opt_func) 215 | 216 | # Encode intent information by feeding all words and then start tagging 217 | elif self.arch == 'irnn' or self.arch == 'igru' or self.arch == 'ilstm' or self.arch == 'ibrnn' or self.arch == 'ibgru' or self.arch == 'iblstm': 218 | if self.input_type == 'embedding': 219 | raw_current = Input(shape=(self.time_length,), dtype='int32') 220 | current = Embedding(input_dim=self.input_vocab_size, output_dim=self.embedding_size, input_length=self.time_length)(raw_current) 221 | else: 222 | current = raw_current = Input(shape=(self.time_length, self.input_vocab_size)) 223 | if 'rnn' in self.arch: 224 | fencoder = SimpleRNN(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 225 | bencoder = SimpleRNN(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation, go_backwards=True)(current) 226 | flabeling = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 227 | blabeling = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(current) 228 | elif 'gru' in self.arch: 229 | fencoder = GRU(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 230 | bencoder = GRU(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation, go_backwards=True)(current) 231 | flabeling = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 232 | blabeling = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(current) 233 | elif 'lstm' in self.arch: 234 | fencoder = LSTM(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 235 | bencoder = LSTM(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation, go_backwards=True)(current) 236 | flabeling = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 237 | blabeling = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(current) 238 | if 'b' in self.arch: 239 | encoder = merge([fencoder, bencoder], mode='concat', concat_axis=-1) 240 | labeling = merge([flabeling, blabeling], mode='concat', concat_axis=-1) 241 | else: 242 | encoder = fencoder 243 | labeling = flabeling 244 | #intent = Dense(self.output_vocab_size, activation='softmax')(encoder) 245 | encoder = RepeatVector(self.time_length)(encoder) 246 | tagger = merge([encoder, labeling], mode='concat', concat_axis=-1) 247 | if self.dropout: 248 | tagger = Dropout(self.dropout_ratio)(tagger) 249 | prediction = TimeDistributed(Dense(self.output_vocab_size, activation='softmax'))(tagger) 250 | 251 | self.model = Model(input=raw_current, output=prediction) 252 | self.model.compile(loss='categorical_crossentropy', optimizer=opt_func) 253 | 254 | # Encode intent information by feeding all words and then start tagging 255 | elif self.arch == 'i-c-rnn' or self.arch == 'i-c-gru' or self.arch == 'i-c-lstm' or self.arch == 'i-c-brnn' or self.arch == 'i-c-bgru' or self.arch == 'i-c-blstm': 256 | if self.input_type == 'embedding': 257 | raw_current = Input(shape=(self.time_length,), dtype='int32') 258 | current = Embedding(input_dim=self.input_vocab_size, output_dim=self.embedding_size, input_length=self.time_length)(raw_current) 259 | else: 260 | current = raw_current = Input(shape=(self.time_length, self.input_vocab_size)) 261 | encoder = Convolution1D(self.embedding_size, 3, border_mode='same', input_shape=(self.time_length, self.embedding_size))(current) 262 | encoder = MaxPooling1D(self.time_length)(encoder) 263 | encoder = Flatten()(encoder) 264 | if 'rnn' in self.arch: 265 | forward = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 266 | backward = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(current) 267 | elif 'gru' in self.arch: 268 | forward = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 269 | backward = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(current) 270 | elif 'lstm' in self.arch: 271 | forward = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 272 | backward = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(current) 273 | if 'b' in self.arch: 274 | labeling = merge([forward, backward], mode='concat', concat_axis=-1) 275 | else: 276 | labeling = forward 277 | #intent = Dense(self.output_vocab_size, activation='softmax')(encoder) 278 | encoder = RepeatVector(self.time_length)(encoder) 279 | tagger = merge([encoder, labeling], mode='concat', concat_axis=-1) 280 | if self.dropout: 281 | tagger = Dropout(self.dropout_ratio)(tagger) 282 | prediction = TimeDistributed(Dense(self.output_vocab_size, activation='softmax'))(tagger) 283 | 284 | self.model = Model(input=raw_current, output=prediction) 285 | self.model.compile(loss='categorical_crossentropy', optimizer=opt_func) 286 | 287 | 288 | # Encode all history and the current utterance first and then start tagging 289 | elif self.arch == 'hirnn' or self.arch == 'higru' or self.arch == 'hilstm' or self.arch == 'hibrnn' or self.arch == 'hibgru' or self.arch == 'hiblstm': 290 | if self.input_type == 'embedding': 291 | raw_his = Input(shape=(self.time_length * self.his_length,), dtype='int32') 292 | raw_cur = Input(shape=(self.time_length,), dtype='int32') 293 | his_vec = Embedding(input_dim=self.input_vocab_size, output_dim=self.embedding_size, input_length=(self.time_length * self.his_length))(raw_his) 294 | cur_vec = Embedding(input_dim=self.input_vocab_size, output_dim=self.embedding_size, input_length=self.time_length)(raw_cur) 295 | else: 296 | his_vec = raw_his = Input(shape=(self.time_length * self.his_length, self.input_vocab_size)) 297 | cur_vec = raw_cur = Input(shape=(self.time_length, self.input_vocab_size)) 298 | if 'rnn' in self.arch: 299 | fencoder = SimpleRNN(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation)(his_vec) 300 | bencoder = SimpleRNN(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation, go_backwards=True)(his_vec) 301 | flabeling = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(cur_vec) 302 | blabeling = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(cur_vec) 303 | elif 'gru' in self.arch: 304 | fencoder = GRU(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation)(his_vec) 305 | bencoder = GRU(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation, go_backwards=True)(his_vec) 306 | flabeling = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(cur_vec) 307 | blabeling = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(cur_vec) 308 | elif 'lstm' in self.arch: 309 | fencoder = LSTM(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation)(his_vec) 310 | bencoder = LSTM(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation, go_backwards=True)(his_vec) 311 | flabeling = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(cur_vec) 312 | blabeling = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(cur_vec) 313 | if 'b' in self.arch: 314 | encoder = merge([fencoder, bencoder], mode='concat', concat_axis=-1) 315 | labeling = merge([flabeling, blabeling], mode='concat', concat_axis=-1) 316 | else: 317 | encoder = fencoder 318 | labeling = flabeling 319 | #intent = Dense(self.output_vocab_size, activation='softmax')(encoder) 320 | encoder = RepeatVector(self.time_length)(encoder) 321 | tagger = merge([encoder, labeling], mode='concat', concat_axis=-1) 322 | if self.dropout: 323 | tagger = Dropout(self.dropout_ratio)(tagger) 324 | prediction = TimeDistributed(Dense(self.output_vocab_size, activation='softmax'))(tagger) 325 | 326 | self.model = Model(input=[raw_his, raw_cur], output=prediction) 327 | self.model.compile(loss='categorical_crossentropy', optimizer=opt_func) 328 | 329 | # Encode all history and the current utterance first and then start tagging 330 | elif self.arch == 'hi-c-rnn' or self.arch == 'hi-c-gru' or self.arch == 'hi-c-lstm' or self.arch == 'hi-c-brnn' or self.arch == 'hi-c-bgru' or self.arch == 'hi-c-blstm': 331 | if self.input_type == 'embedding': 332 | raw_his = Input(shape=(self.time_length * self.his_length,), dtype='int32') 333 | raw_cur = Input(shape=(self.time_length,), dtype='int32') 334 | his_vec = Embedding(input_dim=self.input_vocab_size, output_dim=self.embedding_size, input_length=(self.time_length * self.his_length))(raw_his) 335 | cur_vec = Embedding(input_dim=self.input_vocab_size, output_dim=self.embedding_size, input_length=self.time_length)(raw_cur) 336 | else: 337 | his_vec = raw_his = Input(shape=(self.time_length * self.his_length, self.input_vocab_size)) 338 | cur_vec = raw_cur = Input(shape=(self.time_length, self.input_vocab_size)) 339 | if 'rnn' in self.arch: 340 | flabeling = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(cur_vec) 341 | blabeling = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(cur_vec) 342 | elif 'gru' in self.arch: 343 | flabeling = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(cur_vec) 344 | blabeling = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(cur_vec) 345 | elif 'lstm' in self.arch: 346 | flabeling = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(cur_vec) 347 | blabeling = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(cur_vec) 348 | if 'b' in self.arch: 349 | labeling = merge([flabeling, blabeling], mode='concat', concat_axis=-1) 350 | else: 351 | labeling = flabeling 352 | encoder = Convolution1D(self.embedding_size, 3, border_mode='same', input_shape=(self.time_length * self.his_length, self.embedding_size))(his_vec) 353 | encoder = MaxPooling1D(self.time_length)(encoder) 354 | encoder = Flatten()(encoder) 355 | #intent = Dense(self.output_vocab_size, activation='softmax')(encoder) 356 | encoder = RepeatVector(self.time_length)(encoder) 357 | tagger = merge([encoder, labeling], mode='concat', concat_axis=-1) 358 | if self.dropout: 359 | tagger = Dropout(self.dropout_ratio)(tagger) 360 | prediction = TimeDistributed(Dense(self.output_vocab_size, activation='softmax'))(tagger) 361 | 362 | self.model = Model(input=[raw_his, raw_cur], output=prediction) 363 | self.model.compile(loss='categorical_crossentropy', optimizer=opt_func) 364 | 365 | elif 'amemn2n' in self.arch: 366 | # current: (, time_length, embedding_size) 367 | if self.input_type == 'embedding': 368 | raw_current = Input(shape=(self.time_length,), dtype='int32', name='raw_current') 369 | current = Embedding(input_dim=self.input_vocab_size, output_dim=self.embedding_size, input_length=self.time_length)(raw_current) 370 | else: 371 | current = raw_current = Input(shape=(self.time_length, self.input_vocab_size), name='current') 372 | if 'memn2n-d-' in self.arch: 373 | cur_vec = TimeDistributed(Dense(self.sembedding_size, input_shape=(self.time_length, self.embedding_size)))(current) 374 | cur_vec = AveragePooling1D(self.time_length)(cur_vec) 375 | cur_vec = Flatten()(cur_vec) 376 | elif 'memn2n-c-' in self.arch: 377 | cur_vec = Convolution1D(self.sembedding_size, 3, border_mode='same', input_shape=(self.time_length, self.embedding_size))(current) 378 | cur_vec = MaxPooling1D(self.time_length)(cur_vec) 379 | cur_vec = Flatten()(cur_vec) 380 | elif 'memn2n-r-' in self.arch: 381 | if 'blstm' in self.arch: 382 | fcur_vec = LSTM(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 383 | bcur_vec = LSTM(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation, go_backwards=True)(current) 384 | cur_vec = merge([fcur_vec, bcur_vec], mode='concat') 385 | elif 'rnn' in self.arch: 386 | cur_vec = SimpleRNN(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 387 | elif 'gru' in self.arch: 388 | cur_vec = GRU(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 389 | elif 'lstm' in self.arch: 390 | cur_vec = LSTM(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 391 | else: 392 | sys.stderr.write("The RNN model is invaliad. (rnn | gru | lstm)\n") 393 | elif 'memn2n-rc-' in self.arch: 394 | if 'rnn' in self.arch: 395 | cur_vec = SimpleRNN(self.sembedding_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 396 | elif 'gru' in self.arch: 397 | cur_vec = GRU(self.sembedding_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 398 | elif 'lstm' in self.arch: 399 | cur_vec = LSTM(self.sembedding_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 400 | else: 401 | sys.stderr.write("The RNN model is invaliad. (rnn | gru | lstm)\n") 402 | cur_vec = Convolution1D(self.sembedding_size, 3, border_mode='same', input_shape=(self.time_length, self.sembedding_size))(cur_vec) 403 | cur_vec = MaxPooling1D(self.time_length)(cur_vec) 404 | cur_vec = Flatten()(cur_vec) 405 | elif 'memn2n-cr-' in self.arch: 406 | cur_vec = Convolution1D(self.sembedding_size, 3, border_mode='same', input_shape=(self.time_length, self.embedding_size))(current) 407 | if 'rnn' in self.arch: 408 | cur_vec = SimpleRNN(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 409 | elif 'gru' in self.arch: 410 | cur_vec = GRU(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 411 | elif 'lstm' in self.arch: 412 | cur_vec = LSTM(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 413 | else: 414 | sys.stderr.write("The RNN model is invaliad. (rnn | gru | lstm)\n") 415 | elif 'memn2n-crp-' in self.arch: 416 | cur_vec = Convolution1D(self.sembedding_size, 3, border_mode='same', input_shape=(self.time_length, self.embedding_size))(current) 417 | if 'rnn' in self.arch: 418 | cur_vec = SimpleRNN(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 419 | elif 'gru' in self.arch: 420 | cur_vec = GRU(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 421 | elif 'lstm' in self.arch: 422 | cur_vec = LSTM(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 423 | else: 424 | sys.stderr.write("The RNN model is invaliad. (rnn | gru | lstm)\n") 425 | cur_vec = MaxPooling1D(self.time_length)(cur_vec) 426 | cur_vec = Flatten()(cur_vec) 427 | sent_model = Model(input=raw_current, output=cur_vec) 428 | 429 | # apply the same function for mapping word sequences into sentence vecs 430 | # input_memory: (, his_length, time_length, embedding_size) 431 | if self.input_type == 'embedding': 432 | raw_input_memory = Input(shape=(self.his_length * self.time_length,), dtype='int32', name='input_memory') 433 | input_memory = Reshape((self.his_length, self.time_length))(raw_input_memory) 434 | else: 435 | raw_input_memory = Input(shape=(self.his_length * self.time_length, self.embedding_size), name='input_memory') 436 | input_memory = Reshape((self.his_length, self.time_length, self.embedding_size))(raw_input_memory) 437 | mem_vec = TimeDistributed(sent_model)(input_memory) 438 | 439 | # compute the similarity between sentence embeddings for attention 440 | match = merge([mem_vec, cur_vec], mode='dot', dot_axes=[2, 1]) 441 | match = Activation('softmax', name='match')(match) 442 | 443 | 444 | # encode the history with the current utterance and then feed into each timestep for tagging 445 | his_vec = merge([mem_vec, match], mode='dot', dot_axes=[1, 1]) 446 | encoder = merge([his_vec, cur_vec], mode='sum') 447 | encoder = Dense(self.embedding_size)(encoder) 448 | 449 | # tagging the words in the current sentence 450 | if 'blstm' in self.arch: 451 | forward = LSTM(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 452 | backward = LSTM(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation, go_backwards=True)(current) 453 | labeling = merge([forward, backward], mode='concat', concat_axis=-1) 454 | elif 'rnn' in self.arch: 455 | labeling = SimpleRNN(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 456 | elif 'gru' in self.arch: 457 | labeling = GRU(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 458 | elif 'lstm' in self.arch: 459 | labeling = LSTM(self.hidden_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 460 | tagger = merge([encoder, labeling], mode='concat', concat_axis=-1) 461 | if self.dropout: 462 | tagger = Dropout(self.dropout_ratio)(tagger) 463 | prediction = Dense(self.output_vocab_size, activation='softmax')(tagger) 464 | 465 | self.model = Model(input=[raw_input_memory, raw_current], output=prediction) 466 | self.model.compile(loss='categorical_crossentropy', optimizer=opt_func) 467 | 468 | elif 'memn2n' in self.arch: 469 | # current: (, time_length, embedding_size) 470 | if self.input_type == 'embedding': 471 | raw_current = Input(shape=(self.time_length,), dtype='int32', name='raw_current') 472 | current = Embedding(input_dim=self.input_vocab_size, output_dim=self.embedding_size, input_length=self.time_length)(raw_current) 473 | else: 474 | current = raw_current = Input(shape=(self.time_length, self.input_vocab_size), name='current') 475 | if 'memn2n-d-' in self.arch: 476 | cur_vec = TimeDistributed(Dense(self.sembedding_size, input_shape=(self.time_length, self.embedding_size)))(current) 477 | cur_vec = AveragePooling1D(self.time_length)(cur_vec) 478 | cur_vec = Flatten()(cur_vec) 479 | elif 'memn2n-c-' in self.arch: 480 | cur_vec = Convolution1D(self.sembedding_size, 3, border_mode='same', input_shape=(self.time_length, self.embedding_size))(current) 481 | cur_vec = MaxPooling1D(self.time_length)(cur_vec) 482 | cur_vec = Flatten()(cur_vec) 483 | elif 'memn2n-r-' in self.arch: 484 | if 'blstm' in self.arch: 485 | fcur_vec = LSTM(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 486 | bcur_vec = LSTM(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation, go_backwards=True)(current) 487 | cur_vec = merge([fcur_vec, bcur_vec], mode='concat') 488 | elif 'rnn' in self.arch: 489 | cur_vec = SimpleRNN(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 490 | elif 'gru' in self.arch: 491 | cur_vec = GRU(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 492 | elif 'lstm' in self.arch: 493 | cur_vec = LSTM(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(current) 494 | else: 495 | sys.stderr.write("The RNN model is invaliad. (rnn | gru | lstm)\n") 496 | elif 'memn2n-rc-' in self.arch: 497 | if 'rnn' in self.arch: 498 | cur_vec = SimpleRNN(self.sembedding_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 499 | elif 'gru' in self.arch: 500 | cur_vec = GRU(self.sembedding_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 501 | elif 'lstm' in self.arch: 502 | cur_vec = LSTM(self.sembedding_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 503 | else: 504 | sys.stderr.write("The RNN model is invaliad. (rnn | gru | lstm)\n") 505 | cur_vec = Convolution1D(self.sembedding_size, 3, border_mode='same', input_shape=(self.time_length, self.sembedding_size))(cur_vec) 506 | cur_vec = MaxPooling1D(self.time_length)(cur_vec) 507 | cur_vec = Flatten()(cur_vec) 508 | elif 'memn2n-cr-' in self.arch: 509 | cur_vec = Convolution1D(self.sembedding_size, 3, border_mode='same', input_shape=(self.time_length, self.embedding_size))(current) 510 | if 'rnn' in self.arch: 511 | cur_vec = SimpleRNN(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 512 | elif 'gru' in self.arch: 513 | cur_vec = GRU(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 514 | elif 'lstm' in self.arch: 515 | cur_vec = LSTM(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 516 | else: 517 | sys.stderr.write("The RNN model is invaliad. (rnn | gru | lstm)\n") 518 | elif 'memn2n-crp-' in self.arch: 519 | cur_vec = Convolution1D(self.sembedding_size, 3, border_mode='same', input_shape=(self.time_length, self.embedding_size))(current) 520 | if 'rnn' in self.arch: 521 | cur_vec = SimpleRNN(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 522 | elif 'gru' in self.arch: 523 | cur_vec = GRU(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 524 | elif 'lstm' in self.arch: 525 | cur_vec = LSTM(self.sembedding_size, return_sequences=False, init=self.init_type, activation=self.activation)(cur_vec) 526 | else: 527 | sys.stderr.write("The RNN model is invaliad. (rnn | gru | lstm)\n") 528 | cur_vec = MaxPooling1D(self.time_length)(cur_vec) 529 | cur_vec = Flatten()(cur_vec) 530 | sent_model = Model(input=raw_current, output=cur_vec) 531 | 532 | # apply the same function for mapping word sequences into sentence vecs 533 | # input_memory: (, his_length, time_length, embedding_size) 534 | if self.input_type == 'embedding': 535 | raw_input_memory = Input(shape=(self.his_length * self.time_length,), dtype='int32', name='input_memory') 536 | input_memory = Reshape((self.his_length, self.time_length))(raw_input_memory) 537 | else: 538 | raw_input_memory = Input(shape=(self.his_length * self.time_length, self.embedding_size), name='input_memory') 539 | input_memory = Reshape((self.his_length, self.time_length, self.embedding_size))(raw_input_memory) 540 | mem_vec = TimeDistributed(sent_model)(input_memory) 541 | 542 | # compute the similarity between sentence embeddings for attention 543 | match = merge([mem_vec, cur_vec], mode='dot', dot_axes=[2, 1]) 544 | match = Activation('softmax', name='match')(match) 545 | 546 | 547 | # encode the history with the current utterance and then feed into each timestep for tagging 548 | his_vec = merge([mem_vec, match], mode='dot', dot_axes=[1, 1]) 549 | encoder = merge([his_vec, cur_vec], mode='sum') 550 | encoder = Dense(self.embedding_size)(encoder) 551 | encoder = RepeatVector(self.time_length)(encoder) 552 | 553 | # tagging the words in the current sentence 554 | if 'blstm' in self.arch: 555 | forward = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 556 | backward = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation, go_backwards=True)(current) 557 | labeling = merge([forward, backward], mode='concat', concat_axis=-1) 558 | elif 'rnn' in self.arch: 559 | labeling = SimpleRNN(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 560 | elif 'gru' in self.arch: 561 | labeling = GRU(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 562 | elif 'lstm' in self.arch: 563 | labeling = LSTM(self.hidden_size, return_sequences=True, init=self.init_type, activation=self.activation)(current) 564 | tagger = merge([encoder, labeling], mode='concat', concat_axis=-1) 565 | if self.dropout: 566 | tagger = Dropout(self.dropout_ratio)(tagger) 567 | prediction = TimeDistributed(Dense(self.output_vocab_size, activation='softmax'))(tagger) 568 | 569 | self.model = Model(input=[raw_input_memory, raw_current], output=prediction) 570 | self.model.compile(loss='categorical_crossentropy', optimizer=opt_func) 571 | 572 | 573 | def train(self, H_train, X_train, y_train, H_dev, X_dev, y_dev, val_ratio=0.0): 574 | # load saved model weights 575 | if self.load_weight is not None: 576 | sys.stderr.write("Load the pretrained weights for the model.\n") 577 | self.model.load_weights(self.load_weight) 578 | else: 579 | # training batch preparation 580 | if 'memn2n' in self.arch or self.arch[0] == 'h': 581 | batch_train = [H_train, X_train] 582 | batch_dev = [H_dev, X_dev] 583 | else: 584 | batch_train = X_train 585 | batch_dev = X_dev 586 | # model training 587 | if not self.nodev: 588 | early_stop = EarlyStopping(monitor='val_loss', patience=10) 589 | train_log = LossHistory() 590 | self.model.fit(batch_train, y_train, batch_size=self.batch_size, nb_epoch=self.max_epochs, verbose=1, validation_data=(batch_dev, y_dev), callbacks=[early_stop, train_log], shuffle=self.shuffle) 591 | if self.log is not None: 592 | fo = open(self.log, "wb") 593 | for loss in train_log.losses: 594 | fo.write("%lf\n" %loss) 595 | fo.close() 596 | else: 597 | if self.set_batch: 598 | cur = 0 599 | total = np.sum(np.array(self.trainNum)) 600 | for num in self.trainNum: 601 | print("Current batch is: %d ( %.1f%% )\r" %(cur, float(cur) / float(total) * 100)), 602 | sys.stdout.flush() 603 | single_batch_train = [single_batch[cur:cur+num+1][:] for single_batch in batch_train] 604 | for iter_epoch in range(0, self.max_epochs): 605 | self.model.train_on_batch(single_batch_train, y_train[cur:cur+num+1]) 606 | cur += num + 1 607 | else: 608 | self.model.fit(batch_train, y_train, batch_size=self.batch_size, nb_epoch=self.max_epochs, verbose=1, shuffle=self.shuffle) 609 | 610 | def run(self): 611 | # initialization of vocab 612 | emptyVocab = {} 613 | emptyIndex = list() 614 | trainData = dataSet(self.training_file,'train',emptyVocab,emptyVocab,emptyIndex,emptyIndex) 615 | testData = dataSet(self.test_file, 'test', trainData.getWordVocab(), trainData.getTagVocab(),trainData.getIndex2Word(),trainData.getIndex2Tag()) 616 | if self.train_numfile is not None: 617 | self.trainNum, self.trainTotal = readNum(self.train_numfile) 618 | self.testNum, self.testTotal = readNum(self.test_numfile) 619 | if not self.nodev: 620 | self.devNum, self.devTotal = readNum(self.dev_numfile) 621 | 622 | target_file = self.result_path + '/' + 'tag.list' 623 | fo = open(target_file, "wb") 624 | for tag in trainData.dataSet['id2tag']: 625 | fo.write("%s\n" %tag) 626 | fo.close() 627 | 628 | # preprocessing by padding 0 until maxlen 629 | pad_X_train = sequence.pad_sequences(trainData.dataSet['utterances'], maxlen=self.time_length, dtype='int32', padding='pre') 630 | pad_X_test = sequence.pad_sequences(testData.dataSet['utterances'], maxlen=self.time_length, dtype='int32', padding='pre') 631 | pad_y_train = sequence.pad_sequences(trainData.dataSet['tags'], maxlen=self.time_length, dtype='int32', padding='pre') 632 | num_sample_train, max_len = np.shape(pad_X_train) 633 | num_sample_test, max_len = np.shape(pad_X_test) 634 | 635 | if not self.nodev: 636 | validData = dataSet(self.validation_file, 'val', trainData.getWordVocab(), trainData.getTagVocab(),trainData.getIndex2Word(),trainData.getIndex2Tag()) 637 | pad_X_dev = sequence.pad_sequences(validData.dataSet['utterances'], maxlen=self.time_length, dtype='int32', padding='pre') 638 | pad_y_dev = sequence.pad_sequences(validData.dataSet['tags'], maxlen=self.time_length, dtype='int32', padding='pre') 639 | num_sample_dev, max_len = np.shape(pad_X_dev) 640 | 641 | # encoding input vectors 642 | self.input_vocab_size = trainData.getWordVocabSize() 643 | self.output_vocab_size = trainData.getTagVocabSize() 644 | 645 | # building model architecture 646 | self.build() 647 | 648 | # data generation 649 | sys.stderr.write("Vectorizing the input.\n") 650 | X_train = encoding(pad_X_train, self.input_type, self.time_length, self.input_vocab_size) 651 | X_test = encoding(pad_X_test, self.input_type, self.time_length, self.input_vocab_size) 652 | y_train = encoding(pad_y_train, '1hot', self.time_length, self.output_vocab_size) 653 | 654 | if not self.nodev: 655 | X_dev = encoding(pad_X_dev, self.input_type, self.time_length, self.input_vocab_size) 656 | y_dev = encoding(pad_y_dev, '1hot', self.time_length, self.output_vocab_size) 657 | 658 | if 'memn2n' in self.arch or self.arch[0] == 'h': 659 | # encode history for memory network 660 | pad_H_train = sequence.pad_sequences(history_build(trainData, pad_X_train), maxlen=(self.time_length * self.his_length), dtype='int32', padding='pre') 661 | pad_H_test = sequence.pad_sequences(history_build(testData, pad_X_test), maxlen=(self.time_length * self.his_length), dtype='int32', padding='pre') 662 | H_train = encoding(pad_H_train, self.input_type, self.time_length * self.his_length, self.input_vocab_size) 663 | H_test = encoding(pad_H_test, self.input_type, self.time_length * self.his_length, self.input_vocab_size) 664 | if not self.nodev: 665 | pad_H_dev = sequence.pad_sequences(history_build(validData, pad_X_dev), maxlen=(self.time_length * self.his_length), dtype='int32', padding='pre') 666 | H_dev = encoding(pad_H_dev, self.input_type, self.time_length * self.his_length, self.input_vocab_size) 667 | if self.e2e_flag: 668 | H_train = np.array([H_train[num - 1] for num in self.trainTotal]) 669 | X_train = np.array([X_train[num - 1] for num in self.trainTotal]) 670 | y_train = np.array([y_train[num - 1] for num in self.trainTotal]) 671 | pad_X_test = np.array([pad_X_test[num - 1] for num in self.testTotal]) 672 | H_test = np.array([H_test[num - 1] for num in self.testTotal]) 673 | X_test = np.array([X_test[num - 1] for num in self.testTotal]) 674 | if not self.nodev: 675 | pad_X_dev = np.array([pad_X_dev[num - 1] for num in self.devTotal]) 676 | H_dev = np.array([H_dev[num - 1] for num in self.devTotal]) 677 | X_dev = np.array([X_dev[num - 1] for num in self.devTotal]) 678 | y_dev = np.array([y_dev[num - 1] for num in self.devTotal]) 679 | else: 680 | H_train = H_test = H_dev = None 681 | 682 | if self.nodev: 683 | H_dev = X_dev = y_dev = None 684 | 685 | if self.record_epoch != -1 and self.load_weight is None: 686 | total_epochs = self.max_epochs 687 | self.max_epochs = self.record_epoch 688 | for i in range(1, total_epochs / self.record_epoch + 1): 689 | num_iter = i * self.record_epoch 690 | self.train(H_train=H_train, X_train=X_train, y_train=y_train, H_dev=H_dev, X_dev=X_dev, y_dev=y_dev) 691 | if not self.nodev: 692 | self.test(H=H_dev, X=X_dev, data_type='dev.'+str(num_iter),tagDict=trainData.dataSet['id2tag'], pad_data=pad_X_dev) 693 | self.test(H=H_test, X=X_test, data_type='test.'+str(num_iter), tagDict=trainData.dataSet['id2tag'], pad_data=pad_X_test) 694 | # save weights for the current model 695 | whole_path = self.mdl_path + '/' + self.model_arch + '.' + str(num_iter) + '.h5' 696 | sys.stderr.write("Writing model weight to %s...\n" %whole_path) 697 | self.model.save_weights(whole_path, overwrite=True) 698 | else: 699 | self.train(H_train=H_train, X_train=X_train, y_train=y_train, H_dev=H_dev, X_dev=X_dev, y_dev=y_dev) 700 | if not self.nodev: 701 | self.test(H=H_dev, X=X_dev, data_type='dev', tagDict=trainData.dataSet['id2tag'], pad_data=pad_X_dev) 702 | self.test(H=H_test, X=X_test, data_type='test', tagDict=trainData.dataSet['id2tag'], pad_data=pad_X_test) 703 | if self.load_weight is None: 704 | whole_path = self.mdl_path + '/' + self.model_arch + '.final-' + str(self.max_epochs) + '.h5' 705 | sys.stderr.write("Writing model weight to %s...\n" %whole_path) 706 | self.model.save_weights(whole_path, overwrite=True) 707 | -------------------------------------------------------------------------------- /program/BasicModel.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/yvchen/JointSLU/fad79658f343e98c1737fcfe563e76069b62f064/program/BasicModel.pyc -------------------------------------------------------------------------------- /program/Encoding.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | def encoding( data, encode_type, time_length, vocab_size ): 3 | if encode_type == '1hot': 4 | return onehot_encoding(data, time_length, vocab_size) 5 | elif encode_type == 'embedding': 6 | return data 7 | 8 | def onehot_encoding( data, time_length, vocab_size): 9 | X = np.zeros((len(data), time_length, vocab_size), dtype=np.bool) 10 | for i, sent in enumerate(data): 11 | for j, k in enumerate(sent): 12 | X[i, j, k] = 1 13 | return X 14 | 15 | def onehot_sent_encoding( data, vocab_size): 16 | X = np.zeros((len(data), vocab_size), dtype=np.bool) 17 | for i, sent in enumerate(data): 18 | for j, k in enumerate(sent): 19 | X[i, k] = 1 20 | return X 21 | -------------------------------------------------------------------------------- /program/History.py: -------------------------------------------------------------------------------- 1 | from keras.callbacks import Callback 2 | class LossHistory(Callback): 3 | def on_train_begin(self, logs={}): 4 | self.losses = [] 5 | 6 | def on_batch_end(self, batch, logs={}): 7 | self.losses.append(logs.get('loss')) 8 | -------------------------------------------------------------------------------- /program/History.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/yvchen/JointSLU/fad79658f343e98c1737fcfe563e76069b62f064/program/History.pyc -------------------------------------------------------------------------------- /program/PredefinedEmbedding.py: -------------------------------------------------------------------------------- 1 | """ 2 | Embeddings 3 | by D. Hakkani-Tur 4 | """ 5 | 6 | 7 | import re 8 | import numpy as np 9 | 10 | """ 11 | class InputVec(object): 12 | def __init__(self, input_data, encode_type, embeddingFile=None): 13 | num_sample, time_length = np.shape(input_data) 14 | if encode_type == '1hot': 15 | self.encoding = np.zeros((num_sample, time_length, )) 16 | for i, sent in enumerate(input_tensor): 17 | for j, k in enumerate(sent): 18 | self.encoding[i][j][k] = 1 19 | # elif encode_type == 'embedding': 20 | elif encode_type == 'pretrained' and embeddingFile != None: 21 | self.inputvec = readEmbeddings(embeddingFile) 22 | 23 | def getEmbeddingDim(self): 24 | if encode_type == '1hot': 25 | elif encode_type == 'pretrained': 26 | return self.inputvec['embeddingSize'] 27 | """ 28 | 29 | class PredefinedEmbedding(object): 30 | """ 31 | dictionary of embeddings 32 | """ 33 | 34 | def __init__(self,embeddingFile): 35 | 36 | self.embeddings = readEmbeddings(embeddingFile) 37 | 38 | def getEmbeddingDim(self): 39 | return self.embeddings['embeddingSize'] 40 | 41 | def getWordEmbedding(self,word): 42 | if word == "BOS": 43 | return self.embeddings['embeddings'][""] 44 | elif word == "EOS": 45 | return self.embeddings['embeddings'][""] 46 | elif word in self.embeddings['embeddings']: 47 | return self.embeddings['embeddings'][word] 48 | else: 49 | return np.zeros((self.embeddings['embeddingSize'],1)) 50 | 51 | 52 | def readEmbeddings(embeddingFile): 53 | 54 | # read the word embeddings 55 | # each line has one word and a vector of embeddings listed as a sequence of real valued numbers 56 | 57 | wordEmbeddings = {} 58 | first = True 59 | p=re.compile('\s+') 60 | 61 | for line in open(embeddingFile, 'r'): 62 | d=p.split(line.strip()) 63 | if (first): 64 | first = False 65 | size = len(d) -1 66 | else: 67 | if (size != len(d) -1): 68 | print "Problem with embedding file, not all vectors are the same length\n" 69 | exit() 70 | currentWord = d[0] 71 | wordEmbeddings[currentWord] = np.zeros((size,1)) 72 | for i in range(1,len(d)): 73 | wordEmbeddings[currentWord][i-1] = float(d[i]) 74 | embeddings={'embeddings':wordEmbeddings, 'embeddingSize': size} 75 | return embeddings 76 | -------------------------------------------------------------------------------- /program/PredefinedEmbedding.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/yvchen/JointSLU/fad79658f343e98c1737fcfe563e76069b62f064/program/PredefinedEmbedding.pyc -------------------------------------------------------------------------------- /program/SequenceTagger.py: -------------------------------------------------------------------------------- 1 | ''' 2 | ''' 3 | 4 | import argparse 5 | from BasicModel import KerasModel 6 | 7 | if __name__ == "__main__": 8 | parser = argparse.ArgumentParser() 9 | 10 | parser.add_argument('--arch', dest='arch', type=str, default='lstm', help='architecture to use') 11 | parser.add_argument('--train', dest='train_data_path', type=str, help='path to datasets') 12 | parser.add_argument('--dev', dest='dev_data_path', type=str, help='path to datasets') 13 | parser.add_argument('--test', dest='test_data_path', type=str, help='path to datasets') 14 | parser.add_argument('--out', dest='result_path', type=str, help='path to datasets') 15 | parser.add_argument('--trainnum', dest='train_numfile', type=str, help='path to train num file') 16 | parser.add_argument('--devnum', dest='dev_numfile', type=str, help='path to dev num file') 17 | parser.add_argument('--testnum', dest='test_numfile', type=str, help='path to test num file') 18 | 19 | # network parameters 20 | parser.add_argument('--sgdtype', dest='sgdtype', type=str, default='adam', help='SGD type: sgd/rmsprop/adagrad/adadelta/adam/adamax') 21 | parser.add_argument('--momentum', dest='momentum', type=float, default=0.1, help='momentum for vanilla SGD') 22 | parser.add_argument('--decay_rate', dest='decay_rate', type=float, default=0.0, help='decay rate for sgd') 23 | parser.add_argument('--rho', dest='rho', type=float, default=0.9, help='rho for rmsprop/adadelta') 24 | parser.add_argument('--beta1', dest='beta1', type=float, default=0.9, help='beta1 for adam/adamax') 25 | parser.add_argument('--beta2', dest='beta2', type=float, default=0.999, help='beta2 for adam/adamax') 26 | parser.add_argument('-l', '--learning_rate', dest='learning_rate', type=float, default=1e-2, help='learning rate') 27 | parser.add_argument('-b', '--batch_size', dest='batch_size', type=int, default=10, help='batch size') 28 | parser.add_argument('-m', '--max_epochs', dest='max_epochs', type=int, default=300, help='maximum number of epochs to train the model') 29 | parser.add_argument('-a', '--output_att', dest='output_att', type=str, help='whether output the attention distribution') 30 | parser.add_argument('--iter_per_epoch', dest='iter_per_epoch', type=int, default=100, help='number of iterations per epoch') 31 | parser.add_argument('--hidden_size', dest='hidden_size', type=int, default=50, help='size of hidden layer') 32 | parser.add_argument('--embedding_size', dest='embedding_size', type=int, default=100, help='dimension of embeddings') 33 | parser.add_argument('--activation_func', dest='activation_func', type=str, default='tanh', help='activation function for hidden units: sigmoid, tanh, relu') 34 | parser.add_argument('--input_type', dest='input_type', type=str, default='1hot', help='input type, could be: 1hot/embedding/predefined') 35 | parser.add_argument('--embedding_file', dest='embedding_file', type = str, help='path to the embedding file') 36 | parser.add_argument('--init', dest='init_type', type = str, default='glorot_uniform', help='weight initialization function: glorot_uniform/glorot_normal/he_uniform/he_normal') 37 | parser.add_argument('--forget_bias', dest='forget_bias', type = float, default=1.0, help='LSTM parameter to set forget bias values') 38 | 39 | # regularization 40 | parser.add_argument('--smooth_eps', dest='smooth_eps', type=float, default=1e-8, help='epsilon smoothing for rmsprop/adagrad/adadelta/adam/adamax') 41 | parser.add_argument('--dropout', dest='dropout', type = bool, default=False, help='True/False for performing dropout') 42 | parser.add_argument('--dropout_ratio', dest='dropout_ratio', type = float, default=0.5, help='ratio of weights to drop') 43 | parser.add_argument('--time_length', dest='time_length', type = int, default=60, help='the number of timestamps in given sequences. Short utterances will be padded to this length') 44 | parser.add_argument('--his_length', dest='his_length', type = int, default=20, help='the number of history turns considered for making prediction') 45 | parser.add_argument('--mdl_path', dest='mdl_path', type = str, help='the directory of storing tmp models') 46 | parser.add_argument('--default', dest='default_flag', type = bool, default=True, help='whether use the default values for optimizers') 47 | parser.add_argument('--log', dest='log', type = str, default=None, help='the log file output') 48 | parser.add_argument('--record_epoch', dest='record_epoch', type = int, default=-1, help='how many epochs we predict once') 49 | parser.add_argument('--load_weight', dest='load_weight', help='the weight file for initialization') 50 | parser.add_argument('--combine_his', dest='combine_his', type = bool, default=False, help='whether combine his and current one for prediction') 51 | parser.add_argument('--time_decay', dest='time_decay', type = bool, default=False, help='whether add another matrix for modeling time decay') 52 | parser.add_argument('--shuffle', dest='shuffle', type = bool, default=True, help='whether shuffle the data') 53 | parser.add_argument('--set_batch', dest='set_batch', type = bool, default=False, help='whether set the batch size') 54 | parser.add_argument('--tag_format', dest='tag_format', type = str, default='conlleval', help='defined tag format conlleval/normal (defaul is for conlleval usage, normal one outputs a tag sequence for one sentence per line) ') 55 | parser.add_argument('--e2e', dest='e2e_flag', type = bool, default=False, help='whether only using the last turn (end-to-end training)') 56 | 57 | args = parser.parse_args() 58 | argparams = vars(args) 59 | 60 | KerasModel(argparams).run() 61 | 62 | -------------------------------------------------------------------------------- /program/wordSlotDataSet.py: -------------------------------------------------------------------------------- 1 | """ 2 | RNN for slot filling 3 | dataSet Object 4 | by D. Hakkani-Tur 5 | modified by V. Chen 6 | """ 7 | import re 8 | import numpy as np 9 | 10 | class dataSet(object): 11 | """ 12 | utterances with slot tags 13 | """ 14 | 15 | def __init__(self,dataFile,toggle,wordDictionary,tagDictionary,id2word,id2tag): 16 | if toggle == 'train': 17 | self.dataSet = readData(dataFile) 18 | if toggle == 'val': 19 | self.dataSet = readTest(dataFile,wordDictionary,tagDictionary,id2word,id2tag) 20 | if toggle == 'test': 21 | self.dataSet = readTest(dataFile,wordDictionary,tagDictionary,id2word,id2tag) 22 | 23 | def getNum(self,numFile): 24 | return readNum(numFile) 25 | 26 | def getWordVocabSize(self): 27 | return self.dataSet['wordVocabSize'] 28 | 29 | def getTagVocabSize(self): 30 | return self.dataSet['tagVocabSize'] 31 | 32 | def getNoExamples(self): 33 | return self.dataSet['uttCount'] 34 | 35 | def getExampleUtterance(self,index): 36 | return self.dataSet['utterances'][index] 37 | 38 | def getExampleTags(self,index): 39 | return self.dataSet['tags'][index] 40 | 41 | def getWordVocab(self): 42 | return self.dataSet['word2id'] 43 | 44 | def getTagVocab(self): 45 | return self.dataSet['tag2id'] 46 | 47 | def getIndex2Word(self): 48 | return self.dataSet['id2word'] 49 | 50 | def getIndex2Tag(self): 51 | return self.dataSet['id2tag'] 52 | 53 | def getTagAtIndex(self,index): 54 | return self.dataSet['id2tag'][index] 55 | 56 | def getWordAtIndex(self,index): 57 | return self.dataSet['id2word'][index] 58 | 59 | def getSample(self,batchSize): 60 | inputs={} 61 | targets={} 62 | indices = np.random.randint(0,self.dataSet['uttCount'],size=batchSize) 63 | for i in xrange(batchSize): 64 | inputs[i] = self.dataSet['utterances'][indices[i]] 65 | targets[i] = self.dataSet['tags'][indices[i]] 66 | return inputs,targets 67 | 68 | """ 69 | def encodeInput(self, encode_type, time_length): 70 | from keras.preprocessing import sequence 71 | # preprocessing by padding 0 until maxlen 72 | pad_X = sequence.pad_sequences(trainData.dataSet['utterances'], maxlen=self.time_length, dtype='int32') 73 | pad_y = sequence.pad_sequences(trainData.dataSet['tags'], maxlen=self.time_length, dtype='int32') 74 | num_sample, max_len = np.shape(pad_X) 75 | 76 | if encode_type == '1hot': 77 | self.dataSet['utterances'] 78 | """ 79 | 80 | def readHisData(dataFile): 81 | 82 | # read the data sets 83 | # each line has one utterance that contains tab separated utterance words and corresponding IOB tags 84 | history = list() 85 | utterances = list() 86 | tags = list() 87 | 88 | # reserving index 0 for padding 89 | # reserving index 1 for unknown word and tokens 90 | word_vocab_index = 2 91 | tag_vocab_index = 2 92 | word2id = {'': 0, '': 1} 93 | tag2id = {'': 0, '': 1} 94 | id2word = ['', ''] 95 | id2tag = ['', ''] 96 | 97 | utt_count = 0 98 | for line in open(dataFile, 'r'): 99 | d = line.split('\t') 100 | his = d[0].strip() 101 | utt = d[1].strip() 102 | t = d[2].strip() 103 | print 'his: %s, utt: %s, tags: %s' % (his, utt, t) 104 | 105 | temp_his = list() 106 | temp_utt = list() 107 | temp_tags = list() 108 | if his != '': 109 | myhis = his.split() 110 | mywords = utt.split(' ') 111 | mytags = t.split(' ') 112 | # now add the words and tags to word and tag dictionaries 113 | # also save the word and tag sequence in training data sets 114 | for i in xrange(len(mywords)): 115 | if mywords[i] not in word2id: 116 | word2id[mywords[i]] = word_vocab_index 117 | id2word.append(mywords[i]) 118 | word_vocab_index += 1 119 | if mytags[i] not in tag2id: 120 | tag2id[mytags[i]] = tag_vocab_index 121 | id2tag.append(mytags[i]) 122 | tag_vocab_index += 1 123 | temp_utt.append(word2id[mywords[i]]) 124 | temp_tags.append(tag2id[mytags[i]]) 125 | if his != '': 126 | for i in xrange(len(myhis)): 127 | temp_his.append(word2id[myhis[i]]) 128 | utt_count += 1 129 | history.append(temp_his) 130 | utterances.append(temp_utt) 131 | tags.append(temp_tags) 132 | 133 | data = {'history': history, 'utterances': utterances, 'tags': tags, 'uttCount': utt_count, 'id2word':id2word, 'id2tag':id2tag, 'wordVocabSize' : word_vocab_index, 'tagVocabSize': tag_vocab_index, 'word2id': word2id, 'tag2id':tag2id} 134 | return data 135 | 136 | def readData(dataFile): 137 | 138 | # read the data sets 139 | # each line has one utterance that contains tab separated utterance words and corresponding IOB tags 140 | # if the input is multiturn session data, the flag following the IOB tags is 1 (session start) or 0 (not session start) 141 | 142 | utterances = list() 143 | tags = list() 144 | starts = list() 145 | startid = list() 146 | 147 | # reserving index 0 for padding 148 | # reserving index 1 for unknown word and tokens 149 | word_vocab_index = 2 150 | tag_vocab_index = 2 151 | word2id = {'': 0, '': 1} 152 | tag2id = {'': 0, '': 1} 153 | id2word = ['', ''] 154 | id2tag = ['', ''] 155 | 156 | utt_count = 0 157 | temp_startid = 0 158 | for line in open(dataFile, 'r'): 159 | d=line.split('\t') 160 | utt = d[0].strip() 161 | t = d[1].strip() 162 | if len(d) > 2: 163 | start = np.bool(int(d[2].strip())) 164 | starts.append(start) 165 | if start: 166 | temp_startid = utt_count 167 | startid.append(temp_startid) 168 | #print 'utt: %s, tags: %s' % (utt,t) 169 | 170 | temp_utt = list() 171 | temp_tags = list() 172 | mywords = utt.split() 173 | mytags = t.split() 174 | if len(mywords) != len(mytags): 175 | print mywords 176 | print mytags 177 | # now add the words and tags to word and tag dictionaries 178 | # also save the word and tag sequence in training data sets 179 | for i in xrange(len(mywords)): 180 | if mywords[i] not in word2id: 181 | word2id[mywords[i]] = word_vocab_index 182 | id2word.append(mywords[i]) 183 | word_vocab_index += 1 184 | if mytags[i] not in tag2id: 185 | tag2id[mytags[i]] = tag_vocab_index 186 | id2tag.append(mytags[i]) 187 | tag_vocab_index += 1 188 | temp_utt.append(word2id[mywords[i]]) 189 | temp_tags.append(tag2id[mytags[i]]) 190 | utt_count += 1 191 | utterances.append(temp_utt) 192 | tags.append(temp_tags) 193 | 194 | data = {'start': starts, 'startid': startid, 'utterances': utterances, 'tags': tags, 'uttCount': utt_count, 'id2word':id2word, 'id2tag':id2tag, 'wordVocabSize' : word_vocab_index, 'tagVocabSize': tag_vocab_index, 'word2id': word2id, 'tag2id':tag2id} 195 | return data 196 | 197 | def readTest(testFile,word2id,tag2id,id2word,id2tag): 198 | 199 | utterances = list() 200 | tags = list() 201 | starts = list() 202 | startid = list() 203 | 204 | utt_count = 0 205 | temp_startid = 0 206 | for line in open(testFile, 'r'): 207 | d=line.split('\t') 208 | utt = d[0].strip() 209 | t = d[1].strip() 210 | if len(d) > 2: 211 | start = np.bool(int(d[2].strip())) 212 | starts.append(start) 213 | if start: 214 | temp_startid = utt_count 215 | startid.append(temp_startid) 216 | #print 'utt: %s, tags: %s' % (utt,t) 217 | 218 | temp_utt = list() 219 | temp_tags = list() 220 | mywords = utt.split() 221 | mytags = t.split() 222 | # now add the words and tags to word and tag dictionaries 223 | # also save the word and tag sequence in training data sets 224 | for i in xrange(len(mywords)): 225 | if mywords[i] not in word2id: 226 | temp_utt.append(1) #i.e. append unknown word 227 | else: 228 | temp_utt.append(word2id[mywords[i]]) 229 | if mytags[i] not in tag2id: 230 | temp_tags.append(1) 231 | else: 232 | temp_tags.append(tag2id[mytags[i]]) 233 | utt_count += 1 234 | utterances.append(temp_utt) 235 | tags.append(temp_tags) 236 | wordVocabSize = len(word2id) 237 | 238 | data = {'start': starts, 'startid': startid, 'utterances': utterances, 'tags': tags, 'uttCount': utt_count, 'wordVocabSize' : wordVocabSize, 'id2word':id2word, 'id2tag': id2tag} 239 | return data 240 | 241 | def readNum(numFile): 242 | 243 | numList = map(int, file(numFile).read().strip().split()) 244 | totalList = list() 245 | cur = 0 246 | for num in numList: 247 | cur += num + 1 248 | totalList.append(cur) 249 | return numList, totalList 250 | -------------------------------------------------------------------------------- /program/wordSlotDataSet.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/yvchen/JointSLU/fad79658f343e98c1737fcfe563e76069b62f064/program/wordSlotDataSet.pyc -------------------------------------------------------------------------------- /sample/rnn+emb.3.h5: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/yvchen/JointSLU/fad79658f343e98c1737fcfe563e76069b62f064/sample/rnn+emb.3.h5 -------------------------------------------------------------------------------- /sample/rnn+emb_H-50_O-adam_A-tanh_WR-embedding.test.3: -------------------------------------------------------------------------------- 1 | B-arrive_date.month_name 2 | B-meal_description 3 | I-toloc.city_name 4 | B-depart_date.day_name 5 | B-depart_time.time 6 | O 7 | B-round_trip 8 | B-fromloc.city_name 9 | B-depart_date.day_name 10 | B-depart_time.time_relative 11 | O 12 | I-round_trip 13 | 14 | B-arrive_date.month_name 15 | B-fare_amount 16 | B-fare_amount 17 | B-depart_time.time_relative 18 | 19 | atis_flight 20 | B-arrive_time.time_relative 21 | B-arrive_time.time 22 | B-depart_date.day_name 23 | B-depart_time.time 24 | atis_airfare 25 | 26 | B-fromloc.city_name 27 | B-arrive_time.time 28 | O 29 | atis_flight 30 | 31 | I-fromloc.city_name 32 | B-arrive_time.time 33 | B-depart_date.day_number 34 | 35 | B-fare_amount 36 | I-fare_amount 37 | B-round_trip 38 | B-depart_date.day_number 39 | B-toloc.state_code 40 | O 41 | B-depart_time.time 42 | B-toloc.state_name 43 | I-fromloc.city_name 44 | B-cost_relative 45 | 46 | B-fare_amount 47 | I-fare_amount 48 | B-round_trip 49 | 50 | B-arrive_date.month_name 51 | B-meal_description 52 | I-toloc.city_name 53 | atis_airline 54 | 55 | atis_airline 56 | B-arrive_date.month_name 57 | B-arrive_time.time_relative 58 | 59 | B-depart_date.day_name 60 | I-arrive_time.time 61 | O 62 | 63 | B-arrive_date.month_name 64 | atis_airline 65 | B-depart_time.period_of_day 66 | O 67 | I-fare_amount 68 | B-fromloc.city_name 69 | B-arrive_date.day_number 70 | B-cost_relative 71 | B-depart_date.day_name 72 | B-depart_date.day_number 73 | B-airline_name 74 | O 75 | B-depart_date.day_number 76 | I-stoploc.city_name 77 | O 78 | atis_flight 79 | B-airline_name 80 | B-fromloc.city_name 81 | B-arrive_time.time 82 | B-depart_time.time_relative 83 | atis_flight 84 | 85 | B-arrive_date.month_name 86 | I-stoploc.city_name 87 | I-toloc.city_name 88 | B-arrive_date.day_number 89 | O 90 | 91 | B-depart_date.day_number 92 | I-fare_amount 93 | B-arrive_time.time 94 | I-stoploc.city_name 95 | B-airline_name 96 | O 97 | B-arrive_time.time 98 | 99 | B-arrive_date.month_name 100 | B-fare_amount 101 | B-fromloc.city_name 102 | B-arrive_date.day_number 103 | B-depart_time.time 104 | B-depart_time.period_of_day 105 | I-stoploc.city_name 106 | B-depart_date.day_name 107 | O 108 | B-toloc.city_name 109 | B-airline_name 110 | B-depart_date.day_name 111 | 112 | B-depart_date.day_number 113 | B-meal_description 114 | B-depart_date.month_name 115 | atis_flight 116 | atis_flight 117 | B-toloc.city_name 118 | B-airline_name 119 | B-airline_name 120 | atis_flight 121 | I-round_trip 122 | atis_flight 123 | atis_flight 124 | B-airline_name 125 | atis_flight 126 | 127 | B-arrive_date.month_name 128 | B-round_trip 129 | B-stoploc.city_name 130 | atis_airline 131 | O 132 | B-arrive_date.month_name 133 | B-arrive_date.day_number 134 | B-stoploc.city_name 135 | B-fromloc.city_name 136 | I-arrive_time.time 137 | B-arrive_time.time_relative 138 | O 139 | B-depart_date.day_name 140 | B-arrive_time.time 141 | B-toloc.city_name 142 | B-arrive_time.time 143 | 144 | B-arrive_date.month_name 145 | B-meal_description 146 | I-toloc.city_name 147 | atis_airline 148 | O 149 | B-depart_date.day_number 150 | O 151 | I-round_trip 152 | atis_flight 153 | B-depart_time.time_relative 154 | B-arrive_time.time 155 | 156 | B-arrive_date.month_name 157 | B-round_trip 158 | B-stoploc.city_name 159 | atis_airline 160 | B-depart_time.period_of_day 161 | I-arrive_time.time 162 | B-depart_date.day_number 163 | B-depart_date.day_name 164 | 165 | B-fromloc.city_name 166 | B-arrive_time.time_relative 167 | B-arrive_time.time 168 | B-depart_date.day_name 169 | B-depart_date.month_name 170 | 171 | B-airline_name 172 | B-airline_name 173 | B-depart_date.day_name 174 | B-arrive_date.day_number 175 | atis_flight 176 | 177 | B-arrive_date.month_name 178 | B-fare_amount 179 | I-toloc.city_name 180 | B-depart_time.period_of_day 181 | B-depart_date.day_name 182 | O 183 | B-fromloc.city_name 184 | B-depart_date.day_name 185 | I-fare_amount 186 | B-depart_date.day_name 187 | O 188 | O 189 | B-depart_date.day_name 190 | B-depart_date.day_number 191 | B-arrive_time.time_relative 192 | atis_flight 193 | 194 | -------------------------------------------------------------------------------- /sample/rnn+emb_H-50_O-adam_A-tanh_WR-embedding.test.3.prob: -------------------------------------------------------------------------------- 1 | 3.004134e-02 3.119398e-02 3.085091e-02 3.091323e-02 2.984214e-02 3.033330e-02 3.032658e-02 2.886154e-02 3.024286e-02 3.063466e-02 2.987448e-02 3.107107e-02 3.226705e-02 2.991162e-02 3.196911e-02 3.042180e-02 2.959559e-02 3.054700e-02 3.063351e-02 2.949713e-02 3.039691e-02 3.044371e-02 3.114286e-02 3.048025e-02 2.819052e-02 2.936811e-02 3.053750e-02 2.858530e-02 2.973474e-02 2.992309e-02 3.041556e-02 2.974731e-02 3.200532e-02 2 | 2.822747e-02 2.878842e-02 3.078257e-02 3.081240e-02 2.985657e-02 2.941195e-02 3.178100e-02 2.793164e-02 2.889900e-02 3.245956e-02 3.051860e-02 2.756307e-02 3.038573e-02 3.229151e-02 3.005164e-02 2.978977e-02 3.171821e-02 3.176499e-02 3.064537e-02 3.093446e-02 3.217635e-02 3.057218e-02 3.290017e-02 3.376536e-02 2.932679e-02 3.153845e-02 3.025015e-02 2.950498e-02 2.890016e-02 2.975089e-02 3.107599e-02 2.790898e-02 2.771564e-02 3 | 2.793294e-02 3.254715e-02 3.042792e-02 3.293279e-02 2.829232e-02 3.262677e-02 2.994653e-02 2.828243e-02 3.007885e-02 3.263188e-02 3.059287e-02 2.764908e-02 3.072548e-02 2.845796e-02 2.877599e-02 3.238026e-02 3.226335e-02 3.171226e-02 2.788896e-02 3.151865e-02 2.713785e-02 3.383352e-02 3.146333e-02 3.174935e-02 2.927863e-02 3.127377e-02 3.130918e-02 2.759310e-02 3.215668e-02 3.207940e-02 2.840343e-02 2.719625e-02 2.886106e-02 4 | 3.470575e-02 3.297131e-02 2.936060e-02 3.028559e-02 2.992747e-02 2.452375e-02 3.252875e-02 3.180924e-02 3.341967e-02 2.618930e-02 2.713006e-02 3.037574e-02 2.849086e-02 3.185276e-02 2.679055e-02 3.025892e-02 3.073436e-02 2.947139e-02 2.657760e-02 2.989597e-02 3.022162e-02 3.032712e-02 3.583748e-02 3.245580e-02 2.928627e-02 2.756197e-02 3.193345e-02 3.077061e-02 3.240412e-02 3.802304e-02 2.770867e-02 3.231928e-02 2.385092e-02 5 | 3.010423e-02 3.116522e-02 3.393754e-02 2.719094e-02 2.784757e-02 2.780046e-02 2.823272e-02 3.144001e-02 3.202245e-02 2.613546e-02 3.172666e-02 2.840193e-02 2.974718e-02 3.353348e-02 2.688164e-02 3.029637e-02 3.312880e-02 2.360081e-02 2.737040e-02 2.934091e-02 2.662244e-02 2.878422e-02 2.468628e-02 2.670618e-02 3.391378e-02 3.523425e-02 3.442551e-02 3.470252e-02 3.090366e-02 3.151172e-02 3.258260e-02 3.401289e-02 3.600914e-02 6 | 3.062391e-02 3.436812e-02 3.837900e-02 3.508039e-02 2.765183e-02 2.808514e-02 2.453276e-02 3.651453e-02 3.334988e-02 2.617079e-02 2.669869e-02 3.821559e-02 3.505602e-02 2.466871e-02 2.809891e-02 2.708342e-02 3.529320e-02 3.079382e-02 2.885969e-02 2.672824e-02 2.827003e-02 2.743560e-02 2.528726e-02 3.188113e-02 3.152018e-02 2.673686e-02 3.214202e-02 2.849918e-02 2.909733e-02 3.397540e-02 3.018185e-02 2.630784e-02 3.241266e-02 7 | 3.567399e-02 2.277907e-02 3.173864e-02 2.995050e-02 2.939554e-02 3.861246e-02 3.070864e-02 2.832275e-02 2.706954e-02 2.749453e-02 2.736670e-02 2.957810e-02 2.905190e-02 3.622874e-02 3.079311e-02 3.350758e-02 2.619273e-02 2.654182e-02 3.159868e-02 2.393453e-02 2.983101e-02 3.320497e-02 3.034232e-02 3.139070e-02 2.964174e-02 3.753864e-02 3.311104e-02 3.473377e-02 3.392381e-02 2.753413e-02 2.914367e-02 2.897602e-02 2.408865e-02 8 | 2.226453e-02 3.660938e-02 2.533905e-02 4.404858e-02 2.412734e-02 2.815391e-02 2.897572e-02 3.210710e-02 3.589643e-02 2.272747e-02 3.123020e-02 2.916636e-02 2.658687e-02 2.421835e-02 2.980051e-02 3.292460e-02 3.742493e-02 3.588267e-02 2.435134e-02 3.174156e-02 2.730253e-02 2.908306e-02 2.947070e-02 3.046222e-02 3.184114e-02 3.184676e-02 2.733613e-02 3.129302e-02 3.126733e-02 3.855357e-02 2.679335e-02 3.103903e-02 3.013427e-02 9 | 3.064071e-02 3.537375e-02 3.095928e-02 3.007151e-02 3.678863e-02 2.594905e-02 3.015155e-02 2.325170e-02 2.752603e-02 2.956673e-02 3.290481e-02 2.469984e-02 2.348173e-02 2.542614e-02 2.761496e-02 3.358619e-02 3.166854e-02 3.478942e-02 2.862461e-02 3.107385e-02 2.961136e-02 3.323683e-02 3.708296e-02 2.842836e-02 3.567589e-02 2.874865e-02 3.609512e-02 2.489546e-02 3.553065e-02 3.926067e-02 3.195270e-02 2.442980e-02 2.090253e-02 10 | 3.005594e-02 2.871219e-02 3.553245e-02 2.597787e-02 3.784054e-02 2.479445e-02 3.053750e-02 3.459884e-02 2.450966e-02 3.251637e-02 3.467564e-02 2.933917e-02 3.288698e-02 3.027609e-02 2.471439e-02 2.521437e-02 3.214680e-02 2.714053e-02 3.390747e-02 2.308057e-02 2.217374e-02 2.795676e-02 3.095306e-02 1.937376e-02 2.881427e-02 3.288879e-02 3.982382e-02 3.106307e-02 3.143143e-02 3.698106e-02 1.899988e-02 4.824043e-02 3.284210e-02 11 | 2.959301e-02 3.184488e-02 4.578681e-02 2.544341e-02 3.108581e-02 2.352527e-02 3.850216e-02 4.394765e-02 3.368938e-02 3.690537e-02 3.452187e-02 3.154786e-02 4.392479e-02 2.450547e-02 2.403665e-02 2.810545e-02 3.281866e-02 2.137456e-02 3.381738e-02 1.818073e-02 1.745615e-02 2.661384e-02 2.451665e-02 3.504731e-02 3.692036e-02 2.716027e-02 3.703839e-02 2.459477e-02 2.078404e-02 3.922814e-02 2.346303e-02 2.285342e-02 3.116651e-02 12 | 4.773801e-02 2.568872e-02 3.899479e-02 3.939065e-02 2.151597e-02 3.722727e-02 4.985582e-02 3.154624e-02 2.945591e-02 3.346162e-02 2.085542e-02 3.244642e-02 2.371929e-02 3.276433e-02 2.176217e-02 3.194561e-02 3.096820e-02 2.405750e-02 1.934969e-02 2.676851e-02 2.141184e-02 2.662156e-02 3.096448e-02 4.121345e-02 3.519575e-02 2.709081e-02 4.037342e-02 3.066447e-02 3.045685e-02 2.452559e-02 1.853991e-02 3.304016e-02 2.038951e-02 13 | 3.004134e-02 3.119398e-02 3.085091e-02 3.091323e-02 2.984214e-02 3.033330e-02 3.032658e-02 2.886154e-02 3.024286e-02 3.063466e-02 2.987448e-02 3.107107e-02 3.226705e-02 2.991162e-02 3.196911e-02 3.042180e-02 2.959559e-02 3.054700e-02 3.063351e-02 2.949713e-02 3.039691e-02 3.044371e-02 3.114286e-02 3.048025e-02 2.819052e-02 2.936811e-02 3.053750e-02 2.858530e-02 2.973474e-02 2.992309e-02 3.041556e-02 2.974731e-02 3.200532e-02 14 | 3.018645e-02 2.927823e-02 2.866958e-02 3.109285e-02 2.915013e-02 3.544775e-02 3.033483e-02 2.454647e-02 2.756177e-02 3.591517e-02 3.024909e-02 2.982915e-02 3.309145e-02 3.150993e-02 2.996077e-02 3.261216e-02 3.053172e-02 2.901699e-02 3.063589e-02 3.114127e-02 3.233673e-02 2.973992e-02 3.108113e-02 3.233423e-02 3.160774e-02 3.044147e-02 2.920744e-02 2.834997e-02 3.118951e-02 2.619551e-02 2.995985e-02 2.861116e-02 2.818371e-02 15 | 2.763357e-02 3.192290e-02 3.122844e-02 3.482782e-02 3.072981e-02 2.972450e-02 2.708668e-02 2.776238e-02 3.030816e-02 3.510488e-02 3.029198e-02 3.077336e-02 3.127876e-02 3.194928e-02 2.885432e-02 2.604220e-02 2.812142e-02 3.134410e-02 3.111204e-02 2.991384e-02 2.897592e-02 3.296590e-02 3.143790e-02 3.071203e-02 2.833597e-02 3.025285e-02 3.241801e-02 3.212407e-02 2.721396e-02 3.247559e-02 2.888748e-02 2.890337e-02 2.928649e-02 16 | 3.232053e-02 3.152054e-02 3.242011e-02 2.818424e-02 2.612495e-02 2.775914e-02 3.321584e-02 2.970892e-02 2.928287e-02 2.918803e-02 2.482261e-02 2.911771e-02 3.224100e-02 2.889117e-02 3.201513e-02 3.191312e-02 3.309780e-02 3.103746e-02 2.978562e-02 2.918302e-02 3.133425e-02 2.977359e-02 3.221104e-02 3.136820e-02 2.974200e-02 2.889973e-02 3.273079e-02 2.983302e-02 3.439672e-02 3.247597e-02 2.524005e-02 3.445910e-02 2.570573e-02 17 | 2.945203e-02 3.609781e-02 3.539301e-02 3.270465e-02 2.493419e-02 3.089695e-02 2.921865e-02 3.203126e-02 3.210758e-02 2.897872e-02 3.019956e-02 2.879376e-02 3.417580e-02 3.350793e-02 2.466018e-02 2.509847e-02 3.229048e-02 3.356022e-02 2.987020e-02 2.773705e-02 3.078576e-02 2.682303e-02 2.716128e-02 3.242959e-02 3.455111e-02 3.072347e-02 3.126188e-02 2.730485e-02 2.454335e-02 2.927003e-02 3.285282e-02 2.883544e-02 3.174895e-02 18 | 3.188291e-02 3.136250e-02 3.268651e-02 3.605185e-02 2.939405e-02 2.884343e-02 3.218096e-02 3.728912e-02 2.862108e-02 2.750778e-02 2.468254e-02 3.377786e-02 3.378565e-02 3.186986e-02 2.799804e-02 3.081229e-02 3.250704e-02 2.643798e-02 2.737689e-02 2.903240e-02 2.773321e-02 3.095905e-02 2.954095e-02 3.237967e-02 2.658928e-02 2.861664e-02 3.077137e-02 2.587968e-02 2.750740e-02 3.310256e-02 3.071090e-02 2.700530e-02 3.510327e-02 19 | 2.928629e-02 2.616492e-02 2.656315e-02 3.368356e-02 2.801104e-02 3.026148e-02 2.963878e-02 2.808819e-02 2.986597e-02 2.890225e-02 2.948990e-02 2.934233e-02 3.681086e-02 3.558590e-02 3.301920e-02 3.149031e-02 3.769539e-02 3.050625e-02 3.163633e-02 3.386411e-02 3.007520e-02 3.211689e-02 2.867749e-02 3.304697e-02 2.635963e-02 3.077003e-02 2.858448e-02 3.315317e-02 3.126281e-02 2.412865e-02 3.075331e-02 2.590151e-02 2.526362e-02 20 | 3.085114e-02 3.040451e-02 2.505699e-02 3.266353e-02 2.625522e-02 3.096136e-02 2.658797e-02 2.995024e-02 3.326749e-02 2.451755e-02 2.987655e-02 3.015956e-02 2.494258e-02 2.907085e-02 2.836429e-02 2.529946e-02 3.730674e-02 4.283984e-02 3.143530e-02 2.762953e-02 3.008954e-02 3.297088e-02 3.644277e-02 2.727904e-02 3.124342e-02 2.699939e-02 3.289797e-02 2.922970e-02 3.694997e-02 3.726943e-02 2.624201e-02 3.188713e-02 2.305800e-02 21 | 2.702519e-02 3.372271e-02 3.013022e-02 2.323939e-02 3.438343e-02 2.700485e-02 2.958747e-02 3.990421e-02 3.828875e-02 2.062028e-02 2.931320e-02 2.729828e-02 2.647102e-02 2.206876e-02 2.680775e-02 3.497563e-02 2.670966e-02 3.008127e-02 2.879764e-02 2.557054e-02 2.951841e-02 2.676214e-02 3.176837e-02 2.932320e-02 3.511247e-02 3.355500e-02 2.971837e-02 3.367459e-02 3.853564e-02 4.313561e-02 3.210308e-02 2.690118e-02 2.789176e-02 22 | 2.929034e-02 3.170678e-02 3.703961e-02 3.483669e-02 3.021355e-02 2.764300e-02 2.721770e-02 2.866741e-02 2.857077e-02 3.757941e-02 3.705399e-02 3.133824e-02 2.796909e-02 3.195440e-02 2.387480e-02 3.768298e-02 3.165017e-02 1.961302e-02 2.803648e-02 3.140935e-02 2.593338e-02 2.827461e-02 2.246607e-02 2.721267e-02 2.992169e-02 3.569164e-02 3.058930e-02 3.438922e-02 2.128751e-02 2.687143e-02 3.106736e-02 3.422759e-02 3.871981e-02 23 | 4.001233e-02 2.668072e-02 3.639411e-02 2.960177e-02 4.060642e-02 3.105116e-02 2.980979e-02 3.581090e-02 1.984328e-02 3.843430e-02 2.938767e-02 4.487614e-02 3.695551e-02 2.341416e-02 2.616708e-02 3.232115e-02 2.810388e-02 2.818171e-02 3.787712e-02 2.441446e-02 2.099989e-02 2.963927e-02 2.670326e-02 2.735705e-02 3.060124e-02 2.160960e-02 2.275994e-02 3.161561e-02 2.571846e-02 3.924236e-02 2.722414e-02 2.690056e-02 2.968498e-02 24 | 4.687287e-02 2.356219e-02 3.594452e-02 2.321290e-02 2.579049e-02 4.590384e-02 4.135592e-02 2.980591e-02 3.078842e-02 2.729423e-02 3.621961e-02 2.466890e-02 2.726299e-02 2.899334e-02 3.075258e-02 3.476509e-02 2.424853e-02 2.715723e-02 2.590313e-02 2.543095e-02 2.494142e-02 3.181229e-02 3.058025e-02 2.681694e-02 3.093723e-02 3.274187e-02 3.857065e-02 4.005246e-02 3.189214e-02 2.354548e-02 2.128391e-02 2.832261e-02 2.256916e-02 25 | 2.194461e-02 4.124736e-02 2.435588e-02 4.438284e-02 3.115010e-02 3.768636e-02 3.294131e-02 2.039938e-02 4.033086e-02 3.359599e-02 2.540575e-02 3.547614e-02 3.313397e-02 2.846542e-02 2.755948e-02 2.980168e-02 2.993598e-02 2.483184e-02 2.001365e-02 3.293496e-02 4.228871e-02 2.451741e-02 3.299789e-02 3.389364e-02 2.563312e-02 3.596175e-02 2.515643e-02 2.753444e-02 2.358898e-02 3.211554e-02 2.180485e-02 2.949364e-02 2.942004e-02 26 | 2.762218e-02 3.111372e-02 2.831301e-02 3.624086e-02 3.813234e-02 2.701299e-02 3.736423e-02 2.804727e-02 2.434576e-02 3.832078e-02 2.186616e-02 3.055990e-02 2.382838e-02 3.473296e-02 2.148812e-02 1.890521e-02 2.862326e-02 3.978999e-02 2.060360e-02 3.622079e-02 2.742896e-02 2.216468e-02 3.929709e-02 3.282902e-02 3.478480e-02 3.898400e-02 3.690739e-02 2.891959e-02 2.684144e-02 3.890672e-02 2.718708e-02 2.621563e-02 2.640206e-02 27 | 2.162767e-02 2.448085e-02 5.170893e-02 1.677379e-02 3.034162e-02 1.748680e-02 4.072605e-02 3.294906e-02 2.295104e-02 2.562233e-02 2.181284e-02 2.123519e-02 3.078662e-02 4.082559e-02 3.743562e-02 2.086847e-02 4.060141e-02 3.268153e-02 4.160117e-02 2.989939e-02 1.893869e-02 4.006459e-02 3.378570e-02 2.657725e-02 2.384832e-02 3.152379e-02 3.531948e-02 3.055557e-02 3.111256e-02 3.428749e-02 2.048599e-02 4.174454e-02 2.934006e-02 28 | 4.634825e-02 3.050811e-02 3.316239e-02 2.734362e-02 2.343151e-02 2.806578e-02 3.425855e-02 5.171246e-02 3.278581e-02 2.629998e-02 2.438738e-02 2.445674e-02 4.189255e-02 2.910341e-02 1.788927e-02 1.741002e-02 3.045228e-02 3.652563e-02 3.254743e-02 2.004405e-02 2.682751e-02 2.008283e-02 3.136205e-02 4.906934e-02 4.984668e-02 2.114299e-02 3.860452e-02 1.588921e-02 3.731396e-02 2.712015e-02 2.503797e-02 2.987715e-02 1.920041e-02 29 | 4.912116e-02 3.328518e-02 2.636600e-02 4.619856e-02 2.528439e-02 3.808101e-02 2.486891e-02 3.629754e-02 4.292456e-02 2.280750e-02 2.560245e-02 3.742089e-02 3.079371e-02 3.729020e-02 1.891075e-02 2.775151e-02 1.845511e-02 2.207018e-02 1.624242e-02 2.784104e-02 2.085061e-02 1.911631e-02 2.168416e-02 2.885042e-02 4.368662e-02 3.578527e-02 4.293229e-02 1.913957e-02 2.825729e-02 2.588827e-02 3.351733e-02 2.759172e-02 4.508711e-02 30 | 3.037782e-02 2.532321e-02 2.113753e-02 5.055937e-02 3.480752e-02 2.544366e-02 2.308302e-02 3.062758e-02 3.192441e-02 2.079120e-02 1.828351e-02 3.582903e-02 3.727162e-02 4.004978e-02 5.159304e-02 2.181093e-02 4.556130e-02 2.555982e-02 2.646527e-02 3.918684e-02 3.907140e-02 2.554721e-02 2.564438e-02 1.960625e-02 2.471851e-02 2.108739e-02 2.783746e-02 2.479988e-02 2.339404e-02 2.266478e-02 4.042095e-02 3.254750e-02 3.697384e-02 31 | 1.727880e-02 1.657210e-02 2.611855e-02 2.277201e-02 3.373438e-02 2.183002e-02 2.402849e-02 2.696115e-02 2.984914e-02 2.902023e-02 3.020047e-02 3.383697e-02 3.135763e-02 2.843996e-02 5.216906e-02 1.847618e-02 3.063646e-02 6.628335e-02 4.044438e-02 2.350748e-02 3.285947e-02 2.697525e-02 4.272687e-02 3.251084e-02 2.160418e-02 2.678977e-02 4.036543e-02 3.551482e-02 4.929612e-02 2.007638e-02 3.202597e-02 2.423434e-02 1.150374e-02 32 | 1.891123e-02 3.066058e-02 2.349040e-02 1.618382e-02 3.180505e-02 2.987960e-02 2.394860e-02 4.091803e-02 5.154884e-02 1.962266e-02 2.445928e-02 2.667489e-02 1.758710e-02 1.366637e-02 2.645482e-02 3.180812e-02 2.359734e-02 3.912097e-02 2.050859e-02 1.781905e-02 2.962157e-02 3.257982e-02 3.340350e-02 2.467619e-02 4.068827e-02 5.735523e-02 3.725545e-02 3.273973e-02 5.381325e-02 5.474475e-02 2.341099e-02 2.818734e-02 2.285853e-02 33 | 2.160672e-02 5.084448e-02 3.738050e-02 4.296282e-02 3.386750e-02 1.663742e-02 2.262830e-02 3.059659e-02 2.721931e-02 4.227635e-02 2.797109e-02 2.573515e-02 3.123043e-02 2.416324e-02 2.537723e-02 4.837451e-02 2.041978e-02 2.612107e-02 2.152517e-02 2.996137e-02 2.694140e-02 2.899979e-02 2.200829e-02 2.783546e-02 3.954752e-02 2.460130e-02 2.453155e-02 2.858471e-02 2.060392e-02 3.089275e-02 3.686258e-02 4.147594e-02 4.021576e-02 34 | 4.022866e-02 2.943283e-02 3.712118e-02 3.216286e-02 2.815995e-02 1.472494e-02 3.380162e-02 4.392273e-02 1.098588e-02 5.675025e-02 4.105548e-02 2.477677e-02 2.925545e-02 3.414729e-02 2.555933e-02 4.758980e-02 2.777582e-02 1.751467e-02 3.023151e-02 2.632653e-02 1.301556e-02 4.576737e-02 2.839786e-02 2.872253e-02 2.192223e-02 2.454143e-02 2.563197e-02 3.032305e-02 1.411061e-02 2.674464e-02 4.244883e-02 3.093003e-02 3.592034e-02 35 | 4.758390e-02 2.808748e-02 2.943400e-02 1.734520e-02 3.118216e-02 3.853993e-02 3.524241e-02 3.660214e-02 2.144705e-02 2.932422e-02 5.780961e-02 3.835200e-02 4.329935e-02 2.191984e-02 3.341656e-02 3.575474e-02 2.549395e-02 2.689305e-02 3.468677e-02 2.869618e-02 1.909441e-02 4.249540e-02 2.086086e-02 2.259583e-02 2.101000e-02 1.551777e-02 3.127016e-02 3.870513e-02 2.330220e-02 2.875725e-02 2.334264e-02 2.057177e-02 3.136593e-02 36 | 3.775497e-02 2.948022e-02 2.921804e-02 3.027292e-02 2.428601e-02 6.224381e-02 3.351454e-02 1.759991e-02 3.799369e-02 3.736756e-02 2.307969e-02 3.419995e-02 3.384221e-02 3.284943e-02 2.315690e-02 3.162923e-02 2.278221e-02 3.509182e-02 1.334564e-02 3.137209e-02 5.661825e-02 2.227828e-02 2.781279e-02 2.953660e-02 2.924650e-02 3.181913e-02 3.933879e-02 2.969239e-02 2.434096e-02 1.895173e-02 1.964203e-02 2.574676e-02 2.389492e-02 37 | 1.585626e-02 3.832551e-02 1.829554e-02 3.715382e-02 2.834496e-02 3.688532e-02 4.174383e-02 3.157874e-02 4.285241e-02 2.921046e-02 1.815275e-02 2.599771e-02 2.424609e-02 4.025086e-02 2.072527e-02 2.636459e-02 2.468792e-02 2.604771e-02 1.742942e-02 3.345484e-02 2.945389e-02 2.442680e-02 4.956239e-02 4.397409e-02 2.229228e-02 5.103492e-02 1.845866e-02 3.344288e-02 2.548143e-02 3.104283e-02 3.579512e-02 2.460479e-02 3.282591e-02 38 | 1.607407e-02 2.497154e-02 3.335917e-02 1.874482e-02 2.594315e-02 2.230553e-02 3.918767e-02 2.786226e-02 2.095372e-02 2.627379e-02 1.909019e-02 2.334428e-02 1.855161e-02 3.996518e-02 2.838048e-02 2.208599e-02 3.667561e-02 3.595819e-02 4.198651e-02 4.745903e-02 3.738758e-02 3.587349e-02 3.655360e-02 3.326850e-02 3.123354e-02 2.774125e-02 2.958417e-02 4.648591e-02 3.300425e-02 3.707493e-02 2.813902e-02 3.749363e-02 1.698730e-02 39 | 2.576811e-02 3.158446e-02 6.274466e-02 1.776018e-02 2.539100e-02 3.121154e-02 2.428637e-02 3.618658e-02 2.133521e-02 2.129634e-02 1.699775e-02 2.727086e-02 4.645193e-02 4.030643e-02 3.135370e-02 2.032660e-02 2.873796e-02 4.056865e-02 4.145598e-02 1.928837e-02 2.377206e-02 3.129129e-02 2.360886e-02 3.243913e-02 3.531227e-02 2.591062e-02 2.988383e-02 2.573519e-02 4.557812e-02 3.012028e-02 2.400586e-02 3.953389e-02 2.248592e-02 40 | 6.235188e-02 3.037025e-02 2.394300e-02 3.733540e-02 2.285632e-02 3.579649e-02 3.235520e-02 5.647637e-02 3.917430e-02 1.409172e-02 2.408120e-02 2.958765e-02 4.151288e-02 3.230520e-02 1.551952e-02 1.894939e-02 2.744443e-02 2.385731e-02 3.022514e-02 2.677616e-02 2.689555e-02 1.533941e-02 1.830665e-02 3.360064e-02 4.764276e-02 1.879687e-02 1.782857e-02 1.389762e-02 2.832686e-02 1.916177e-02 3.821053e-02 3.378854e-02 6.319440e-02 41 | 2.659892e-02 2.360025e-02 1.840827e-02 5.978526e-02 2.892378e-02 5.672064e-02 2.481680e-02 2.203439e-02 3.308366e-02 2.331251e-02 2.865695e-02 2.897970e-02 3.052743e-02 3.290488e-02 3.546243e-02 3.249085e-02 3.285148e-02 1.867616e-02 2.945223e-02 5.677499e-02 3.652447e-02 2.982997e-02 2.064356e-02 1.652624e-02 3.005360e-02 2.100502e-02 2.907126e-02 1.342600e-02 2.127503e-02 1.169555e-02 6.138149e-02 2.111077e-02 4.339549e-02 42 | 2.101767e-02 2.155050e-02 1.635581e-02 4.244146e-02 4.679413e-02 3.467686e-02 2.647595e-02 2.580500e-02 3.235250e-02 3.750219e-02 2.322129e-02 3.420227e-02 3.480739e-02 2.354092e-02 5.339630e-02 2.756959e-02 3.220699e-02 2.655398e-02 3.480186e-02 3.519355e-02 3.468856e-02 3.581430e-02 4.927013e-02 2.375660e-02 1.162332e-02 2.765301e-02 3.327675e-02 2.566635e-02 3.673144e-02 2.441552e-02 1.879978e-02 3.127887e-02 1.655914e-02 43 | 1.701812e-02 1.987979e-02 1.964715e-02 1.021276e-02 2.750976e-02 1.954063e-02 2.697478e-02 2.729442e-02 5.572468e-02 1.945943e-02 3.018270e-02 3.794732e-02 2.401305e-02 2.081198e-02 4.167736e-02 2.437441e-02 2.861684e-02 4.199947e-02 4.348573e-02 2.059186e-02 3.616784e-02 3.599356e-02 3.396970e-02 2.595139e-02 3.174070e-02 3.632733e-02 3.255470e-02 5.228552e-02 4.467082e-02 4.882663e-02 1.730669e-02 3.328688e-02 1.395597e-02 44 | 2.282225e-02 5.364570e-02 4.922246e-02 2.407004e-02 2.506665e-02 1.901655e-02 2.642203e-02 2.766432e-02 3.940686e-02 3.036031e-02 2.486819e-02 2.903922e-02 2.006847e-02 1.346664e-02 1.669539e-02 4.016634e-02 3.350943e-02 2.021608e-02 1.846963e-02 2.267607e-02 4.374678e-02 2.256788e-02 1.974674e-02 3.369814e-02 5.302842e-02 3.052200e-02 2.278080e-02 3.032940e-02 2.488422e-02 3.825167e-02 3.936876e-02 3.669406e-02 4.750852e-02 45 | 4.934695e-02 4.631552e-02 4.084899e-02 3.663292e-02 4.394937e-02 1.563858e-02 2.181814e-02 2.738731e-02 1.039489e-02 6.249004e-02 3.485661e-02 2.849957e-02 2.749227e-02 2.307969e-02 2.164051e-02 4.337513e-02 1.748590e-02 1.629155e-02 3.799037e-02 2.802492e-02 1.838702e-02 3.568363e-02 2.365007e-02 3.666364e-02 1.630557e-02 2.384254e-02 2.018796e-02 2.479092e-02 1.421103e-02 2.557191e-02 5.096109e-02 1.785431e-02 5.833104e-02 46 | 5.616021e-02 1.456377e-02 1.913245e-02 2.314518e-02 2.260360e-02 4.337950e-02 2.484645e-02 1.914794e-02 1.161368e-02 4.345192e-02 7.272174e-02 2.976154e-02 3.096373e-02 3.236099e-02 4.686906e-02 2.964364e-02 2.559528e-02 2.338960e-02 3.886578e-02 3.329054e-02 2.289963e-02 3.522021e-02 2.840085e-02 1.852222e-02 1.963134e-02 3.467862e-02 3.953812e-02 4.847578e-02 1.988762e-02 2.016532e-02 2.254862e-02 2.249089e-02 2.603422e-02 47 | 3.125184e-02 2.830666e-02 3.823224e-02 3.578204e-02 2.080829e-02 5.785022e-02 3.433022e-02 2.030757e-02 4.273864e-02 3.518197e-02 3.149353e-02 3.082225e-02 3.607417e-02 1.973031e-02 2.630945e-02 3.430487e-02 2.456302e-02 5.169009e-02 1.673147e-02 3.346379e-02 3.000174e-02 1.722240e-02 3.619989e-02 3.752923e-02 2.417797e-02 2.003140e-02 3.279641e-02 2.751822e-02 1.862280e-02 3.330079e-02 1.845513e-02 2.614279e-02 2.802864e-02 48 | 3.004134e-02 3.119398e-02 3.085091e-02 3.091323e-02 2.984214e-02 3.033330e-02 3.032658e-02 2.886154e-02 3.024286e-02 3.063466e-02 2.987448e-02 3.107107e-02 3.226705e-02 2.991162e-02 3.196911e-02 3.042180e-02 2.959559e-02 3.054700e-02 3.063351e-02 2.949713e-02 3.039691e-02 3.044371e-02 3.114286e-02 3.048025e-02 2.819052e-02 2.936811e-02 3.053750e-02 2.858530e-02 2.973474e-02 2.992309e-02 3.041556e-02 2.974731e-02 3.200532e-02 49 | 2.871053e-02 2.924470e-02 2.930849e-02 3.153962e-02 2.805488e-02 3.033383e-02 3.342623e-02 3.028443e-02 3.106664e-02 3.218884e-02 3.216029e-02 2.904452e-02 3.076288e-02 3.092094e-02 3.033504e-02 3.033621e-02 3.163766e-02 3.057523e-02 3.174340e-02 2.891628e-02 3.234330e-02 2.839275e-02 3.069116e-02 3.391330e-02 2.894550e-02 3.177516e-02 3.052361e-02 2.869605e-02 2.809470e-02 2.856377e-02 2.944339e-02 2.875680e-02 2.926984e-02 50 | 2.839502e-02 3.183948e-02 3.154563e-02 3.159309e-02 2.868487e-02 2.926593e-02 2.981498e-02 2.693304e-02 2.992951e-02 3.214935e-02 2.912963e-02 3.383420e-02 3.098219e-02 3.206198e-02 3.103289e-02 3.110184e-02 2.941988e-02 3.042205e-02 2.964166e-02 2.919167e-02 2.973337e-02 3.415262e-02 3.016049e-02 3.326547e-02 2.995504e-02 2.954678e-02 3.074851e-02 2.813905e-02 3.108641e-02 3.238882e-02 2.929128e-02 2.654641e-02 2.801681e-02 51 | 3.273794e-02 2.890844e-02 3.034232e-02 3.023416e-02 2.903325e-02 3.210270e-02 2.895669e-02 2.986335e-02 2.784623e-02 2.955043e-02 2.546348e-02 2.763059e-02 3.247997e-02 3.147051e-02 2.889644e-02 2.822335e-02 3.008223e-02 2.994040e-02 2.961716e-02 2.894417e-02 3.335924e-02 2.928750e-02 3.109134e-02 3.013667e-02 3.150956e-02 3.177168e-02 3.227231e-02 3.300716e-02 3.299774e-02 3.067396e-02 3.066459e-02 3.278973e-02 2.811467e-02 52 | 2.598969e-02 3.592102e-02 3.580574e-02 2.984766e-02 2.616662e-02 2.754703e-02 2.981507e-02 3.271754e-02 3.186956e-02 2.492958e-02 3.075938e-02 2.784025e-02 3.101307e-02 2.992468e-02 3.126318e-02 3.070878e-02 3.301640e-02 3.320439e-02 3.151543e-02 3.017991e-02 2.695728e-02 2.838375e-02 3.032358e-02 3.119033e-02 2.820341e-02 2.911579e-02 3.181282e-02 2.927946e-02 2.834606e-02 3.316208e-02 3.240814e-02 2.945141e-02 3.133093e-02 53 | 3.223367e-02 3.128063e-02 3.166748e-02 3.461849e-02 2.924590e-02 3.439530e-02 3.086153e-02 3.047349e-02 2.988498e-02 2.983838e-02 2.666307e-02 3.320829e-02 2.919004e-02 2.810899e-02 2.554317e-02 2.917983e-02 3.311753e-02 2.994955e-02 2.693743e-02 3.225978e-02 3.544435e-02 2.761055e-02 2.920157e-02 3.088176e-02 3.269698e-02 3.198535e-02 3.175851e-02 2.692307e-02 2.914647e-02 2.883358e-02 3.147005e-02 2.646659e-02 2.892365e-02 54 | 2.648748e-02 2.761327e-02 3.520131e-02 3.200027e-02 2.607406e-02 3.299111e-02 3.015064e-02 3.377707e-02 2.746401e-02 3.012897e-02 2.653332e-02 3.361532e-02 4.026319e-02 3.481979e-02 3.235872e-02 2.979149e-02 2.874519e-02 3.238951e-02 3.048719e-02 2.563975e-02 2.281189e-02 3.452273e-02 2.767844e-02 2.600193e-02 2.635305e-02 3.317495e-02 3.301983e-02 3.098264e-02 3.006880e-02 3.102015e-02 2.575281e-02 3.111252e-02 3.096859e-02 55 | 2.657967e-02 2.738216e-02 3.167849e-02 2.721888e-02 2.500674e-02 2.797003e-02 3.618582e-02 3.256671e-02 3.193377e-02 2.591138e-02 2.686493e-02 2.826027e-02 3.404257e-02 2.992919e-02 2.643165e-02 3.374954e-02 4.085137e-02 2.670148e-02 3.124238e-02 2.671910e-02 2.788678e-02 3.316609e-02 3.514298e-02 3.833491e-02 3.317172e-02 2.899422e-02 2.682444e-02 3.351775e-02 3.026002e-02 3.302059e-02 2.861454e-02 2.858986e-02 2.525010e-02 56 | 3.360681e-02 4.053175e-02 2.933508e-02 3.654182e-02 2.680189e-02 3.267756e-02 2.852295e-02 2.729983e-02 3.337889e-02 2.734170e-02 3.018166e-02 3.173970e-02 2.889311e-02 2.569534e-02 2.355457e-02 3.417137e-02 3.334586e-02 3.016909e-02 2.574342e-02 2.813026e-02 2.749220e-02 3.538387e-02 2.685595e-02 3.505570e-02 3.193557e-02 2.652502e-02 3.332852e-02 2.970986e-02 3.535070e-02 2.903152e-02 2.870947e-02 2.546504e-02 2.749394e-02 57 | 3.057775e-02 3.483967e-02 2.812658e-02 3.675995e-02 3.159223e-02 2.517034e-02 2.516648e-02 2.686214e-02 2.595758e-02 3.137676e-02 3.180722e-02 2.890658e-02 2.983684e-02 3.093295e-02 2.917615e-02 2.811252e-02 3.405187e-02 3.007175e-02 3.357470e-02 2.569444e-02 3.510190e-02 3.147725e-02 2.975567e-02 2.264161e-02 2.613463e-02 3.429308e-02 3.329515e-02 2.670890e-02 2.512898e-02 4.748602e-02 2.488745e-02 3.400844e-02 3.048645e-02 58 | 2.822962e-02 2.564732e-02 3.423639e-02 2.376743e-02 3.039256e-02 2.969740e-02 3.640401e-02 3.606891e-02 2.431969e-02 2.484550e-02 3.551844e-02 2.582654e-02 3.153616e-02 2.914041e-02 3.270436e-02 3.211799e-02 3.924358e-02 2.961093e-02 4.034430e-02 2.891416e-02 2.025862e-02 2.494203e-02 2.854778e-02 2.858362e-02 3.039186e-02 3.079515e-02 3.721655e-02 2.895212e-02 3.034090e-02 3.812799e-02 2.679125e-02 2.599320e-02 3.049321e-02 59 | 3.397901e-02 2.962212e-02 4.795554e-02 3.816809e-02 3.171732e-02 3.349530e-02 3.701771e-02 3.051849e-02 3.813245e-02 3.497681e-02 2.987760e-02 3.537003e-02 2.328079e-02 2.388804e-02 1.617648e-02 2.399280e-02 3.325745e-02 2.694265e-02 2.569068e-02 3.225883e-02 2.963337e-02 2.211843e-02 3.371489e-02 3.537962e-02 3.229194e-02 2.900208e-02 3.353196e-02 3.313641e-02 2.275364e-02 3.012687e-02 2.272786e-02 2.622263e-02 2.304212e-02 60 | 3.004134e-02 3.119398e-02 3.085091e-02 3.091323e-02 2.984214e-02 3.033330e-02 3.032658e-02 2.886154e-02 3.024286e-02 3.063466e-02 2.987448e-02 3.107107e-02 3.226705e-02 2.991162e-02 3.196911e-02 3.042180e-02 2.959559e-02 3.054700e-02 3.063351e-02 2.949713e-02 3.039691e-02 3.044371e-02 3.114286e-02 3.048025e-02 2.819052e-02 2.936811e-02 3.053750e-02 2.858530e-02 2.973474e-02 2.992309e-02 3.041556e-02 2.974731e-02 3.200532e-02 61 | 2.977731e-02 2.886226e-02 2.791992e-02 3.208046e-02 2.947209e-02 3.098234e-02 2.948610e-02 2.723129e-02 3.013649e-02 3.334088e-02 3.012824e-02 2.885517e-02 3.106648e-02 3.197315e-02 2.928467e-02 3.125177e-02 3.056177e-02 2.952654e-02 2.929679e-02 3.183695e-02 3.347115e-02 3.024124e-02 3.154963e-02 3.215455e-02 2.998182e-02 3.321787e-02 3.062390e-02 2.915514e-02 2.982903e-02 2.924227e-02 3.081282e-02 2.840297e-02 2.824699e-02 62 | 2.873900e-02 3.107043e-02 2.956240e-02 3.002914e-02 2.872878e-02 3.029898e-02 2.652148e-02 2.584418e-02 3.015218e-02 2.980809e-02 3.142757e-02 3.074698e-02 2.994578e-02 3.054414e-02 3.129027e-02 3.035436e-02 3.152347e-02 3.266351e-02 2.894277e-02 3.122672e-02 2.821919e-02 3.131351e-02 2.901906e-02 2.812848e-02 3.138365e-02 3.094932e-02 3.246535e-02 3.134579e-02 3.507769e-02 3.330567e-02 2.874846e-02 3.077417e-02 2.984947e-02 63 | 3.067695e-02 2.926269e-02 3.581980e-02 2.729941e-02 2.919994e-02 2.753162e-02 2.891537e-02 3.020072e-02 2.801972e-02 2.988776e-02 2.673970e-02 2.566752e-02 2.966436e-02 3.011222e-02 2.948678e-02 2.912710e-02 3.165070e-02 3.104774e-02 3.412312e-02 2.642858e-02 3.307384e-02 2.927962e-02 3.425935e-02 3.117618e-02 3.059408e-02 3.087862e-02 3.348306e-02 3.146178e-02 3.209310e-02 3.279207e-02 2.959661e-02 3.387557e-02 2.657434e-02 64 | 2.717016e-02 3.187152e-02 3.138461e-02 2.950835e-02 2.531324e-02 3.023738e-02 3.125119e-02 3.472609e-02 3.068764e-02 2.743194e-02 3.792156e-02 2.688672e-02 3.074521e-02 2.940698e-02 2.843507e-02 2.997055e-02 3.171847e-02 3.290918e-02 3.691945e-02 2.656664e-02 2.645133e-02 2.806317e-02 2.849060e-02 3.176601e-02 3.180068e-02 3.153491e-02 2.833126e-02 2.909472e-02 3.273548e-02 2.974998e-02 2.989291e-02 2.816074e-02 3.286630e-02 65 | 3.283557e-02 3.121123e-02 3.248948e-02 4.120284e-02 3.054526e-02 3.291282e-02 3.056566e-02 3.025464e-02 2.729407e-02 3.147893e-02 3.015393e-02 3.274910e-02 2.855152e-02 3.098628e-02 2.659444e-02 3.023457e-02 3.432797e-02 3.311365e-02 3.190451e-02 2.771082e-02 3.000299e-02 2.705066e-02 2.791091e-02 3.012031e-02 3.478666e-02 2.608966e-02 2.721740e-02 2.373058e-02 2.670934e-02 3.383705e-02 2.878474e-02 2.671007e-02 2.993228e-02 66 | 3.473162e-02 2.698721e-02 3.114052e-02 3.421311e-02 3.535390e-02 3.179418e-02 3.358389e-02 2.916169e-02 2.528383e-02 2.869687e-02 2.952461e-02 2.986031e-02 3.527246e-02 4.049442e-02 3.523333e-02 3.121645e-02 2.978850e-02 3.105561e-02 3.472974e-02 2.709110e-02 2.470151e-02 3.198513e-02 3.043734e-02 2.769856e-02 2.554883e-02 2.705403e-02 2.734394e-02 2.837045e-02 3.374780e-02 2.787614e-02 2.576705e-02 2.708742e-02 2.716844e-02 67 | 2.755702e-02 2.877032e-02 2.560913e-02 3.658555e-02 2.535517e-02 2.673520e-02 3.184338e-02 3.126853e-02 4.332742e-02 2.641315e-02 3.377796e-02 2.887568e-02 3.201223e-02 3.081468e-02 2.977094e-02 2.615449e-02 4.210678e-02 2.997456e-02 2.902972e-02 2.983158e-02 3.148678e-02 2.761073e-02 2.924080e-02 3.234274e-02 3.202846e-02 3.173734e-02 3.200531e-02 2.930418e-02 3.019146e-02 2.955894e-02 2.534883e-02 2.882775e-02 2.450326e-02 68 | 3.337729e-02 3.630005e-02 3.302596e-02 3.462268e-02 3.344198e-02 2.799714e-02 2.937143e-02 2.779314e-02 3.576429e-02 2.851846e-02 2.349502e-02 3.253772e-02 3.122849e-02 2.793035e-02 2.818720e-02 3.082608e-02 3.074184e-02 2.812172e-02 2.522459e-02 2.670767e-02 2.783690e-02 3.013551e-02 3.021729e-02 2.972093e-02 3.306282e-02 2.723453e-02 3.527701e-02 2.235514e-02 3.678245e-02 4.190771e-02 2.550202e-02 2.901749e-02 2.573711e-02 69 | 3.443472e-02 2.866038e-02 3.708524e-02 3.708379e-02 2.999736e-02 2.086170e-02 2.323204e-02 2.940547e-02 2.915690e-02 3.321429e-02 2.796691e-02 3.086662e-02 3.452014e-02 3.361003e-02 2.475526e-02 2.118413e-02 3.072724e-02 2.761967e-02 2.945493e-02 2.957930e-02 2.718620e-02 2.803853e-02 2.681302e-02 2.347490e-02 3.201007e-02 4.058034e-02 3.676846e-02 3.472947e-02 2.737984e-02 3.550624e-02 2.383696e-02 3.327926e-02 3.698057e-02 70 | 3.377468e-02 3.341810e-02 4.030357e-02 3.521587e-02 2.390410e-02 2.260777e-02 2.841073e-02 3.658341e-02 2.934604e-02 2.828478e-02 2.969423e-02 3.109437e-02 3.890989e-02 2.823376e-02 3.209720e-02 2.913743e-02 3.353970e-02 2.350142e-02 2.911906e-02 2.410606e-02 1.940200e-02 3.198614e-02 2.961129e-02 3.269072e-02 2.942668e-02 2.850499e-02 4.168960e-02 2.569480e-02 2.501058e-02 4.052792e-02 2.992991e-02 2.542311e-02 2.882009e-02 71 | 4.349646e-02 2.972582e-02 4.349925e-02 3.598770e-02 2.926964e-02 2.546207e-02 3.059242e-02 2.516774e-02 2.736856e-02 3.372046e-02 2.793078e-02 4.091462e-02 3.016441e-02 2.389285e-02 1.951974e-02 3.063061e-02 2.426882e-02 3.476242e-02 2.052733e-02 2.906447e-02 2.938721e-02 2.166630e-02 2.845600e-02 4.291688e-02 3.388124e-02 2.873738e-02 3.956675e-02 3.546362e-02 2.548533e-02 3.203216e-02 2.402622e-02 2.669142e-02 2.572338e-02 72 | 3.362400e-02 3.860085e-02 3.312375e-02 3.780926e-02 2.717517e-02 4.208337e-02 2.553661e-02 2.625141e-02 3.052490e-02 3.232969e-02 3.034812e-02 2.793659e-02 3.386334e-02 2.721034e-02 2.531944e-02 2.718352e-02 2.457389e-02 3.401104e-02 1.837615e-02 2.243741e-02 2.607514e-02 3.423148e-02 3.489636e-02 2.159118e-02 3.379761e-02 4.294664e-02 3.893685e-02 2.489812e-02 2.678272e-02 3.863538e-02 2.703465e-02 2.162261e-02 3.023241e-02 73 | 2.233459e-02 3.055096e-02 3.305923e-02 3.096210e-02 3.068402e-02 2.073546e-02 3.711830e-02 3.251237e-02 2.735149e-02 3.159315e-02 2.783665e-02 2.591441e-02 3.937734e-02 2.796994e-02 2.366027e-02 2.789629e-02 3.391949e-02 3.612270e-02 2.571134e-02 2.591415e-02 3.351949e-02 2.198822e-02 4.227774e-02 3.757811e-02 2.838750e-02 3.095851e-02 3.769227e-02 3.025519e-02 2.867527e-02 3.946863e-02 2.450135e-02 3.194848e-02 2.152503e-02 74 | 2.285910e-02 3.623389e-02 5.277396e-02 2.340642e-02 2.788138e-02 2.671199e-02 3.519811e-02 3.242207e-02 2.408691e-02 3.112787e-02 2.060760e-02 2.596762e-02 2.900078e-02 3.636920e-02 1.969521e-02 3.099524e-02 3.999023e-02 2.903283e-02 2.239095e-02 2.022650e-02 2.031916e-02 3.705289e-02 3.713486e-02 3.281229e-02 2.946637e-02 4.130446e-02 5.089360e-02 2.909613e-02 3.448470e-02 3.476553e-02 2.178775e-02 2.712132e-02 1.678304e-02 75 | 3.366490e-02 4.418406e-02 3.812345e-02 2.869829e-02 2.250065e-02 1.983933e-02 3.367353e-02 4.703327e-02 3.347857e-02 2.374917e-02 2.110348e-02 3.332767e-02 4.285628e-02 2.757440e-02 2.041613e-02 2.210070e-02 3.761486e-02 3.520066e-02 2.484282e-02 1.467766e-02 1.999498e-02 2.778175e-02 2.435677e-02 3.103751e-02 4.121816e-02 2.569976e-02 3.606714e-02 2.441023e-02 4.079735e-02 4.183761e-02 1.812417e-02 3.856940e-02 2.544528e-02 76 | 2.987808e-02 2.937032e-02 4.572017e-02 2.871099e-02 1.762138e-02 2.645736e-02 4.045605e-02 4.747531e-02 2.470224e-02 2.243585e-02 1.961176e-02 2.443607e-02 3.071864e-02 4.970659e-02 2.485905e-02 3.162277e-02 3.015268e-02 2.445937e-02 2.358265e-02 1.571252e-02 1.510913e-02 2.726017e-02 2.618091e-02 3.416169e-02 4.755614e-02 2.998823e-02 6.146822e-02 2.101944e-02 2.690977e-02 3.601107e-02 3.401009e-02 3.203373e-02 2.060148e-02 77 | 2.547374e-02 3.308173e-02 3.150493e-02 4.749280e-02 3.035249e-02 3.399793e-02 3.647814e-02 4.184719e-02 2.978694e-02 2.469838e-02 2.442828e-02 3.314342e-02 3.672632e-02 3.720918e-02 2.007238e-02 2.297871e-02 3.355104e-02 1.814580e-02 2.114903e-02 3.709893e-02 1.955914e-02 2.947569e-02 2.388784e-02 3.151692e-02 2.644419e-02 2.896383e-02 3.615439e-02 3.425021e-02 2.612016e-02 2.760285e-02 3.250704e-02 2.252566e-02 4.177471e-02 78 | 2.430006e-02 2.730570e-02 2.212746e-02 2.665882e-02 3.371208e-02 3.802495e-02 2.633821e-02 2.853054e-02 3.638140e-02 2.178260e-02 1.779297e-02 4.057428e-02 3.660741e-02 3.245040e-02 3.583269e-02 2.381378e-02 3.330638e-02 5.430618e-02 2.685780e-02 2.738776e-02 2.847027e-02 2.523164e-02 3.358061e-02 3.050900e-02 3.208102e-02 2.278894e-02 4.545615e-02 3.353367e-02 3.192232e-02 3.278940e-02 2.991687e-02 2.505532e-02 1.457339e-02 79 | 1.884398e-02 2.159756e-02 2.684647e-02 1.869745e-02 3.443676e-02 4.341093e-02 3.716777e-02 4.520481e-02 3.944678e-02 1.724195e-02 1.960580e-02 2.375277e-02 2.456945e-02 3.394177e-02 2.408625e-02 1.756180e-02 2.901807e-02 3.767814e-02 2.907296e-02 2.054902e-02 3.670420e-02 2.127493e-02 3.417406e-02 2.760172e-02 4.550337e-02 3.518325e-02 3.203743e-02 4.107690e-02 3.461481e-02 3.752544e-02 2.444031e-02 4.927271e-02 1.786041e-02 80 | 1.286300e-02 3.973183e-02 4.028470e-02 3.439001e-02 2.815191e-02 3.417129e-02 3.564652e-02 8.558726e-02 5.151928e-02 1.681146e-02 1.389643e-02 2.282394e-02 2.692225e-02 1.449343e-02 1.868835e-02 2.152963e-02 3.147703e-02 2.738003e-02 2.967888e-02 3.433961e-02 3.462855e-02 2.270555e-02 2.743362e-02 2.764243e-02 3.446596e-02 2.530672e-02 2.585005e-02 2.747276e-02 2.312962e-02 4.008126e-02 3.110747e-02 3.226938e-02 2.751972e-02 81 | 3.004134e-02 3.119398e-02 3.085091e-02 3.091323e-02 2.984214e-02 3.033330e-02 3.032658e-02 2.886154e-02 3.024286e-02 3.063466e-02 2.987448e-02 3.107107e-02 3.226705e-02 2.991162e-02 3.196911e-02 3.042180e-02 2.959559e-02 3.054700e-02 3.063351e-02 2.949713e-02 3.039691e-02 3.044371e-02 3.114286e-02 3.048025e-02 2.819052e-02 2.936811e-02 3.053750e-02 2.858530e-02 2.973474e-02 2.992309e-02 3.041556e-02 2.974731e-02 3.200532e-02 82 | 2.979474e-02 2.896291e-02 3.042269e-02 3.147914e-02 2.949982e-02 3.100962e-02 3.240804e-02 2.826008e-02 2.913002e-02 3.234047e-02 3.122204e-02 2.938054e-02 2.960125e-02 3.081059e-02 2.889084e-02 3.006152e-02 3.049544e-02 3.154336e-02 3.203816e-02 2.916829e-02 3.130754e-02 2.955969e-02 3.275828e-02 3.229646e-02 3.075252e-02 2.951432e-02 2.843545e-02 2.928371e-02 2.950415e-02 3.018197e-02 3.040929e-02 2.980758e-02 2.966952e-02 83 | 2.881509e-02 3.256080e-02 2.951373e-02 3.240299e-02 3.140992e-02 3.095776e-02 2.936394e-02 3.002536e-02 3.206978e-02 2.997821e-02 3.057930e-02 2.846937e-02 3.069569e-02 2.945891e-02 2.945470e-02 3.044077e-02 3.099795e-02 2.960505e-02 2.980764e-02 3.086328e-02 2.928293e-02 3.298540e-02 3.227762e-02 2.950918e-02 2.884775e-02 3.049192e-02 2.959356e-02 2.921593e-02 3.159325e-02 3.262001e-02 3.050479e-02 2.630975e-02 2.929765e-02 84 | 3.350132e-02 3.011519e-02 3.070106e-02 3.363774e-02 3.037709e-02 2.638066e-02 2.902433e-02 2.818367e-02 2.967051e-02 3.044433e-02 2.982073e-02 2.955301e-02 3.083559e-02 3.617260e-02 3.074185e-02 2.895487e-02 3.184653e-02 2.681481e-02 2.897591e-02 3.096875e-02 2.929089e-02 2.977364e-02 2.908614e-02 2.862965e-02 2.803993e-02 3.079573e-02 3.470884e-02 2.990113e-02 3.128550e-02 3.307302e-02 2.633412e-02 3.445933e-02 2.790151e-02 85 | 2.959591e-02 3.104737e-02 3.744042e-02 2.971140e-02 2.720261e-02 2.756565e-02 2.704394e-02 3.416221e-02 3.396080e-02 2.531412e-02 2.590426e-02 3.436641e-02 3.334117e-02 2.850354e-02 2.915802e-02 2.902165e-02 3.406914e-02 3.189116e-02 2.870869e-02 2.733842e-02 2.612705e-02 2.691037e-02 2.637630e-02 3.026321e-02 3.426955e-02 3.092280e-02 3.406420e-02 3.004236e-02 3.060558e-02 3.419621e-02 3.003447e-02 3.022529e-02 3.061575e-02 86 | 3.792923e-02 3.175906e-02 3.444301e-02 3.403420e-02 2.715521e-02 3.359704e-02 3.305970e-02 3.251376e-02 3.391238e-02 2.806200e-02 3.209309e-02 2.924666e-02 2.986453e-02 2.854168e-02 2.575193e-02 2.879040e-02 2.853733e-02 2.560275e-02 2.663388e-02 3.036855e-02 3.253390e-02 2.933746e-02 2.731498e-02 3.066826e-02 3.214874e-02 3.009577e-02 2.883477e-02 3.017799e-02 2.981558e-02 2.804192e-02 3.085295e-02 2.723726e-02 3.104404e-02 87 | 2.455903e-02 2.857813e-02 2.703634e-02 3.312851e-02 3.099581e-02 2.959838e-02 2.502481e-02 2.840486e-02 2.680559e-02 2.701792e-02 2.403115e-02 3.610034e-02 3.063717e-02 3.172597e-02 3.362567e-02 3.430118e-02 3.013621e-02 3.566763e-02 2.815281e-02 2.902665e-02 3.112174e-02 3.104375e-02 3.033216e-02 3.227270e-02 2.736618e-02 3.748903e-02 3.214900e-02 3.456271e-02 2.964235e-02 3.374774e-02 3.162266e-02 2.769881e-02 2.639705e-02 88 | 2.815913e-02 3.270954e-02 2.762585e-02 3.163574e-02 3.251174e-02 3.185368e-02 3.359558e-02 3.072657e-02 2.967020e-02 2.667064e-02 3.755543e-02 2.545926e-02 2.566452e-02 2.592416e-02 2.826103e-02 2.946459e-02 3.635309e-02 2.867911e-02 2.671428e-02 2.838332e-02 2.974318e-02 2.766469e-02 3.465225e-02 3.441424e-02 3.281233e-02 3.358217e-02 3.001408e-02 3.436791e-02 3.181932e-02 3.592210e-02 2.744624e-02 2.655304e-02 2.339089e-02 89 | 2.740346e-02 2.914722e-02 3.713185e-02 2.916533e-02 3.046507e-02 2.331596e-02 2.729239e-02 2.677709e-02 3.242886e-02 3.200120e-02 2.769730e-02 3.203853e-02 2.792199e-02 2.382717e-02 2.475301e-02 2.751577e-02 3.085994e-02 4.345977e-02 3.507624e-02 2.747132e-02 3.017072e-02 3.033647e-02 3.096140e-02 2.916936e-02 3.292827e-02 2.524867e-02 3.469071e-02 3.152949e-02 3.141067e-02 3.665156e-02 2.530074e-02 3.569262e-02 3.015979e-02 90 | 3.587235e-02 3.487768e-02 3.394726e-02 2.697916e-02 3.045028e-02 3.282673e-02 3.453439e-02 3.399817e-02 2.644109e-02 2.942775e-02 3.368553e-02 2.324071e-02 3.363892e-02 2.937742e-02 2.442743e-02 2.833038e-02 3.378460e-02 2.338591e-02 3.015363e-02 1.964459e-02 2.730008e-02 3.156253e-02 4.208500e-02 2.262212e-02 3.627211e-02 3.449783e-02 2.746255e-02 2.292536e-02 3.742639e-02 3.709843e-02 2.836470e-02 2.743912e-02 2.591988e-02 91 | 2.973794e-02 3.200701e-02 3.793789e-02 3.287220e-02 2.789193e-02 2.145327e-02 3.199997e-02 4.048049e-02 3.078298e-02 2.926367e-02 3.266994e-02 3.154612e-02 3.361702e-02 3.160926e-02 2.664753e-02 3.828503e-02 3.130312e-02 2.290543e-02 2.758696e-02 2.571408e-02 1.580090e-02 2.602296e-02 1.924141e-02 3.966056e-02 2.507659e-02 3.083280e-02 4.737838e-02 3.103955e-02 2.314135e-02 3.422568e-02 2.512864e-02 2.982303e-02 3.631632e-02 92 | 3.651945e-02 2.791010e-02 4.476658e-02 3.706524e-02 2.812365e-02 3.170228e-02 3.162086e-02 2.234702e-02 2.627162e-02 3.480046e-02 2.206142e-02 4.317475e-02 2.886151e-02 3.740442e-02 3.270138e-02 3.097778e-02 3.964786e-02 2.252657e-02 2.275712e-02 2.607732e-02 2.844978e-02 3.441774e-02 2.767403e-02 2.859283e-02 3.056301e-02 3.124138e-02 3.723362e-02 2.941904e-02 2.849538e-02 3.314294e-02 2.040625e-02 2.586375e-02 1.718282e-02 93 | 3.767967e-02 2.667902e-02 2.575300e-02 3.221603e-02 2.332559e-02 3.893984e-02 3.509218e-02 3.939663e-02 3.826341e-02 2.560584e-02 2.487851e-02 2.426538e-02 3.312632e-02 2.589795e-02 2.356642e-02 2.117728e-02 2.349874e-02 4.472508e-02 2.313092e-02 2.299192e-02 2.215537e-02 2.095067e-02 3.185166e-02 3.258694e-02 3.813100e-02 3.279778e-02 3.632808e-02 3.513694e-02 4.298222e-02 3.407902e-02 2.687563e-02 3.087180e-02 2.504322e-02 94 | 3.004134e-02 3.119398e-02 3.085091e-02 3.091323e-02 2.984214e-02 3.033330e-02 3.032658e-02 2.886154e-02 3.024286e-02 3.063466e-02 2.987448e-02 3.107107e-02 3.226705e-02 2.991162e-02 3.196911e-02 3.042180e-02 2.959559e-02 3.054700e-02 3.063351e-02 2.949713e-02 3.039691e-02 3.044371e-02 3.114286e-02 3.048025e-02 2.819052e-02 2.936811e-02 3.053750e-02 2.858530e-02 2.973474e-02 2.992309e-02 3.041556e-02 2.974731e-02 3.200532e-02 95 | 2.968458e-02 3.015142e-02 3.005484e-02 3.080465e-02 2.991750e-02 2.988683e-02 3.168286e-02 2.862434e-02 2.957247e-02 3.184667e-02 3.114688e-02 3.045549e-02 3.030109e-02 3.158098e-02 2.889975e-02 3.052668e-02 2.959101e-02 2.989367e-02 2.910632e-02 3.100189e-02 2.988414e-02 3.166761e-02 3.075724e-02 3.118235e-02 3.002529e-02 3.105370e-02 2.979729e-02 3.045035e-02 3.024527e-02 3.123002e-02 2.918489e-02 2.968141e-02 3.011055e-02 96 | 2.903186e-02 3.338362e-02 3.223447e-02 3.351031e-02 2.908180e-02 3.111766e-02 2.850142e-02 2.673031e-02 3.002905e-02 3.128773e-02 2.787585e-02 2.977777e-02 3.102167e-02 2.959128e-02 2.994368e-02 2.955282e-02 3.078555e-02 3.152949e-02 2.823499e-02 3.211486e-02 2.913555e-02 3.028786e-02 3.077522e-02 2.955982e-02 3.270205e-02 2.964215e-02 3.213028e-02 2.791118e-02 3.237498e-02 3.167288e-02 2.895795e-02 3.067010e-02 2.884384e-02 97 | 3.189225e-02 3.190405e-02 3.321248e-02 2.973877e-02 2.816577e-02 2.589940e-02 3.172684e-02 3.181922e-02 2.976116e-02 2.874772e-02 3.111203e-02 2.863791e-02 3.123581e-02 3.393254e-02 2.931993e-02 2.880733e-02 3.004117e-02 2.894487e-02 2.871149e-02 2.835187e-02 2.891441e-02 3.177258e-02 3.173596e-02 3.224251e-02 2.928360e-02 2.985670e-02 3.309947e-02 2.959560e-02 3.115185e-02 3.119650e-02 2.928635e-02 3.128123e-02 2.862062e-02 98 | 2.660177e-02 2.931432e-02 2.885955e-02 2.733200e-02 2.831187e-02 3.015416e-02 2.635588e-02 2.887714e-02 3.338219e-02 2.925105e-02 3.315122e-02 3.041347e-02 3.587344e-02 2.982860e-02 2.841825e-02 3.016817e-02 3.382134e-02 2.725843e-02 3.240658e-02 2.820828e-02 3.140227e-02 3.153253e-02 2.362424e-02 3.138303e-02 3.120666e-02 3.355200e-02 3.077289e-02 3.087207e-02 3.123773e-02 3.053905e-02 3.280357e-02 2.695653e-02 3.612980e-02 99 | 3.141348e-02 3.135925e-02 3.143736e-02 3.548883e-02 2.728063e-02 3.576723e-02 2.517395e-02 2.418316e-02 2.871891e-02 3.136621e-02 2.785259e-02 2.797561e-02 2.891184e-02 2.838376e-02 2.988918e-02 3.331697e-02 3.459377e-02 3.090833e-02 2.961205e-02 2.965047e-02 3.141733e-02 2.820283e-02 3.037566e-02 2.999366e-02 3.407997e-02 2.873281e-02 3.584700e-02 2.706299e-02 3.742732e-02 3.071338e-02 2.717320e-02 3.062736e-02 2.506297e-02 100 | 2.988138e-02 3.072645e-02 3.022233e-02 3.517354e-02 2.450481e-02 2.936364e-02 2.744715e-02 3.268670e-02 3.563344e-02 2.614438e-02 2.725866e-02 2.664522e-02 2.696504e-02 3.015116e-02 2.838130e-02 3.019549e-02 2.993726e-02 2.944050e-02 3.256103e-02 2.439087e-02 2.912936e-02 3.073365e-02 3.583452e-02 3.308237e-02 2.909127e-02 3.572566e-02 3.381782e-02 3.417085e-02 3.273719e-02 3.265601e-02 2.894961e-02 3.201428e-02 2.434702e-02 101 | 2.335189e-02 3.540199e-02 3.171477e-02 3.103580e-02 2.459352e-02 2.378576e-02 2.978038e-02 2.984784e-02 3.166159e-02 2.311463e-02 3.636459e-02 3.252085e-02 3.001275e-02 2.784004e-02 3.324408e-02 3.396674e-02 3.427535e-02 2.825471e-02 3.716621e-02 2.557877e-02 3.079602e-02 3.064480e-02 2.647546e-02 2.742269e-02 2.977943e-02 2.654234e-02 2.709491e-02 3.130893e-02 2.847925e-02 4.609279e-02 2.897578e-02 3.276373e-02 3.011161e-02 102 | 2.939912e-02 3.240233e-02 4.334396e-02 3.293778e-02 3.252961e-02 2.734084e-02 2.758070e-02 2.592930e-02 2.248882e-02 3.012052e-02 2.949623e-02 3.561529e-02 2.919482e-02 2.758283e-02 2.688436e-02 3.347998e-02 3.830789e-02 3.818522e-02 2.600285e-02 2.715939e-02 2.723933e-02 2.775140e-02 2.243319e-02 3.289793e-02 3.646915e-02 3.013611e-02 3.220295e-02 2.809204e-02 3.149392e-02 3.394056e-02 3.122697e-02 2.040608e-02 2.972860e-02 103 | 3.558969e-02 2.913619e-02 3.195867e-02 3.374955e-02 3.865100e-02 3.032582e-02 3.316417e-02 3.179650e-02 2.996693e-02 3.067156e-02 3.612889e-02 2.690963e-02 2.994187e-02 2.997085e-02 2.594095e-02 3.037464e-02 3.200321e-02 2.310005e-02 3.322234e-02 2.416335e-02 2.128691e-02 3.516313e-02 3.189358e-02 3.052784e-02 2.455368e-02 3.454843e-02 3.050405e-02 3.059548e-02 3.299801e-02 3.360250e-02 2.490489e-02 2.548210e-02 2.717353e-02 104 | 3.202901e-02 2.472701e-02 2.548910e-02 3.091530e-02 2.534518e-02 3.093247e-02 3.160102e-02 2.656417e-02 3.124083e-02 3.299969e-02 3.616279e-02 2.490659e-02 2.679822e-02 2.348247e-02 2.573309e-02 3.378570e-02 3.177411e-02 2.759523e-02 3.327310e-02 2.758288e-02 2.745586e-02 2.592462e-02 3.388889e-02 2.920158e-02 3.798757e-02 3.223258e-02 4.623414e-02 3.776306e-02 3.095000e-02 4.128361e-02 1.862287e-02 3.571406e-02 1.980327e-02 105 | 2.813882e-02 4.306540e-02 4.715339e-02 3.669352e-02 2.493807e-02 2.914681e-02 3.296639e-02 2.950124e-02 3.814635e-02 2.943926e-02 2.852566e-02 3.028221e-02 3.067999e-02 2.376163e-02 2.278535e-02 3.034874e-02 3.479976e-02 2.629886e-02 2.036883e-02 2.336438e-02 1.884467e-02 2.397062e-02 3.439106e-02 3.208572e-02 3.367697e-02 3.063928e-02 3.455966e-02 3.310732e-02 2.483182e-02 4.727435e-02 1.876107e-02 3.027686e-02 2.717601e-02 106 | 4.869142e-02 3.025076e-02 4.480706e-02 2.970877e-02 3.041185e-02 2.083896e-02 4.305261e-02 4.163474e-02 2.911809e-02 2.302729e-02 2.145519e-02 3.744262e-02 2.819220e-02 3.533193e-02 1.918736e-02 2.029704e-02 3.479602e-02 2.949613e-02 2.914733e-02 2.251938e-02 2.855293e-02 2.611933e-02 2.884869e-02 3.190145e-02 2.715123e-02 2.501313e-02 2.662764e-02 3.270595e-02 2.224732e-02 4.620299e-02 2.337654e-02 3.091459e-02 3.093143e-02 107 | 2.261317e-02 2.680961e-02 4.407809e-02 2.254186e-02 2.669388e-02 3.465164e-02 3.183985e-02 3.099336e-02 2.085303e-02 2.131940e-02 3.017847e-02 2.896922e-02 3.292290e-02 4.345703e-02 3.271561e-02 2.781060e-02 3.704949e-02 3.656435e-02 2.586746e-02 2.179920e-02 2.457741e-02 2.890830e-02 2.269926e-02 2.758972e-02 3.396056e-02 4.907139e-02 3.870599e-02 3.456593e-02 3.684057e-02 3.303901e-02 3.095676e-02 1.897037e-02 2.038654e-02 108 | 2.923040e-02 3.700654e-02 3.109336e-02 4.002323e-02 3.301068e-02 3.034233e-02 2.849356e-02 3.722266e-02 3.530855e-02 2.449379e-02 2.017214e-02 2.421702e-02 2.659968e-02 2.411207e-02 2.130302e-02 3.536490e-02 3.833497e-02 2.587479e-02 1.483288e-02 2.432658e-02 1.917386e-02 1.827644e-02 2.713038e-02 6.156161e-02 3.252729e-02 4.720266e-02 4.988483e-02 2.561766e-02 3.753539e-02 3.682841e-02 2.274878e-02 2.174327e-02 1.840633e-02 109 | 3.576858e-02 2.305582e-02 2.287661e-02 4.123913e-02 2.940792e-02 3.133683e-02 2.478810e-02 2.519001e-02 3.466720e-02 2.529491e-02 2.423412e-02 3.144563e-02 2.119166e-02 3.547836e-02 2.350044e-02 1.672025e-02 2.652253e-02 4.101142e-02 2.999476e-02 2.684977e-02 2.665501e-02 2.527784e-02 4.241052e-02 2.061003e-02 4.799969e-02 2.807484e-02 4.562544e-02 3.596079e-02 3.946240e-02 3.522800e-02 1.963175e-02 4.547839e-02 1.701130e-02 110 | 1.927809e-02 2.594621e-02 3.642119e-02 2.714702e-02 2.449606e-02 2.172564e-02 3.545684e-02 6.034501e-02 3.618653e-02 2.059778e-02 2.405961e-02 2.094321e-02 3.635440e-02 2.180216e-02 2.921286e-02 1.949588e-02 3.822859e-02 3.853222e-02 3.242845e-02 1.577455e-02 2.050460e-02 2.112107e-02 3.888275e-02 1.694771e-02 4.081083e-02 2.871774e-02 3.407602e-02 2.623419e-02 2.843435e-02 5.156156e-02 2.483032e-02 4.918141e-02 3.426509e-02 111 | 2.314655e-02 3.339506e-02 5.035701e-02 2.363368e-02 3.732781e-02 1.772550e-02 4.359081e-02 5.447068e-02 3.123638e-02 2.160278e-02 3.086818e-02 3.565616e-02 3.158820e-02 3.053174e-02 2.797868e-02 3.337133e-02 3.829097e-02 3.662623e-02 3.231250e-02 1.925246e-02 1.371093e-02 2.970187e-02 2.541588e-02 4.538004e-02 2.789495e-02 2.227142e-02 2.984381e-02 2.342521e-02 2.582017e-02 3.731127e-02 2.595117e-02 1.679108e-02 2.351942e-02 112 | 4.122256e-02 2.542688e-02 3.107942e-02 3.980821e-02 4.572835e-02 2.906132e-02 2.898058e-02 3.193470e-02 2.550612e-02 3.339585e-02 2.191567e-02 4.295232e-02 2.547566e-02 4.460389e-02 2.073745e-02 2.717664e-02 3.013285e-02 2.200131e-02 1.776345e-02 2.201397e-02 3.045730e-02 3.748709e-02 1.770576e-02 2.468088e-02 3.503541e-02 4.374525e-02 4.094600e-02 3.320027e-02 3.365384e-02 3.085524e-02 2.128476e-02 2.712370e-02 1.690734e-02 113 | 2.646807e-02 3.924071e-02 3.178948e-02 3.219626e-02 2.966493e-02 2.281231e-02 2.848752e-02 4.627700e-02 3.164027e-02 2.645805e-02 2.224379e-02 2.286584e-02 4.035667e-02 2.366138e-02 3.922701e-02 4.034677e-02 2.632465e-02 2.906381e-02 1.796125e-02 1.951973e-02 1.042009e-02 1.970368e-02 2.212347e-02 2.797514e-02 3.857898e-02 3.286232e-02 5.416644e-02 2.732983e-02 3.538025e-02 5.022549e-02 2.235610e-02 3.984240e-02 2.243034e-02 114 | 3.462482e-02 2.464845e-02 3.415445e-02 2.817514e-02 3.145712e-02 2.242533e-02 4.679181e-02 3.102781e-02 3.650966e-02 2.730229e-02 2.939004e-02 2.488853e-02 1.363909e-02 2.730927e-02 1.880448e-02 1.593243e-02 3.268965e-02 2.244367e-02 2.164077e-02 4.013607e-02 2.859181e-02 1.932842e-02 4.681519e-02 3.590938e-02 4.888988e-02 2.790938e-02 5.059461e-02 4.842887e-02 1.824530e-02 3.089259e-02 2.730566e-02 3.333708e-02 1.976096e-02 115 | 1.777092e-02 2.901796e-02 3.732695e-02 2.716701e-02 4.366011e-02 3.374369e-02 3.373476e-02 5.503216e-02 3.931421e-02 2.942855e-02 1.839002e-02 4.859370e-02 3.735830e-02 1.870040e-02 2.314009e-02 1.548214e-02 1.785997e-02 3.480408e-02 2.154000e-02 2.193193e-02 2.510048e-02 2.418968e-02 3.442942e-02 1.932915e-02 2.122267e-02 3.189249e-02 3.955824e-02 4.074271e-02 1.961851e-02 4.787759e-02 2.288305e-02 3.314121e-02 3.601783e-02 116 | 2.692781e-02 2.482667e-02 4.702172e-02 1.614812e-02 3.507455e-02 2.730893e-02 5.577029e-02 4.405794e-02 2.250396e-02 2.623888e-02 1.796245e-02 2.626728e-02 3.711866e-02 3.509859e-02 2.811962e-02 2.662529e-02 3.250397e-02 5.113148e-02 1.998309e-02 2.328089e-02 2.254129e-02 1.747282e-02 3.154776e-02 4.209213e-02 3.780133e-02 3.841436e-02 2.503074e-02 3.057181e-02 2.198252e-02 3.802397e-02 2.111365e-02 2.551664e-02 2.392077e-02 117 | 2.451566e-02 2.852358e-02 3.517017e-02 2.864284e-02 3.212584e-02 3.347117e-02 4.650402e-02 5.811253e-02 3.137744e-02 1.579928e-02 1.550644e-02 2.603322e-02 1.933037e-02 4.890201e-02 2.235225e-02 2.491242e-02 3.592828e-02 3.005262e-02 1.827120e-02 3.004092e-02 1.824125e-02 3.562221e-02 2.995122e-02 5.158106e-02 3.600235e-02 3.257379e-02 2.626804e-02 3.402011e-02 2.924834e-02 2.190309e-02 3.469831e-02 2.771020e-02 1.660765e-02 118 | 3.461258e-02 4.176299e-02 3.028776e-02 3.984554e-02 3.384240e-02 2.436379e-02 3.459684e-02 6.326551e-02 3.796643e-02 2.278073e-02 1.768710e-02 2.485056e-02 3.040945e-02 2.066804e-02 1.820530e-02 2.339807e-02 2.313577e-02 2.200446e-02 1.564012e-02 3.631813e-02 3.987989e-02 2.417166e-02 2.927807e-02 2.310655e-02 2.856988e-02 2.819056e-02 4.144498e-02 2.587350e-02 3.490850e-02 3.717869e-02 2.454333e-02 5.007347e-02 1.713928e-02 119 | 2.737864e-02 3.167502e-02 3.012665e-02 1.472622e-02 3.086295e-02 3.311202e-02 2.607631e-02 3.864661e-02 2.650437e-02 1.451075e-02 2.189702e-02 3.286710e-02 3.199203e-02 3.398827e-02 4.652308e-02 2.796604e-02 3.643853e-02 4.013308e-02 2.550055e-02 2.632693e-02 1.520446e-02 2.003806e-02 2.102675e-02 1.535980e-02 4.215489e-02 4.145262e-02 5.237309e-02 2.421980e-02 3.151422e-02 2.830001e-02 3.089787e-02 4.018872e-02 4.001756e-02 120 | 2.293668e-02 2.349984e-02 3.027355e-02 3.345440e-02 4.167385e-02 2.269213e-02 4.627471e-02 1.032096e-01 4.382389e-02 1.727617e-02 1.978546e-02 4.399922e-02 2.251363e-02 2.584533e-02 1.874843e-02 1.118872e-02 4.068014e-02 2.483383e-02 2.145736e-02 2.595405e-02 3.135855e-02 1.918043e-02 2.945856e-02 3.229189e-02 3.173561e-02 2.164655e-02 2.796770e-02 2.964622e-02 1.514420e-02 2.523863e-02 3.269368e-02 3.387126e-02 2.964565e-02 121 | 3.004134e-02 3.119398e-02 3.085091e-02 3.091323e-02 2.984214e-02 3.033330e-02 3.032658e-02 2.886154e-02 3.024286e-02 3.063466e-02 2.987448e-02 3.107107e-02 3.226705e-02 2.991162e-02 3.196911e-02 3.042180e-02 2.959559e-02 3.054700e-02 3.063351e-02 2.949713e-02 3.039691e-02 3.044371e-02 3.114286e-02 3.048025e-02 2.819052e-02 2.936811e-02 3.053750e-02 2.858530e-02 2.973474e-02 2.992309e-02 3.041556e-02 2.974731e-02 3.200532e-02 122 | 2.930443e-02 2.802567e-02 3.021672e-02 3.023718e-02 2.904166e-02 3.446186e-02 3.136348e-02 2.827108e-02 2.812157e-02 3.396560e-02 3.177319e-02 2.875867e-02 3.020238e-02 3.103774e-02 2.742209e-02 3.061132e-02 2.893973e-02 3.286118e-02 3.204028e-02 3.106887e-02 3.399786e-02 2.956516e-02 3.202911e-02 3.227188e-02 3.009778e-02 3.080233e-02 2.912322e-02 2.989174e-02 2.825264e-02 2.854929e-02 3.091272e-02 2.809907e-02 2.868251e-02 123 | 2.792151e-02 3.340006e-02 2.860288e-02 2.978019e-02 2.844598e-02 3.421458e-02 3.011085e-02 2.710635e-02 3.089585e-02 2.938276e-02 2.936133e-02 2.948078e-02 2.988359e-02 3.022537e-02 3.105627e-02 3.598991e-02 3.224680e-02 2.980942e-02 2.929859e-02 2.959784e-02 2.962453e-02 3.325239e-02 3.238345e-02 3.228787e-02 2.848459e-02 3.023215e-02 2.848338e-02 2.930774e-02 3.355527e-02 2.966967e-02 2.965270e-02 2.765431e-02 2.860103e-02 124 | 2.944618e-02 3.091454e-02 2.848684e-02 2.805783e-02 3.093968e-02 2.767244e-02 3.081720e-02 2.840690e-02 3.422856e-02 3.097106e-02 2.975292e-02 3.076576e-02 3.059677e-02 3.109423e-02 2.734440e-02 3.012030e-02 2.998591e-02 2.853431e-02 3.100443e-02 2.772594e-02 3.467148e-02 2.790244e-02 3.215244e-02 3.342557e-02 3.195095e-02 2.737212e-02 2.752997e-02 3.357731e-02 3.285894e-02 3.071112e-02 3.123800e-02 3.420694e-02 2.553651e-02 125 | 2.774440e-02 3.069879e-02 3.491478e-02 2.652156e-02 2.522919e-02 3.119616e-02 2.688473e-02 3.117948e-02 3.013271e-02 2.833009e-02 2.913343e-02 2.965786e-02 3.067166e-02 3.013739e-02 2.976671e-02 2.948987e-02 3.261308e-02 2.874286e-02 3.322837e-02 2.726601e-02 3.101024e-02 3.275027e-02 2.642626e-02 2.449053e-02 3.178943e-02 3.237879e-02 3.431111e-02 3.131907e-02 3.217234e-02 3.189854e-02 3.097826e-02 3.234703e-02 3.458905e-02 126 | 2.706028e-02 3.284020e-02 3.199404e-02 3.378917e-02 3.041676e-02 3.001284e-02 2.780454e-02 2.928088e-02 2.853475e-02 2.814438e-02 2.760306e-02 3.216500e-02 3.898755e-02 2.999656e-02 3.456964e-02 3.169130e-02 3.784944e-02 2.641843e-02 3.540118e-02 2.885604e-02 3.211612e-02 2.884652e-02 2.617794e-02 3.469305e-02 2.590868e-02 2.790095e-02 2.314295e-02 2.915522e-02 2.953763e-02 2.779332e-02 3.217946e-02 2.582893e-02 3.330316e-02 127 | 3.484584e-02 2.636060e-02 2.901960e-02 3.605358e-02 2.557422e-02 3.264830e-02 2.840262e-02 2.545768e-02 2.599275e-02 3.329315e-02 3.001916e-02 3.001863e-02 2.787837e-02 3.617925e-02 2.899869e-02 3.283642e-02 2.856071e-02 3.328721e-02 3.224297e-02 2.992033e-02 2.808351e-02 3.444674e-02 2.934883e-02 3.413585e-02 2.791525e-02 2.944181e-02 3.551419e-02 3.065279e-02 3.066455e-02 2.725596e-02 3.143773e-02 2.812131e-02 2.539135e-02 128 | 2.508189e-02 3.412451e-02 2.738346e-02 3.136687e-02 2.640335e-02 3.522002e-02 2.646876e-02 2.285953e-02 3.173601e-02 2.852404e-02 3.400790e-02 2.709200e-02 3.267529e-02 2.412157e-02 3.261875e-02 3.928269e-02 3.238857e-02 2.878238e-02 2.975191e-02 3.260097e-02 3.081438e-02 3.308434e-02 3.157272e-02 2.918465e-02 3.098527e-02 3.121315e-02 2.607864e-02 3.163316e-02 3.463310e-02 3.575087e-02 2.903138e-02 2.510533e-02 2.842258e-02 129 | 3.372727e-02 3.368102e-02 2.778923e-02 3.813273e-02 2.869790e-02 2.855009e-02 2.809612e-02 2.622469e-02 3.413979e-02 3.174452e-02 3.240210e-02 2.786499e-02 2.851990e-02 2.846224e-02 2.369574e-02 2.842083e-02 3.250330e-02 3.162273e-02 2.998317e-02 3.447120e-02 3.523060e-02 2.785200e-02 3.309567e-02 3.149278e-02 2.924576e-02 2.472021e-02 3.175071e-02 3.047851e-02 2.566473e-02 3.307801e-02 3.010643e-02 3.117312e-02 2.738192e-02 130 | 2.697125e-02 2.835261e-02 3.833412e-02 2.355249e-02 3.380731e-02 2.934047e-02 2.951429e-02 2.744444e-02 2.616019e-02 2.917373e-02 3.216491e-02 3.213551e-02 3.196844e-02 2.876812e-02 2.633628e-02 2.987859e-02 3.596672e-02 2.411943e-02 3.864542e-02 2.858839e-02 3.246516e-02 3.163959e-02 3.082258e-02 2.015077e-02 2.453955e-02 3.581538e-02 2.640048e-02 3.432697e-02 2.762599e-02 3.807765e-02 2.704980e-02 3.300727e-02 3.685616e-02 131 | 3.207103e-02 3.360108e-02 3.929852e-02 2.906859e-02 2.662912e-02 3.113842e-02 3.428790e-02 4.035087e-02 3.104781e-02 2.886477e-02 3.008969e-02 2.923565e-02 3.515574e-02 2.324352e-02 2.702405e-02 3.419183e-02 4.079374e-02 2.730687e-02 3.200468e-02 2.434625e-02 2.244632e-02 2.822967e-02 2.348847e-02 3.856183e-02 3.051953e-02 3.091793e-02 3.051272e-02 2.775253e-02 2.735557e-02 3.502427e-02 2.489941e-02 2.244403e-02 2.809761e-02 132 | 3.584977e-02 2.666483e-02 4.290515e-02 3.445359e-02 2.416260e-02 2.895558e-02 3.748991e-02 2.560010e-02 2.751525e-02 3.056420e-02 2.755199e-02 3.503487e-02 2.255159e-02 3.659281e-02 2.369241e-02 3.334445e-02 2.883388e-02 2.306035e-02 2.696956e-02 2.663466e-02 2.811286e-02 3.302436e-02 3.030962e-02 3.726500e-02 2.768100e-02 3.854709e-02 3.540569e-02 3.779576e-02 2.981870e-02 3.430357e-02 1.893300e-02 3.219952e-02 1.817629e-02 133 | 3.697767e-02 3.649751e-02 2.990673e-02 3.617966e-02 2.347110e-02 3.805742e-02 2.993024e-02 3.433926e-02 2.843486e-02 2.382330e-02 2.454261e-02 2.928197e-02 3.309097e-02 2.532737e-02 2.376749e-02 2.851695e-02 3.227420e-02 2.775535e-02 2.202441e-02 2.431356e-02 2.160382e-02 2.532578e-02 2.594375e-02 2.823593e-02 3.930704e-02 4.086318e-02 3.449210e-02 3.056659e-02 3.256066e-02 4.781397e-02 2.451756e-02 2.543903e-02 3.481796e-02 134 | 3.144476e-02 3.087414e-02 3.738957e-02 3.293373e-02 3.179248e-02 2.164441e-02 3.272523e-02 2.187322e-02 3.077978e-02 2.332855e-02 3.876615e-02 3.262920e-02 2.794599e-02 3.108156e-02 2.656587e-02 2.364374e-02 3.934815e-02 3.987772e-02 3.133573e-02 3.298370e-02 2.639134e-02 1.920835e-02 3.530193e-02 3.669050e-02 3.313165e-02 2.417885e-02 3.146826e-02 3.159411e-02 2.533605e-02 3.249971e-02 2.864770e-02 2.600848e-02 3.057942e-02 135 | 3.636949e-02 2.109196e-02 2.607385e-02 2.900660e-02 4.422121e-02 4.072442e-02 2.998851e-02 3.131390e-02 2.793191e-02 3.164836e-02 2.569444e-02 2.878312e-02 2.685894e-02 2.884018e-02 2.342626e-02 2.351701e-02 3.479526e-02 3.081081e-02 3.456209e-02 1.906239e-02 2.981902e-02 2.782152e-02 3.962417e-02 2.936348e-02 3.136362e-02 4.033186e-02 4.100694e-02 2.319403e-02 3.612202e-02 3.177592e-02 2.382173e-02 2.688668e-02 2.414830e-02 136 | 2.621331e-02 2.751916e-02 2.910360e-02 4.070037e-02 2.627655e-02 2.799005e-02 3.008284e-02 3.965684e-02 4.238390e-02 2.862616e-02 3.789680e-02 2.531856e-02 4.076185e-02 2.217296e-02 2.796843e-02 1.902525e-02 3.147371e-02 4.776594e-02 3.615453e-02 2.227603e-02 2.209996e-02 2.063244e-02 3.341592e-02 2.332886e-02 3.784323e-02 2.604054e-02 3.030131e-02 2.296621e-02 4.155049e-02 4.179482e-02 1.332981e-02 3.481714e-02 2.251238e-02 137 | 3.004134e-02 3.119398e-02 3.085091e-02 3.091323e-02 2.984214e-02 3.033330e-02 3.032658e-02 2.886154e-02 3.024286e-02 3.063466e-02 2.987448e-02 3.107107e-02 3.226705e-02 2.991162e-02 3.196911e-02 3.042180e-02 2.959559e-02 3.054700e-02 3.063351e-02 2.949713e-02 3.039691e-02 3.044371e-02 3.114286e-02 3.048025e-02 2.819052e-02 2.936811e-02 3.053750e-02 2.858530e-02 2.973474e-02 2.992309e-02 3.041556e-02 2.974731e-02 3.200532e-02 138 | 2.871053e-02 2.924470e-02 2.930849e-02 3.153962e-02 2.805488e-02 3.033383e-02 3.342623e-02 3.028443e-02 3.106664e-02 3.218884e-02 3.216029e-02 2.904452e-02 3.076288e-02 3.092094e-02 3.033504e-02 3.033621e-02 3.163766e-02 3.057523e-02 3.174340e-02 2.891628e-02 3.234330e-02 2.839275e-02 3.069116e-02 3.391330e-02 2.894550e-02 3.177516e-02 3.052361e-02 2.869605e-02 2.809470e-02 2.856377e-02 2.944339e-02 2.875680e-02 2.926984e-02 139 | 2.839502e-02 3.183948e-02 3.154563e-02 3.159309e-02 2.868487e-02 2.926593e-02 2.981498e-02 2.693304e-02 2.992951e-02 3.214935e-02 2.912963e-02 3.383420e-02 3.098219e-02 3.206198e-02 3.103289e-02 3.110184e-02 2.941988e-02 3.042205e-02 2.964166e-02 2.919167e-02 2.973337e-02 3.415262e-02 3.016049e-02 3.326547e-02 2.995504e-02 2.954678e-02 3.074851e-02 2.813905e-02 3.108641e-02 3.238882e-02 2.929128e-02 2.654641e-02 2.801681e-02 140 | 3.219925e-02 2.776144e-02 3.105991e-02 2.930949e-02 3.100061e-02 2.910858e-02 2.997981e-02 2.834889e-02 2.917403e-02 2.935780e-02 2.773429e-02 2.956211e-02 3.020012e-02 3.246876e-02 2.914584e-02 2.719363e-02 3.212935e-02 2.996502e-02 3.125186e-02 2.838350e-02 3.367153e-02 2.821498e-02 2.981806e-02 3.172345e-02 3.184549e-02 3.098035e-02 3.015224e-02 3.364694e-02 3.349463e-02 3.048820e-02 3.112789e-02 3.114443e-02 2.835749e-02 141 | 2.832817e-02 3.278722e-02 3.563105e-02 3.149718e-02 2.784881e-02 3.123147e-02 2.729991e-02 3.026850e-02 3.017912e-02 2.921207e-02 2.872121e-02 2.753328e-02 3.244078e-02 2.810496e-02 3.265638e-02 2.999134e-02 3.093481e-02 3.136796e-02 3.161057e-02 2.848833e-02 2.680170e-02 2.866584e-02 3.087760e-02 2.943168e-02 2.906303e-02 3.128789e-02 3.414777e-02 2.958197e-02 3.066872e-02 3.286020e-02 3.066234e-02 3.018784e-02 2.963033e-02 142 | 3.163918e-02 2.963988e-02 3.203378e-02 3.344756e-02 2.866969e-02 3.232364e-02 3.213167e-02 3.308830e-02 3.082077e-02 2.812057e-02 2.948462e-02 2.869283e-02 2.842272e-02 2.778017e-02 2.739366e-02 2.853475e-02 3.230445e-02 3.225572e-02 2.870649e-02 3.221614e-02 3.013455e-02 2.615578e-02 3.174436e-02 3.012160e-02 3.168415e-02 3.363974e-02 3.112789e-02 3.037436e-02 2.842738e-02 3.280438e-02 2.890787e-02 2.818098e-02 2.899045e-02 143 | 2.667255e-02 3.150999e-02 3.706697e-02 3.642239e-02 2.734785e-02 3.251061e-02 3.087865e-02 3.694171e-02 3.211767e-02 2.826975e-02 2.661318e-02 3.196706e-02 3.298317e-02 3.153262e-02 2.772086e-02 3.228907e-02 3.175168e-02 3.228065e-02 2.524184e-02 3.023164e-02 2.545552e-02 3.062767e-02 2.879354e-02 2.708794e-02 2.700566e-02 3.378169e-02 3.354558e-02 2.943013e-02 2.483949e-02 3.343581e-02 2.669475e-02 2.972765e-02 2.722464e-02 144 | 3.263226e-02 3.012076e-02 3.146118e-02 2.984958e-02 3.215812e-02 2.708138e-02 3.683200e-02 3.576066e-02 2.792556e-02 2.848798e-02 2.786099e-02 3.263414e-02 3.438733e-02 3.364450e-02 2.588812e-02 3.268544e-02 3.530831e-02 3.119719e-02 2.901931e-02 2.735032e-02 2.630727e-02 3.153044e-02 3.046823e-02 3.363338e-02 2.861159e-02 2.849530e-02 2.343135e-02 3.130958e-02 3.103057e-02 3.325700e-02 2.491068e-02 2.826013e-02 2.646933e-02 145 | 2.940967e-02 3.008657e-02 2.504905e-02 2.693998e-02 2.830978e-02 2.679449e-02 3.005969e-02 3.878710e-02 3.547018e-02 2.116148e-02 2.953554e-02 3.185444e-02 2.729060e-02 3.241614e-02 2.758501e-02 3.064465e-02 3.797012e-02 3.528262e-02 2.659195e-02 2.519288e-02 2.579414e-02 3.266091e-02 2.766804e-02 3.357295e-02 3.802777e-02 3.069766e-02 3.181574e-02 3.635603e-02 3.588148e-02 3.267119e-02 3.108451e-02 2.627078e-02 2.106684e-02 146 | 3.077871e-02 3.479318e-02 3.323083e-02 3.462629e-02 3.393375e-02 2.611241e-02 3.014787e-02 3.233604e-02 2.966049e-02 2.692027e-02 3.022970e-02 2.640788e-02 2.725696e-02 2.834226e-02 2.720247e-02 2.913302e-02 3.517896e-02 2.398655e-02 3.110073e-02 2.682481e-02 3.599693e-02 2.801830e-02 2.633105e-02 2.664156e-02 2.985841e-02 3.008840e-02 3.393183e-02 3.033711e-02 3.109913e-02 3.620787e-02 2.419566e-02 3.888510e-02 3.020548e-02 147 | 3.476729e-02 3.026174e-02 3.062555e-02 4.006096e-02 2.814902e-02 3.119279e-02 2.834953e-02 3.950829e-02 2.589270e-02 2.643047e-02 2.659820e-02 3.081221e-02 3.097453e-02 2.839993e-02 2.944122e-02 2.111279e-02 3.237813e-02 4.461957e-02 3.697556e-02 2.868420e-02 2.213826e-02 2.602644e-02 3.177279e-02 2.387152e-02 3.243527e-02 2.884951e-02 3.378890e-02 2.348107e-02 2.952211e-02 3.500811e-02 2.558700e-02 2.710865e-02 3.517567e-02 148 | 3.004134e-02 3.119398e-02 3.085091e-02 3.091323e-02 2.984214e-02 3.033330e-02 3.032658e-02 2.886154e-02 3.024286e-02 3.063466e-02 2.987448e-02 3.107107e-02 3.226705e-02 2.991162e-02 3.196911e-02 3.042180e-02 2.959559e-02 3.054700e-02 3.063351e-02 2.949713e-02 3.039691e-02 3.044371e-02 3.114286e-02 3.048025e-02 2.819052e-02 2.936811e-02 3.053750e-02 2.858530e-02 2.973474e-02 2.992309e-02 3.041556e-02 2.974731e-02 3.200532e-02 149 | 2.930443e-02 2.802567e-02 3.021672e-02 3.023718e-02 2.904166e-02 3.446186e-02 3.136348e-02 2.827108e-02 2.812157e-02 3.396560e-02 3.177319e-02 2.875867e-02 3.020238e-02 3.103774e-02 2.742209e-02 3.061132e-02 2.893973e-02 3.286118e-02 3.204028e-02 3.106887e-02 3.399786e-02 2.956516e-02 3.202911e-02 3.227188e-02 3.009778e-02 3.080233e-02 2.912322e-02 2.989174e-02 2.825264e-02 2.854929e-02 3.091272e-02 2.809907e-02 2.868251e-02 150 | 2.733755e-02 3.319563e-02 2.970981e-02 2.998114e-02 2.923982e-02 3.279373e-02 2.898103e-02 2.821785e-02 3.122287e-02 2.900254e-02 2.868794e-02 2.828665e-02 3.202624e-02 3.124224e-02 3.200796e-02 3.569289e-02 3.011728e-02 3.069142e-02 2.774237e-02 2.964986e-02 2.745479e-02 3.358751e-02 3.094149e-02 3.211292e-02 2.818466e-02 3.046007e-02 2.890642e-02 2.960505e-02 3.535935e-02 3.130170e-02 2.995879e-02 2.728592e-02 2.901448e-02 151 | 3.102488e-02 3.063888e-02 3.001300e-02 3.026877e-02 3.004300e-02 2.726616e-02 2.871450e-02 2.537363e-02 2.895937e-02 3.103781e-02 2.883894e-02 2.883903e-02 3.030811e-02 3.552873e-02 2.878716e-02 3.201178e-02 3.445261e-02 2.742382e-02 3.211296e-02 3.073824e-02 3.747581e-02 2.859458e-02 2.983986e-02 3.639446e-02 3.095496e-02 2.757224e-02 2.855101e-02 3.267943e-02 3.213230e-02 2.817300e-02 3.109800e-02 3.161168e-02 2.254135e-02 152 | 2.796751e-02 3.098100e-02 3.261075e-02 2.958463e-02 2.584276e-02 3.311739e-02 2.373762e-02 2.978134e-02 3.302935e-02 2.895748e-02 3.061862e-02 3.065544e-02 3.144272e-02 2.768389e-02 2.883836e-02 2.706892e-02 3.336602e-02 3.014642e-02 3.157105e-02 2.655105e-02 2.847525e-02 3.081860e-02 2.462321e-02 2.448403e-02 3.262848e-02 3.403958e-02 3.281702e-02 3.350291e-02 3.562624e-02 3.197108e-02 3.154651e-02 3.202737e-02 3.388735e-02 153 | 2.804827e-02 3.069052e-02 3.601127e-02 2.978959e-02 2.754357e-02 3.247504e-02 2.892191e-02 2.766560e-02 2.926074e-02 2.857620e-02 2.834071e-02 2.614986e-02 3.170379e-02 2.469029e-02 3.519016e-02 3.415138e-02 3.450574e-02 2.714603e-02 3.739728e-02 2.973380e-02 3.162377e-02 2.860189e-02 3.006161e-02 2.561304e-02 2.862814e-02 3.033031e-02 3.129901e-02 2.869530e-02 3.088110e-02 3.601008e-02 2.761780e-02 3.148908e-02 3.115712e-02 154 | 2.760948e-02 2.644157e-02 3.190574e-02 3.816352e-02 2.396208e-02 3.535499e-02 3.197300e-02 2.799097e-02 2.570611e-02 3.363338e-02 3.120324e-02 2.918919e-02 2.845920e-02 3.493459e-02 2.930145e-02 3.182413e-02 3.365852e-02 2.983177e-02 3.133218e-02 2.951914e-02 2.360281e-02 3.152638e-02 2.984015e-02 3.560826e-02 2.842765e-02 3.847947e-02 3.257297e-02 3.564892e-02 2.637168e-02 2.585573e-02 2.684242e-02 2.596119e-02 2.726806e-02 155 | 2.710082e-02 3.624188e-02 3.044499e-02 3.592648e-02 2.706395e-02 2.589803e-02 3.299613e-02 3.006251e-02 3.231889e-02 2.973859e-02 3.107659e-02 3.517131e-02 3.201060e-02 2.596671e-02 2.493516e-02 3.598595e-02 3.521139e-02 2.950057e-02 2.944473e-02 2.870499e-02 2.582413e-02 3.454612e-02 2.925117e-02 3.686076e-02 2.477505e-02 2.742235e-02 2.311888e-02 3.145637e-02 2.415223e-02 4.819734e-02 2.437728e-02 2.599762e-02 2.822035e-02 156 | 4.862794e-02 2.676361e-02 2.813089e-02 2.526997e-02 3.040676e-02 3.805999e-02 2.758147e-02 2.225284e-02 2.223574e-02 2.785501e-02 3.235066e-02 2.466900e-02 2.741460e-02 3.324768e-02 2.368001e-02 3.175651e-02 3.432846e-02 3.919425e-02 3.443478e-02 2.834457e-02 3.103036e-02 2.883107e-02 2.774968e-02 2.919432e-02 4.366317e-02 2.565617e-02 2.811655e-02 3.150232e-02 3.881249e-02 3.284654e-02 2.659358e-02 2.565915e-02 2.373982e-02 157 | 2.950200e-02 3.382230e-02 3.205049e-02 4.012173e-02 3.016804e-02 2.862355e-02 2.366077e-02 3.842360e-02 3.920970e-02 2.294882e-02 3.738187e-02 2.474996e-02 3.409279e-02 3.062924e-02 2.322594e-02 2.178100e-02 3.964876e-02 2.546337e-02 3.602088e-02 2.518365e-02 3.292822e-02 2.308738e-02 3.090129e-02 2.807203e-02 2.606295e-02 3.289917e-02 2.902189e-02 3.269572e-02 2.531573e-02 3.435984e-02 2.648278e-02 3.058646e-02 3.087812e-02 158 | 4.121389e-02 2.810636e-02 3.098244e-02 3.571944e-02 3.029216e-02 3.217578e-02 3.549629e-02 3.154653e-02 3.107186e-02 2.562948e-02 2.641552e-02 3.242955e-02 3.130497e-02 2.327960e-02 2.819433e-02 3.375074e-02 4.268673e-02 2.576600e-02 4.193982e-02 2.720380e-02 3.543903e-02 2.060806e-02 2.909608e-02 3.194919e-02 2.686261e-02 2.632680e-02 2.814178e-02 2.338410e-02 3.101544e-02 3.381022e-02 2.321364e-02 2.787095e-02 2.707684e-02 159 | 3.002185e-02 2.645688e-02 4.380776e-02 4.460681e-02 2.595998e-02 3.276848e-02 2.841329e-02 2.763365e-02 2.805081e-02 2.701946e-02 2.786492e-02 4.121666e-02 2.663066e-02 3.569980e-02 2.767039e-02 1.909331e-02 3.793684e-02 4.856305e-02 2.915979e-02 2.448066e-02 2.384843e-02 2.582528e-02 2.837493e-02 2.110764e-02 3.552162e-02 3.822186e-02 3.101661e-02 3.115465e-02 2.842010e-02 3.548041e-02 1.841803e-02 2.705065e-02 2.250474e-02 160 | 3.330299e-02 3.069848e-02 4.003917e-02 2.935255e-02 2.786917e-02 1.845113e-02 4.376705e-02 4.725299e-02 3.505483e-02 1.994361e-02 2.135229e-02 2.704457e-02 3.160451e-02 2.663225e-02 3.048954e-02 3.485756e-02 2.993136e-02 2.598964e-02 2.060838e-02 1.780369e-02 1.733231e-02 3.553678e-02 3.615861e-02 3.537017e-02 2.681214e-02 3.762959e-02 3.487787e-02 2.311722e-02 3.752922e-02 5.799143e-02 2.197204e-02 2.218277e-02 2.144402e-02 161 | 3.206484e-02 3.312024e-02 3.262854e-02 3.731636e-02 3.208121e-02 1.890525e-02 2.500773e-02 2.294176e-02 2.846245e-02 3.382526e-02 4.137344e-02 3.375432e-02 3.379389e-02 3.532718e-02 1.744411e-02 3.321066e-02 2.937818e-02 2.844868e-02 1.675732e-02 3.368412e-02 1.738712e-02 2.418318e-02 2.311363e-02 3.678314e-02 4.330916e-02 4.040571e-02 3.935192e-02 3.720070e-02 3.145055e-02 3.276878e-02 2.232479e-02 2.560782e-02 2.658795e-02 162 | 5.560665e-02 3.687223e-02 3.386034e-02 4.480763e-02 3.400046e-02 2.109350e-02 1.895169e-02 3.512676e-02 2.557029e-02 3.460353e-02 2.660150e-02 2.926297e-02 3.654774e-02 3.026650e-02 1.693698e-02 1.941904e-02 3.484536e-02 2.703693e-02 1.658975e-02 2.062344e-02 1.813249e-02 3.335810e-02 2.856103e-02 2.035627e-02 4.038473e-02 2.839289e-02 4.740386e-02 2.224538e-02 2.456378e-02 4.894723e-02 2.513606e-02 2.800908e-02 3.588568e-02 163 | 3.946885e-02 2.178835e-02 3.833547e-02 2.266923e-02 2.606920e-02 2.020846e-02 3.508311e-02 4.549655e-02 2.559577e-02 2.397438e-02 3.176409e-02 1.955334e-02 4.704924e-02 2.698425e-02 3.406339e-02 2.681070e-02 3.159145e-02 3.763698e-02 4.608874e-02 1.942244e-02 2.056532e-02 2.091665e-02 3.606069e-02 2.916443e-02 2.649456e-02 2.122220e-02 4.723996e-02 2.959539e-02 3.391189e-02 2.877494e-02 1.994015e-02 4.270902e-02 2.375075e-02 164 | 3.048104e-02 3.128316e-02 3.910588e-02 4.163909e-02 1.965466e-02 3.894361e-02 3.483311e-02 4.110702e-02 3.552346e-02 2.792277e-02 3.103803e-02 2.589741e-02 2.250318e-02 3.463963e-02 1.558634e-02 1.750396e-02 3.584805e-02 3.450055e-02 2.188515e-02 2.127885e-02 3.030319e-02 2.340897e-02 3.427377e-02 2.657960e-02 4.187826e-02 4.510444e-02 5.728462e-02 2.121514e-02 2.297330e-02 3.320776e-02 2.088615e-02 2.379368e-02 1.791610e-02 165 | 1.998385e-02 4.532412e-02 4.757199e-02 3.725655e-02 3.322767e-02 2.226204e-02 3.797735e-02 4.490213e-02 3.638302e-02 2.086561e-02 1.356609e-02 5.392157e-02 3.629843e-02 2.350941e-02 2.449571e-02 3.339859e-02 2.078721e-02 4.207520e-02 1.249753e-02 1.706350e-02 1.558269e-02 1.837921e-02 2.504370e-02 3.209132e-02 3.194041e-02 3.236955e-02 4.208683e-02 1.972134e-02 3.878828e-02 5.607790e-02 1.827814e-02 2.370698e-02 2.256606e-02 166 | 2.773513e-02 2.182733e-02 5.301435e-02 2.368041e-02 3.027906e-02 1.881711e-02 4.735135e-02 3.058238e-02 1.665099e-02 3.478283e-02 1.689628e-02 2.409400e-02 2.889190e-02 8.731513e-02 2.181410e-02 2.167227e-02 2.892411e-02 2.740990e-02 1.858108e-02 1.863147e-02 1.734307e-02 3.082871e-02 2.870179e-02 3.155107e-02 4.680865e-02 4.348537e-02 4.607034e-02 3.061824e-02 2.633424e-02 2.725625e-02 2.307188e-02 3.448484e-02 1.449442e-02 167 | 1.885032e-02 3.178189e-02 3.162037e-02 2.795304e-02 2.674736e-02 2.366608e-02 3.406763e-02 6.425011e-02 3.373213e-02 2.150885e-02 1.925833e-02 2.912311e-02 5.259628e-02 2.633362e-02 2.162218e-02 1.601298e-02 3.003467e-02 3.463927e-02 2.137227e-02 2.024245e-02 1.182468e-02 2.957071e-02 3.091679e-02 3.595363e-02 4.651699e-02 2.200052e-02 4.438368e-02 2.782079e-02 4.997425e-02 3.447908e-02 2.701827e-02 2.986347e-02 2.426420e-02 168 | 3.004134e-02 3.119398e-02 3.085091e-02 3.091323e-02 2.984214e-02 3.033330e-02 3.032658e-02 2.886154e-02 3.024286e-02 3.063466e-02 2.987448e-02 3.107107e-02 3.226705e-02 2.991162e-02 3.196911e-02 3.042180e-02 2.959559e-02 3.054700e-02 3.063351e-02 2.949713e-02 3.039691e-02 3.044371e-02 3.114286e-02 3.048025e-02 2.819052e-02 2.936811e-02 3.053750e-02 2.858530e-02 2.973474e-02 2.992309e-02 3.041556e-02 2.974731e-02 3.200532e-02 169 | 2.973476e-02 2.831196e-02 3.109511e-02 3.129388e-02 2.707919e-02 3.193102e-02 3.214402e-02 2.653058e-02 2.718433e-02 3.395307e-02 3.057630e-02 2.867866e-02 3.196572e-02 3.218533e-02 3.107874e-02 3.031960e-02 3.133482e-02 3.060143e-02 3.069891e-02 3.026804e-02 3.060467e-02 3.106198e-02 2.926378e-02 3.289192e-02 3.080474e-02 3.245018e-02 3.227124e-02 2.876302e-02 2.994164e-02 2.760170e-02 2.927590e-02 2.796433e-02 3.013943e-02 170 | 2.655870e-02 3.220064e-02 3.131140e-02 3.420858e-02 2.560927e-02 2.820519e-02 3.110288e-02 2.767408e-02 3.018582e-02 3.133914e-02 2.964728e-02 2.795172e-02 3.179827e-02 3.136080e-02 2.920909e-02 3.170728e-02 3.186637e-02 3.099450e-02 2.914172e-02 3.173314e-02 2.899106e-02 3.504995e-02 3.466010e-02 3.278792e-02 2.717246e-02 2.930127e-02 3.045983e-02 2.771823e-02 3.034697e-02 3.340254e-02 2.996623e-02 2.779533e-02 2.854221e-02 171 | 3.420142e-02 3.265053e-02 2.640960e-02 2.901378e-02 2.905519e-02 2.974680e-02 2.782848e-02 2.540072e-02 3.059775e-02 2.804253e-02 2.924516e-02 2.753990e-02 2.980818e-02 3.198111e-02 2.689895e-02 3.277142e-02 3.218858e-02 2.952270e-02 2.732399e-02 3.181761e-02 3.270391e-02 3.176048e-02 3.411341e-02 3.080823e-02 3.034415e-02 3.115400e-02 3.370802e-02 2.966580e-02 3.796504e-02 3.181275e-02 3.063697e-02 2.777826e-02 2.550458e-02 172 | 2.633900e-02 3.458676e-02 3.341326e-02 2.777937e-02 2.818394e-02 2.469117e-02 2.393479e-02 3.171635e-02 3.466462e-02 2.384517e-02 3.235203e-02 3.134488e-02 3.211926e-02 3.321485e-02 3.031733e-02 2.803534e-02 3.578295e-02 3.298253e-02 3.088665e-02 2.532385e-02 2.700713e-02 2.771900e-02 2.499115e-02 2.697809e-02 3.159097e-02 3.096214e-02 3.203072e-02 3.146669e-02 3.303104e-02 3.596544e-02 2.994948e-02 3.387061e-02 3.292344e-02 173 | 3.058880e-02 2.842940e-02 4.022317e-02 3.120008e-02 2.567052e-02 3.095908e-02 2.744551e-02 3.110793e-02 2.774498e-02 3.053369e-02 2.680978e-02 2.645792e-02 3.172369e-02 2.907845e-02 2.828529e-02 3.199036e-02 3.555154e-02 2.499207e-02 3.514514e-02 2.512669e-02 2.898106e-02 3.070334e-02 2.604153e-02 3.198717e-02 3.491304e-02 3.182195e-02 3.407491e-02 2.603057e-02 3.409486e-02 3.327453e-02 3.031070e-02 2.880258e-02 2.989964e-02 174 | 3.131723e-02 2.754636e-02 3.326740e-02 4.349848e-02 2.253842e-02 3.713122e-02 2.620970e-02 2.888086e-02 2.729837e-02 3.200028e-02 3.475083e-02 2.810182e-02 3.445319e-02 2.952840e-02 2.709138e-02 2.976754e-02 2.995951e-02 2.845956e-02 3.973093e-02 2.919897e-02 2.564269e-02 2.975337e-02 2.789496e-02 2.808738e-02 3.045517e-02 3.020230e-02 3.121022e-02 3.124993e-02 3.219242e-02 2.601736e-02 2.575010e-02 3.056744e-02 3.024624e-02 175 | 3.120626e-02 3.612296e-02 2.756021e-02 4.007331e-02 2.581764e-02 2.930458e-02 3.128119e-02 2.653461e-02 3.179332e-02 2.431110e-02 2.858265e-02 3.478239e-02 2.909905e-02 2.862263e-02 3.249958e-02 3.781265e-02 3.512200e-02 3.221859e-02 3.113498e-02 2.563794e-02 2.879625e-02 3.468327e-02 3.192494e-02 3.221772e-02 2.843158e-02 2.551966e-02 2.815189e-02 2.564633e-02 2.562798e-02 4.123303e-02 2.790225e-02 2.483794e-02 2.550951e-02 176 | 2.806420e-02 3.421237e-02 2.920271e-02 3.234120e-02 3.388576e-02 2.795336e-02 3.252573e-02 2.536300e-02 3.053259e-02 2.957163e-02 3.637814e-02 3.172476e-02 3.228956e-02 2.722048e-02 2.740873e-02 3.558554e-02 3.275620e-02 3.393476e-02 2.852888e-02 2.766783e-02 3.274008e-02 2.640385e-02 2.724817e-02 3.389700e-02 3.281762e-02 2.805503e-02 2.858770e-02 3.137956e-02 3.219977e-02 3.213204e-02 2.567555e-02 2.735573e-02 2.436047e-02 177 | 2.797218e-02 3.398695e-02 3.628578e-02 3.391646e-02 3.210272e-02 2.660674e-02 2.799566e-02 3.025051e-02 3.127792e-02 3.506897e-02 3.270212e-02 2.955566e-02 3.394780e-02 2.838469e-02 2.255383e-02 2.214831e-02 3.745958e-02 2.637982e-02 3.152245e-02 2.537861e-02 2.874321e-02 3.462593e-02 3.002397e-02 2.469428e-02 2.936319e-02 3.586885e-02 3.554516e-02 2.744968e-02 2.525323e-02 4.422127e-02 2.186702e-02 2.509153e-02 3.175595e-02 178 | 3.539312e-02 2.730399e-02 4.105362e-02 2.469089e-02 3.078559e-02 2.603504e-02 4.057445e-02 3.685287e-02 2.990423e-02 2.769069e-02 2.171829e-02 3.220795e-02 3.680316e-02 2.407777e-02 2.263345e-02 2.712827e-02 3.661256e-02 2.454828e-02 3.350691e-02 2.340221e-02 2.580587e-02 2.233926e-02 3.596127e-02 3.491811e-02 3.198968e-02 2.775632e-02 3.697610e-02 3.382555e-02 3.071883e-02 3.752356e-02 2.197634e-02 3.357622e-02 2.370958e-02 179 | 3.855577e-02 3.932500e-02 5.567690e-02 3.445339e-02 2.279986e-02 3.055624e-02 3.698354e-02 3.203913e-02 3.107014e-02 3.328920e-02 2.352598e-02 3.169664e-02 2.748431e-02 2.844369e-02 1.816460e-02 2.996573e-02 3.127889e-02 2.137473e-02 2.073844e-02 2.161849e-02 1.975733e-02 2.595325e-02 3.072320e-02 3.709554e-02 3.147516e-02 3.796389e-02 4.231399e-02 3.306942e-02 1.861612e-02 3.390891e-02 2.237708e-02 3.011637e-02 2.758902e-02 180 | 3.632136e-02 3.987424e-02 3.650137e-02 3.785992e-02 2.295450e-02 3.192946e-02 3.214669e-02 3.273195e-02 3.124407e-02 2.717305e-02 2.024934e-02 4.645825e-02 3.674630e-02 2.438649e-02 2.561561e-02 2.831423e-02 2.402073e-02 3.096270e-02 1.574937e-02 2.317633e-02 1.960891e-02 2.873244e-02 3.009689e-02 3.450697e-02 3.188971e-02 3.439105e-02 3.751959e-02 3.006965e-02 2.697602e-02 5.211442e-02 2.224648e-02 2.189841e-02 2.553353e-02 181 | 2.886996e-02 2.411716e-02 4.028902e-02 2.564526e-02 2.789401e-02 3.429800e-02 3.426126e-02 2.918000e-02 2.477735e-02 2.435878e-02 2.133136e-02 2.424369e-02 3.094997e-02 4.164310e-02 2.195762e-02 2.479979e-02 3.405029e-02 2.895132e-02 1.564661e-02 3.377397e-02 2.825686e-02 2.114609e-02 3.080741e-02 2.746586e-02 3.292502e-02 5.184041e-02 4.845975e-02 4.353138e-02 2.402166e-02 3.710498e-02 2.903152e-02 3.197131e-02 2.239919e-02 182 | 1.891036e-02 3.373047e-02 4.130797e-02 2.340025e-02 3.818281e-02 2.917425e-02 2.929534e-02 4.019494e-02 2.866797e-02 2.093750e-02 2.113296e-02 2.800790e-02 3.334506e-02 2.710721e-02 2.149940e-02 2.109062e-02 5.104741e-02 4.536257e-02 2.698835e-02 2.847344e-02 2.149749e-02 2.451266e-02 3.220962e-02 3.042042e-02 2.905945e-02 3.374163e-02 3.616906e-02 3.110127e-02 3.117034e-02 3.719052e-02 2.819858e-02 2.712334e-02 2.974877e-02 183 | 3.106646e-02 2.721157e-02 2.836353e-02 2.399604e-02 3.401625e-02 3.555916e-02 3.728184e-02 5.660629e-02 3.440474e-02 2.411351e-02 2.101971e-02 2.096091e-02 2.619513e-02 2.151733e-02 1.882877e-02 2.158171e-02 3.686788e-02 4.750044e-02 3.273166e-02 1.833619e-02 3.036892e-02 1.999521e-02 4.205183e-02 3.564190e-02 4.441027e-02 3.065008e-02 3.987078e-02 2.375367e-02 4.306352e-02 2.932555e-02 2.091423e-02 3.123804e-02 1.055689e-02 184 | -------------------------------------------------------------------------------- /sample/tag.list: -------------------------------------------------------------------------------- 1 | 2 | 3 | O 4 | B-fromloc.city_name 5 | B-toloc.city_name 6 | B-round_trip 7 | I-round_trip 8 | atis_flight 9 | B-cost_relative 10 | B-fare_amount 11 | I-fare_amount 12 | atis_airfare 13 | B-arrive_date.month_name 14 | B-arrive_date.day_number 15 | I-fromloc.city_name 16 | B-stoploc.city_name 17 | B-arrive_time.time_relative 18 | B-arrive_time.time 19 | I-arrive_time.time 20 | B-toloc.state_code 21 | atis_airline 22 | I-toloc.city_name 23 | I-stoploc.city_name 24 | B-meal_description 25 | B-depart_date.month_name 26 | B-depart_date.day_number 27 | B-airline_name 28 | I-airline_name 29 | B-depart_time.period_of_day 30 | B-depart_date.day_name 31 | B-toloc.state_name 32 | B-depart_time.time_relative 33 | B-depart_time.time 34 | -------------------------------------------------------------------------------- /script/run_joint.sh: -------------------------------------------------------------------------------- 1 | DATADIR=data 2 | PROG=program/SequenceTagger.py 3 | OPTFUNC=adam 4 | ITER_EPOCH=300 5 | MAX_ITER=10 6 | NUM_RECORD=10 7 | if [ ! $1 ]; then 8 | echo "Usage: $0 " 9 | else 10 | if [ ! $2 ]; then 11 | echo "Default is CPU for theano or automatic setting for tensorflow depending on what the backend is using." 12 | GPUSET='' 13 | elif [ $2 == 'theano' ]; then 14 | GPUSET="THEANO_FLAGS=device=gpu$3,floatX=float32" 15 | fi 16 | MDL=$1 17 | TRAIN=$DATADIR/atis-2.train.w-intent.iob 18 | DEV=$DATADIR/atis-2.dev.w-intent.iob 19 | TEST=$DATADIR/atis.test.w-intent.iob 20 | TLEN=48 21 | VDIM=100 22 | for DROPOUT in 0.50 0.25 23 | do 24 | for HDIM in 150 100 50 25 | do 26 | if [ ! -d experiment/res.w-intent/"$HDIM"-"$DROPOUT" ]; then 27 | mkdir -p experiment/res.w-intent/"$HDIM"-"$DROPOUT" 28 | fi 29 | if [ ! -d experiment/mdl.w-intent/"$HDIM"-"$DROPOUT" ]; then 30 | mkdir -p experiment/mdl.w-intent/"$HDIM"-"$DROPOUT" 31 | fi 32 | RES_PATH=experiment/res.w-intent/"$HDIM"-"$DROPOUT" 33 | MDL_PATH=experiment/mdl.w-intent/"$HDIM"-"$DROPOUT" 34 | CMD="$GPUSET python $PROG --train $TRAIN --dev $DEV --test $TEST --sgdtype $OPTFUNC --arch $MDL --iter_per_epoch $ITER_EPOCH --out $RES_PATH -m $MAX_ITER --mdl_path $MDL_PATH --record_epoch $NUM_RECORD --dropout True --dropout_ratio $DROPOUT --hidden_size $HDIM --time_length $TLEN --embedding_size $VDIM --input_type embedding" 35 | echo $CMD 36 | done 37 | done 38 | fi 39 | -------------------------------------------------------------------------------- /script/run_sample.sh: -------------------------------------------------------------------------------- 1 | DATADIR=data 2 | PROG=program/SequenceTagger.py 3 | OPTFUNC=adam 4 | ITER_EPOCH=1 5 | MAX_ITER=3 6 | NUM_RECORD=3 7 | if [ ! $1 ]; then 8 | echo "Usage: $0 " 9 | else 10 | if [ ! $2 ]; then 11 | echo "Default is CPU for theano or automatic setting for tensorflow depending on what the backend is using." 12 | GPUSET='' 13 | elif [ $2 == 'theano' ]; then 14 | GPUSET="THEANO_FLAGS=device=gpu$3,floatX=float32" 15 | fi 16 | MDL=$1 17 | TRAIN=$DATADIR/sample.iob 18 | TEST=$DATADIR/sample.iob 19 | TLEN=48 20 | VDIM=100 21 | for DROPOUT in 0.50 22 | do 23 | for HDIM in 50 24 | do 25 | if [ ! -d sample ]; then 26 | mkdir -p sample 27 | fi 28 | RES_PATH=sample 29 | MDL_PATH=sample 30 | CMD="$GPUSET python $PROG --train $TRAIN --test $TEST --sgdtype $OPTFUNC --arch $MDL --iter_per_epoch $ITER_EPOCH --out $RES_PATH -m $MAX_ITER --mdl_path $MDL_PATH --record_epoch $NUM_RECORD --dropout True --dropout_ratio $DROPOUT --hidden_size $HDIM --time_length $TLEN --embedding_size $VDIM --input_type embedding" 31 | echo $CMD 32 | done 33 | done 34 | fi 35 | -------------------------------------------------------------------------------- /script/run_slot.sh: -------------------------------------------------------------------------------- 1 | DATADIR=data 2 | PROG=program/SequenceTagger.py 3 | OPTFUNC=adam 4 | ITER_EPOCH=300 5 | MAX_ITER=10 6 | NUM_RECORD=10 7 | if [ ! $1 ]; then 8 | echo "Usage: $0 " 9 | else 10 | if [ ! $2 ]; then 11 | echo "Default is CPU for theano or automatic setting for tensorflow depending on what the backend is using." 12 | GPUSET='' 13 | elif [ $2 == 'theano' ]; then 14 | GPUSET="THEANO_FLAGS=device=gpu$3,floatX=float32" 15 | fi 16 | MDL=$1 17 | TRAIN=$DATADIR/atis-2.train.iob 18 | DEV=$DATADIR/atis-2.dev.iob 19 | TEST=$DATADIR/atis.test.iob 20 | TLEN=48 21 | VDIM=100 22 | for DROPOUT in 0.50 0.25 23 | do 24 | for HDIM in 150 100 50 25 | do 26 | if [ ! -d experiment/res/"$HDIM"-"$DROPOUT" ]; then 27 | mkdir -p experiment/res/"$HDIM"-"$DROPOUT" 28 | fi 29 | if [ ! -d experiment/mdl/"$HDIM"-"$DROPOUT" ]; then 30 | mkdir -p experiment/mdl/"$HDIM"-"$DROPOUT" 31 | fi 32 | RES_PATH=experiment/res/"$HDIM"-"$DROPOUT" 33 | MDL_PATH=experiment/mdl/"$HDIM"-"$DROPOUT" 34 | CMD="$GPUSET python $PROG --train $TRAIN --dev $DEV --test $TEST --sgdtype $OPTFUNC --arch $MDL --iter_per_epoch $ITER_EPOCH --out $RES_PATH -m $MAX_ITER --mdl_path $MDL_PATH --record_epoch $NUM_RECORD --dropout True --dropout_ratio $DROPOUT --hidden_size $HDIM --time_length $TLEN --embedding_size $VDIM --input_type embedding" 35 | echo $CMD 36 | done 37 | done 38 | fi 39 | --------------------------------------------------------------------------------