├── README.md ├── augment.py ├── datasets.py ├── environment.yaml ├── infer3d.py ├── loss.py ├── models.py ├── train2d.py ├── train3d.py ├── trainPose.py ├── trainer.py ├── utils.py ├── v1 ├── A.ipynb ├── calibration.py ├── distribution.py ├── triangulation.py └── utils.py └── visualization.ipynb /README.md: -------------------------------------------------------------------------------- 1 | # Probabilistic Triangulation V2 2 | 3 | Code of ICCV 2023 paper: [Probabilistic Triangulation for Uncalibrated Multi-View 3D Human Pose Estimation](https://arxiv.org/abs/2309.04756) 4 | 5 | Abstract: 3D human pose estimation has been a long-standing challenge in computer vision and graphics, where multi-view methods have significantly progressed but are limited by the tedious calibration processes. Existing multi-view methods are restricted to fixed camera pose and therefore lack generalization ability. This paper presents a novel Probabilistic Triangulation module that can be embedded in a calibrated 3D human pose estimation method, generalizing it to uncalibration scenes. The key idea is to use a probability distribution to model the camera pose and iteratively update the distribution from 2D features instead of using camera pose. Specifically, We maintain a camera pose distribution and then iteratively update this distribution by computing the posterior probability of the camera pose through Monte Carlo sampling. This way, the gradients can be directly back-propagated from the 3D pose estimation to the 2D heatmap, enabling end-to-end training. Extensive experiments on Human3.6M and CMU Panoptic demonstrate that our method outperforms other uncalibration methods and achieves comparable results with state-of-the-art calibration methods. Thus, our method achieves a trade-off between estimation accuracy and generalizability. 6 | 7 | ## version update 8 | 1. Accelerated the model by replacing backbone with mobileone; 9 | 2. changed the sampling logic to speed up multi-view fusion; 10 | 3. Now the model can be reasoned in real time on iphone. 11 | 12 | ## Getting started 13 | 14 | ### 1. Dataset 15 | 16 | Download and preprocess the dataset by following the instructions in [h36m-fetch](https://github.com/anibali/h36m-fetch) and [learnable triangulation](https://github.com/karfly/learnable-triangulation-pytorch). 17 | 18 | The directory structure after completing all processing: 19 | 20 | ``` 21 | human3.6m 22 | ├── extra 23 | │ ├── bboxes-Human36M-GT.npy 24 | │ ├── human36m-multiview-labels-GTbboxes.npy 25 | │ └── una-dinosauria-data 26 | └── processed 27 | ├── S1 28 | ├── S11 29 | ├── S5 30 | ├── S6 31 | ├── S7 32 | ├── S8 33 | └── S9 34 | ``` 35 | 36 | ### 2. Quick Start 37 | 38 | Use conda to create an environment, or a newer version of pytorch: 39 | 40 | ``` 41 | conda env create -f environment.yaml 42 | ``` 43 | 44 | Perform inference on [pretrained models](https://drive.google.com/file/d/11baGjN9-iC6AzORrPSLJ_Oyk6kCLiEH6/view?usp=drive_link): 45 | 46 | ```python 47 | python infer3d.py 48 | ``` 49 | 50 | The following results will be obtained, where x3d/l2 is mpjpe: 51 | ``` 52 | loss 4.177016958594322 53 | loss/hm 2.6981436171952415 54 | loss/x3d 5.655890337684575 55 | x2d/l1 16.779179317109726 56 | x2d/l2 13.22389983604936 57 | x3d/l1 38.431529241449695 58 | x3d/l2 26.103624186095068 59 | ``` 60 | 61 | Train the 3d estimator, which by default will use the pretrained model of the 2d backbone: 62 | 63 | ```python 64 | python train3d.py 65 | ``` 66 | 67 | Train the 2d backbone: 68 | 69 | ```python 70 | python train2d.py 71 | ``` 72 | 73 | ### 3. Some training suggestions 74 | 75 | 1. While training, it was found that the estimation accuracy of the 2D pose very much affects the results, the mpjpe can be up to 26mm for 384x384 inputs, but only 34mm for 256x256 inputs. 76 | 77 | 2. When the model parameter count is small, Human3.6m has a single background that tends to overfit the model (probably because the model uses color as a key point feature). We added some color and brightness data augmentation during training to combat it. But this can't completely solve the field scene. Pre-training the model with a broader dataset would solve this problem. 78 | 79 | 3. Human3.6m has a lot of data duplicates, and spaced use can quickly validate the training results. 80 | 81 | 4. The voxel fusion multi-view approach leads to a rich physics prior, but there is a bottleneck in acceleration. In the new version of the code, we use orientation + sampled features as inputs, which can greatly speed up the speedup. 82 | 83 | 5. In training, the fusion part uses the generated data for pre-training and is fine-tuned in subsequent training, which can achieve better generalization. 84 | 85 | 86 | 87 | ## Citation 88 | 89 | If you find this project useful for your research, please consider citing: 90 | 91 | ``` 92 | @article{hu2023pose, 93 | title={Probabilistic Triangulation for Uncalibrated Multi-View 3D Human Pose Estimation}, 94 | author={Boyuan Jiang, Lei Hu, Shihong Xia} 95 | journal={IEEE International Conference on Computer Vision}, 96 | year={2023}, 97 | publisher={IEEE} 98 | } 99 | ``` 100 | -------------------------------------------------------------------------------- /augment.py: -------------------------------------------------------------------------------- 1 | from utils import eulid_to_homo 2 | import numpy as np 3 | import cv2 4 | 5 | class Compose(): 6 | def __init__(self, transforms): 7 | self.transforms = transforms 8 | def __call__(self, sample): 9 | for t in self.transforms: 10 | sample = t(sample) 11 | return sample 12 | 13 | class Crop(): 14 | def __init__(self, scaleRange,moveRange): 15 | self.scaleRange = scaleRange 16 | self.moveRange = moveRange 17 | 18 | def __call__(self,sample): 19 | H, W, C = sample['image'].shape 20 | scale = np.random.uniform(self.scaleRange[0],self.scaleRange[1]) 21 | moveX = np.random.uniform(self.moveRange[0], self.moveRange[1]) 22 | moveY = np.random.uniform(self.moveRange[0], self.moveRange[1]) 23 | B = sample['box'] 24 | width = B[2] - B[0] 25 | height = B[3] - B[1] 26 | cx = int((B[0] + B[2])/2 + moveX * width) 27 | cy = int((B[1] + B[3])/2 + moveY * height) 28 | 29 | side = int(max(width, height) * scale) 30 | A = np.asarray([ 31 | cx - side//2, 32 | cy - side//2, 33 | cx + side - side//2, 34 | cy + side - side//2 35 | ]) 36 | Aclip = np.clip(A, [0, 0, 0, 0], [W, H, W, H]) 37 | img = np.zeros((side, side, C)) 38 | img[(Aclip[1]-A[1]):(Aclip[3]-A[1]), (Aclip[0]-A[0]):(Aclip[2]-A[0]) 39 | ] = sample['image'][Aclip[1]: Aclip[3], Aclip[0]: Aclip[2]] 40 | 41 | sample['image'] = img 42 | sample['camera'].update_after_crop(A) 43 | del sample['box'] 44 | return sample 45 | 46 | class Resize(): 47 | def __init__(self, image_size, interpolation=cv2.INTER_NEAREST): 48 | self.image_size = (image_size, image_size) if isinstance( 49 | image_size, (int, float)) else image_size 50 | self.interpolation = interpolation 51 | 52 | def __call__(self, sample): 53 | sample['camera'].update_after_resize(sample['image'].shape[:2], self.image_size) 54 | sample['image'] = cv2.resize( 55 | sample['image'], self.image_size, interpolation=cv2.INTER_CUBIC) 56 | return sample 57 | 58 | class NormSkeleton(): 59 | def __init__(self, root_id=6): 60 | self.root_id = root_id 61 | 62 | def __call__(self, sample): 63 | x3d = eulid_to_homo(sample['x3d']) @ sample['camera'].extrinsics().T 64 | 65 | x2d = x3d @ sample['camera'].K.T 66 | x2d[:,:2] /= x2d[:,2:] 67 | 68 | sample['x3d'] = (x3d-x3d[self.root_id]).astype(np.float32) 69 | sample['x2d'] = x2d[:,:2].astype(np.float32) 70 | sample['K'] = sample['camera'].K 71 | sample['R'] = sample['camera'].R 72 | sample['t'] = sample['camera'].t 73 | del sample['camera'] 74 | return sample 75 | 76 | class NormImage(): 77 | def __init__(self): 78 | pass 79 | 80 | def __call__(self,sample): 81 | sample['image'] = np.clip(sample['image']/255., 0.,1.).transpose(2,0,1).astype(np.float32) 82 | return sample 83 | 84 | 85 | class RandomBrightness(object): 86 | def __init__(self, delta=32): 87 | assert delta >= 0.0 88 | assert delta <= 255.0 89 | self.delta = delta 90 | def __call__(self, sample): 91 | if np.random.randint(2): 92 | delta = np.random.uniform(-self.delta, self.delta) 93 | sample['image'] += delta 94 | return sample 95 | 96 | class RandomContrast(object): 97 | def __init__(self, lower=0.5, upper=1.5): 98 | self.lower = lower 99 | self.upper = upper 100 | assert self.upper >= self.lower, "contrast upper must be >= lower." 101 | assert self.lower >= 0, "contrast lower must be non-negative." 102 | def __call__(self, sample): 103 | if np.random.randint(2): 104 | alpha = np.random.uniform(self.lower, self.upper) 105 | sample['image'] *= alpha 106 | return sample 107 | 108 | 109 | class RandomSaturation(object): 110 | def __init__(self, lower=0.5, upper=1.5): 111 | self.lower = lower 112 | self.upper = upper 113 | assert self.upper >= self.lower, "contrast upper must be >= lower." 114 | assert self.lower >= 0, "contrast lower must be non-negative." 115 | def __call__(self, sample): 116 | if np.random.randint(2): 117 | sample['image'][:, :, 1] *= np.random.uniform(self.lower, self.upper) 118 | return sample 119 | 120 | class RandomHue(object): 121 | def __init__(self, delta=18.0): 122 | assert delta >= 0.0 and delta <= 360.0 123 | self.delta = delta 124 | def __call__(self, sample): 125 | if np.random.randint(2): 126 | sample['image'][:, :, 0] += np.random.uniform(-self.delta, self.delta) 127 | sample['image'][:, :, 0][sample['image'][:, :, 0] > 360.0] -= 360.0 128 | sample['image'][:, :, 0][sample['image'][:, :, 0] < 0.0] += 360.0 129 | return sample 130 | 131 | class SwapChannels(object): 132 | def __init__(self, swaps): 133 | self.swaps = swaps 134 | def __call__(self, image): 135 | image = image[:, :, self.swaps] 136 | return image 137 | 138 | class RandomLightingNoise(object): 139 | def __init__(self): 140 | self.perms = ((0, 1, 2), (0, 2, 1), 141 | (1, 0, 2), (1, 2, 0), 142 | (2, 0, 1), (2, 1, 0)) 143 | def __call__(self, sample): 144 | if np.random.randint(2): 145 | swap = self.perms[np.random.randint(len(self.perms))] 146 | sample['image'] = sample['image'][:,:,swap] 147 | return sample 148 | 149 | class ConvertColor(object): 150 | def __init__(self, current='BGR', transform='HSV'): 151 | self.transform = transform 152 | self.current = current 153 | def __call__(self, sample): 154 | if self.current == 'BGR' and self.transform == 'HSV': 155 | sample['image'] = cv2.cvtColor(sample['image'] , cv2.COLOR_BGR2HSV) 156 | elif self.current == 'HSV' and self.transform == 'BGR': 157 | sample['image'] = cv2.cvtColor(sample['image'] , cv2.COLOR_HSV2BGR) 158 | else: 159 | raise NotImplementedError 160 | return sample 161 | 162 | class PhotometricDistort(): 163 | def __init__(self): 164 | self.distort = Compose([ 165 | RandomBrightness(), 166 | RandomContrast(), 167 | ConvertColor(transform='HSV'), 168 | RandomSaturation(), 169 | RandomHue(), 170 | ConvertColor(current='HSV', transform='BGR'), 171 | RandomLightingNoise(), 172 | ]) 173 | def __call__(self, sample): 174 | sample['image'] = np.clip(sample['image'],0,255).astype(np.float32) 175 | return self.distort(sample) 176 | 177 | 178 | class GenHeatmap(): 179 | def __init__(self, num_keypoints, image_size = 256, heatmap_size = 64): 180 | self.num_keypoints = num_keypoints 181 | self.image_size = image_size 182 | self.heatmap_size = heatmap_size 183 | 184 | 185 | def __call__(self,sample): 186 | hm = np.zeros((self.num_keypoints, self.heatmap_size, self.heatmap_size), dtype=np.float32) 187 | reg = np.zeros((self.num_keypoints, 2), dtype=np.float32) 188 | ind = np.zeros((self.num_keypoints), dtype=np.int64) 189 | mask = np.zeros((self.num_keypoints), dtype=np.uint8) 190 | 191 | for i,x2d in enumerate(sample['x2d']): 192 | ct = x2d * self.heatmap_size/self.image_size 193 | ct_int = (ct + 0.5).astype(np.int32) 194 | if ct_int[0] < self.heatmap_size and ct_int[1] < self.heatmap_size and ct_int[0] >= 0 and ct_int[1] >= 0: 195 | radius = 2 196 | self.draw_gaussian(hm[i], ct, radius) 197 | ind[i] = ct_int[1] * self.heatmap_size + ct_int[0] 198 | reg[i] = ct - ct_int 199 | mask[i] = 1 200 | 201 | sample['hm'] = hm 202 | sample['reg'] = reg 203 | sample['mask'] = mask 204 | sample['ind'] = ind 205 | return sample 206 | 207 | def gaussian2D(self, shape, sigma=1): 208 | m, n = [(ss - 1.) / 2. for ss in shape] 209 | y, x = np.ogrid[-m:m+1, -n:n+1] 210 | 211 | h = np.exp(-(x * x + y * y) / (2 * sigma * sigma)) 212 | h[h < np.finfo(h.dtype).eps * h.max()] = 0 213 | return h 214 | 215 | def draw_gaussian(self, heatmap, center, radius): 216 | diameter = 2 * radius + 1 217 | gaussian = self.gaussian2D((diameter, diameter), sigma=diameter / 6) 218 | 219 | x, y = int(center[0]), int(center[1]) 220 | 221 | height, width = heatmap.shape[0:2] 222 | 223 | left, right = min(x, radius), min(width - x, radius + 1) 224 | top, bottom = min(y, radius), min(height - y, radius + 1) 225 | 226 | masked_heatmap = heatmap[y - top:y + bottom, x - left:x + right] 227 | masked_gaussian = gaussian[radius - top:radius + 228 | bottom, radius - left:radius + right] 229 | if min(masked_gaussian.shape) > 0 and min(masked_heatmap.shape) > 0: # TODO debug 230 | np.maximum(masked_heatmap, masked_gaussian, out=masked_heatmap) 231 | return heatmap 232 | 233 | 234 | 235 | 236 | 237 | 238 | -------------------------------------------------------------------------------- /datasets.py: -------------------------------------------------------------------------------- 1 | import torch 2 | from torch.utils.data import Dataset, DataLoader 3 | import cv2 4 | from utils import Camera, eulid_to_homo 5 | import numpy as np 6 | from augment import * 7 | import os 8 | from collections import defaultdict 9 | import random 10 | 11 | class Human36M(Dataset): 12 | def __init__(self, cfg, is_train): 13 | super().__init__() 14 | self.cfg = cfg 15 | self.is_train = is_train 16 | self.labels = np.load(cfg['labels_path'], allow_pickle=True).item() 17 | # n_cameras = len(self.labels['camera_names']) 18 | 19 | train_subjects = ['S1', 'S5', 'S6', 'S7', 'S8'] 20 | test_subjects = ['S9','S11'] 21 | train_subjects = list(self.labels['subject_names'].index(x) for x in train_subjects) 22 | test_subjects = list(self.labels['subject_names'].index(x) for x in test_subjects) 23 | 24 | if is_train: 25 | mask = np.isin(self.labels['table']['subject_idx'], train_subjects, assume_unique=True) 26 | else: 27 | mask = np.isin(self.labels['table']['subject_idx'], test_subjects, assume_unique=True) 28 | 29 | 30 | self.labels['table'] = self.labels['table'][mask] 31 | 32 | self.augment = Compose([ 33 | Crop(cfg['scaleRange'],cfg['moveRange']), 34 | Resize(cfg['image_size']), 35 | PhotometricDistort(), 36 | NormSkeleton(), 37 | NormImage(), 38 | GenHeatmap(cfg['num_keypoints'],cfg['image_size'],cfg['heatmap_size']), 39 | ]) 40 | 41 | def __len__(self): 42 | if self.is_train: 43 | return (len(self.labels['table'])*len(self.labels['camera_names']))//self.cfg['data_skip_train'] 44 | else: 45 | return len(self.labels['table'])*len(self.labels['camera_names'])//self.cfg['data_skip_test'] 46 | 47 | def __getitem__(self, index): 48 | if self.is_train: 49 | index = index * self.cfg['data_skip_train'] + np.random.randint(self.cfg['data_skip_train']) 50 | else: 51 | index = index * self.cfg['data_skip_test'] + np.random.randint(self.cfg['data_skip_test']) 52 | 53 | camera_idx = index % len(self.labels['camera_names']) 54 | idx = index // len(self.labels['camera_names']) 55 | shot = self.labels['table'][idx] 56 | subject = self.labels['subject_names'][shot['subject_idx']] 57 | action = self.labels['action_names'][shot['action_idx']] 58 | frame_idx = shot['frame_idx'] 59 | camera_name = self.labels['camera_names'][camera_idx] 60 | 61 | 62 | 63 | image = cv2.imread(os.path.join( 64 | self.cfg['root_path'], subject, action, 'imageSequence' + '-undistorted', 65 | camera_name, 'img_%06d.jpg' % (frame_idx+1))) 66 | 67 | box = shot['bbox_by_camera_tlbr'][camera_idx][[1,0,3,2]] # TLBR to LTRB 68 | 69 | # x3d = np.pad( 70 | # shot['keypoints'][:self.cfg['num_keypts']], 71 | # ((0,0), (0,1)), 'constant', constant_values=1.0) 72 | x3d = np.asarray(shot['keypoints'][:self.cfg['num_keypoints']]) 73 | 74 | shot_camera = self.labels['cameras'][shot['subject_idx'], camera_idx] 75 | camera = Camera(shot_camera['R'],shot_camera['t'],shot_camera['K']) 76 | 77 | sample = self.augment({'image':image, 'box': box, 'x3d': x3d, 'camera': camera}) 78 | 79 | if self.cfg['use_tag']: 80 | sample['tag'] = { 81 | 'subject': shot['subject_idx'], 82 | 'action': shot['action_idx'], 83 | 'camera': camera_idx, 84 | 'frame': frame_idx, 85 | } 86 | 87 | return sample 88 | 89 | class MultiPose(Dataset): 90 | def __init__(self, cfg, is_train): 91 | super().__init__() 92 | self.cfg = cfg 93 | self.is_train = is_train 94 | self.labels = np.load(cfg['labels_path'], allow_pickle=True).item() 95 | # n_cameras = len(self.labels['camera_names']) 96 | 97 | train_subjects = ['S1', 'S5', 'S6', 'S7', 'S8'] 98 | test_subjects = ['S9', 'S11'] 99 | train_subjects = list(self.labels['subject_names'].index(x) for x in train_subjects) 100 | test_subjects = list(self.labels['subject_names'].index(x) for x in test_subjects) 101 | 102 | if is_train: 103 | mask = np.isin(self.labels['table']['subject_idx'], train_subjects, assume_unique=True) 104 | else: 105 | mask = np.isin(self.labels['table']['subject_idx'], test_subjects, assume_unique=True) 106 | 107 | self.labels['table'] = self.labels['table'][mask] 108 | 109 | self.augment = Compose([ 110 | NormSkeleton(), 111 | ]) 112 | 113 | def __len__(self): 114 | return len(self.labels['table']) 115 | 116 | def __getitem__(self, index): 117 | shot = self.labels['table'][index] 118 | subject = self.labels['subject_names'][shot['subject_idx']] 119 | action = self.labels['action_names'][shot['action_idx']] 120 | frame_idx = shot['frame_idx'] 121 | 122 | view_list = [] 123 | for camera_idx, camera_name in enumerate(self.labels['camera_names']): 124 | x3d = np.asarray(shot['keypoints'][:self.cfg['num_keypoints']]) 125 | shot_camera = self.labels['cameras'][shot['subject_idx'], camera_idx] 126 | camera = Camera(shot_camera['R'],shot_camera['t'],shot_camera['K']) 127 | view_list.append(self.augment({'x3d': x3d, 'camera': camera})) 128 | 129 | random.shuffle(view_list) 130 | sample = defaultdict(list) 131 | for view in view_list: 132 | for key in view: 133 | sample[key].append(view[key]) 134 | 135 | for key in sample: 136 | sample[key] = np.asarray(sample[key]) 137 | 138 | V,J,_ = sample['x2d'].shape 139 | # (V,1,3,3) @ (V,J,3,1) -> (V,J,3) -> (V,J,4) 140 | if self.is_train: 141 | rands = np.random.randn(V,J,2)*2 142 | x2d = sample['x2d'] + rands 143 | conf = np.exp(-np.sqrt((rands**2).sum(-1)))[...,None] 144 | else: 145 | x2d = sample['x2d'] 146 | conf = np.ones((V,J,1)) 147 | # (V,1,3,3) @ (V,J,3,1) -> (V,J,3,1) 148 | xdir = (np.linalg.inv(sample['K'])[:,None] @ eulid_to_homo(x2d)[...,None]).squeeze(-1) 149 | xdir = np.concatenate([xdir, conf], axis=-1).astype(np.float32) 150 | sample['xdir'] = xdir 151 | 152 | return sample 153 | 154 | 155 | 156 | 157 | class MultiHuman36M(Dataset): 158 | def __init__(self, cfg, is_train): 159 | super().__init__() 160 | self.cfg = cfg 161 | self.is_train = is_train 162 | self.labels = np.load(cfg['labels_path'], allow_pickle=True).item() 163 | # n_cameras = len(self.labels['camera_names']) 164 | 165 | train_subjects = ['S1', 'S5', 'S6', 'S7', 'S8'] 166 | test_subjects = ['S9', 'S11'] 167 | train_subjects = list(self.labels['subject_names'].index(x) for x in train_subjects) 168 | test_subjects = list(self.labels['subject_names'].index(x) for x in test_subjects) 169 | 170 | if is_train: 171 | mask = np.isin(self.labels['table']['subject_idx'], train_subjects, assume_unique=True) 172 | else: 173 | mask = np.isin(self.labels['table']['subject_idx'], test_subjects, assume_unique=True) 174 | 175 | 176 | self.labels['table'] = self.labels['table'][mask] 177 | 178 | self.augment = Compose([ 179 | Crop(cfg['scaleRange'],cfg['moveRange']), 180 | Resize(cfg['image_size']), 181 | PhotometricDistort(), 182 | NormSkeleton(), 183 | NormImage(), 184 | GenHeatmap(cfg['num_keypoints'],cfg['image_size'],cfg['heatmap_size']), 185 | ]) 186 | 187 | def __len__(self): 188 | if self.is_train: 189 | return len(self.labels['table']) // self.cfg['data_skip_train'] 190 | else: 191 | return len(self.labels['table']) // self.cfg['data_skip_test'] 192 | 193 | def __getitem__(self, index): 194 | if self.is_train: 195 | index = index * self.cfg['data_skip_train'] + np.random.randint(self.cfg['data_skip_train']) 196 | else: 197 | index = index * self.cfg['data_skip_test'] + np.random.randint(self.cfg['data_skip_test']) 198 | 199 | shot = self.labels['table'][index] 200 | subject = self.labels['subject_names'][shot['subject_idx']] 201 | action = self.labels['action_names'][shot['action_idx']] 202 | frame_idx = shot['frame_idx'] 203 | 204 | view_list = [] 205 | for camera_idx, camera_name in enumerate(self.labels['camera_names']): 206 | 207 | image = cv2.imread(os.path.join( 208 | self.cfg['root_path'], subject, action, 'imageSequence' + '-undistorted', 209 | camera_name, 'img_%06d.jpg' % (frame_idx+1))) 210 | 211 | box = shot['bbox_by_camera_tlbr'][camera_idx][[1,0,3,2]] # TLBR to LTRB 212 | 213 | # x3d = np.pad( 214 | # shot['keypoints'][:self.cfg['num_keypts']], 215 | # ((0,0), (0,1)), 'constant', constant_values=1.0) 216 | x3d = np.asarray(shot['keypoints'][:self.cfg['num_keypoints']]) 217 | 218 | shot_camera = self.labels['cameras'][shot['subject_idx'], camera_idx] 219 | camera = Camera(shot_camera['R'],shot_camera['t'],shot_camera['K']) 220 | 221 | view_list.append(self.augment({'image':image, 'box': box, 'x3d': x3d, 'camera': camera})) 222 | 223 | # random.shuffle(view_list) 224 | sample = defaultdict(list) 225 | for view in view_list: 226 | for key in view: 227 | sample[key].append(view[key]) 228 | 229 | for key in sample: 230 | sample[key] = np.asarray(sample[key]) 231 | 232 | return sample -------------------------------------------------------------------------------- /environment.yaml: -------------------------------------------------------------------------------- 1 | name: algo 2 | channels: 3 | - pytorch 4 | - nvidia 5 | - conda-forge 6 | - defaults 7 | dependencies: 8 | - _libgcc_mutex=0.1=conda_forge 9 | - _openmp_mutex=4.5=2_gnu 10 | - asttokens=2.4.1=pyhd8ed1ab_0 11 | - backcall=0.2.0=pyh9f0ad1d_0 12 | - blas=1.0=mkl 13 | - brotli-python=1.0.9=py38h6a678d5_7 14 | - bzip2=1.0.8=h7b6447c_0 15 | - ca-certificates=2024.2.2=hbcca054_0 16 | - certifi=2024.2.2=pyhd8ed1ab_0 17 | - charset-normalizer=2.0.4=pyhd3eb1b0_0 18 | - comm=0.2.1=pyhd8ed1ab_0 19 | - cuda-cudart=11.8.89=0 20 | - cuda-cupti=11.8.87=0 21 | - cuda-libraries=11.8.0=0 22 | - cuda-nvrtc=11.8.89=0 23 | - cuda-nvtx=11.8.86=0 24 | - cuda-runtime=11.8.0=0 25 | - debugpy=1.6.7=py38h6a678d5_0 26 | - decorator=5.1.1=pyhd8ed1ab_0 27 | - executing=2.0.1=pyhd8ed1ab_0 28 | - ffmpeg=4.3=hf484d3e_0 29 | - filelock=3.13.1=py38h06a4308_0 30 | - freetype=2.12.1=h4a9f257_0 31 | - gmp=6.2.1=h295c915_3 32 | - gmpy2=2.1.2=py38heeb90bb_0 33 | - gnutls=3.6.15=he1e5248_0 34 | - idna=3.4=py38h06a4308_0 35 | - importlib-metadata=7.0.1=pyha770c72_0 36 | - importlib_metadata=7.0.1=hd8ed1ab_0 37 | - intel-openmp=2023.1.0=hdb19cb5_46306 38 | - ipykernel=6.29.2=pyhd33586a_0 39 | - ipython=8.12.2=pyh41d4057_0 40 | - jedi=0.19.1=pyhd8ed1ab_0 41 | - jinja2=3.1.3=py38h06a4308_0 42 | - jpeg=9e=h5eee18b_1 43 | - jupyter_client=8.6.0=pyhd8ed1ab_0 44 | - jupyter_core=5.7.1=py38h578d9bd_0 45 | - lame=3.100=h7b6447c_0 46 | - lcms2=2.12=h3be6417_0 47 | - ld_impl_linux-64=2.38=h1181459_1 48 | - lerc=3.0=h295c915_0 49 | - libcublas=11.11.3.6=0 50 | - libcufft=10.9.0.58=0 51 | - libcufile=1.8.1.2=0 52 | - libcurand=10.3.4.107=0 53 | - libcusolver=11.4.1.48=0 54 | - libcusparse=11.7.5.86=0 55 | - libdeflate=1.17=h5eee18b_1 56 | - libffi=3.4.4=h6a678d5_0 57 | - libgcc-ng=13.2.0=h807b86a_5 58 | - libgomp=13.2.0=h807b86a_5 59 | - libiconv=1.16=h7f8727e_2 60 | - libidn2=2.3.4=h5eee18b_0 61 | - libjpeg-turbo=2.0.0=h9bf148f_0 62 | - libnpp=11.8.0.86=0 63 | - libnvjpeg=11.9.0.86=0 64 | - libpng=1.6.39=h5eee18b_0 65 | - libsodium=1.0.18=h36c2ea0_1 66 | - libstdcxx-ng=11.2.0=h1234567_1 67 | - libtasn1=4.19.0=h5eee18b_0 68 | - libtiff=4.5.1=h6a678d5_0 69 | - libunistring=0.9.10=h27cfd23_0 70 | - libwebp-base=1.3.2=h5eee18b_0 71 | - llvm-openmp=14.0.6=h9e868ea_0 72 | - lz4-c=1.9.4=h6a678d5_0 73 | - markupsafe=2.1.3=py38h5eee18b_0 74 | - matplotlib-inline=0.1.6=pyhd8ed1ab_0 75 | - mkl=2023.1.0=h213fc3f_46344 76 | - mkl-service=2.4.0=py38h5eee18b_1 77 | - mkl_fft=1.3.8=py38h5eee18b_0 78 | - mkl_random=1.2.4=py38hdb19cb5_0 79 | - mpc=1.1.0=h10f8cd9_1 80 | - mpfr=4.0.2=hb69a4c5_1 81 | - mpmath=1.3.0=py38h06a4308_0 82 | - ncurses=6.4=h6a678d5_0 83 | - nest-asyncio=1.6.0=pyhd8ed1ab_0 84 | - nettle=3.7.3=hbbd107a_1 85 | - networkx=3.1=py38h06a4308_0 86 | - numpy=1.24.3=py38hf6e8229_1 87 | - numpy-base=1.24.3=py38h060ed82_1 88 | - openh264=2.1.1=h4ff587b_0 89 | - openjpeg=2.4.0=h3ad879b_0 90 | - openssl=3.2.1=hd590300_0 91 | - packaging=23.2=pyhd8ed1ab_0 92 | - parso=0.8.3=pyhd8ed1ab_0 93 | - pexpect=4.9.0=pyhd8ed1ab_0 94 | - pickleshare=0.7.5=py_1003 95 | - pillow=10.2.0=py38h5eee18b_0 96 | - pip=23.3.1=py38h06a4308_0 97 | - platformdirs=4.2.0=pyhd8ed1ab_0 98 | - prompt-toolkit=3.0.42=pyha770c72_0 99 | - prompt_toolkit=3.0.42=hd8ed1ab_0 100 | - psutil=5.9.8=py38h01eb140_0 101 | - ptyprocess=0.7.0=pyhd3deb0d_0 102 | - pure_eval=0.2.2=pyhd8ed1ab_0 103 | - pygments=2.17.2=pyhd8ed1ab_0 104 | - pysocks=1.7.1=py38h06a4308_0 105 | - python=3.8.18=h955ad1f_0 106 | - python-dateutil=2.8.2=pyhd8ed1ab_0 107 | - python_abi=3.8=2_cp38 108 | - pytorch=2.2.0=py3.8_cuda11.8_cudnn8.7.0_0 109 | - pytorch-cuda=11.8=h7e8668a_5 110 | - pytorch-mutex=1.0=cuda 111 | - pyyaml=6.0.1=py38h5eee18b_0 112 | - pyzmq=25.1.2=py38h6a678d5_0 113 | - readline=8.2=h5eee18b_0 114 | - requests=2.31.0=py38h06a4308_1 115 | - setuptools=68.2.2=py38h06a4308_0 116 | - six=1.16.0=pyh6c4a22f_0 117 | - sqlite=3.41.2=h5eee18b_0 118 | - stack_data=0.6.2=pyhd8ed1ab_0 119 | - sympy=1.12=py38h06a4308_0 120 | - tbb=2021.8.0=hdb19cb5_0 121 | - tk=8.6.12=h1ccaba5_0 122 | - torchaudio=2.2.0=py38_cu118 123 | - torchtriton=2.2.0=py38 124 | - torchvision=0.17.0=py38_cu118 125 | - tornado=6.3.3=py38h01eb140_1 126 | - traitlets=5.14.1=pyhd8ed1ab_0 127 | - typing_extensions=4.9.0=py38h06a4308_1 128 | - urllib3=2.1.0=py38h06a4308_1 129 | - wcwidth=0.2.13=pyhd8ed1ab_0 130 | - wheel=0.41.2=py38h06a4308_0 131 | - xz=5.4.5=h5eee18b_0 132 | - yaml=0.2.5=h7b6447c_0 133 | - zeromq=4.3.5=h6a678d5_0 134 | - zipp=3.17.0=pyhd8ed1ab_0 135 | - zlib=1.2.13=h5eee18b_0 136 | - zstd=1.5.5=hc292b87_0 137 | - pip: 138 | - absl-py==2.1.0 139 | - cachetools==5.3.2 140 | - contourpy==1.1.1 141 | - cycler==0.12.1 142 | - fonttools==4.49.0 143 | - google-auth==2.28.0 144 | - google-auth-oauthlib==1.0.0 145 | - grpcio==1.60.1 146 | - importlib-resources==6.1.1 147 | - kiwisolver==1.4.5 148 | - markdown==3.5.2 149 | - matplotlib==3.7.5 150 | - oauthlib==3.2.2 151 | - opencv-python==4.9.0.80 152 | - pandas==2.0.3 153 | - protobuf==4.25.3 154 | - pyasn1==0.5.1 155 | - pyasn1-modules==0.3.0 156 | - pyparsing==3.1.1 157 | - pytz==2024.1 158 | - requests-oauthlib==1.3.1 159 | - rsa==4.9 160 | - seaborn==0.13.2 161 | - tensorboard==2.14.0 162 | - tensorboard-data-server==0.7.2 163 | - tensorboardx==2.6.2.2 164 | - torchsummary==1.5.1 165 | - tqdm==4.66.2 166 | - tzdata==2024.1 167 | - werkzeug==3.0.1 168 | prefix: ~/.conda/envs/algo 169 | -------------------------------------------------------------------------------- /infer3d.py: -------------------------------------------------------------------------------- 1 | import os 2 | import numpy as np 3 | import torch 4 | from torch.utils.data import Dataset, DataLoader 5 | import cv2 6 | import matplotlib.pyplot as plt 7 | from utils import * 8 | from trainer import AverageMeter 9 | from datasets import MultiHuman36M 10 | from models import ProbTri 11 | from loss import Net3d 12 | from tqdm import tqdm 13 | 14 | cfg = { 15 | 'root_path': '/data/human36m/processed', 16 | 'labels_path': '/data/human36m/extra/human36m-multiview-labels-GTbboxes.npy', 17 | 'lr':1e-5, 18 | 'num_epoch':300, 19 | 'batch_size_train': 32, 20 | 'batch_size_test': 8, 21 | 'num_workers':8, 22 | 'num_keypoints': 17, 23 | 'num_views': 4, 24 | 'scaleRange': [1.1,1.2], 25 | 'moveRange': [-0.1,0.1], 26 | 'image_size':384, 27 | 'heatmap_size': 96, 28 | # 'backbone_path': 'checkpoints/backbone.pth', 29 | # 'fusion_path': 'checkpoints/fusion.pth', 30 | 'model_path': 'checkpoints/pretrain.pth', 31 | 'device':'cuda', 32 | 'save_dir': '/logs/pose3d', 33 | 'use_tag':False, 34 | 'data_skip_train':8, 35 | 'data_skip_test':1, 36 | } 37 | 38 | test_db = MultiHuman36M(cfg, is_train=False) 39 | test_loader = DataLoader( 40 | test_db, 41 | batch_size=cfg['batch_size_test'], 42 | shuffle=False, 43 | num_workers = cfg['num_workers'], 44 | pin_memory = True, 45 | drop_last=True, 46 | ) 47 | 48 | # trainer 49 | model = ProbTri(cfg).cuda() 50 | if 'backbone_path' in cfg: 51 | pretrain_dict = torch.load(cfg['backbone_path']) 52 | missing, unexpected = model.backbone.load_state_dict(pretrain_dict,strict=False) 53 | print('load backbone model, missing length', len(missing), 'unexpected', len(unexpected) , '\n') 54 | 55 | if 'fusion_path' in cfg: 56 | pretrain_dict = torch.load(cfg['fusion_path']) 57 | missing, unexpected = model.fusion.load_state_dict(pretrain_dict,strict=False) 58 | print('load fusion model, missing length', len(missing), 'unexpected', len(unexpected) , '\n') 59 | 60 | if 'model_path' in cfg: 61 | pretrain_dict = torch.load(cfg['model_path']) 62 | missing, unexpected = model.load_state_dict(pretrain_dict,strict=False) 63 | print('missing length', len(missing), 'unexpected', len(unexpected) , '\n') 64 | 65 | net = Net3d(cfg, model) 66 | net.eval() 67 | torch.set_grad_enabled(False) 68 | 69 | 70 | avg_loss_stats = {} 71 | for iter_id, batch in tqdm(enumerate(test_loader)): 72 | for key in batch: 73 | if isinstance(batch[key], torch.Tensor): 74 | batch[key] = batch[key].to(device = cfg['device'], non_blocking=True) 75 | 76 | output, loss, loss_stats = net(batch) 77 | if 'loss' not in avg_loss_stats: 78 | for key in loss_stats: 79 | avg_loss_stats[key] = AverageMeter() 80 | 81 | for key in loss_stats: 82 | avg_loss_stats[key].update(loss_stats[key].item(), test_loader.batch_size) 83 | 84 | for k, v in avg_loss_stats.items(): 85 | print(k, v.avg) -------------------------------------------------------------------------------- /loss.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn as nn 3 | import torch.nn.functional as F 4 | import numpy as np 5 | 6 | class WingLoss(nn.Module): 7 | def __init__(self): 8 | super().__init__() 9 | self.omega = 10 10 | self.sigma = 2 11 | self.c = self.omega - self.omega * np.log(1+self.omega/self.sigma) 12 | 13 | def forward(self, pred, target): 14 | """ 15 | Args: 16 | pred : (B,N,3) 17 | target: (B,N,3) 18 | """ 19 | l1, loss = self.calloss(pred, target) 20 | l2 = torch.sqrt(((pred - target) ** 2).sum(-1)).mean(-1) 21 | 22 | loss_stats = { 23 | 'loss': loss.mean().cpu().detach(), 24 | 'diff/l1' : l1.mean().cpu().detach(), 25 | 'diff/l2': l2.mean().cpu().detach(), 26 | } 27 | 28 | return loss.mean(), loss_stats 29 | 30 | def calloss(self, x,t): 31 | diff = torch.abs(x - t).sum(-1) 32 | is_small = (diff < self.omega).float() 33 | small_loss = self.omega * torch.log(1+diff/self.sigma) 34 | big_loss = diff - self.c 35 | loss = (small_loss * is_small + big_loss * (1-is_small)) * 0.1 36 | return diff, loss 37 | 38 | class FocalLoss(nn.Module): 39 | """ 40 | Args: 41 | pred (B, c, h, w) 42 | target (B, c, h, w) 43 | """ 44 | 45 | def __init__(self): 46 | super(FocalLoss, self).__init__() 47 | self.epsilon = 1e-8 48 | 49 | def forward(self, pred, target): 50 | pred = pred.clamp(min=self.epsilon, max=1-self.epsilon) 51 | yeq1_index = target.ge(0.9).float() 52 | other_index = target.lt(0.9).float() 53 | 54 | yeq1_loss = (yeq1_index * torch.log(pred) * torch.pow(1-pred,2)).sum() 55 | other_loss = (other_index * torch.log(1 - pred) * torch.pow(pred, 2) * torch.pow(1 - target, 4)).sum() 56 | num_yeq1 = yeq1_index.float().sum() 57 | 58 | if num_yeq1 == 0: 59 | loss = - other_loss 60 | else: 61 | loss = - (yeq1_loss + other_loss) / num_yeq1 62 | 63 | return loss 64 | 65 | def _tranpose_and_gather_feat(feat, ind): 66 | """ 67 | Args: 68 | feat (B,C*2,H,W) 69 | ind (B,C) 70 | Returns: 71 | feat (B,C,2) 72 | """ 73 | # (B,C*2,H,W) -> (B,C,2,H*W) 74 | feat = feat.view(feat.size(0), feat.size(1)//2, 2, -1) 75 | return torch.gather(feat, 3, ind.unsqueeze(-1).unsqueeze(-1).expand(-1, -1, 2, -1)).squeeze(3) 76 | 77 | class RegL1Loss(nn.Module): 78 | """ 79 | Args: 80 | output (B, dim, h, w) 81 | mask (B, max_obj) 82 | ind (B, max_obj) 83 | target (B, max_obj, dim) 84 | Temp: 85 | pred (B, max_obj, dim) 86 | """ 87 | 88 | def __init__(self): 89 | super(RegL1Loss, self).__init__() 90 | 91 | def forward(self, output, mask, ind, target): 92 | pred = _tranpose_and_gather_feat(output, ind) 93 | 94 | mask = mask.unsqueeze(2).expand_as(pred).float() 95 | loss = F.l1_loss(pred * mask, target * mask, reduction='sum') 96 | loss = loss / (mask.sum() + 1e-4) 97 | return loss 98 | 99 | class DetLoss(nn.Module): 100 | def __init__(self,cfg): 101 | super(DetLoss, self).__init__() 102 | self.cfg=cfg 103 | self.hm_crit = FocalLoss() 104 | self.reg_crit = RegL1Loss() 105 | 106 | def forward(self, output, batch): 107 | 108 | hm_loss = self.hm_crit(output['hm'], batch['hm']) 109 | reg_loss = self.reg_crit(output['reg'],batch['mask'],batch['ind'], batch['reg']) 110 | loss = hm_loss + reg_loss 111 | 112 | B,C,H,W = output['hm'].shape 113 | max_val, max_idx = torch.max(output['hm'].view(B, C, -1), dim=2) 114 | # reg = _tranpose_and_gather_feat(output['reg'], max_idx) 115 | reg = torch.gather(output['reg'].view(B,C,2,H*W), 3, max_idx.unsqueeze(-1).unsqueeze(-1).expand(-1,-1,2,-1)).squeeze(3) 116 | 117 | x = (torch.stack([max_idx%W, max_idx//W],dim=-1) + reg) * self.cfg['image_size'] / self.cfg['heatmap_size'] 118 | l1 = torch.abs(x - batch['x2d']).sum(-1).mean(-1) 119 | l2 = torch.sqrt(((x - batch['x2d']) ** 2).sum(-1)).mean(-1) 120 | 121 | 122 | loss_stats = { 123 | 'loss': loss.mean().cpu().detach(), 124 | 'loss/hm' : hm_loss.mean().cpu().detach(), 125 | 'loss/reg': reg_loss.mean().cpu().detach(), 126 | 'diff/l1': l1.mean().cpu().detach(), 127 | 'diff/l2': l2.mean().cpu().detach(), 128 | } 129 | output['pred_x2d'] = torch.cat([x,max_val[...,None]],dim=-1) 130 | return loss.mean(), loss_stats 131 | 132 | class CocktailLoss(nn.Module): 133 | def __init__(self): 134 | super().__init__() 135 | self.hm_crit = FocalLoss() 136 | # self.reg_crit = RegL1Loss() 137 | 138 | self.omega = 25 139 | self.sigma = 5 140 | self.c = self.omega - self.omega * np.log(1+self.omega/self.sigma) 141 | 142 | 143 | def forward(self, output, batch): 144 | 145 | # 2d 146 | hm_loss, reg_loss = 0., 0. 147 | B,V = batch['hm'].shape[:2] 148 | # print(output['hm'].shape, batch['hm'].shape, output['reg'].shape, batch['mask'].shape, batch['ind'].shape, batch['reg'].shape) 149 | for i in range(V): 150 | hm_loss += self.hm_crit(output['hm'][:,i], batch['hm'][:,i]) / V 151 | # reg_loss += self.reg_crit(output['reg'][:,i],batch['mask'][:,i],batch['ind'][:,i], batch['reg'][:,i]) / V 152 | 153 | # 3d 154 | pred = output['x3d'] 155 | target = batch['x3d'][:,0] 156 | x3d_l1, x3d_loss = self.calloss(pred, target) 157 | x3d_l2 = torch.sqrt(((pred - target) ** 2).sum(-1)).mean(-1) 158 | 159 | x2d_l1 = torch.abs(output['x2d'] - batch['x2d']).sum(-1).mean(-1) 160 | x2d_l2 = torch.sqrt(((output['x2d'] - batch['x2d']) ** 2).sum(-1)).mean(-1) 161 | 162 | loss = (x3d_loss + hm_loss)*0.5 163 | 164 | loss_stats = { 165 | 'loss': loss.mean().cpu().detach(), 166 | 'loss/hm' : hm_loss.mean().cpu().detach(), 167 | # 'loss/reg': reg_loss.mean().cpu().detach(), 168 | 'loss/x3d': x3d_loss.mean().cpu().detach(), 169 | 'x2d/l1': x2d_l1.mean().cpu().detach(), 170 | 'x2d/l2': x2d_l2.mean().cpu().detach(), 171 | 'x3d/l1' : x3d_l1.mean().cpu().detach(), 172 | 'x3d/l2': x3d_l2.mean().cpu().detach(), 173 | } 174 | 175 | return loss.mean(), loss_stats 176 | 177 | def calloss(self, x,t): 178 | diff = torch.abs(x - t).sum(-1) 179 | is_small = (diff < self.omega).float() 180 | small_loss = self.omega * torch.log(1+diff/self.sigma) 181 | big_loss = diff - self.c 182 | loss = (small_loss * is_small + big_loss * (1-is_small)) * 0.1 183 | return diff, loss 184 | 185 | 186 | 187 | 188 | class Net2d(nn.Module): 189 | def __init__(self, cfg, model): 190 | super().__init__() 191 | self.model = model.to(cfg['device']) 192 | self.loss = DetLoss(cfg).to(cfg['device']) 193 | 194 | def forward(self, batch): 195 | out_hm, out_reg, _ = self.model(batch['image']) 196 | output = {'hm':out_hm,'reg':out_reg} 197 | loss, loss_stats = self.loss(output, batch) 198 | return output, loss, loss_stats 199 | 200 | 201 | class Net3d(nn.Module): 202 | def __init__(self, cfg, model): 203 | super().__init__() 204 | self.model = model.to(cfg['device']) 205 | self.loss = CocktailLoss().to(cfg['device']) 206 | 207 | def forward(self, batch): 208 | out_x3d, out_hm, out_reg, out_x2d = self.model(batch['image'],batch['K']) 209 | output = {'x3d':out_x3d,'hm':out_hm,'reg':out_reg,'x2d':out_x2d} 210 | loss, loss_stats = self.loss(output, batch) 211 | return output, loss, loss_stats 212 | 213 | class NetPose(nn.Module): 214 | def __init__(self, cfg, model): 215 | super().__init__() 216 | self.model = model.to(cfg['device']) 217 | self.loss = WingLoss().to(cfg['device']) 218 | 219 | def forward(self, batch): 220 | out = self.model(batch['xdir']) 221 | loss, loss_stats = self.loss(out, batch['x3d'][:,0]) 222 | return out, loss, loss_stats 223 | 224 | 225 | class NetTri(nn.Module): 226 | def __init__(self, cfg, model): 227 | super().__init__() 228 | self.model = model.to(cfg['device']) 229 | self.loss = WingLoss().to(cfg['device']) 230 | 231 | def forward(self, batch): 232 | out = self.model(batch['xdir'], batch['xfeat']) 233 | loss, loss_stats = self.loss(out, batch['x3d'][:,0]) 234 | return out, loss, loss_stats 235 | 236 | # class Net(nn.Module): 237 | # def __init__(self, cfg, model): 238 | # super().__init__() 239 | # self.model = model.to(cfg['device']) 240 | # self.loss = WingLoss().to(cfg['device']) 241 | 242 | # def forward(self, batch): 243 | # lds_pred,_ = self.model(batch['image']) 244 | # lds_pred = lds_pred.squeeze() 245 | # loss, loss_stats = self.loss(lds_pred, batch['x2d']) 246 | # return lds_pred, loss, loss_stats 247 | 248 | 249 | -------------------------------------------------------------------------------- /models.py: -------------------------------------------------------------------------------- 1 | # 2 | # For licensing see accompanying LICENSE file. 3 | # Copyright (C) 2022 Apple Inc. All Rights Reserved. 4 | # 5 | from typing import Optional, List, Tuple 6 | 7 | import copy 8 | import torch 9 | import torch.nn as nn 10 | import torch.nn.functional as F 11 | from utils import eulid_to_homo 12 | 13 | 14 | __all__ = ['MobileOne', 'mobileone', 'reparameterize_model'] 15 | 16 | 17 | class SEBlock(nn.Module): 18 | """ Squeeze and Excite module. 19 | 20 | Pytorch implementation of `Squeeze-and-Excitation Networks` - 21 | https://arxiv.org/pdf/1709.01507.pdf 22 | """ 23 | 24 | def __init__(self, 25 | in_channels: int, 26 | rd_ratio: float = 0.0625) -> None: 27 | """ Construct a Squeeze and Excite Module. 28 | 29 | :param in_channels: Number of input channels. 30 | :param rd_ratio: Input channel reduction ratio. 31 | """ 32 | super(SEBlock, self).__init__() 33 | self.reduce = nn.Conv2d(in_channels=in_channels, 34 | out_channels=int(in_channels * rd_ratio), 35 | kernel_size=1, 36 | stride=1, 37 | bias=True) 38 | self.expand = nn.Conv2d(in_channels=int(in_channels * rd_ratio), 39 | out_channels=in_channels, 40 | kernel_size=1, 41 | stride=1, 42 | bias=True) 43 | 44 | def forward(self, inputs: torch.Tensor) -> torch.Tensor: 45 | """ Apply forward pass. """ 46 | b, c, h, w = inputs.size() 47 | x = F.avg_pool2d(inputs, kernel_size=[h, w]) 48 | x = self.reduce(x) 49 | x = F.relu(x) 50 | x = self.expand(x) 51 | x = torch.sigmoid(x) 52 | x = x.view(-1, c, 1, 1) 53 | return inputs * x 54 | 55 | 56 | class MobileOneBlock(nn.Module): 57 | """ MobileOne building block. 58 | 59 | This block has a multi-branched architecture at train-time 60 | and plain-CNN style architecture at inference time 61 | For more details, please refer to our paper: 62 | `An Improved One millisecond Mobile Backbone` - 63 | https://arxiv.org/pdf/2206.04040.pdf 64 | """ 65 | def __init__(self, 66 | in_channels: int, 67 | out_channels: int, 68 | kernel_size: int, 69 | stride: int = 1, 70 | padding: int = 0, 71 | dilation: int = 1, 72 | groups: int = 1, 73 | inference_mode: bool = False, 74 | use_se: bool = False, 75 | num_conv_branches: int = 1) -> None: 76 | """ Construct a MobileOneBlock module. 77 | 78 | :param in_channels: Number of channels in the input. 79 | :param out_channels: Number of channels produced by the block. 80 | :param kernel_size: Size of the convolution kernel. 81 | :param stride: Stride size. 82 | :param padding: Zero-padding size. 83 | :param dilation: Kernel dilation factor. 84 | :param groups: Group number. 85 | :param inference_mode: If True, instantiates model in inference mode. 86 | :param use_se: Whether to use SE-ReLU activations. 87 | :param num_conv_branches: Number of linear conv branches. 88 | """ 89 | super(MobileOneBlock, self).__init__() 90 | self.inference_mode = inference_mode 91 | self.groups = groups 92 | self.stride = stride 93 | self.kernel_size = kernel_size 94 | self.in_channels = in_channels 95 | self.out_channels = out_channels 96 | self.num_conv_branches = num_conv_branches 97 | 98 | # Check if SE-ReLU is requested 99 | if use_se: 100 | self.se = SEBlock(out_channels) 101 | else: 102 | self.se = nn.Identity() 103 | self.activation = nn.ReLU() 104 | 105 | if inference_mode: 106 | self.reparam_conv = nn.Conv2d(in_channels=in_channels, 107 | out_channels=out_channels, 108 | kernel_size=kernel_size, 109 | stride=stride, 110 | padding=padding, 111 | dilation=dilation, 112 | groups=groups, 113 | bias=True) 114 | else: 115 | # Re-parameterizable skip connection 116 | self.rbr_skip = nn.BatchNorm2d(num_features=in_channels) \ 117 | if out_channels == in_channels and stride == 1 else None 118 | 119 | # Re-parameterizable conv branches 120 | rbr_conv = list() 121 | for _ in range(self.num_conv_branches): 122 | rbr_conv.append(self._conv_bn(kernel_size=kernel_size, 123 | padding=padding)) 124 | self.rbr_conv = nn.ModuleList(rbr_conv) 125 | 126 | # Re-parameterizable scale branch 127 | self.rbr_scale = None 128 | if kernel_size > 1: 129 | self.rbr_scale = self._conv_bn(kernel_size=1, 130 | padding=0) 131 | 132 | def forward(self, x: torch.Tensor) -> torch.Tensor: 133 | """ Apply forward pass. """ 134 | # Inference mode forward pass. 135 | if self.inference_mode: 136 | return self.activation(self.se(self.reparam_conv(x))) 137 | 138 | # Multi-branched train-time forward pass. 139 | # Skip branch output 140 | identity_out = 0 141 | if self.rbr_skip is not None: 142 | identity_out = self.rbr_skip(x) 143 | 144 | # Scale branch output 145 | scale_out = 0 146 | if self.rbr_scale is not None: 147 | scale_out = self.rbr_scale(x) 148 | 149 | # Other branches 150 | out = scale_out + identity_out 151 | for ix in range(self.num_conv_branches): 152 | out += self.rbr_conv[ix](x) 153 | 154 | return self.activation(self.se(out)) 155 | 156 | def reparameterize(self): 157 | """ Following works like `RepVGG: Making VGG-style ConvNets Great Again` - 158 | https://arxiv.org/pdf/2101.03697.pdf. We re-parameterize multi-branched 159 | architecture used at training time to obtain a plain CNN-like structure 160 | for inference. 161 | """ 162 | if self.inference_mode: 163 | return 164 | kernel, bias = self._get_kernel_bias() 165 | self.reparam_conv = nn.Conv2d(in_channels=self.rbr_conv[0].conv.in_channels, 166 | out_channels=self.rbr_conv[0].conv.out_channels, 167 | kernel_size=self.rbr_conv[0].conv.kernel_size, 168 | stride=self.rbr_conv[0].conv.stride, 169 | padding=self.rbr_conv[0].conv.padding, 170 | dilation=self.rbr_conv[0].conv.dilation, 171 | groups=self.rbr_conv[0].conv.groups, 172 | bias=True) 173 | self.reparam_conv.weight.data = kernel 174 | self.reparam_conv.bias.data = bias 175 | 176 | # Delete un-used branches 177 | for para in self.parameters(): 178 | para.detach_() 179 | self.__delattr__('rbr_conv') 180 | self.__delattr__('rbr_scale') 181 | if hasattr(self, 'rbr_skip'): 182 | self.__delattr__('rbr_skip') 183 | 184 | self.inference_mode = True 185 | 186 | def _get_kernel_bias(self) -> Tuple[torch.Tensor, torch.Tensor]: 187 | """ Method to obtain re-parameterized kernel and bias. 188 | Reference: https://github.com/DingXiaoH/RepVGG/blob/main/repvgg.py#L83 189 | 190 | :return: Tuple of (kernel, bias) after fusing branches. 191 | """ 192 | # get weights and bias of scale branch 193 | kernel_scale = 0 194 | bias_scale = 0 195 | if self.rbr_scale is not None: 196 | kernel_scale, bias_scale = self._fuse_bn_tensor(self.rbr_scale) 197 | # Pad scale branch kernel to match conv branch kernel size. 198 | pad = self.kernel_size // 2 199 | kernel_scale = torch.nn.functional.pad(kernel_scale, 200 | [pad, pad, pad, pad]) 201 | 202 | # get weights and bias of skip branch 203 | kernel_identity = 0 204 | bias_identity = 0 205 | if self.rbr_skip is not None: 206 | kernel_identity, bias_identity = self._fuse_bn_tensor(self.rbr_skip) 207 | 208 | # get weights and bias of conv branches 209 | kernel_conv = 0 210 | bias_conv = 0 211 | for ix in range(self.num_conv_branches): 212 | _kernel, _bias = self._fuse_bn_tensor(self.rbr_conv[ix]) 213 | kernel_conv += _kernel 214 | bias_conv += _bias 215 | 216 | kernel_final = kernel_conv + kernel_scale + kernel_identity 217 | bias_final = bias_conv + bias_scale + bias_identity 218 | return kernel_final, bias_final 219 | 220 | def _fuse_bn_tensor(self, branch) -> Tuple[torch.Tensor, torch.Tensor]: 221 | """ Method to fuse batchnorm layer with preceeding conv layer. 222 | Reference: https://github.com/DingXiaoH/RepVGG/blob/main/repvgg.py#L95 223 | 224 | :param branch: 225 | :return: Tuple of (kernel, bias) after fusing batchnorm. 226 | """ 227 | if isinstance(branch, nn.Sequential): 228 | kernel = branch.conv.weight 229 | running_mean = branch.bn.running_mean 230 | running_var = branch.bn.running_var 231 | gamma = branch.bn.weight 232 | beta = branch.bn.bias 233 | eps = branch.bn.eps 234 | else: 235 | assert isinstance(branch, nn.BatchNorm2d) 236 | if not hasattr(self, 'id_tensor'): 237 | input_dim = self.in_channels // self.groups 238 | kernel_value = torch.zeros((self.in_channels, 239 | input_dim, 240 | self.kernel_size, 241 | self.kernel_size), 242 | dtype=branch.weight.dtype, 243 | device=branch.weight.device) 244 | for i in range(self.in_channels): 245 | kernel_value[i, i % input_dim, 246 | self.kernel_size // 2, 247 | self.kernel_size // 2] = 1 248 | self.id_tensor = kernel_value 249 | kernel = self.id_tensor 250 | running_mean = branch.running_mean 251 | running_var = branch.running_var 252 | gamma = branch.weight 253 | beta = branch.bias 254 | eps = branch.eps 255 | std = (running_var + eps).sqrt() 256 | t = (gamma / std).reshape(-1, 1, 1, 1) 257 | return kernel * t, beta - running_mean * gamma / std 258 | 259 | def _conv_bn(self, 260 | kernel_size: int, 261 | padding: int) -> nn.Sequential: 262 | """ Helper method to construct conv-batchnorm layers. 263 | 264 | :param kernel_size: Size of the convolution kernel. 265 | :param padding: Zero-padding size. 266 | :return: Conv-BN module. 267 | """ 268 | mod_list = nn.Sequential() 269 | mod_list.add_module('conv', nn.Conv2d(in_channels=self.in_channels, 270 | out_channels=self.out_channels, 271 | kernel_size=kernel_size, 272 | stride=self.stride, 273 | padding=padding, 274 | groups=self.groups, 275 | bias=False)) 276 | mod_list.add_module('bn', nn.BatchNorm2d(num_features=self.out_channels)) 277 | return mod_list 278 | 279 | 280 | class MobileConv(nn.Module): 281 | def __init__(self, in_channels, out_channels, stride=1, inference_mode = False): 282 | super().__init__() 283 | self.convs = nn.Sequential( 284 | MobileOneBlock(in_channels=in_channels, 285 | out_channels=in_channels, 286 | kernel_size=3, 287 | stride=stride, 288 | padding=1, 289 | groups=1, 290 | inference_mode=inference_mode, 291 | use_se=False, 292 | num_conv_branches=1), 293 | MobileOneBlock(in_channels=in_channels, 294 | out_channels=out_channels, 295 | kernel_size=1, 296 | stride=1, 297 | padding=0, 298 | groups=1, 299 | inference_mode=inference_mode, 300 | use_se=False, 301 | num_conv_branches=1), 302 | ) 303 | def __call__(self, x): 304 | return self.convs(x) 305 | 306 | 307 | class UpHead(nn.Module): 308 | def __init__(self,width_multipliers,inference_mode): 309 | super().__init__() 310 | self.proj4 = nn.Sequential( 311 | MobileConv(int(512 * width_multipliers[3]), int(256 * width_multipliers[2]), stride=1, inference_mode=inference_mode), 312 | nn.UpsamplingBilinear2d(scale_factor=2) 313 | ) 314 | self.proj3 = nn.Sequential( 315 | MobileConv(int(256 * width_multipliers[2]), int(128 * width_multipliers[1]), stride=1, inference_mode=inference_mode), 316 | nn.UpsamplingBilinear2d(scale_factor=2) 317 | ) 318 | self.proj2 = nn.Sequential( 319 | MobileConv(int(128 * width_multipliers[1]), int(64 * width_multipliers[0]), stride=1, inference_mode=inference_mode), 320 | nn.UpsamplingBilinear2d(scale_factor=2) 321 | # MobileConv(256, 128, stride=1, inference_mode=inference_mode), 322 | ) 323 | self.proj1 = nn.Sequential( 324 | MobileConv(int(64 * width_multipliers[0]), 256, stride=1, inference_mode=inference_mode), 325 | # nn.UpsamplingBilinear2d(scale_factor=2) 326 | MobileConv(256, 128, stride=1, inference_mode=inference_mode), 327 | ) 328 | # self.proj0 = nn.Sequential( 329 | # MobileConv( min(64,int(64 * width_multipliers[0])), 64, stride=1, inference_mode=inference_mode), 330 | # ) 331 | 332 | def forward(self, x1, x2, x3, x4): 333 | elt3 = x3 + self.proj4(x4) 334 | elt2 = x2 + self.proj3(elt3) 335 | elt1 = x1 + self.proj2(elt2) 336 | out = self.proj1(elt1) 337 | # elt0 = x0 + self.proj1(elt1) 338 | # out = self.proj0(elt0) 339 | return out 340 | 341 | 342 | class Pose2d(nn.Module): 343 | def __init__(self, 344 | num_blocks_per_stage: List[int] = [2, 8, 10, 1], 345 | num_classes: int = 1000, 346 | width_multipliers: Optional[List[float]] = None, 347 | inference_mode: bool = False, 348 | use_se: bool = False, 349 | num_conv_branches: int = 1) -> None: 350 | """ 351 | :param num_blocks_per_stage: List of number of blocks per stage. 352 | :param num_classes: Number of classes in the dataset. 353 | :param width_multipliers: List of width multiplier for blocks in a stage. 354 | :param inference_mode: If True, instantiates model in inference mode. 355 | :param use_se: Whether to use SE-ReLU activations. 356 | :param num_conv_branches: Number of linear conv branches. 357 | """ 358 | super().__init__() 359 | 360 | assert len(width_multipliers) == 4 361 | self.inference_mode = inference_mode 362 | self.in_planes = min(64, int(64 * width_multipliers[0])) 363 | self.use_se = use_se 364 | self.num_conv_branches = num_conv_branches 365 | 366 | # Build stages 367 | self.stage0 = MobileOneBlock(in_channels=3, out_channels=self.in_planes, 368 | kernel_size=3, stride=2, padding=1, 369 | inference_mode=self.inference_mode) 370 | self.cur_layer_idx = 1 371 | self.stage1 = self._make_stage(int(64 * width_multipliers[0]), num_blocks_per_stage[0], 372 | num_se_blocks=0) 373 | self.stage2 = self._make_stage(int(128 * width_multipliers[1]), num_blocks_per_stage[1], 374 | num_se_blocks=0) 375 | self.stage3 = self._make_stage(int(256 * width_multipliers[2]), num_blocks_per_stage[2], 376 | num_se_blocks=int(num_blocks_per_stage[2] // 2) if use_se else 0) 377 | 378 | self.stage4 = self._make_stage(int(512 * width_multipliers[3]), num_blocks_per_stage[3], 379 | num_se_blocks=num_blocks_per_stage[3] if use_se else 0) 380 | # self.neck = nn.Sequential( 381 | # MobileConv(int(512 * width_multipliers[3]), 512, stride=1, inference_mode=inference_mode), 382 | # MobileConv(512, 256, stride=1, inference_mode=inference_mode), 383 | # # nn.UpsamplingBilinear2d(scale_factor=2), 384 | # MobileConv(256, 256, stride=2, inference_mode=inference_mode), 385 | # MobileConv(256, 128, stride=1, inference_mode=inference_mode), 386 | # MobileConv(128, 128, stride=2, inference_mode=inference_mode), 387 | # MobileConv(128, 64, stride=1, inference_mode=inference_mode), 388 | # ) 389 | # self.gap = nn.AdaptiveAvgPool2d(output_size=1) 390 | # self.linear = nn.Linear(64, num_classes*2) 391 | 392 | self.upstage = UpHead(width_multipliers, inference_mode) 393 | self.hm_head = nn.Sequential( 394 | MobileConv( 128, num_classes, stride=1, inference_mode=inference_mode), 395 | nn.Softmax2d(), 396 | ) 397 | self.reg_head = nn.Sequential( 398 | MobileConv( 128, num_classes*2, stride=1, inference_mode=inference_mode), 399 | ) 400 | self.freeze() 401 | 402 | def _make_stage(self, 403 | planes: int, 404 | num_blocks: int, 405 | num_se_blocks: int) -> nn.Sequential: 406 | """ Build a stage of MobileOne model. 407 | 408 | :param planes: Number of output channels. 409 | :param num_blocks: Number of blocks in this stage. 410 | :param num_se_blocks: Number of SE blocks in this stage. 411 | :return: A stage of MobileOne model. 412 | """ 413 | # Get strides for all layers 414 | strides = [2] + [1]*(num_blocks-1) 415 | blocks = [] 416 | for ix, stride in enumerate(strides): 417 | use_se = False 418 | if num_se_blocks > num_blocks: 419 | raise ValueError("Number of SE blocks cannot " 420 | "exceed number of layers.") 421 | if ix >= (num_blocks - num_se_blocks): 422 | use_se = True 423 | 424 | # Depthwise conv 425 | blocks.append(MobileOneBlock(in_channels=self.in_planes, 426 | out_channels=self.in_planes, 427 | kernel_size=3, 428 | stride=stride, 429 | padding=1, 430 | groups=self.in_planes, 431 | inference_mode=self.inference_mode, 432 | use_se=use_se, 433 | num_conv_branches=self.num_conv_branches)) 434 | # Pointwise conv 435 | blocks.append(MobileOneBlock(in_channels=self.in_planes, 436 | out_channels=planes, 437 | kernel_size=1, 438 | stride=1, 439 | padding=0, 440 | groups=1, 441 | inference_mode=self.inference_mode, 442 | use_se=use_se, 443 | num_conv_branches=self.num_conv_branches)) 444 | self.in_planes = planes 445 | self.cur_layer_idx += 1 446 | return nn.Sequential(*blocks) 447 | 448 | def freeze(self): 449 | for layer in [self.stage0, self.stage1, self.stage2, self.stage3, self.stage4]: 450 | for param in layer.parameters(): 451 | param.requires_grad = False 452 | 453 | for layer in [self.upstage, self.hm_head,self.reg_head]: 454 | for param in layer.parameters(): 455 | param.requires_grad = True 456 | layer.train() 457 | 458 | 459 | def forward(self, x: torch.Tensor) -> torch.Tensor: 460 | """ Apply forward pass. """ 461 | x0 = self.stage0(x) 462 | x1 = self.stage1(x0) 463 | x2 = self.stage2(x1) 464 | x3 = self.stage3(x2) 465 | x4 = self.stage4(x3) 466 | feature = self.upstage(x1,x2,x3,x4) 467 | # feature = self.neck(x) 468 | # out = self.gap(feature) 469 | # out = out.view(out.size(0), -1) 470 | # out = self.linear(out) 471 | return self.hm_head(feature), self.reg_head(feature), feature 472 | 473 | 474 | 475 | def pose2d_model(num_classes: int = 1000, inference_mode: bool = False) -> nn.Module: 476 | return Pose2d(num_classes=num_classes, inference_mode=inference_mode, 477 | width_multipliers = (3.0, 3.5, 3.5, 4.0),use_se=True) 478 | 479 | 480 | def reparameterize_model(model: torch.nn.Module) -> nn.Module: 481 | """ Method returns a model where a multi-branched structure 482 | used in training is re-parameterized into a single branch 483 | for inference. 484 | 485 | :param model: MobileOne model in train mode. 486 | :return: MobileOne model in inference mode. 487 | """ 488 | # Avoid editing original graph 489 | model = copy.deepcopy(model) 490 | for module in model.modules(): 491 | if hasattr(module, 'reparameterize'): 492 | module.reparameterize() 493 | return model 494 | 495 | 496 | class Fusion(nn.Module): 497 | def __init__(self,cfg): 498 | super().__init__() 499 | self.cfg = cfg 500 | self.embed = nn.Sequential( 501 | nn.Linear(4,1024), 502 | nn.ReLU(), 503 | nn.Linear(1024,512), 504 | nn.ReLU(), 505 | nn.Linear(512,256), 506 | nn.ReLU(), 507 | nn.Linear(256,128), 508 | ) 509 | 510 | self.pose0 = nn.Sequential( 511 | nn.Linear(cfg['num_keypoints']*256, 1024), 512 | nn.ReLU(), 513 | nn.Linear(1024,512), 514 | nn.ReLU(), 515 | nn.Linear(512,256), 516 | nn.ReLU(), 517 | # nn.Dropout(p=0.5), 518 | nn.Linear(256,256), 519 | ) 520 | self.pose1 = nn.Sequential( 521 | nn.Linear(cfg['num_views']*256, 1024), 522 | nn.ReLU(), 523 | nn.Linear(1024,512), 524 | nn.ReLU(), 525 | nn.Linear(512,cfg['num_keypoints']*3), 526 | ) 527 | 528 | def forward(self, xdir, xfeat=None): 529 | B,V,J,_ = xdir.shape 530 | xdir = self.embed(xdir) 531 | 532 | if xfeat == None: 533 | xfeat = torch.zeros_like(xdir) 534 | # (B,V,J,256) -> (B,V,J*256) 535 | x = torch.cat([xdir,xfeat], dim=-1).view(B,V,J*256) 536 | x = self.pose0(x) # (B,V,256) 537 | x = x.view(B,-1) 538 | out = self.pose1(x).view(B,-1,3) 539 | return out 540 | 541 | 542 | 543 | class ProbTri(nn.Module): 544 | def __init__(self, cfg): 545 | super().__init__() 546 | self.cfg = cfg 547 | self.backbone = pose2d_model(num_classes=cfg['num_keypoints']) 548 | self.fusion = Fusion(cfg) 549 | self.freeze() 550 | 551 | def freeze(self): 552 | self.backbone.freeze() 553 | # for layer in [self.backbone]: 554 | # for param in layer.parameters(): 555 | # param.requires_grad = False 556 | 557 | for layer in [self.fusion]: 558 | for param in layer.parameters(): 559 | param.requires_grad = True 560 | layer.train() 561 | 562 | def forward(self, image, K): 563 | """ 564 | Args: 565 | image (B,V,3,H,W) 566 | K (B,V,3,3) 567 | """ 568 | B,V,_,image_h,image_w = image.shape 569 | image = image.view(-1,3,image_h,image_w) 570 | K = K.view(-1,3,3) 571 | out_hm,out_reg,feature = self.backbone(image) 572 | 573 | # BV,J,H,W 574 | heatmap = out_hm 575 | BV, J, H,W = heatmap.shape 576 | reg = out_reg.view(BV,J,2,H*W) 577 | 578 | # ind (BV,J) 579 | max_val, ind = torch.max(heatmap.view(BV,J, -1), dim=2) 580 | # ind_xy (BV,J,2) 581 | ind_y = ind // W 582 | ind_x = ind % W 583 | ind_xy = torch.stack([ind_x, ind_y], dim=-1) 584 | 585 | # ind (BV,J,1,1) -> ind (BV,J,2,1) / reg (BV,J,2,HW) / reg_val (BV,J,2) 586 | reg_val = torch.gather(reg, 3, ind.unsqueeze(-1).unsqueeze(-1).expand(-1,-1,2,-1)).squeeze(3) 587 | 588 | # coord (BV,J,2) 589 | coord = ind_xy.float() + reg_val 590 | out_x2d = coord.clone()* image_w / W 591 | 592 | # (BV,1,3,3)@(BV,J,3,1) -> (BV,J,3,1) 593 | xdir = (torch.inverse(K)[:,None] @ eulid_to_homo(coord * image_w / W)[...,None]).squeeze(-1) 594 | xdir = torch.cat([xdir, max_val[...,None]], dim = -1).view(B,V,J,4) 595 | 596 | coord[:, :, 0] = (coord[:, :, 0] / (W - 1)) * 2 - 1 597 | coord[:, :, 1] = (coord[:, :, 1] / (H - 1)) * 2 - 1 598 | coord = coord.unsqueeze(1) 599 | xfeat = F.grid_sample(feature, coord, mode='bilinear', align_corners=True) 600 | xfeat = xfeat.squeeze(2).transpose(1, 2).view(B,V,J,128) 601 | 602 | out_x3d = self.fusion(xdir, xfeat) 603 | # out_x3d = self.fusion(xdir) 604 | 605 | return out_x3d, out_hm.view(B, V, J, H,W), out_reg.view(B,V,J,2,H,W), out_x2d.view(B,V,J,2) 606 | 607 | 608 | 609 | 610 | 611 | 612 | 613 | -------------------------------------------------------------------------------- /train2d.py: -------------------------------------------------------------------------------- 1 | import os 2 | import numpy as np 3 | import torch 4 | from torch.utils.data import Dataset, DataLoader 5 | import cv2 6 | import matplotlib.pyplot as plt 7 | from utils import * 8 | from trainer import Trainer 9 | from datasets import Human36M 10 | from models import pose2d_model 11 | from loss import Net2d 12 | os.environ['CUDA_VISIBLE_DEVICES'] = '0,1,2,3' 13 | 14 | cfg = { 15 | 'root_path': '/data/human36m/processed', 16 | 'labels_path': '/data/human36m/extra/human36m-multiview-labels-GTbboxes.npy', 17 | 'lr':1e-4, 18 | 'num_epoch':300, 19 | 'batch_size_train': 128, 20 | 'batch_size_test': 32, 21 | 'num_workers':32, 22 | 'num_keypoints': 17, 23 | 'scaleRange': [1.1,1.2], 24 | 'moveRange': [-0.1,0.1], 25 | 'image_size':384, 26 | 'heatmap_size': 96, 27 | # 'model_path': 'checkpoints/mobileone_s4_unfused.pth.tar', 28 | 'model_path': '/logs/pose2d_384/model_12.pth', 29 | # 'model_path': 'checkpoints/backbone_pretrain.pth', 30 | 'device':'cuda', 31 | 'save_dir': '/logs/pose2d_384', 32 | 'use_tag':False, 33 | 'data_skip_train':16, 34 | 'data_skip_test':16, 35 | } 36 | 37 | train_db = Human36M(cfg, is_train=True) 38 | test_db = Human36M(cfg, is_train=False) 39 | train_loader = DataLoader( 40 | train_db, 41 | batch_size=cfg['batch_size_train'], 42 | shuffle=True, 43 | num_workers = cfg['num_workers'], 44 | pin_memory = True, 45 | drop_last=True, 46 | ) 47 | test_loader = DataLoader( 48 | test_db, 49 | batch_size=cfg['batch_size_test'], 50 | shuffle=False, 51 | num_workers = cfg['num_workers'], 52 | pin_memory = True, 53 | drop_last=True, 54 | ) 55 | 56 | 57 | # trainer 58 | model = pose2d_model(num_classes=17) 59 | 60 | if 'model_path' in cfg: 61 | pretrain_dict = torch.load(cfg['model_path']) 62 | missing, unexpected = model.load_state_dict(pretrain_dict,strict=False) 63 | print('missing length', len(missing), 'unexpected', len(unexpected) , '\n') 64 | model = torch.nn.DataParallel(model, device_ids=[0,1,2,3]).cuda() 65 | 66 | net = Net2d(cfg, model) 67 | trainer = Trainer(cfg, net) 68 | trainer.run(train_loader, test_loader) 69 | -------------------------------------------------------------------------------- /train3d.py: -------------------------------------------------------------------------------- 1 | import os 2 | import numpy as np 3 | import torch 4 | from torch.utils.data import Dataset, DataLoader 5 | import cv2 6 | import matplotlib.pyplot as plt 7 | from utils import * 8 | from trainer import Trainer 9 | from datasets import MultiHuman36M 10 | from models import ProbTri 11 | from loss import Net3d 12 | os.environ['CUDA_VISIBLE_DEVICES'] = '0,1,2,3' 13 | 14 | 15 | cfg = { 16 | 'root_path': '/data/human36m/processed', 17 | 'labels_path': '/data/human36m/extra/human36m-multiview-labels-GTbboxes.npy', 18 | 'lr':1e-5, 19 | 'num_epoch':300, 20 | 'batch_size_train': 32, 21 | 'batch_size_test': 8, 22 | 'num_workers':8, 23 | 'num_keypoints': 17, 24 | 'num_views': 4, 25 | 'scaleRange': [1.1,1.2], 26 | 'moveRange': [-0.1,0.1], 27 | 'image_size':384, 28 | 'heatmap_size': 96, 29 | # 'backbone_path': 'checkpoints/backbone.pth', 30 | # 'fusion_path': 'checkpoints/fusion.pth', 31 | 'model_path': 'checkpoints/pretrain.pth', 32 | 'device':'cuda', 33 | 'save_dir': '/logs/pose3d', 34 | 'use_tag':False, 35 | 'data_skip_train':8, 36 | 'data_skip_test':4, 37 | } 38 | 39 | train_db = MultiHuman36M(cfg, is_train=True) 40 | test_db = MultiHuman36M(cfg, is_train=False) 41 | train_loader = DataLoader( 42 | train_db, 43 | batch_size=cfg['batch_size_train'], 44 | shuffle=True, 45 | num_workers = cfg['num_workers'], 46 | pin_memory = True, 47 | drop_last=True, 48 | ) 49 | test_loader = DataLoader( 50 | test_db, 51 | batch_size=cfg['batch_size_test'], 52 | shuffle=False, 53 | num_workers = cfg['num_workers'], 54 | pin_memory = True, 55 | drop_last=True, 56 | ) 57 | 58 | 59 | # trainer 60 | model = ProbTri(cfg) 61 | if 'backbone_path' in cfg: 62 | pretrain_dict = torch.load(cfg['backbone_path']) 63 | missing, unexpected = model.backbone.load_state_dict(pretrain_dict,strict=False) 64 | print('load backbone model, missing length', len(missing), 'unexpected', len(unexpected) , '\n') 65 | 66 | if 'fusion_path' in cfg: 67 | pretrain_dict = torch.load(cfg['fusion_path']) 68 | missing, unexpected = model.fusion.load_state_dict(pretrain_dict,strict=False) 69 | print('load fusion model, missing length', len(missing), 'unexpected', len(unexpected) , '\n') 70 | 71 | if 'model_path' in cfg: 72 | pretrain_dict = torch.load(cfg['model_path']) 73 | missing, unexpected = model.load_state_dict(pretrain_dict,strict=False) 74 | print('missing length', len(missing), 'unexpected', len(unexpected) , '\n') 75 | model = torch.nn.DataParallel(model, device_ids=[0,1,2,3]).cuda() 76 | 77 | net = Net3d(cfg, model) 78 | trainer = Trainer(cfg, net) 79 | trainer.run(train_loader, test_loader) 80 | -------------------------------------------------------------------------------- /trainPose.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn as nn 3 | from torch.utils.data import Dataset, DataLoader 4 | import os 5 | import numpy as np 6 | from loss import NetPose 7 | from trainer import Trainer 8 | from models import Fusion 9 | from datasets import MultiPose 10 | 11 | # 1e-3, 扩大范围的wingloss,batch size 128 12 | # 2e-4, 进一步降低loss,batch size 256 13 | 14 | cfg = { 15 | 'root_path': '/data/human36m/processed', 16 | 'labels_path': '/data/human36m/extra/human36m-multiview-labels-GTbboxes.npy', 17 | 'lr':2e-4, 18 | 'num_epoch':300, 19 | 'batch_size_train': 256, 20 | 'batch_size_test': 64, 21 | 'num_workers':8, 22 | 'num_keypoints': 17, 23 | 'num_views':4, 24 | 'image_size':384, 25 | 'heatmap_size':96, 26 | 'device':'cuda', 27 | 'model_path': '/logs/tri/model_125.pth', 28 | 'save_dir': '/logs/tri', 29 | } 30 | 31 | train_db = MultiPose(cfg, is_train=True) 32 | test_db = MultiPose(cfg, is_train=False) 33 | train_loader = DataLoader( 34 | train_db, 35 | batch_size=cfg['batch_size_train'], 36 | shuffle=True, 37 | num_workers = cfg['num_workers'], 38 | pin_memory = True, 39 | drop_last=True, 40 | ) 41 | test_loader = DataLoader( 42 | test_db, 43 | batch_size=cfg['batch_size_test'], 44 | shuffle=False, 45 | num_workers = cfg['num_workers'], 46 | pin_memory = True, 47 | drop_last=True, 48 | ) 49 | 50 | 51 | 52 | model = Fusion(cfg) 53 | 54 | if 'model_path' in cfg: 55 | pretrain_dict = torch.load(cfg['model_path']) 56 | missing, unexpected = model.load_state_dict(pretrain_dict,strict=False) 57 | print('missing length', len(missing), 'unexpected', len(unexpected) , '\n') 58 | # model = torch.nn.DataParallel(model, device_ids=[0,1,2]).cuda() 59 | model = model.cuda() 60 | 61 | net = NetPose(cfg, model) 62 | trainer = Trainer(cfg, net) 63 | trainer.run(train_loader, test_loader) -------------------------------------------------------------------------------- /trainer.py: -------------------------------------------------------------------------------- 1 | import torch 2 | from tqdm import tqdm 3 | import tensorboardX 4 | import time 5 | import os 6 | 7 | # EMA 8 | class EMA(): 9 | def __init__(self, model, decay): 10 | self.model = model 11 | self.decay = decay 12 | self.shadow = {} 13 | self.backup = {} 14 | 15 | def register(self): 16 | for name, param in self.model.named_parameters(): 17 | if param.requires_grad: 18 | self.shadow[name] = param.data.clone() 19 | 20 | def update(self): 21 | for name, param in self.model.named_parameters(): 22 | if param.requires_grad: 23 | assert name in self.shadow 24 | new_average = (1.0 - self.decay) * param.data + self.decay * self.shadow[name] 25 | self.shadow[name] = new_average.clone() 26 | 27 | def apply_shadow(self): 28 | for name, param in self.model.named_parameters(): 29 | if param.requires_grad: 30 | assert name in self.shadow 31 | self.backup[name] = param.data 32 | param.data = self.shadow[name] 33 | 34 | def restore(self): 35 | for name, param in self.model.named_parameters(): 36 | if param.requires_grad: 37 | assert name in self.backup 38 | param.data = self.backup[name] 39 | self.backup = {} 40 | 41 | 42 | # AverageMeter 43 | class AverageMeter(): 44 | """Computes and stores the average and current value""" 45 | def __init__(self): 46 | self.reset() 47 | 48 | def reset(self): 49 | self.val = 0 50 | self.avg = 0 51 | self.sum = 0 52 | self.count = 0 53 | 54 | def update(self, val, n=1): 55 | self.val = val 56 | self.sum += val * n 57 | self.count += n 58 | if self.count > 0: 59 | self.avg = self.sum / self.count 60 | 61 | # Logger 62 | class Logger(): 63 | def __init__(self, cfg, is_train): 64 | if not os.path.exists(cfg['save_dir']): 65 | os.makedirs(cfg['save_dir']) 66 | 67 | time_str = time.strftime('%Y-%m-%d-%H-%M') 68 | if is_train: 69 | save_path = os.path.join(cfg['save_dir'],'trainlogs') 70 | else: 71 | save_path = os.path.join(cfg['save_dir'],'testlogs') 72 | 73 | if not os.path.exists(save_path): 74 | os.makedirs(save_path) 75 | 76 | log_dir = os.path.join(save_path, 'logs_{}'.format(time_str)) 77 | print(log_dir) 78 | self.writer = tensorboardX.SummaryWriter(log_dir = log_dir) 79 | 80 | def scalar_summay(self, tag, value, step): 81 | self.writer.add_scalar(tag, value, step) 82 | 83 | 84 | # Trainer 85 | class Trainer(): 86 | def __init__(self, cfg, net): 87 | self.cfg = cfg 88 | self.net = net 89 | self.optimizer = torch.optim.Adam(filter(lambda p: p.requires_grad, net.parameters()), lr=cfg['lr']) 90 | 91 | # for state in self.optimizer.state.values(): 92 | # for k, v in state.items(): 93 | # if isinstance(v, torch.Tensor): 94 | # state[k] = v.to(device=cfg['device'], non_blocking=True) 95 | 96 | self.ema = EMA(self.net, 0.999) 97 | self.ema.register() 98 | 99 | self.train_logger = Logger(cfg, is_train=True) 100 | self.test_logger = Logger(cfg, is_train=False) 101 | 102 | def run(self, train_loader, test_loader): 103 | start_epoch = 0 104 | for epoch in range(start_epoch + 1, self.cfg['num_epoch']+1): 105 | # train 106 | log_dict = self.run_epoch(train_loader, is_train=True) 107 | for k, v in log_dict.items(): 108 | print('train epoch=', epoch, k, v.avg) 109 | self.train_logger.scalar_summay(k, v.avg, epoch) 110 | 111 | # test 112 | log_dict = self.run_epoch(test_loader, is_train=False) 113 | for k, v in log_dict.items(): 114 | print('test epoch=', epoch, k, v.avg) 115 | self.test_logger.scalar_summay(k, v.avg, epoch) 116 | 117 | if hasattr(self.net.model, 'module'): 118 | model_state_dict = self.net.model.module.state_dict() 119 | else: 120 | model_state_dict = self.net.model.state_dict() 121 | 122 | save_path = os.path.join(self.cfg['save_dir'], f'model_{epoch}.pth') 123 | 124 | torch.save(model_state_dict, save_path) 125 | 126 | 127 | def run_epoch(self, data_loader, is_train=True): 128 | if is_train: 129 | self.net.train() 130 | torch.set_grad_enabled(True) 131 | 132 | else: 133 | self.net.eval() 134 | torch.set_grad_enabled(False) 135 | 136 | avg_loss_stats = {key: AverageMeter() for key in ['time/data', 'time/infer']} 137 | 138 | t0 = time.time() 139 | for iter_id, batch in tqdm(enumerate(data_loader)): 140 | for key in batch: 141 | if isinstance(batch[key], torch.Tensor): 142 | batch[key] = batch[key].to(device = self.cfg['device'], non_blocking=True) 143 | 144 | t1 = time.time() 145 | avg_loss_stats['time/data'].update(t1-t0) 146 | t0 = time.time() 147 | 148 | output, loss, loss_stats = self.net(batch) 149 | 150 | t1 = time.time() 151 | avg_loss_stats['time/infer'].update(t1-t0) 152 | t0 = time.time() 153 | 154 | if is_train: 155 | self.optimizer.zero_grad() 156 | loss.backward() 157 | self.optimizer.step() 158 | self.ema.update() 159 | 160 | if 'loss' not in avg_loss_stats: 161 | for key in loss_stats: 162 | avg_loss_stats[key] = AverageMeter() 163 | 164 | for key in loss_stats: 165 | avg_loss_stats[key].update(loss_stats[key].item(), data_loader.batch_size) 166 | 167 | return avg_loss_stats 168 | -------------------------------------------------------------------------------- /utils.py: -------------------------------------------------------------------------------- 1 | import cv2 2 | import torch 3 | import numpy as np 4 | 5 | class Camera(): 6 | def __init__(self,R,t,K): 7 | self.R = np.asarray(R).copy() 8 | self.t = np.asarray(t).copy() 9 | self.K = np.asarray(K).copy() 10 | 11 | def update_after_crop(self, bbox): 12 | left, upper, right, lower = bbox 13 | 14 | cx, cy = self.K[0, 2], self.K[1, 2] 15 | 16 | new_cx = cx - left 17 | new_cy = cy - upper 18 | 19 | self.K[0, 2], self.K[1, 2] = new_cx, new_cy 20 | 21 | def update_after_resize(self, image_shape, new_image_shape): 22 | height, width = image_shape 23 | new_height, new_width = new_image_shape 24 | 25 | fx, fy, cx, cy = self.K[0, 0], self.K[1, 1], self.K[0, 2], self.K[1, 2] 26 | 27 | new_fx = fx * (new_width / width) 28 | new_fy = fy * (new_height / height) 29 | new_cx = cx * (new_width / width) 30 | new_cy = cy * (new_height / height) 31 | 32 | self.K[0, 0], self.K[1, 1], self.K[0, 2], self.K[1, 2] = new_fx, new_fy, new_cx, new_cy 33 | 34 | def projection(self): 35 | return self.K @ self.extrinsics() 36 | 37 | def extrinsics(self): 38 | return np.hstack([self.R, self.t]) 39 | 40 | def eulid_to_homo(points): 41 | """ 42 | points: (...,N,M) 43 | return: (...,N,M+1) 44 | """ 45 | if isinstance(points, np.ndarray): 46 | return np.concatenate([points, np.ones((*points.shape[:-1],1))], axis=-1) 47 | elif torch.is_tensor(points): 48 | return torch.cat([points, torch.ones((*points.shape[:-1],1),dtype=points.dtype,device=points.device)],dim=-1) 49 | else: 50 | raise TypeError("Works Only with numpy arrays and Pytorch tensors") 51 | 52 | def homo_to_eulid(points): 53 | """ 54 | points: (...,N,M+1) 55 | return: (...,N,M) 56 | """ 57 | if isinstance(points, np.ndarray): 58 | return points[...,:-1] / points[...,-1,None] 59 | elif torch.is_tensor(points): 60 | return points[...,:-1] / points[...,-1,None] 61 | else: 62 | raise TypeError("Works Only with numpy arrays and Pytorch tensors") 63 | 64 | 65 | -------------------------------------------------------------------------------- /v1/A.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 4, 6 | "metadata": {}, 7 | "outputs": [], 8 | "source": [ 9 | "import os\n", 10 | "import numpy as np\n", 11 | "from collections import defaultdict\n", 12 | "from tqdm import tqdm\n", 13 | "from utils import *\n", 14 | "\n", 15 | "import math\n", 16 | "\n", 17 | "from triangulation import ProbabilisticTriangulation,cal_mpjpe_batch\n", 18 | "from calibration import CalibrationBatch\n", 19 | "from torch.utils.data import Dataset, DataLoader\n", 20 | "\n", 21 | "from dataset import H36M, H36MPred\n", 22 | "import torch\n", 23 | "import torch.nn as nn\n", 24 | "import numpy as np\n", 25 | "import cv2\n", 26 | "from utils import *\n", 27 | "import matplotlib.pyplot as plt\n", 28 | "from tqdm import tqdm" 29 | ] 30 | }, 31 | { 32 | "cell_type": "code", 33 | "execution_count": 10, 34 | "metadata": {}, 35 | "outputs": [], 36 | "source": [ 37 | "cfg = {\n", 38 | " \"nB\":32,\n", 39 | " \"nV\":4,\n", 40 | " \"M\":32,\n", 41 | " \"isDistr\": False,\n", 42 | " \"cube_min\": [-0.16, -0.22, 0],\n", 43 | " \"cube_max\" : [0.22, 0.28, 0.21],\n", 44 | " \"cube_shape\": [64,64,32],\n", 45 | "}" 46 | ] 47 | }, 48 | { 49 | "cell_type": "code", 50 | "execution_count": null, 51 | "metadata": {}, 52 | "outputs": [ 53 | { 54 | "name": "stderr", 55 | "output_type": "stream", 56 | "text": [ 57 | "69it [43:29, 37.76s/it]" 58 | ] 59 | } 60 | ], 61 | "source": [ 62 | "h36m = H36MPred()\n", 63 | "h36mloader = DataLoader(h36m, batch_size = cfg[\"nB\"], shuffle = True)\n", 64 | "db = []\n", 65 | "for iter_i, batch in tqdm(enumerate(h36mloader)):\n", 66 | " calibr = CalibrationBatch(cfg,batch['pose_2d_pred'],batch['confidence'])\n", 67 | " weights_log = calibr.monte_carlo()\n", 68 | " R,t = calibr.prob_tri.getbuffer_Rt()\n", 69 | " for j in range(cfg[\"nB\"]):\n", 70 | " db.append({\n", 71 | " 'pose_3d' : batch['pose_3d'][j],\n", 72 | " 'pose_2d_pred': batch['pose_2d_pred'][j],\n", 73 | " 'Rpred': R.quan[:,j],\n", 74 | " 'tpred': t.vector[:,j],\n", 75 | " 'Rgt': batch['Rgt'][j],\n", 76 | " 'tgt': batch['tgt'][j],\n", 77 | " })" 78 | ] 79 | }, 80 | { 81 | "cell_type": "code", 82 | "execution_count": null, 83 | "metadata": {}, 84 | "outputs": [], 85 | "source": [ 86 | "np.save('after_monte.npy', np.asarray(db))" 87 | ] 88 | }, 89 | { 90 | "cell_type": "code", 91 | "execution_count": 7, 92 | "metadata": {}, 93 | "outputs": [ 94 | { 95 | "data": { 96 | "text/plain": [ 97 | "[,\n", 98 | " ,\n", 99 | " ,\n", 100 | " ,\n", 101 | " ,\n", 102 | " ,\n", 103 | " ,\n", 104 | " ,\n", 105 | " ,\n", 106 | " ,\n", 107 | " ,\n", 108 | " ,\n", 109 | " ,\n", 110 | " ,\n", 111 | " ,\n", 112 | " ,\n", 113 | " ,\n", 114 | " ,\n", 115 | " ,\n", 116 | " ,\n", 117 | " ,\n", 118 | " ,\n", 119 | " ,\n", 120 | " ,\n", 121 | " ,\n", 122 | " ,\n", 123 | " ,\n", 124 | " ,\n", 125 | " ,\n", 126 | " ,\n", 127 | " ,\n", 128 | " ]" 129 | ] 130 | }, 131 | "execution_count": 7, 132 | "metadata": {}, 133 | "output_type": "execute_result" 134 | }, 135 | { 136 | "data": { 137 | "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjUAAAGdCAYAAADqsoKGAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAADUjklEQVR4nOydd5wb1dm2r5lR1/Zdbe/uFXfjhum2waYEEyBvKCEJCUle4IUkwBecEAIhgVBCCAQIEMAktNC7KTbYBowbuG7vfVe7klZdM/P9MdrmAu6mnIuffpJGZ86cGa2ZW8/9nPNIuq7rCAQCgUAgEHzNkY/2AAQCgUAgEAgOBULUCAQCgUAg+EYgRI1AIBAIBIJvBELUCAQCgUAg+EYgRI1AIBAIBIJvBELUCAQCgUAg+EYgRI1AIBAIBIJvBELUCAQCgUAg+EZgOtoDOJJomkZzczOJiYlIknS0hyMQCAQCgWAf0HUdn89Hbm4usrz3eMy3StQ0NzdTUFBwtIchEAgEAoHgAGhoaCA/P3+vn3+rRE1iYiJgXJSkpKSjPBqBQCAQCAT7gtfrpaCgoP8+vje+VaKmz3JKSkoSokYgEAgEgq8ZX5Y6IhKFBQKBQCAQfCMQokYgEAgEAsE3AiFqBAKBQCAQfCMQokYgEAgEAsE3AiFqBAKBQCAQfCMQokYgEAgEAsE3AiFqBAKBQCAQfCMQokYgEAgEAsE3AiFqBAKBQCAQfCMQokYgEAgEAsE3AiFqBAKBQCAQfCMQokYgEAgEAsE3gm9VQUuB4OtEW3cDO+rWoGmxPXyq7+WdvmvDodv0vWz/wt73vlXX99bHgW3f7dO99L/ndjqg7dJCQ0If1I8+5LO+bQPnoQ0aS7w/faB/XddB0/uPqevawPF1+tvruta/TUcf0kf/WHQ9vkkbtEmPP+LXNv6soaPpOrqmo2O01/r20wftp2lImgqqBqqKpGugGQ9dB12TUJHQJAAJXZeGXEtdkpD6rsCgwoGaLoE00GbQDoOutjRo08B2fZf6gzpSf99D9mPX/aSh+xgj7n+963H2+HrXsfSdBPHzlnTj9cBJx9v0be97PfD739g2qA3xayb1bR10rZBAl4zvSwIdGXTQkI3949dfl+JD6f8y9P5xyNrAdyTpA5/0b9SJj6CvjR4fGaAPnHHfH5mkx9tIxjPxM5DQjff6wBWT9EFXQ9eR+/796BISmtGXNLCPpBv/3tIThnPp5b/kaCBEzSGgrboSXdeRFSX+MKGYjGdZUZBNJhRFQVIUFMWEJMtfWmlU8O1D03Q21W5jR/XLyMGVZDmrjtpY9vWv85v2V6wDKiZiKKgoaCjx16ZBrwc/TLu8jz8kE6ok7+FzEyr7u30P/R9AX7okY4uFKPI0UuJrxB4LIWOILxkNWdeRdR0JHVnXkHRt4H3/5xoSxJ8Hfd7fhyHyZF1D7u/HuBn2vYeBtnL/jbOv/aDjx9tJu/YH/ceW4/v13ZAHv5cH9SsNOn7fjVlmUHt9UJu+Y6L1bxu8v8SubQe/N7YBQ/vfbR+Qpb3/qPi6c0f4R0ft2ELUHAJevO0mervd+7WPrJiQTXGRoygocTEkx8XQgAjafbtsMiHLSr9gkuNiSTYpxvb4tiHtBwkuWVFQBrWRFXkPbQf2GdJ2b2OWlcN0dY2bfUzTUTWdmKahaRDTtPh7fchzX5tdtw2814ip8W16fLs69PM99TvwGUYfg7erg/r60mNo/e8jEZVQOIZLbmBS6mbGZG8mLbmRHBlwGucecheiRp17vjB7/X/i7lJjr013/Rn9BX0M7ktDRpWU+M1Tjt/0ZeNGPni7pAxsR0GLPxvvB9pqkjwgJPo/M54H+h28XUFD6n89+DNtl34HPo/33Tc+SdllTMaN/+uONRqiuKeRUm89Jf5GSoKNlESbKVGbydHd3+ib6WHnKKt4LR7Z0YbIKQkt/tCR0SX6pZQ26PMBiUf/ay1+Qv39Sbu2H3QMqU+WDdpf6tt/sAQFu9V2dC4QQtQcEhLS0pFkBU2NoakqmqqiqjG0mIqm7sk6IN42RozwER7t4UFHQpdlkGR0SUaXjJuFLslo8ZuF1vdg4LUa/4fQfzNC6r8hxga93nU/ba/bJLT4jbS///7P45/ttr8Uf9130zO2c4ijabIO6ZpEpioxOrGR0dmbyR25CVti28B11GR6O0bR3DaLKu8xNJoSicoSqmzoD63/tYQm68bYZQY9G220/jbGs7Ev6LJhPfR9hgS6bISi9fhrTZb63xt9SfFrZLzW5a//jX9/kDQNWVONX/mahqKpSJqG0rdd01B0FUVTkTUVRTfaGNsGfaZrg7YNvJaJ96tr8TbaQJu+48T6+lSQJQcx3YxdD5Co9ZCqd+HSO8jROymgnRy69i5cJPDqDhp1F37dDjDkxsUuN7K+G5hhU+3pwZD3hjOxp89A77eqBh+PIW339n7wdnb5fHCfxmuGvCZuB+k6xr9paagN1be/JIEU/7ciYVhkxjYZXZKQJQlJlpCQjWdZRpZkFFlBNslIssl4r5iQzSZk2YRZMWEyWVHMZswmMybFhMlsxWw2o1hsWOLPJqsFs8WK2WzBZLGimG3IJjOK2YysmFEU41lWTCBJHL6fkIeGOUfx2ELUHAI2TLqYnkB0yK90Te/7Fa+hqhqaGkNVVdAMoaPHVDRNRVeNZ1QVPe6Fa1osHvrVBsK0g14r/aFfDWUPn+9pn4HwsCEjlEGfS7puvB/0+dB9djl+fy7CABK64eWjHvkv4DChI4Esgxz/BS8rSLICsgJyPNgsyYCMpsvouoSmGQ9dk5GRUZBRUFAkiYRMNylFLZiLOvEmOOnCRTXH0Km7aAsW0hnLoduUiifThpb99RMOshobcqOXh4gB4wYuxUXBkM93EQy7fRZ/bdINoWDSVExaDEXXMGsxlPhnJn1gu9HW+Hvtey3puvHvBT2eCxC3CIyEAWQdZOPOZ/zXl2MhGVEcVTKhSX3WkkJMNqFignh0R9J1LLqORQKLJGNVFCxmCzaLGYvFgs1ixWqzYrXbsTmcmCxmesMRWto68VTUQFcT5qgHs8lKzJKM15xE1KKhWPw4pR7SpW6ypU7y6SBH6kLZVbgM0uA+3U6jnkmrnk4nSfjMCSRmFjB56kmUTJjDWJvjCP1VCARHFiFqDgGrKzpp9+1PxEWOP8wDmyRAiT++BLNi/GowyRKKLGFSZONZjm9XpP73iiz3t+t7mOShnysymGR56GfxPhTJuLHLivErRZElZAkUdBRJwxT3rk3xAKaC4cUrexBXfUJN0tT4axXi29A0JN0QfWga6MavYb3vvdYn+tT+10ZETEWLxYZEyXaNlA1sjw1q3/eZ1t9mVyT0+HjUPUadB2/r+9pUWabXkYQvIRlvUgLhHAfBLCe+pGTcSjpdZBCUnLt3tOsmTSXR78Pp92JWY4PEwNCbvqLFr/cePpMHRxP03cXGbhGHeD9mLWaIBz2GWVdR4s+mPhGhG23MWgyTrmJWo5i1PsGtD0QZVA1Fi4sFTUPS+pIYjRBRf9RMjgsF2URMMROVzURMFqKKlYjJSthsIWKyErNYiJlNRM1mVNPg/3Xt5d9Tf7Im9MVLo3v5N2VVFCwmE1azGavZYogPqxWbw4HNbsfmdGJ1OIxtNhtWq7X/0ffeZDL158pFI2G62tqprqmjZusWOndUobhbsWu9SApothRClhS6rElErSpmk4/k7AipBMmWGsmngzypE5O0+w+IPvy6jUbdRSvpdCnJRJIyyR0+hbHTTiM9q5AxssyYve4tEHwzEaLmEPD/ThtDJKbtWTgoA8Jid8EhxcUEQ8TH4M/kXQSHLH/TUjOPDmpMw+cO4esyHp6OAN7OAN4uP94uP0FPGGMmjIauq8YzGkEzeJ0mvA4Fj1PB6zTjcZjxOix4nRZ67Wb0ffiOnOEISUE/iYEuUoKtZPa2kdfbRZrfTUpvN4khH7om4ZUcaJI8yBwgnsQYT4zUjRkHsqbHBcSAmFBUDUVVUVQdJaZhjmnIUQ1TTEeJGJaGHANJ1ZFUkLW9pwzogKooRM3mPTwSiJrNxMwmguYB4bGnR8xs3ssRDgyzyWSIC5ttN7HxRQJk8Huz2Yy8H5ZaLBrF19lBQ2MzVVUVdO3ciVbXiLWnG5scQjKBanYStafis6bQ5kgiluTAkeIiFTMu3ORLFRTQQb7U8YXCJaBbacJFq5ROtzUdOauEotFzKB05E2d6HqMkiVGH4kIKBN8QJH3vczK/cXi9XpKTk/F4PCQlJR3t4QgOI6qq4e8O4+0K4e0M9osXb5fxurcnvFv2bEwGr0PG45DxOmS8CQq9iQoeh0y3VcJjl4mavlywmPQo6XSSTicZmpecXjPZnjSioQx8eiXpkecZHfmcWcEAWbtEiNwRJ11uJ7FWM1qTguzbc07Wfl0LWSZmMhG1mImazHt8jtlsRG02YlYbUYslLkJMRBWFqCwTkfqmtx4aZFneJ8HxRe+tViuKcmizC9RYlF53F52t7dTU1dJUUU6gtgZzayf2Xg9WUwzZbCJmthGxJBF0JNPpTKHNloAsh0jCS5regws3OVIXhVIbBVIHZmnvtmxQt9CEi3YlHV9CDtb80eSWTqF42HQsKbmHPLdLIPg6sq/37wOK1Nx3333cfvvttLS0MG7cOO6++27mzZu31/Z///vfuffee6mtraWwsJDf/OY3XHTRRUPa3H333dx///3U19eTkZHB0qVLufXWW7HZjCxqn8/HsmXLeOGFF2hvb2fy5Mn89a9/Zfr06QdyCoKvOZqm4+8J9wsWb1cIX2fQeI6LFl0bUC064LdKeJ0yHqeMx2XFl6AQSDEbosUCXvO+3TzsYY2kSJRMuslTakm3lpMut5ERFzJpQQVT11TqwjPYohWjqq+TGH2fY4OdHBMKDzZJiGkS3p5Egu0OgpU60d6+m7SORIyI2UzUakF1JhBLSCDmcBCz2w0BYrMSNVvi4sQQIBFZNkQIENF1IpqGeoh/txyIANn1vfkQR2z2BTUWw9/txtPRTlNTC3W1VfRUV0JzKzZ3N45gAItZR7dYiZlthC0Oos4kwglJNAyfQJdixaKHScZDOoZwGS7VcaLURoHUjuULhEtIN9MiZdBuziSQnI89dzi5JdPIL5mCPTmP4ZLE8CN4LQSCbyr7LWqefvpprrrqKu677z7mzJnDAw88wKJFi9i+fTuFhYW7tb///vu5/vrreeihh5g+fTrr1q3jxz/+MampqSxZsgSAJ598kuuuu45HHnmE2bNnU15eziWXXALAXXfdBcCPfvQjtm7dyhNPPEFubi7Lly/n5JNPZvv27eTl5R3EJRB8FdE0nYAnvJtY6Y+0uMNog0RLRBkUZUlR8OTa8DoV/MkKXodCtwWi+6BZTDGd5IBGUkDFHtUACasG2TGdCWEP00zrMKetI5C60/Br4kiBTEI9Y+nomUllpx+HupYRtte4WuvdLRoTCJgJNtnwNVvp8KfgtyUQcDgIFDkIpmcQysgg4LDTq+tE95Dr8yUXznjsAbPZfNCCxGKxfCXXWNJUld5uN76uTrra2qirr6O1poZYYz2WTjdOnxd7OIRiMaNbLEQtdjSLHZMzAX+Ki/bcUrrMDnp1M2Y9TDJe0ukmEzfjpTKK48LFKu09chbWTbTILrosmYRS83HkjCSvdBqZhZOwJeZSIsuUHMFrIhB8G9lv+2nmzJlMmTKF+++/v3/bmDFjOOuss7j11lt3az979mzmzJnD7bff3r/tqquuYv369axevRqAX/ziF+zYsYN33323v80111zDunXr+PDDDwkGgyQmJvLSSy9x+umn97eZNGkSixcv5uabb96nsQv76auDrukEfBG8nSF8XcHdxIvPHUJTjT9NTYJem2TksfQJF6fx8CWa8Ngl/PtgC6HrJAZ1kgIayQE1Ll407GGdqGJEcvwOhVRZYlhMZ5JHY1S4lYhrA77M9YRSquLLfhrI/kxirbk0bdeQurrIyOykONHNKD045NdCTJXp7E2hIZRNtVpImzWTgMNByG7fp2ulKMpBWTQ2mw2LxXLIrZojhaap+Lu78XV14u3soKmpmcaGWvz19Sjtbdh7PCQGAlhVDclsRrNYiFnsBG0O/E4n/oQE/AkJeC12fLqVXt0SFy4eMvRuMiU3+VIHxVIrRVI7Vmlv6cQQwUSblEGXLZNIagGO3JHklUwjNe8YSMozZssJBIJDzmGxnyKRCBs2bOC6664bsv3UU09l7dq1e9wnHA73W0h92O121q1bRzQaxWw2M3fuXJYvX866deuYMWMG1dXVvP7661x88cUAxGLGrJU99dMnjPZ27HB4YFaS1+vdn9MVHAS6rhP0RY3ISudAhKXfKuoKocaMiELILA3ksThkPNkynlKHIVoSFDw2qX+Rpy/CpmokB1USfTpJfq1ftCQHNJL9GokhDY9dpj1FoTXVRFuKQrvLxPCwznR3mMndEQpaTUSdLfRmrsc3agNNSXVDjhFzp9FRbqa33k4OYQqStzMzsZOk5Ej8xI2nDj2VSqmECoqpV/KIJZsgefcxm0wmkpOT9/pITEzEYrEc1HfxVUbTVAI9Pfjcnfi6OulsbaOxqRF3Qx16SwvW7m4Sev04olEsMQ1MJnSrjTSLDZvTid+ZiKc0m2ZnAkGbjQBmQ7hoFkx6hBQ8ZNBNllTDFK2dIqmNYrkV2xcIlygK7XI6blsW0fQCnLmjySuZRkLWeCzJ+RTICgVH8BoJBIJ9Z79ETWdnJ6qqkpWVNWR7VlYWra2te9xnwYIF/POf/+Sss85iypQpbNiwgUceeYRoNEpnZyc5OTmcf/75dHR0MHfuXHRdJxaLcfnll/eLp8TERGbNmsUf/vAHxowZQ1ZWFv/5z3/45JNPGDFixF7He+utt/L73/9+f05RsI/ouk7IH41HWkKDxIsRefF1hYhFNVQJfPZ4HkufcMmX8Yx0GK+dCqF9yGVR0MkiSko4hMMfw+7TcPQopHgUkgOGZWSLDkRRogq0Jyu0pZjYXmChLUVBd/qZFgowtUfnzG4TOWUWkqMmdHTCifX4ctdTl7mBSELzoPMEX0cKXa3ZyF0OSlQ3x5mayM9uja+taRDBRA2FVFBMJcX0SMmg6zhNJjKTkkjJziYlNbVfrCQlJZGcnIzD4fhK2jmHAl3TCHg9+Do78Lk78XR20NrUQktzA5HmJsxdXTh9PhyRKPZIDEtMBZOFNIsFh91pRFlSs2nLdxJwOgg6HKi6hF+34NWt+HQrZj1Maly45KgNhmiRWilW2rBLkb2OLYZMu5xBjyOLWFo+ifljyS6cij1rHObkAvJkBWFqCwRfPw4oUXjX/wnrur7X/zEvW7aM1tZWjj32WHRdJysri0suuYTbbrutPxy+cuVKbrnlFu677z5mzpxJZWUlV155JTk5OSxbtgyAJ554gksvvZS8vDwURWHKlCl873vfY+PGjXsd5/XXX8/VV1/d/97r9VJQIH5j7Qu6rhMOxPoFS5946bOKvF0homGVoGUgyuJxxp9LZLzjnHgcCr32fZs1kyqp5OkBsmMeEiMelN4Iuk9G8TlwehNI67TgGJIUEy8IRwyfTaIx3UR7ikw0KYojLUpmmkSJzcKCsJPsHitabQBLqwmLmkCEGD4pgDtlO02ZG9Ezt2Gye/p71jSZnp5sPJ05pHRpDI82cQobSVACQ9YR6tBTqQ/n0hTMoF1zYE8IYXUFmTYxhVHHXEJaWv7X1vL5MnRdJ+j14OvqjD866Gprp7mlEW9LE3J7Bw6PNy5YotijMSxRlSSTBafVSsAZFy056XTHXwftdpAkorqMT7fi1W34dCPikkoPrmg3OVInxVIbxVIbRXIrTmnv60OpyHQoaXgcWagZhSTljSUjfwq2zLGYkgvJVUzkHsFrJhAIDj/7lVMTiURwOBw8++yznH322f3br7zySjZv3syqVav2um80GqWtrY2cnBwefPBBrr32Wnp6epBlmXnz5nHssccOybtZvnw5l112Gb29vUPWkPD7/Xi9XnJycjjvvPPo7e3ltdde26fxi5yaoYSDMUOkDIq2DBYvgYg6ZIqzp1+4KP3b9mWKswWdPClMnuYjN9JFRqgdS9CNHuklqDoJhjKJhLMgkEKSz4zLq6LsIddVk6ArUcafAlK6RFKOleyiVEpysyhx2slVIVbroauihdaaNnrdPQQI4ZdC9EphemU/ckodqa5a0tMbsFqD/X2rqkK3O5dYVwrZXQFGxOrJp3XIcvMxVaarJ5E6v50qWxp6iZ2UwjqsGSDJZnJzz6Ok+GdYrVm7D/5rhGEdevsFS2+XEWVpa2mms60Fra0VW08P9rARYRksWjCZidiNXJaAM57TMki06LKMrkMIEz7dZogXrU+4eHBJ3eTF81tKpFaKpDYSpNBex6oi06mk4U3IRMsoIiV3HKl5U7C4RkNKIShHfpaVQCA49ByWnBqLxcLUqVNZsWLFEFGzYsUKzjzzzC/c12w2k5+fD8BTTz3F4sWL+8VKIBDYbfErRVHQdZ1dNZfT6cTpdNLd3c1bb73Fbbfdtj+n8K0iEor157DsulaLpytEtzYgWgyRouBJkPFkyXgdifht+5b06JJi5BEgL9ZDfqiNHH89ib0tEPUTkG10WVKot+bSrhdSE8qivbeQrB6VrG6VrOCu6sWY7RM1QzjdhDnbQWpBEvlFSYwsTCZNVvF6PHg8HnpaunB/tp2Wd9dQ1uvFpwYIEhlYQc4MkqSSmtpCekY9+ekNmM0DlkQsaiLUmk5yo05pZxe59i1YLfHZLfE+fL0mysM2PrLZ+KyogPGnn8w4cwOZnncxIkUyOdlnUVJyBXb7Vz8KaNiGvYYl1NVJbzyXpautjfa2FgLtbVjcbhyhMLZIDHs0LloiMYpjKsUmM0FHXKgkJON1OmlJGCpaADQd/LqlX7j4dAtyLEoaHjJxUyC3GzaR1EqxqY1EKbjXMatIuE2peBOyIKOY5NyxJOdOwZwxGiWlkCyTha+3jBQIBIeK/bafrr76ai688EKmTZvGrFmzePDBB6mvr+enP/0pYFg+TU1NPP744wCUl5ezbt06Zs6cSXd3N3feeSdbt27lscce6+9zyZIl3HnnnUyePLnfflq2bBlnnHFGf/j+rbfeQtd1Ro0aRWVlJb/61a8YNWoUP/jBDw7FdfhaEg2rQ6Y5D55B1NUTok1X+/NW+oVLiownV8HrSERVvjzKYpd08qUweaqXvEgXeYEm8nzVZPdUomghQrKNZlsmNfY86mx5fGjPpTF5LImylayemCFeelRGeGKMi4EhBIb+8taSzdhzHKTnOshymUhMUolgROQ8nkY8nR52VHn4xOMltpcCoUC/EDFJUTJSGknNqCUlowWTedC06ICMc6eOqylClurBmdJqlO6JJ/FGVZnymIUP7XZecdnoHpXGwuKFLMifzUm979PS+gR6wBhDpmsRpaVX4XR+NVYY0XWdsN8fT7rtwNdpiBZPZwcdba142tvA3YUjGDIiLNFYv3DJikQpjMTQrNaB6EpyKt1OJ43x9wGHY0gxy5guxQVLXLioFuSoSjoeXFInRXItJVIrxXIrxVIbSVJgr2PXkOg2peBLzEJylZCQM5ak7MmYM0ahpBbhMllxHYmLKBAIvtbst6g577zz6Orq4qabbqKlpYXx48fz+uuvU1RUBEBLSwv19fX97VVV5Y477qCsrAyz2cwJJ5zA2rVrKS4u7m9zww03IEkSN9xwA01NTbhcLpYsWcItt9zS38bj8XD99dfT2NhIWloa55xzDrfccstRWcTrSBGLqPjcA7OF+qItPV0hmv0h2jRtIPk2nojrccl4ixSC1sQv7V9GJ1tWydX95EV7yAu1kuevJ99TQZ63ioxINx5zAnX2PGptudTa8/jUnsdzGYtoyM8mKplIDOr94iW7QeX4nhhpvjDSnqqPmyTs6WYS0yTMCSqaxU8QDz5/D10eD43b9/5rvR8dHFhw6jacmhVZVdHDvUTVVpT0MlLyWknM9yOZByJ8Sg8kVmlkdobJloOYrdqQmUjtcgLv2GTeSbKxyWYFxcy8vHlcNWwJszLH09L4KI1V/0uvZkR50tPnU1p6NUmJ4798vIeQcMA/JIfF19WFr6uD7vY23B3tRNyd2ALBuCU0IFoSozEyIzGs0RgRm21AtKQ46RhkEwUcDrRdcoDCuoJPtxmJuZqVXtWGrKmk0UOW1EGJXGtEW2TDLkr+EuHSY06mNzEbOaMYR/Y4krInYXKNQk4tJt1kJf1wX0SBQPCNRpRJOIqoUS0uWoZGWtp6QjQGIrTp6iBraFAyrl3ep/pCSZJm5LKoXnIjneTHoyx53TvJCzaTHekkKNuotedSa8uj1p5nvLbnUWvLo8WaEa9CDbKqk+FTye5WyexRyY4LGUdkz38+ik1HccbQLAFC9NAb6yQq+/deXCiOWTaRINlxRiwk6Facuo2E+MOsyQSDHbhDjXSGGumlHldeF0mlPszFkSFJvCY3pNbGyPaEyJBDDL5cqsnGztRcXpICvGeVaYsXR5zomsiS0iUsLF5Igkmhrv6fNDQ8iqr6AUhJnk7psGtITTkyq1i3VVfy4dNP0NXagr+rE3MwGI+sRPuFiy06IGAsMZVQn2hJcA7JZ9mbaNF14tOgbfg0I+ri1+1IukYKXeRK7UOiLcVSK6lS7xeOu8ecjD8pGzmjBHvWOBKzj0HJGAmpxWDet7V5BAKBYDD7ev8WouYwoqoavbtEWrq7gjT6wjQEw7Tp2m6JuF6HTNjy5bksJnSypRj5BMiPuckLtpLrryfPU0Gep5K8cDtJqh8d6DSnUmOPi5Z4xMV45OM2734dbGHNiLz0qOR5VPI9Kok9scEL6A5CRzUHiSq9xEy9xMx+YqZedGX3dUAkSTKmMiclk2hz4lStOAImbN1g98ok6DYsmJDiyqc35qEz1EBnqJHOYCOar5k0k5+kYQEso0OoRVFjElQca4dOWnOUvFCIpHBkiH4Kpw/n85QsnlI7eY8AsfiMrLyEPJYMW8Li0sUUJRWhqkEaGh+nru4BYjFjRlRi4jiGlf6StLR5R2z6dW17Ny///AccU9vQP+VZ1vV+0bJrEm7fY1fRAqDqEr2D8lv8uhO/bkPXVFKlTvL7pkHHoy3FUitpXyJcPOYkAknZSBkl2LPGk5B9DEr6CEO4WByH6aoIBIJvK4e19pNgAF3XaS7viSffBmnpDlHvC9EUjtI+KNLicSh4HTK+YgkkBfji//GnYERZ8jUP+ZFO8gIN5Hmrye8pIzfQRGbEjYKhMlRkmqyZ1NlzqbHnsTZ7AXWOQmqcRdRZXPhl694GT5o3xuieGMN7omR0x7B7NOTgnm/cmhQjZvITM/eixp9jJn//Krt2u5305GSSkzOHrseSkIQ9oGBuV4nV9RKu86IHh+bGaLqGJ9JOXV8UprcOq7eTpECE1MQIOSN9hE9WiRbp8XM2sHdClkcn292DMzgod8aaRLhoNhuS0ngs3MTa3hrQGkCCREsSC4sXsmTYEia5JiFJEpoWpqHxcWpr7yMS6QDA4RjOsNL/w+VacMTEjCcY5eY31lG+4n2uaGmjYcRoehMS+oWMatrzP9moLhsWkWqjFycBzU5AsxHTY6RI7RRLbYyWqiiW2+JRl1YypC9ejNJrTiSYnIOUUYrNNRZn9kRDuKSVkGxx7mktQYFAIDiqCFFzCDh/UyVuuyFeYgUShg+y9/VJLHo8yqIHKFC7yAu1kt9bR56nnDxPBbnhdhza0JyUsGSm3pZDjT2Pja4TqE0aTk1CMXXWLOrlZKLSnqM7iqqSEuhlWDTCsJ4omT0qTo+GySsjBcxI+q7jNG7eqhLsFzAxkx/dGiQxzUZaSp9Yydtt9du+lW+1QJRwnZdIrZfwFi+RxgbCMX1Ilk1Mi9IVbqYz1Eh3oJ6Yu4bEXg/OkMawQABzZpjgJI3QZJ1Yrk5/3ECHJK+ZTHeQrHYvtvCg8FHWBKLDjmd9YirLfeWsaf0EtcsQOibZxLy8eSwZtoTj8o/DqhhCT9NitLS+SE3NPYRCTQDYbAWUllxBdvaZSNKRWWcmFFX519pq7n53B05/lKu63qV8/DG0ZWcbpx2fBu3VbPhIIKA7CGo2/JqFkB4lQzJmE42VKo1oi2LMKnJJni88rs+cQCg5B9JLsWWOw5k9ETltGKSVkmRN4OibtAKBQLDvCFFzkEiShDvNTNegfOU0TSOXMAW6l4JoO/lBI8qS172DvEAjGdGePaaW+BQHtfY83s2cT23KGGoTSqi1ZlNrSqUZK/que+k6jkiI1HAPKeEQ+VqYzEiYlN4I9h4dxSchBS2Yogkoqh0JMzA0sVpHI2byIznCWJN1EjLMpOY6SMtIIzm5pF+wOByO3abdG0PQUbvDhLf14K/1EK71EmvbPVk0pPrpCDXSFWwg6KlBcdfiDERICkYoDAXRJZ1okU5wtobvGA01c9A11iVS/GYym924usJY+1YOtibD2OPRhp/M5qQMnm9dyzv17+Bv8ffvOzhPJsWWMmjcGu0db1JdfReBQDUAFksmJcW/IDf3XGT5yJQm0DSdlz5r4k9vbKPDE2NK2MppKU+TapJ5M2McjZEUoloyASLkxIXLBKkivmpuKyWmVjKlni88Rq/ZSSglF1JLsGWNw5k1ESndEC6J1kS+PKVcIBAIvh4IUXOw6Dp/7XwYm7ee/J5ycsIdWPU915Xpy29Znx0XLYklRo6LKY1a7HRpQ6MC5liUhHCQBI+H0eEAaZEQWbEwSeEgtlAA/AGUiA1TzIkpmhB/TkXW9zwjTDKrhnBxmUnLdZBVnEJucQYpqcmY9mJr7HYOmk60xU+k1kO4zku4xoPm2/18vZEuOkON+HpribqrsHS3kBCMURwMoMTTuHRZJzJcxzNJIzRVRk0csKRkXSatRyWzrZcMdwRzLC5ksifC8JNhxClUJqTxSu0bvFb1OG2Btv59d82TGTJ+XaerayVV1XfS27sdAJMpheLin5Kf930U5cglsn5Y0cGtb+xge7OPvJjMDyJgHvYaH/TMYE1hKWPVav5oup9h5maype4v7CtgdhBMzoW0UmxZ43Bkju8XLgm2ZBKO0DkJBALB0UQkCh8K7p4APcY0dtWcSHPGeOpSx1GTWGrMJjKlUys5qY0p+ONuiaxpOCIhEkIBEsNBQ7yEg6RGgqSEQ9hCAeTowEJxkmbCFHViiiVgijpRYgmYYg4k9mA7SeBMNfULl+ySFDLyE3Em7yW35gvQIiqRep8hYmq9hOs8EB36J6PpKu5wK92BBsKeavSOCmw+L6mBINZd1pXRTTqRqQlE5znwF7pRTQPTuBVNIqMzhKszTLo7gkkDbMkw7EQYfgoMP4lOk4k3at7glapX2OHe0b9voiWRBcULWFK6hMmZk/eYA9PdvY6q6r/g8WwwjqckUFhwKYWFl2IyHbl4xdYmD39+cycfVnTi0GB6WEKzBthpjdERNcYhofGW5TpGyo39+wXNDoLJOYZwyRyPPXNcv3DBnnLExi8QCARHGpEofITQdZ1l8x6lRjVRp5qoD6tENA1rLEpCKEBCb5CEcCeJoXpmxYVLcjiILRxCYg96UgdZtWGKJWGKOrFqSZhiCcYSu3vAYlNIz08gIz+RjIIEMvITSMtxYrIcWC6I6osQrvUSqfMSrHITaw0i7TLMiBbGHWyk11dLrKsSxV1Loq+X7Mjua9PELFa0sSOwnpBOsLQbj3kHqtYNGJEHU0zH1RkmszNMancURQdyjoE5p8CIUyBvGkE9yvv17/PKJ7/no+aPUPUvzpPZFa/3c6qq78Tt/hAAWbaSn38hRYU/wWJJO6DrdCA0dge44+1yXtjUBDoUqKAjscquA3aIGunjIyUPi80rGCk3oloSUL7/PGSMwO5IQ0yIFggEgr0jRM0hoHx7NZLfz6hwkKnhAAmhIGZN/dL9FMlMktWFQ0rDFHOiB6yEvaDt2b0iMd1GRr4hXDIKEsnITyAx3XbAM3N0XSfWGSRS6yVY3U2o0g2+ofO2JSAQ89LTW0eopxqpqxKru5EEvx/nLv1pkkSXqwBpzDhy54zHPCFCj7wZd/eHaFp8FWENLGEVV1eEzM4IKZ4osjUFhp0Kx50Cw06CxCw0XePT1k955aMbjTyZ6KA8mYyJLB62mIXFC0m1pe71/Hp7y6muuYuOjreNc5FM8fpMPz+i9Zl6AhHufa+Sx9bWEtV0bBrEJJ0GkwTxTKmpKCzERHpgHZ+nhThTfx8AZfYVUDjziI1VIBAIvs4IUXOQSJLE+OZatODuybFOpzM+UyiZBGsyplgCmt9C2Au9HTG8nSHQIYLx6EMxyaTlOvsjLxn5iaTnJ2C1H9zXpcc0Is29hGs89O5sI9YYQI4OFUS6ruMNt+P31RBzV2Hpqsbe2USipu2WUNrqSKUlpxTT+AkUz5nOmDnDSQl/REfHW9S5/4zePWA92UIqmR1hXF0Rkr0xpJxJMPEUw1bKmwqKcW5VPVW8suE/vFr96m55MotLF7O4dDHFycVfeJ7BYD3VNX+ltfUljEwmiezssygtuQK7vfDAL+B+YsxoquXedyvojah9QyEkA0jkAadhZQFm6p3beUt+mWN6ipiAG5fUjWZPRT728iM2XoFAIPi6I0TNIWD+sTPRdZ3k5GQSE5KQIlZCPdDTEqSzsZfO8l66eqPsLl/Anmjuj7pk5CeQnp9AapYDWdm3YpJfhBaMEa734i/rIFDRAZ0qsj7Qr4yEqkXxBhoJ99QgdVdjaylDDnh2EzBes4Oy1AJqXcXYJoyndO4M5k4fySxrNx0db9Pe/hDrPvsUBllqDn+MzM4ImZ1hErQkpGGnwaR4NCZhoJJPZ7DzgPNkBhMKt1Jb+3eam59B1w1B5XItoLTkKhISRh74hdxPVE3nqXX13P5WGT3BQWE3Caw6nCDJLMHGRBQqbQ3cn/0KNHdw8kYzFdNcnK+/DhLIc64Em5hULRAIBPuKEDUHiaZqpMRK6Wz0Ud7Yi7ulHi22e66MJEFKtnPAPooLmANJ3t0bsZ4QgUo33q1NRBt6MfmV/tV548UOiMQC+L01xLqrsXRWILdWYNLVIX8IEdlEZUoe5SmFlKUWoI0ey/hpYzl+dBYXFKYQDdfT0fEWDTt/wzb/9iFjSPTF4jkyUZypE428mJNPgbwpIA/k+QRjQSNPpvqVoXkykom5+XM5Y9gZX5gnM5hIxE1d3T9obFqOFl/fJy1tHsNKryYpaeLBXNL9QtM0Hlpdw33vV+EJDvUQh8dkzpRNLJSt2JFoM3dxu+slUiZnc9r2STRvXUl3fjGT2E6a5EF3upBmXHbExi4QCATfBISoOUgkWWL967WE/AM3sSHJu/kJZBQcXPLunjCmVvfS/Xk9/vIOpA4Vc2xgbRVz/KsNhjoJ9VQju6tRWraj+1oxMfDFa0g0JGZRllpAeWohZamFuDMLmD06m/kjXfxwpAtXohW/v5z2tsfY+NFL9EYaBg1EJ9kbI7MzjKvXgb3wFJhmzFTCmTFkzJqusb51Pa9Uv8KKuhUHlCczmFjMR339w9Q3PNJfnyk5eRrDSq8hNXXG/l/UA6SpJ8h971fy3IZGwrGBnCQHKseGLXxXtjLebFxxn+znwYw3qC2p4Dfz7yDFY2H5368iSzVRk53B9/RXQQJp7tVg2TVrSSAQCARfhBA1B4kkSYyfn4ck0T8D6WCSd/eGHlXx7myh+7N6InU+LD4zpvhCepb4Csa6rhLsbULtrkFuL0PvKEcPe/u/ZB3oTUxlW3IB25LyKUstpDIln4DZxsT8ZOaPdHHpKBfH5KegyBI+3xbaGx6ksuNNgpp74Jx1ndSeKK7OCC7zKKwli2D6KZA7eUg0po+qnipeqXqF12peo9Xf2r99f/JkBqOqQRobH6e27kFisR4AEhPGUTrsatLT5h+RkgbBiMqb21pY/lE9G+qHriFTaNeZ3aNyhpRModW4HhEpykupK3kv+y0um3YJvx9zNwDP3HM9plgMT24xU9hGiuRDT8xBmnbpYT8HgUAg+KYhRM0hYOYZpYe8z3B3Lx3rKvGXt0N7DHvEgSwp8SiLUTdKVcOEPbXQVQFtZajdNaAaOTsaoNsddI6YwOaEPD4yZ1OeWkCX3ajYk+owM2+Ei0tGuThupIuMBCu6rtLT9THVm56g3bOGsDSQ/CxrOmndEVxeC660+ZhHLIKFJ+4Wjeljr3ky5kQWlOx7nsxgNC1Cc/Mz1NTeO6g+0zBKS/+PTNcCpL2UijhU6LrOhrpuntvQyCufNeOPDJ3hNjnTwbSuNuZ0Z1NilVHi5/Zu0ic8k/kycwvt/GfuCyQ4cgEo/3g1jTu2kqbbaMpK5yL9JSNKM+8aMNsO67kIBALBNxEhar4CaKpK1446uj+rI1Lnxew14YxX3XH2rUwiQTTsQe2ugvYy1K5KNG8T6HG7w2xGGTmC1txhbLTn8IaaRrklDT1+o5ckOCY/he+PdHH8KBcT49EYTYvQ3fgqOzb9m47wVqJytP94iqqT3hUhU88nPft0TLNPg5zJsIdyCQChWIj3G97n5aqX95gns6R0CfML5u9TnsxgdF2ltfVFqmvuIRQyFqOz2fIpLbmCrKwzkeXD+2fc3BPk+Y2N/HdjEzWd/t0+n1mYwvGSh6IyjdHWXCw2Q8xscuzk0cznyXbV8scJlzJxxNX9taSikTCrlj+KrGn4s3KYxuckSn705AKkKRcd1vMRCASCbypC1BwFervctG8so3dHO3pbFGckAZvixIEJBwOLwUV7W9A7ylHdVahdFeiBrv7PLEVFWOYuoi1vOOts2bzSm8AO96DF70yQ7rQwf6SL+aNczBvhIs1p5NyoYS/ubXfT3vYqnXodMSWe2CyDKarh8si4bMeQVnwuypwF4Nj7AnWHOk9mMEZ9preorr6bQKDSOG+LK16f6buHtT5TMKLy9vZWntvQyOrKTvrW3TZWljEYm5PID4anIK1sYoKUgDMuZmqsTTyc+QLB9C2c73Jy8pRHSEudNaT/Da++iLejjQRzEl2ZKczTnzeiNPN/DaZDlzwuEAgE3yaEqDnMRCNh2nZW4P6slnCtF7NHIUXOwCJbSOsTMAroWgy1uxa1qwK1qxLVXQVRw/5R0tJwzpiIbeIEektG8ZE5i/eagqyt6iLQ3meBhJElmFyYyvx4NGZ8bjKybNxoY107af38ITq6P6DT3IWmSH1TojBHNDJDybhSjiN13PeRc6ftNRrTxxflyZxeejpLSpfsV57MYHRdp8u9iurqO/H5tgHx+kxFl5Gff9Fhq8+k6zob63t4bkMDr37Wgi88sM6O3SwTjGroQH6qnf+bU4qyppLh7/eQYjIqK3WaunnM9Qrb0j/ijNQws3NnMW7cnVgtQy26XncX6158FknXiaRmMIPNOKUgpJbAMRcclnMTCASCbwNC1BxCdE3D3dxE69Yyene0ordGcEaTSDa7SJOSgeT+K65F/GjuSkPAdFWi9tSBFkOy27GNG4v99POxT5yANGYcm0JWVpZ3sqq8napVfqCu/5iuRKsRjRnpYt6IDFIc8ehFLEy08lVaa/9Ne3ATbmcEXZbABiBhC4OLQjKzTiN51CVIThdfRlewizdq3uDlqpd3y5M5tfhUlgwz8mTkg8ht6e75lKqqv+DxrAdAUZzx+kw/PGz1mVo8QZ7f2MR/NzRSPcheciVaMSsSzT0hglGNFIeZn88fxvh2P+YXa8k2WcEEATnI0+lvsyL9fU5K6+WXTo0RpVdQXPyzfrtpMB/+5zGi4RDmpCx6XYnM0dcbIaDjrwdlz+UwBAKBQPDlCFFzkMQiETY8+Tyh2h4sPQqppmySzSkkU2TcqOIaQ/N3DAgYdyWarxVkCevIkSSeNAXbxIuxT5yIddgw6j1h3izrYFV5Bx+t2UYwOpCQqsgSUwtTmT/KEDJjc5L6ozF01xHe8jwdza/QTjU9STK6ScJYSU/CETHjso4ns+gCEovOQlK+fIp5X57MK1WvsLZ57SHLk9kVr3cL1dV30uX+AIjXZ8r7PkVFP8FiST+ovvdEKKry9vY2w16q6ECLe0p2s8K8ERl4ghE+qTFmNVlNMj+YU8LSjBQ6n99JniYjmRRiqLya+gHPpL/FpJQefpUSIsXqYtz4u3azm/poqSxj+wfvoeug250cy0bsUhgyRsGEpYf8PAUCgeDbhBA1B4mETPbOLBQpj76cXl3X0DwNg0RMFXqoB3NeHraJE7BPuBD7xAnYxo5FdjgIRVU+qu5iVVkHK1/+kNquoSUXspKsHD8yk/mjXMwZnkGyPf5rPhaGmvcJVrxIu/s9OuxePMlmSAYwBEuCmkhm0ixcw3+IM23qPs02+qI8mQkZE1gybMkB58nsSq+/gurqu+noeNO4npKJ3NzvUlz8c2zW7IPufzC6rrOpoad/9pIvNGAvzShJY+G4bCrafTy7vpGYpiNJsHRKPj+bWkj3c+WYutrJlxSQ4IPEDfwr82WyLRF+lt2Gy6yTljqHsXuwmwYf//1/PQiAll1CKM3JLH1jPEpz3R6nwwsEAoFg3xGi5iCR0dA6dqJLyoCI6a5GdtqwT5hA4rwZ2Cb8EPvEiZjSjYiDruvUdPpZubGNleUdfFLdNWTRNpMsMa04lfkjMzl+lIvR2YkDYqS7FrasoLfmFTqCm+hIlfElmiAbiK9bk0wWrswFuEouwuEs2edzqe6p5pXqV3it+jVa/C392/vyZBaXLqYked/7+yKCwYZB9Zk0QCI760xKSq7A4Sg6JMfoo9UT4oVNTTy3oYGqjgGBlpdi55wpeZw2MYe3t7Vxx9tl/dO0Txjl4tcnj8C0shXtgS1kShJIElvtlfwz63nCSoDvpgUoSe0AZEpLrqK4+PI92k197Fy9kpaKMrDZMUky09iATYpA1ngYe9YhPWeBQCD4NiJEzUEiWSzI6ifooRDO6ROxT7wU+8QJmAsLh0RFApEYq3a0sTJuK9W7h0ZjcpNtzB+VyfyRLuYMTyfRFo/GRENQ9R565Qp8DW/SYW6jPcNCwGWi39vSIdUyDFf+Uly5Z+xXhKMvT+aV6lfY3jVQ8uBQ5skMJhxuo6a/PpMxfdzlOjVen2nUITkGGPbSiri99OEge8lmljltfA5Lp+YzrSiVZzc2cuHD6+jwGTPHJuYnc/3C0ZQ0BPHdvxW7BkgSDZZWHsl8kZ32Wr4bGcOYojVIpggWi4vx4+4mNfXYLxxPNBTig3//CwBvzjCcNo1j9U1GlOaE//elidkCgUAg+HKEqDkEFP7r0d1sHV3XqWz3sbKsg5VlHayrcRNRB6IxZkViRklafKZSJiMyEwb6cFfDZ++gV76Np3Mt7anQkWEhNEyB+MJ7EjJpzslkFiwlI+NkLJa9T7velSOVJzOYSMRNXf0DNDY+cdjqM+m6zmeNHp5d38ArnzXjHWQvTS9O5dypBSyakE2C1cTb29tYdM+H/ZGbwjQHv1owivkWK+3/riAWiGIH3IqX5a5XeTflE87qns+FSRLRkveN8X+J3TSYdS//l153F5Irh4RwhDm2zVikqLEK86jTDsn5CwQCwbcdIWoOAX1ipDccY21lJyvLO1hV1kFTT3BIu/xUO8ePcjF/ZCazh6XjtMYvfzQIle9C5Qq0irfp0Rpoz7DQkW4lkjMwfVmWLKSnziMzezEZGSfs12ygL8uTWVy6mIUlC0mz7bs42hdiMR/1DY9SX/8wqtoLQHLy1Hh9ppmH5Bht3j57qZHK9t7+7bnJNs6Zms85U/IpzjDqKG2oc/PH13eyoc5IAk5zWrjixOGcW5JBx7MVeFv82ICgFOa59BU8n/4u09UJ/K35MtTxTxJJaMawm678UrupD29nO+tf/i8AjZnF5GrdTNc3x6M0NxgrIwoEAoHgoBGi5iAJRVUeW1vLyrIO1te5iaoDFbotJpmZg6Ixw1zOgWhMVxVsfgcqVqDWrcadqNKRYaFjlIWYObm/D5PsJMN1Mq7MBaSnHbffa7QcyTyZwahqiMamJ6ire4Bo1BAQCQljGVZ6Nenpxx90faZQVOXdHe08t6GBVeVD7aWF47I5d1oBs0rT+2eGVXX0ctubO3lrW1t/ux/NLeXHU/IJvl2P+5XNmAAVlTdT1vJkxmvYbXZuabiCPGcjbdP/iq5EsFgy43bTvguyD578F7FoBHnkeDI9PcxL/BSzpELBTKPwp0AgEAgOCULUHCRmReYfq6roDhj5IUXpDo6Pr+J7bGk6DsvgaIwhYqhcQcxTQ1eqmfYMK13T7aimgZwKsykVV+apZLoWkJo6a79Xzj3SeTKDMeozPRuvz9QOgMNRGq/PtPCg6jPpus7njR6e29DIy5814wkOVEafVpTK0qn5nDYxhyTbwFov7b4Qd79TwdOfNqBqOrIE500v4MrjhmHd0IHnrk3Imo4ErE34jH9lvkSLtZOfSxdz8o6xdIxeTmveGgDSUucybtwdWPbBbuqjcec2ytZ+AJLEjqQcxih1TNW3GFGaE0WURiAQCA4lQtQcJIoscfnxwzArMsePyqQkbnOg60Y0pnKFIWZqVxMlTGe6hfYcC+6xaWjywA3Nas3G5VpApmsBKSnT9snWGEwoFmJlw0pernp59zyZvLksHraY4wuOP6R5MoMx6jO9FK/P1ACAzZZHSckVZGeddVD1mdp9IV6M20vlbQP2Uk6yje9MyeOcKfmUuhKG7NMbjvHgB9X888NqAvEZTSePyeLaU0eSU++n+97PiIVUZGCnrYaHs15gq72SGYlTubfmt+jBJhpm/mHAbiq9iuKiy/dLlOmaxsrHHgJAmjmfopZm5jnWYZJUKJ4HJccd8DURCAQCwe4IUXMIuOy4YcaLSADK3+qPxtBdS9gs0ZFhpWOMhe6UBPRBP8zt9iIyXQtxZS4gKXHCfkcxNF1jQ9sGXqky8mR6owM3/MOZJzMYXdfp6HiL6pq78fsrALBYMigu/jl5uechywcmosKxPnupkVXlHahxf8lqklk4PpulU/OZPSwDRR4a6YiqGk+tq+ev71bQ2WtULJ9UkML/WzSaCSHofqKMHncICWg2d/Bo5ousTtyEw+TgvqQ/U/JJEp6sD2g75okDtpv62LbqXdqqK7HYHawxJzPL0chkfdtAlEYgEAgEhxQhag6WaAjWP2KImNo1oIYJWmU6Miy0F6bgSTIZN7E4Cc5RRkQmcyFO58gDyi3ZW55MrjOXxcMWH7Y8mcHouo7b/SFV1Xfg820FwGRKpqjoJxTkX4iiOA6oz61NXp7b0MBLnzXTExiwl6YUpnDutAJO38VeGrzvG1tbuf2tsv5K2iUZTn69YBQnJjvxvF5DV60XAK/s50nXa7ye+iExNBbnLOTKhu8T2dFBy5gH8eYeuN3URyQYYPVTjwPQO28hYysqmW/7BEXSYNhJUPjFU8AFAoFAsP8IUXOwKGb44Hb8eOjItdCe5cLn0Ic0SUo6Jm4tnYrDcWBioyvYxZu1b/Jy1ct7zJNZXLqYKVlTDkuezK5093xKddUd9Hg+BYz6TAUFP6Cw4IeYzUn73V+7L8RLm5p5bkMjZW2+/u3ZSXF7aWo+w3axlwbzSXUXt76xk80NPQBkJFi48uSRLC3NIPBOPR2fGxW+I0R5Pv1dnk1/m5AUIcmWxD0FfyLrPQW/tJ3mY+8j4jTspmGl/0dR0U8POAfokxeewd/TTWJWDi9hZaGtl4n6jniU5jcH1KdAIBAIvhghag6SmBZk/fR0/H0lr9EBmZSU6WS6TsXlOhWbLfeA+u7Lk3ml+hXWNK054nkyu+L1bTXqM3WtAkCWLeTnXXhA9ZkiMY33dhqL471fNmAvWUwyC8YZ9tLc4bvbS4OpaPPx5zd38s6OeEKyReHH80r54dQCtDXNdL28CTQdDZ13kj/mCdcrdCledEljaek5XO4+n/Ar7XTnrqJt7BPocp/d9FdSU2cc4FWCnrZWNrz2IgANsxcw7fPPON7yMbKko49YgJQ39YD7FggEAsHeOSBRc99993H77bfT0tLCuHHjuPvuu5k3b95e2//973/n3nvvpba2lsLCQn7zm99w0UUXDWlz9913c//991NfX09GRgZLly7l1ltvxWazARCLxbjxxht58sknaW1tJScnh0suuYQbbrgB+SiuxmoyJSAlZCP5e0lLnYXLtQCX6+QDsizgq5Ensyt+fyXV1XfT3vEGYNRnyslZSknxL7DZcva5H13X2dbs5bkNjby0ual/xhjA5MIUlk7NZ/HE3IHaVnuhzRvirhXlPLO+AU03krXPn17AFfOHYf+8C99fN6GHDQG43rmdhzNfoNbahKwrpDlTuW3MzRS96yDU0Ujb+McH7Ka0eYwb+5cD/u76WPXEw6ixGDnjJ/FaVGKppZvxehlIIJ207KD6FggEAsHe2W9R8/TTT3PVVVdx3333MWfOHB544AEWLVrE9u3bKSws3K39/fffz/XXX89DDz3E9OnTWbduHT/+8Y9JTU1lyZIlADz55JNcd911PPLII8yePZvy8nIuueQSAO666y4A/vznP/OPf/yDxx57jHHjxrF+/Xp+8IMfkJyczJVXXnkQl+DgGTf2DqzWnAOyXvr4ojyZ00tPZ8mwJYc9T2ZXgsEGamruoaX1RQbqM50Rr89UvM/9dPaG+2cv7WwdsJcyE618Z0o+S6fmMzxz7/ZSH95QlAdWVfHw6hpCUWN15gXjsvj1qaPIbgjgfWALXo+RHFxtaeKh7OfY7CzDHLOCBGcMX8KV0qWEn2mm11pG86z7iDgOjd3UR/3Wz6n89CMkWWbdhNnM3vAJxysfIUmgj1qMlD3hoPoXCAQCwd6RdF3Xv7zZADNnzmTKlCncf//9/dvGjBnDWWedxa233rpb+9mzZzNnzhxuv/32/m1XXXUV69evZ/Xq1QD84he/YMeOHbz77rv9ba655hrWrVvHhx9+CMDixYvJysri4Ycf7m9zzjnn4HA4eOKJJ/Zp7F6vl+TkZDweD0lJBy5ADhV9eTKvVL3Ctq5t/duPRp7MYMLhdmpr76Op+amB+kwZp1Ba+n/7XJ/JsJeM2Usry9qJDbKXTh2b1W8vmZQvP7dITOPJT+r423uVuP2GaJlWlMr1p41mXFjC83oN0VYjObhT8fBI1gusTPoUs2olooTIsGdw05QbGfOxi+C2Try5H9I2djm6HMFqyWLcuLsPym7qQ9NUll97JR31tZScfBq3JxVyyY7X+An/Rgekn30CmaMP+jgCgUDwbWNf79/7FamJRCJs2LCB6667bsj2U089lbVr1+5xn3A43G8h9WG321m3bh3RaBSz2czcuXNZvnw569atY8aMGVRXV/P6669z8cUX9+8zd+5c/vGPf1BeXs7IkSP57LPPWL16NXfffff+nMJR54vyZObkzWHJsCVHNE9mMNFoN3V1D9LQ+DiaFgKMGUClw64mOemYfepjW7Mnbi819wsQgGMKDHvpjIm5JDu+2F7qQ9N0XtvSwu1vlfUXAC11Oblu4WjmpzjxvllLZ0UPAAEpzL9dr/Fy6kp0ZBTNTMQUYnHpYn7p+gWRZxsJ+Jtom/A43hzjb9Wwm+7Y73ygvbHl3bfpqK/F5kzglewRHL/+I06QPjI+HHOmEDQCgUBwmNkvUdPZ2YmqqmRlZQ3ZnpWVRWtr6x73WbBgAf/85z8566yzmDJlChs2bOCRRx4hGo3S2dlJTk4O559/Ph0dHcydOxdd14nFYlx++eVDxNO1116Lx+Nh9OjRKIqCqqrccsstXHDBBXsdbzgcJhwO97/3er37c7qHjC/KkxmfPp7FwxazqGTREc2TGUws1huvz/TPgfpMSZMpHXYNaamzvnT/rt4wL242Zi/taBm4xq5EK9+ZksfSKfmMyNr3OlUAa6s6+dMbO/m80dPf1/+dPJJzhmXgf7eBjs0VoEMMlZdTV/FUxhv0KkEc0UT8Fg9ptjR+O+O3TCsfhu+xasLORprn3EfE1mc3XU1R0U8O2m7qI+TvZc3TRsSwYMm5PBLWmCM1MpIadF1COvl3h+Q4AoFAINg7B5QovKeK1Htbb2XZsmW0trZy7LHHous6WVlZXHLJJdx2220oirFq7sqVK7nlllu47777mDlzJpWVlVx55ZXk5OSwbJmRWPn000+zfPly/v3vfzNu3Dg2b97MVVddRW5u7pCIzmBuvfVWfv/73x/IKR4Sqj3VvFr1Kq9Wv7rHPJnFwxZTmlx61MZn1GdaHq/P5AYgIWEMw0qv+dL6TFFV4/24vfTezkH2kiJzStxemjdi3+ylwexs9fKnN3aysqwDAKdF4Sfzh3HptAJia1roeGUjxIxjrUrYyKNZL9Bm6SI9mg16AL/Fw8LihVw34peoL7Tgbag37KZxT6JL4UNqNw3m4//+h6DPS1peAY+Z0zjp8zWcQDxKM+5sSB92SI8nEAgEgt3Zr5yaSCSCw+Hg2Wef5eyzz+7ffuWVV7J582ZWrVq1132j0ShtbW3k5OTw4IMPcu2119LT04Msy8ybN49jjz12SN7N8uXLueyyy+jt7UWWZQoKCrjuuuv4+c9/3t/m5ptvZvny5ezcuXOPx9xTpKagoOCw5tTsLU8mwZzAguIFRy1PZjCaFqW55Vlqa+4lHDEKPDocJZSW/B+ZmYu+MHqxo8XLs+uN2Utdg+2l/GSWTs1nyTG5pDj2r1YVQHNPkDtXlPPfjY3oOphkif+ZWcgv5g/DtsWN7/16tEAMgO22Wv6R/RQV9npSYy5Ceoig2UeKNYUbjr2Bud2T6XmxEjXmp238crxZRu7Wobab+nA3N/LYL3+OpqqM+NmvWOaOcdVn/+EHPGtEaa76DFKLDukxBQKB4NvEYcmpsVgsTJ06lRUrVgwRNStWrODMM8/8wn3NZjP5+fkAPPXUUyxevLh/KnYgENhtWraiKOi6Tp/m2lsbTdP2ekyr1YrVevhzU74sT2bxsMUcn388NpPtizs6zBj1mV6mpuYegqF6AGzWXKM+U/bZe63P5PZHeGmzMXtpW/OAvZSRELeXpuYzcj/tpT48wSj3r6zi0TU1hGPGd3n6hBx+depIMpsCeB7YgqfbEKYtFjf3Z/2HT53bsKlOSrUxVJt2AHBS4Un8ZvL1KG920725jHBCAy3H/oOwtQlJUigtuZqiossOmd00mFVPPIymqpROmc49vXDylk84ESNvRx93DpIQNAKBQHBE2G/76eqrr+bCCy9k2rRpzJo1iwcffJD6+np++tOfAnD99dfT1NTE448bS8SXl5ezbt06Zs6cSXd3N3feeSdbt27lscce6+9zyZIl3HnnnUyePLnfflq2bBlnnHFGv0W1ZMkSbrnlFgoLCxk3bhybNm3izjvv5NJLLz0U1+GA8UV8LPjvAnyRganKX4U8mcHouk5H59tUV9/VX5/JbE6npPjn5OWdv8f6TFFVY1VZB89uaOC9ne1EVUNcmhWp3146boRrv+2lPsIxlSc+quPe9yv7yyHMLEnj+tPGMDoCnqcrcDca+T0+c4iH0//LipS1oEscox/LTtNmqtlBkiWJ38z8DScpc3A/WE7IHcST/wHtY55ElyJYrdmG3ZQy/YDG+WXUbt5A9cZPkRWF6Iln0Nrcw0VaBUU0oeky8sI/HJbjCgQCgWB39lvUnHfeeXR1dXHTTTfR0tLC+PHjef311ykqMn6NtrS0UF9f399eVVXuuOMOysrKMJvNnHDCCaxdu5bi4uL+NjfccAOSJHHDDTfQ1NSEy+XqFzF9/O1vf2PZsmX87Gc/o729ndzcXH7yk5/w29/+9iBO/+BJtCQyMnUkLb0tX4k8mcEY9ZlWx+szbQHAZEqiqPAnFBRctMf6TDtbvTy3vpEXNzf1F4QEmJBn2EtnHJNLqnP/7aU+NE3n5c+auf2tMpp6ggCMzErgukWjmZvixPtmHZ07jfyeqKLyTMrbPJvxFmE5wsjIMVgTFT4LfwzA/Pz5/Hbmb7F9GqJjxedoUoj2KcvxZBh2U3racYwd+5dDbjf1ocZivP/4PwGYtHAx13b4OXnruv4oDWO/A0kHtpq0QCAQCPaf/V6n5uvM4VqnpivYRaot9ajmyexKT896qqrvpKfnEwAUxRGvz/Sj3RYJ7O6zlzY2srVpsL1k4ezJRu2l0dkHf70+rOjgT2/s7LewspNsXH3KSM4a4aL3nXoCG9pAB13SeSd5PQ+7nsVj6sUVymNe+gm8GniGiB4h0ZzIdTOv47T0U+l+ppxwtcewm2Y8QNjUeNjtpj42vvEy7//rQeyJSZh+/v+4p6GT/7fxEb7HS2iajPzrMkjIPGzHFwgEgm8LhyWnRrBn0u2HJxJwIPh826iqvpOurpWAUZ8pL+9/KC766ZDl/2OqxqryDp7b0Mg7O9qG2EsnjTbspfmjXJgP0F4azLZmD396YycfVnQCkGg18dPjh3HJ1AJiH7fQcccG9PgKwZ8lV/O39MdpsrZjjyZyjvIDanK28Hz3cgDm5M3hxlk3klxjou2vm9CCUbyFq2kb/QQ6ht00ftxfSUmZdtDj/iKCPi8fPftvAGae930uavNxwrZP+2c86WO+IwSNQCAQHGGEqPmG4PdXUV1zN+3trwMgScqg+kwDFkhZq4/nNjTwwqZmOnsHZoaNz0ti6ZR8zpiUR9pB2EuDaewOcMfb5by4uQldNwTT948t4hfzh2Hd5sbz101ofiOfpjmpmztSHma7sxpZU5jjP42p48bxYMPfCHWHcJqd/Hr6rzmz8Ay8r9fQ9UkrmhKifcaTeFKMVacNu+kOLJbDn8e09tknCfl7cRUW8056EVJjB/PCm8mhHVWTUc68/cs7EQgEAsEhRYiarznBYGO8PtML9NVnyspaQmnJFTgcRq2onkCElz8zFsfrW8wOIN1p4azJxuylMTmHzo7rCUT4+/uVPLa2johqRGDOOCaXX54yEldrEM+DWwl2Gvk0vQlh/pb8bz5I/BQkGOGZwsVjLuHF2OPcU2sItJk5M7lp9k1keJPo+PtmYu1BQokNtM18kJDcYNhNpddQVPjjw2o39dHZUMdnK4zintO+/yOWtPlZuHPDQJRmxJngOPoJ4gKBQPBtQ4iarynhcAe1dX+nqWmgPlNGxsmUlv4fiQmjicUXx3t2QwPvbG/vFxcmWeKkMZksnVrA8YfIXuojFFV5bG0tf3+/Em/IWFNm9rB0rl80hpGqhOeZCrrqjVliMbvO8tTXeS7pDVRJw9VbwPeSf4zzuCi3bLuWYCyI3WTnl9N+ydIRS/GvbaH9jc3oqoZ32Frahz+OpoePmN3Uh67rrHz8n+iaxvDps3g0JGELBTgh8CmZdKGqCqaldx+RsQgEAoFgKELUfM2IRnvi9Zke66/PlJo6m2Gl15CcPImKNh/3frCD5zc10eEbsJfG5iSxdGo+Z07KJT3h0K7do2o6L2xq4s63y2j2GGManZ3IdYtGMzs1Ad9btXRs6wJAN8O7mRu4z7mcoBLGEUnm5N6lnH/q6fy16c988pmR2Dwtaxo3zbmJXCkL92PbCZV1oykhOub8hx7nKtAhPX0+Y8f85YjYTX1UbVhH3eebUEwmRn/3In5R3sbi8o0cjzEjSytdjGJPOWLjEQgEAsEAQtR8TYjFemloeJS6QfWZkpImM6z0ahTbdF7+vJnn1q/ms0H2UprTwlmT8jhnah7jcpMP+Zh0XWdVuTGjaWerEYHJTbZx9amjOGO4C//7DbSvK+tzxdiWV8+ttn/QZe7BpJqZ0bqIn06/jLrcz/jxxgvxR/3YFBtXTb2KC0ZfQKTCQ9szG9F6o4SSG2k79kFCev0Rt5v6iEWjrHrCmMI99fSz+HN9B0mBXhb0fkQG3cRiCubv/f2IjUcgEAgEQxGi5iuOqoZpanqS2rr7B9VnGk1R8dXscI9n2ZtNrNj2zhB76YTRmSydms8JozKxmA7PTX9Lo4db39jB2iojApNkM/HzE4Zz0bQCoh+3GjOaIsbKym25Pm623UeltQ6AER3TuCD9B0y/uJQ/bbmZNZ+sAWBy5mT+MOcPFDoK8LxWS+/qJnR0fGM+oq3wsaNiNw1m05uv0NPagjMllZSTF/PG1nrOqNzM/L4oTeFCsB7YysoCgUAgOHiEqPmKomlRWlqeo6b2XsJhowK63V6MlHwV79cN54V3mmjzru9vPzo7kXOnFXDmpFwyDrG9NJgGd4Db3yrj5c+aAaOA5cWzi/jZ/GFYtnfTffcmNJ+xaF/ApfLXpCf4wLIOgCxfMQt93+P87yxgg/Ih33vvu/iiPiyyhSumXMH3x3wfrStM+782E232oylBOo97hm7z+3G76XjGjrn9iNpNfQQ8PXz836cAmHv+RfxmRy1pvb0s9n5AKl6iERPmix884uMSCAQCwQBC1HzF0HWVtrZXqa65m2DQWJk5JpVQGfkZK3ZksrnBA1QDkOowc+YkY/bSuNykL6yqfbC4/RH+9l4Fyz+uI6rqSBKcPSmPq08ZQXp7GM9DW/G3BQBQkyX+k/U2/5ZfRJd0EsKpzG46k+/NOoe82XZu/vS3rGo0ip9OyJjAzXNvpiSphMCnbfS8UoUe1YhkNNMy/R+EVMNuGlZ6DYVH2G4azOqnnyASDJBVOhzPuKl8sq2eM6s2cxxGDpCacyJma8JRGZtAIBAIDISo+Yqg6zqdnSuoqr4Lv78cTZco88xgo3spq2sdRGIa4EGRJU4YZdhLJ44+fPZSH8GIyiNravjHyip8YWNG07wRGVy3aDQjNBnPc1V0VcfzeOwyHxR9zl/0B4nKMUyqhSlNp3Cm61yOv2IsH3re44rX/og34sUsm/n5pJ9z8biLkUM67n/vJLilEx0d/+RPacl8GE09unZTH201VWx5720Ajr/ox/zPtmrSe3s50/M+yfQSCZuwXvavozY+gUAgEBgIUXOU0XUdd/caqqvuwOv7nFZ/JmtbzuGT1jl0Bvq+Ho1RWYmcOy2fMyfl4Uo8/JXHVU3nuQ0N3LWiglavMaNpXG4S1y0azaz0RDxv1dL+WYfR2CRRPryVG7mbbskDSIxun8mJveewaOl0EodL/Pbj63m3/l0AxqaP5ZY5tzA8dTjhGg8dT5WhesJo5hDuE5+jS3/nqNtNfei6zsrHHgJdZ9Ts49iQ6KKstZmzKjcxD8NWUzOOw2J1HrUxCgQCgcBAiJqjSI9nA1VVd9Dc8Tmftk1mbfM1VPYU9X+e4jBz1hGyl/rQdZ33drbz5zd3Ut5mzLLKS7HzqwWjWDzCRe/KRlo/KgNVBwk6h4W42XYfZWolADme4cxtPJsFc+cydWER7za/wy0v3UJ3uBuTbOKnE3/KpRMuxaSb8Lxdi+/9BtAhlt9G8zH3EYzWxe2mX1JY+KOjZjf1UfHJGhp3bMVksTL7exdz0pZ60v29fMfzLokECAfN2K59/KiOUSAQCAQGQtQcBXy+7ZRX3snqihbWNs9gY/t3iWpmABRZ4viRLsNeGpOJ1aQcsXFtqu/m1jd2sq7GmGWV4jDzixOG8/1p+UTXtdH2l/XoIWNGU7hQ5m+pT/Ju5ENQISmUway6M5iXNZ/514xGTw5z3UfX8lbtWwCMSh3FLXNvYVTaKGLuEB1PbSNS70NHJzh7I01JD6JFvxp2Ux+xSIRVyx8FYNqS7/CcN0qLZOI7VZuYg5GkHUs+FqtdzHgSCASCrwJC1BxB/P5qPvzsAV7ZEmRt8/F0h1P7PxuZlcC5Uws4c3IumYm2Izqu2k4/t79VxmtbWgCwmmR+MKeEn84vxbyzm+67N6N6jIX89Ewz/y14n4cD/4YIWGI2pjQuYGbwJI4/dxzDprh4r+E9bnrpJtwhN4qk8OOJP+ayCZdhVswENrfT/UIlelhFc0Rwn/gsXZEVoBl207ixf8FsTv2i4R4xNrz2It6ONhLS0hl/+tlc+vEO0gO9nNOzAidBQn4L9l8uP9rDFAgEAkEcIWqOAO3d9Sz/4Hle36FQ2XNc//Ykm8xZkwtYOjWfCXnJR8ReGkxnb5i/vVvBk5/UE9OMGU3nTMnn6lNGktYRwvPQVnpb/ABISWY+HlnGrYF7CAciSLrEmLbZzGg6jZnzxjBjcQlB2c/1q6/nterXABieMpyb597MuPRxaOEY7v+WEdjYDoA6soumUfcQDH+17KY+et1dfPLCMwAc971LuLeuFY+ssLTiU2bHozRRxxRsCSlHcZQCgUAgGIwQNYcJVdP5YGcVj69ey+raJKLaCABkSWfuMCfnzxzNSUfYXuojEInxzw9reGBVFf74AnknjHJx7aLRlGoynuer6CzvBkCyKtSO7+Z3sTtp8xuCJL9nNLNrz2Rszhjm/3okGfmJrGpYxY0f3UhnsBNZkrl0/KVcfszlWBQLkQYf7qd2EusKoUs64ZO20KDcixaOYLXmMH78X0lJnnrEr8MX8eF/HiMaDpEzcjQZM+bw4OqtpAd8LPW+jZ0wgV4Ljv8ncmkEAoHgq4QQNYeYmk4/z3xayXPra+jwm4F0APKTPJw7tYALZk0nM+nI2kt9xFSNZ9Y3ctc75f11oSbmJ3PdotHMSE/E+3Yt7ZvaQQcUCe8E+KPpb3zm3wpASjCTWbVnMTJ6DLOXDmfMrBx8MR83rL6Bl6peAqAkuYRb5tzCBNcEdE3Ht6oBz1t1oOmQqtF9/DN0+N+Mz246gXFjb//K2E19tFSWsf2D9wA48eLLuGVHLWFZ5uzyT5jFJgCipgk40rKO5jAFAoFAsAtC1BwCfKEor29p4dn19ayv66u9ZMZhCjC3sI4L50xn7tjTjri91Ieu67y9vY3b3txJVYdhJxWmOfjVglEsGuHCv6qR1jVlEDNKLWhj7DyY9iwvuV+HMFhjDqY1LGJs2xwmzC1g1pnDsCWYWdO0ht+t/R1tgTYkJC4edzE/n/RzbCYbqjeM++kywlXx6zHZS33BHQT9cbtp2K8oLPjhV8Zu6kPXdd5/7CEAxh53IsHcIp75eDtpfi9LfW9hJYLfa8X560eP8kgFAoFAsCtC1BwknmCUWbe+SyBu40hojM/YwUkljSydvYS87KVHTcwAbKjr5tbXd7C+zrCT0pwW/vfE4XxvagGR9W20/2U9WsBYVE8pcvJayUf8vf2fxNwxZF1hXOtcpjYuID8ni/m/GkV2aTK9kV5uXHsL/634LwCFiYXcPPdmJmdOBiC4vYvu58qNfs0SkYVbqYv+FS301bWb+ti5ZhUt5TsxW23Mu+Bifrq1Ck2SOK1sDTPYDEBEH40zu+iLOxIIBALBEUeImoPErngoSarEHbQzJ/cTTihpYerYH5OV9cujGoWo6ujltjd38ta2NgBsZpkfzi3hsuNKMZd7cN+zCdVtLKqnuOxsnlDHzZ134m3zAlDUPY5ZtWeSSS4zzyll/HF5yIrMxy0f89s1v6XFb8yU+v6Y73PFlCuwm+zoUZWe12rwf2x8JhfIdB77Hzo8bwBfXbupj2goxAdPGhGYGWedS7li4x1/hDRfD+f0vo2FGL4eGwlXPHCURyoQCASCPSFEzUFisWRw3byNEC2jtPQKcrK/gyybj9p42n0h/vpOBU992oCq6cgSfHdaAVedPJLUrjA9D2/D12gsqicnmmmbFuFG/01Ut9QAkB7KZVb1WeR7RjFiehZzlg7HmWwlEA1w56d38nTZ0wDkJeTxhzl/YHr2dACirX66/rOTWLz+k3xckNrUPxP01CJJJoYN++VX0m4azLqX/0uvu4skVxZTTj+TMzaUA3DmzlVM53MAIuFSEkvHHM1hCgQCgWAvCFFzCJg+6Y+YTakoyuEvX7A3esMxHvygmn9+WN1vhZ08JpNrF46mWFLwvFhFxw5jUT3JIhOZ4eAvykN80PohAE4tkak1ixjdfixpWYnM/8FI8kcb5Qk+bf2UZWuW0dTbBMB5o87j6qlX4zA70HUd/0ct9LxeDTEdKcFE9LRt1HjvQAsadtOE8feQnDzlKFyVfcfb2c76lw07bf73f8AqX5jNYY0MXzdnB97FhIrXbSfxF/ce5ZEKBAKBYG8IUXMIsFmzj9qxo6rGU+vq+eu7FXT2RgCYVJDC/zttDFPTE/C+U0fbp63GjCYZTFNSeSL9VZY3/AdN1zBhYkLzfCY1noJTdjLtrGImnVyIYpIJxoLcs/Eelu8wFpjLceZw05ybODbnWABUf5Tu58oJxcWSebSVtkmP0+E27KaM9BMZO/a2r6zdNJgPnvwXsWiE/DHjKZ0xm4tWbwHgO9vfYyrG65CvgKRxX81cIIFAIBAIUfO1Rdd13tjayu1vlVHTacxoKslw8qsFo1gwIoPeD5tpfWQHetSY0WQZk8r7wzdzd92v6a037KcRvslMr1hMUjiD4okZzPvuCJIy7ABsat/EsjXLqPPWAXDOiHP45bRfkmBJACBU0Y37mXI0XwRMEuYFUapMvyPorovbTX2zm45ekvS+0rhzG2VrPwBJ4viLf8zTLV3UqjrZPZ2cFXoPBY2eTidJP73taA9VIBAIBF+AEDVfQ9bVuLn1jR1squ8BICPBwpUnjeC8qflENnbQ9pcNaL1RAMwFiVRO7eLmpl/RWNkIQK5azLQdS8j1DScxzca8S0dQcowLgFAsxL2b7uXx7Y+jo5PpyOSm2TcxJ28OAHpMw/N2Hb0fGH0pmTYiC7dS0X4bWvTrYzf1oWuaUYUbmHDCKSQWlvDHDz8DJM7d8Q6T2AZAqCublJknHMWRCgQCgeDLEKLma0RFm48/v7mTd3YYK/s6LAo/mlfKj+eVYKry4P7bZmIdQQBM6Ta8c0zc6vkr68uMZf1T5DSmVC1iROs0FEVh8sJCpp5WjNlirGr8ecfn3LDmBmo8RtLwmcPO5Nczfk2SJQmAaGcQ9392Em0yIj22mYm0DHuY9tbXAcjIOImxY27DbE45YtfkYNm26l3aqiux2B3MOe9CHqxvo0uXyHO3sTjyPjI63e1Okn70u6M9VIFAIBB8CULUfA1o84a4a0U5z6xvQNONSt7nTy/gypNHkNwdwfOv7URqjanYstOEPC+d+5XlvFDxIjo6FsnC1K5TGFsxH7NmJW9UKvMvGElqthOAiBrh/s/u55Gtj6DpGhn2DG6cdSPzC+YDhtUV2NBOz8uV6BEN2WHCfIZGefBqgp2G3TR82K8pKLj0a2E39REJBlj9lFHq4NhzzifkSOSeTbWAxHk73mYiOwEItmSQetKZR2+gAoFAINgnhKj5CuMLRXlgVTX/XF1NKJ4bs2BcFr9eOJpCScH7cg0dWzoBkMwyttlZvJS5kn/sfJBgzIjYTNZmM3bTKSRG0nAkWZhz7nBGTMvqFx/burZxw+obqOypBOD00tO5fsb1JFuTAdCCMbpfqCD4uXEcS2kSoRM3U9b4JzQtgs2ay/jx95CcPPmIXptDwScvPIO/p5uU7BymLFrCbyubCSBR0tHEadFVSBJ0tSaSdMmvjvZQBQKBQLAPCFHzFSQS03jykzr+9l4lbr8xo2lqUSrXLxrN5IwEvO/W0/ZJq1FPSQLH1Cw2jKrm9rLf07LVWPhumGUUU7YsJt1diCTBhBPymXFGKVa78ZVH1SgPbnmQhz5/CFVXSbOl8dtjf8tJRSf1jyNc68H9VBlqTxhkcJ7sotH1d9rrv752Ux89ba1seO1FAOZf+CMaoxr/aukCJM7f8RbjpQp0HQL1yaSd8f2jOlaBQCAQ7BtC1HyF0HWdVz9v4fa3yqh3G4vYlbqcXLtwNCcPz8C/tpnWR3eih411aGyjUmmbGeW62pv4bONnALismcxuOpPsinFISGSVJDH/glG4ChP7j1PmLuOGNTew023YKwuKF/Cbmb8h1WZMvdZVHd/79XjfrQcdlDQblrN1dnReTrCj/mtrNw1m1RMPo8ZiFE6YxLCpM/jJlmpUJEa31LBA+xAk6GpJIumCy7+25ygQCATfNoSo+YqwtqqTP72xk88bjQKQrkQrV508gu9OySeyuYO2OzageY2ojTkvAfWEZP7S/SCvffoaAHbFzgnqGeR8MA2zZsHqNDH77OGMmZ2DJBs35agW5ZEtj/CPz/9BTIuRYk3hN8f+hoXFC/vHEesJ4X6qrD9Hxz7ZRWDmOnbU3Iquf73tpj7qt35O5acfIckyJ1z0I7b0Bnm5ywfAd8veYrRUjaaDvyqBwvsvP8qjFQgEAsG+IkTNUWZnq5c/v7GT98s6AHBaFH4yfxg/nFuMXOuj697N/aUHlBQrtpNzeVJ+kcc2P0ZIDSEhcVziyQz/5HjMHmMNmTFzcph19jDsCZb+41R2V/KbNb9he9d2AE4qPIkbjr2BDHtGf5vA5x10P1+BHlKRrAqJZ+ZQZ7mT9uo+u+lkxo7589fSbupD01RWPvYgAMecsoj0giIui5dDmFRfzimsBaCzKZmkpRchKcpRG6tAIBAI9o8DKsRz3333UVJSgs1mY+rUqXz44Ydf2P7vf/87Y8aMwW63M2rUKB5//PHd2tx9992MGjUKu91OQUEB//d//0coFOr/vLi4GEmSdnv8/Oc/P5BTOOo09wT55bOfseivH/J+WQcmWeKiWUWs+vUJ/HRUNoHHd9D16DZibQEku4mk04r59KwWllZfzANbHiCkhpiYMokfd/6WMW8vxuxJID0vge/8aionXjimX9DEtBgPb3mY7776XbZ3bSfJksSf5v2Ju46/q1/QaBEV93PluP+9Ez2kYilIxPEjG1vDl9Le/jqSZGLE8N8wccI/vtaCBmDLu2/TUV+LzZnA7HP/h5VuH2t9QWRN5dyKtxkh1aHpEoFyB8k/vOZoD1cgEAgE+8F+R2qefvpprrrqKu677z7mzJnDAw88wKJFi9i+fTuFhYW7tb///vu5/vrreeihh5g+fTrr1q3jxz/+MampqSxZsgSAJ598kuuuu45HHnmE2bNnU15eziWXXALAXXfdBcCnn36Kqqr9/W7dupVTTjmFc88990DO+6jhCUa5f2UVj66pIRwzZjSdPiGHXy4YRYGs4Hm1hvbNRtQGRSJhTi7V49xcveWXbK8xoix5zjxOi16A8mY+6BJmq8LMM0qZcLxRSbuPak81y1Yv4/NOoxjj/Pz5/G7W73A5XP1tIk29uP+zk1hnECRIOD4f35jVbK38o2E32fIYP+4ekpMnHZkLdBgJ+XtZ8/QTAMw693tYExK58WPjmk6v3cEJyscAdDQkk7jkO8gWy177EggEAsFXD0nXdX1/dpg5cyZTpkzh/vvv7982ZswYzjrrLG699dbd2s+ePZs5c+Zw++2392+76qqrWL9+PatXrwbgF7/4BTt27ODdd9/tb3PNNdewbt26vUaBrrrqKl599VUqKir2OZHT6/WSnJyMx+MhKSlpn/Y5VIRjKk98VMe971fSEzBW+51Rksb1i0ZzTEYC3vcb6F3bDKrxdTgmuQjMsXJn1d9YUbcCAKfZyXdSv0fqyglEPEa74dMymXPOCBJSB4ppqprK8h3LuWfjPUS0CInmRK6dcS1nDDuj/1rpmk7v6iY8b9WCqqMkW0g6N5fq0G20t39z7KbBrHz8ITa89hJpeQVcdNvfeL7Ty//uqMcSi3LL+3/jQstLqLpM3dsuildsRE5IONpDFggEAgH7fv/er0hNJBJhw4YNXHfddUO2n3rqqaxdu3aP+4TDYWw225BtdruddevWEY1GMZvNzJ07l+XLl7Nu3TpmzJhBdXU1r7/+OhdffPFex7F8+XKuvvrqLxQ04XCYcDjc/97r9e7rqR4yNE3n5c+a+cvbZTR2G2vHjMhM4LpFozlhWAb+j1toeXQnejAGgHVYMuZTsnmkaznLVy4nqkWRJZnFeWcyZstJeD5QiaCTnGln/vmjKBibNuR4dd46lq1Zxqb2TQDMyZ3DjbNvJNs5UHRT9UVwP1NGuKIHANu4dMwLI3xWeRHBYN/spmspKPjBN2bmj7u5kU1vvgrACRf9iKgk88dKo+r4rMrPmW/+FID2+mQST10gBI1AIBB8DdkvUdPZ2YmqqmRlZQ3ZnpWVRWtr6x73WbBgAf/85z8566yzmDJlChs2bOCRRx4hGo3S2dlJTk4O559/Ph0dHcydOxdd14nFYlx++eW7iac+XnzxRXp6evotqr1x66238vvf/35/TvGQsrqik1vf2MG2ZkNMZSVZufqUkXxnUh7RrV203bnBWAMGMGc7SFhYxOvS+9z7yf/iDhmVr2dmzWSh/wI6npfxqCqKWWbaoiImn1KEYh6wmjRd4z87/8PdG+4mpIZwmp38atqv+M6I7wwRJsGdbrqfLUfzR5HMMkmnl9CT/y6fb/3m2U2DWfXEw2iqSumU6RRPmsp99e20RFWcoQBn171Doa2ZmCbTu9VCyR1H729GIBAIBAfOAc1+2vXXu67re/1Fv2zZMlpbWzn22GPRdZ2srCwuueQSbrvtNpT4zJKVK1dyyy23cN999zFz5kwqKyu58sorycnJYdmyZbv1+fDDD7No0SJyc3O/cJzXX389V199df97r9dLQUHB/p7ufrOt2cOf3tjJhxXGKryJVhM/PX4Yl84pQar34f7H5/31k5QkC0mnFrElp4bbN/yE8m5jJk5xUjEXZlxG8I002txhQKdofDrzzhtJsss+5HgNvgZ+u+a3rG8zajzNzJ7JTXNuIjdh4ProUQ3PGzWGxQWYc5wkfTePys4/0F7+BtBnN92G2Zx8WK/PkaZ28waqN36KrCjMv/CH9ERj3F1jLFI4t/wz5lk3ANBel0LS/LmY0tK+qDuBQCAQfEXZL1GTkZGBoii7RWXa29t3i970YbfbeeSRR3jggQdoa2sjJyeHBx98kMTERDIyjNk3y5Yt48ILL+RHP/oRABMmTMDv93PZZZfxm9/8BlkeiEjU1dXxzjvv8Pzzz3/peK1WK1ar9UvbHSoauwPc+XY5L2xuQtfBrEh8/9gi/vfEEST6onie3EGorBvAmDJ9fAHuCVGu+/wWVm5bCUCSJYkfDP8Rrk+PofGtHiBMQqqVeeeNpOSYjCHiUdd1ni1/lr+s/wvBWBC7yc41U6/h3FHnIksD1yza5jcKUbYaU8MT5uQizfGzaef5BEP1SJKZ4cOvpSD/km+M3dSHGovx/uP/BGDywsWk5eZzU2UzXk0nrdfDmc3vkWdvI6op9H5upuS1PxzlEQsEAoHgQNkvUWOxWJg6dSorVqzg7LPP7t++YsUKzjzziwv+mc1m8vPzAXjqqadYvHhxv1gJBAJDhAuAoijous6uecyPPvoomZmZnH766fsz9MNKTyDC39+v5LG1dURUY0bTGcfk8stTR5FrUvC+XkvbxjbQAVki4dgc9Lkp3F/5T556/SliegxFUvjuyO8yx72YnY+7aYz2IMsSk04pZNppxZitQ9dLae5t5ndrf8fHLcaMnWlZ07hpzk0UJA5EonRdx/9JKz2vVkNMQ04wk7J0BF0Jr1Px2SC7afzfSE465ohdryPJZytex93UgD0xiWPPuYDGUIR/NhhVzo8r28hs20YA2mpSSDx2CubcvKM5XIFAIBAcBPttP1199dVceOGFTJs2jVmzZvHggw9SX1/PT3/6U8CwfJqamvrXoikvL2fdunXMnDmT7u5u7rzzTrZu3cpjjz3W3+eSJUu48847mTx5cr/9tGzZMs4444x+iwpA0zQeffRRLr74Ykymr8a6gR2+MCffuQpP0JjRNKs0netPG834jAR8qxppW92EHi9GaZ+QgePUfF7ofIX73r4PT9hYPfi4/OP4ftqPqHkpzNZWw7LKG5nCceePIi3XOeR4uq7zfMXz3L7+dvxRPzbFxlVTr+KC0RcMic6o/ijd/60gtL0LAOvIVJLOzqG88XffeLupj6DPy0fP/huAOeddiM2ZwJ+31xEBcns6OK39A7IdnUQ0E/7PzZS8IKI0AoFA8HVmv5XBeeedR1dXFzfddBMtLS2MHz+e119/naKiIgBaWlqor6/vb6+qKnfccQdlZWWYzWZOOOEE1q5dS3FxcX+bG264AUmSuOGGG2hqasLlcrFkyRJuueWWIcd+5513qK+v59JLLz3A0z30uBKtzCxJo94d4NpFo5lfmk5gXSutj+5A8xszmizFSSSfVsI6+TNuX3MtNZ4aAIanDOd/x15F7IMMNj7bBoA90cycpSMYOSNrNyuo1d/KjR/dyJqmNQBMck3i5rk3U5RUNKRdqKoH99NlRlkFRSJ5YQnaRDcbti39xttNg1n77JOE/L24CouZcNKpbOsN8lybYf8dt2MDs+ybAWitSiXhmNFYSkccxdEKBAKB4GDZ73Vqvs4crnVqPIEoTqtCZHsX3jdriXUZKyGbXHaSF5bQlOPmLxvuYE2zIUZSran87JifMarlWNa/XEckpCJJMH5+PjPPKMHqMA/pX9d1Xq56mT+v+zO+qA+LbOGKKVfw/THfR5EHIlm6quFdUY9vVQPoxvFTzxtFOy9QURGv3WTLN2o3fUPtpj46G+p4/Nf/i65pnLvsjxSOn8gFm6t4v9vHsPZGblj/AIucHxDSzNS9kE7Rv5/GNmHS0R62QCAQCPbAYVmnRrBnbO0B3K/VEGkwiiLKCWaSTikiPN7CHVvu57kNz6HqKibZxPfHfJ8zk85jw7PNrG2oBiCzKJH53xtFZtHuX1RHoIObPrqJlY0rAZiQMYGb595MaXLpkHaxriBdT5URjY/BOSMb50IXO6uuo6PjTQBcGacwZsyfv7F2Ux+6rrPy8X+iaxrDp8+icPxEPnT7eL/bh6xpHLdjAzMcxirLbRWp2EcXCUEjEAgE3wCEqDlIVF+Ejoe2gKojmWUSjsvHNieLp2qe5oGXHsAXNUTGSYUn8fMx/0vjO1HeWm1M27Y6TBx71jDGzs1FlnefJv96zev88ZM/4o14MctmfjbpZ1wy7hJM8tCvzb+xjZ4Xq9AjKpLNROo5w4kVtbJ+89nfKrupj6oN66j7fBOKycT8C3+IpuvcVGUstDe2uYYTej8hPaGHoGoluEWm8KEbjvKIBQKBQHAoEKLmIFESLSTMyUUPqSSeVMgqz2ruePNyGnwNAIxOG82vpv6KxNp8PvhLFaFeI6F49KxsZp09HEfS7vWFuoJd3PzxzbxT/w4AY9LGcMvcWxiROjTnQwvF6HmxkkC8VpSlJInU746i1f8MFRsG7KYJ4/9GUtLEw3kZvjLEolFWPWFM4Z56+lmkZGXzQls3W3pDmGNR5pVtZIbTiNK0VqRgK87GMeu4ozlkgUAgEBwihKg5BCQvKqGsu4xrPr6cT1uN5fYz7BlcMfkKZttOZM2TlbRU7QQgLdfJ/AtGkTsiZY99vVX7Frd8fAvd4W5MkomfHPMTfjjhh5jloXk24Xov7qfKUN0hkCHppCLs81LYUXbNgN3kOpUxo/+M2Xxk61wdTTa9+Qo9rS04U1KZefZ3CWsaf6wyFhyc1FDBcaFPSUnw4VdthLfI5N/966M8YoFAIBAcKoSoOUi8ES93rL+DFypeQEfHIlu4eNzFXDjiYra92c5/39+IrumYrAozFpcw8cR8FEXerZ/uUDd//OSPvFlrCJJRqaO4ee7NjE4bPaSdrun4VjbgfacONFBSraSdP5pwai2frr+4324aMfw68vMv/lbYTX0EPD18/N+nAJh7/kVY7A4ebGinIRzFEQ4yt3wjU51bAWgtS8GSnYLz5NOO5pAFAoFAcAgRouYgsSpWPm7+GB2dRcWLuHLKlQTKTLx08xb8nggAw6a4mHvuCBJSbXvs4936d7npo5twh9woksKPJvyIn0z8CWZlaHQm1hPG/XQZkRpjfRv7MS5SzhpGc+e/43ZT9FtnNw1m9dNPEAkGyCodzrj5J+GNqdxVa0yVn1a7k1mxjSTZ/PTG7ES3SuT+4YpvlegTCASCbzpC1BwkVsXKjbNvxG6yU6yP5INHymjYYayFkuSyc9z5Iykal77HfT1hD39a9yderTaqRw9PGc7Nc29mXPq43doGt3bi/m8FejCGZFFIOXMY5gk2tpVdSUfHW8C3027qo62mii3vvQ3A8Rf/GEmWubemme6YSorfx6yKzUxJ2AFA685UTGkOEs8672gOWSAQCASHGCFqDgHTM2aw4c06/vP2J2gxHcUkM2VhEVMWFGIyK3vc54PGD7hx7Y10BDuQJZkfjPsBP5v0MyzK0MRhLaLiebUa/zqj3pY5P4H080cTsFSwef0VhEIN31q7qQ9d11n52EOg64yafRz5o8fRHIrwQIORQD2zZhsz2UyCFMAbcxLdppN93WVIyp6/G4FAIBB8PRGi5iDxe8I8f/sGvJ3GgnuF49KYd95IUjIde2zvi/i47dPbeLHyRcCoxn3L3FuY6NrdLoo09+L+z05iHUGQIHF+PoknFdLUupyKLX12UwETxt/zrbSb+qj4ZA2NO7Zislg57n8uAeD22lbCuk62p5PpFZ8xKdVI1G7dkYIp0UzyBT84iiMWCAQCweFAiJqDxJFkISnDjqbqzP3uCEonufYaLVnTtIbfrf0dbYE2JCQuGnsRv5j8C2ymobk2uq7Tu6YZzxs1oOrIiRbSzhuJUqSwdef/DrKbFjBm9J++lXZTH7FIhFXLHwVg2pLvkJSRyU5/kKdb3AAcW72NGeYtOKQQ3bFE1O0amb+4ENmy+1R6gUAgEHy9EaLmIJEkiZMvGYvZpmCx7fly+qN+bv/0dv5b8V8AChMLuXnuzUzOnLxbW9UXofu5ckJlRl6ObUwaqUtH4ld3sOXTwXbT9eTnX/SttJsGs+G1F/F2tJGQls6MM84B4JaqFjSgpKOZKdVbmJheBkD7tmQUm0zqJZcfxRELBAKB4HAhRM0hwJli3etnn7R8wm/X/JZmv7FWyv+M+R+umHwFDvPu9lSozI372XK03iiYZFIWl+CYkU1T0+NUVP5J2E270Ovu4pMXngHguO9dgtlmY213Lyu6vEi6xsya7Uy1bsMmRXDHktF2qGRcshTZ6fySngUCgUDwdUSImsNEIBrgrg138VSZsW5KXkIef5jzB6ZnT9+trR7T8LxZS+9qYyl/U5aD9AtGQ7rK1m0/p6PDmNUj7KahfPifx4iGQ+SMHM3oucej6zp/iC+0N6a5lmNqtjE+qwKAti1JSGadtJ/+39EcskAgEAgOI0LUHAbWt65n2ZplNPY2AnDeqPO4eurVe4zORNsDuP+zk2iLHwDnrBxSTivBF9zGlk//l1CoEUmyMGLE9eTnXfitt5v6aKksY/sH7wFw4sWXIUkSL7f3sMkXwKTGmFa7kynObVilKB3RVCiLkXruaSgpKUd34AKBQCA4bAhRcwgJxoLcs/EentzxJDo6Oc4cfj/798zKnbVbW13XCXzaRs8rVehRDdlpInXpSGyj02ho/BeVlX8WdtNe0HWd9x97CICxx51I9vCRRDWdW6uNKM0xDZVMrNvG2NwqADo+T0RWVNJ+cd1RG7NAIBAIDj9C1BwiNrdv5oY1N1DnrQPgnBHn8MtpvyTBkrBbWy0Qpfv5CoJbuwCwDk8h7buj0OxBtmy5nI7OFQC4XAsZM/pWYTftws41q2gp34nZamPeBRcD8ERzJzXBCPZIiMn15UxK2olZitEWTUeqiJK86DjMWZlHeeQCgUAgOJwIUXOQhNUw9266l8e2PYaOzv9v787Dojrvxv+/Z4MZtmFHVBDcEAU3cEFwSxqtUVyapprFaOqWfpO2Jn2+z1db/T3PE6s0bWPsEm3UmKqx0eRR0zQxi01UVFQiatSoICKCCCL7OsMs5/fH6BgCJqLIKHxe1zXXBefc53B/Ts6V+fg597nvYI9g/mfE/5DUJan59jmVlG07h62yAdQqjOMj8BrZhaqak5z+8hfyuOl7WEwmUrc4XuEeOvVxvPwDqLbaePUbyyHEXjpNn7CLABR/5Y1W1UDAwt+4rM9CCCHahiQ1d8lkNfFhzocoKEzpMYX/HPqf+Lg1rawoNoWqzy9RvScfFNAGGvCfEYWui1ejx00GfTgxMX/GxyfWBdHc/9I/2E5NWSk+QSHETZoKwOq8YkotVox11fQryKG/XxZalY1CSxDabDM+I+Nw69bNtR0XQghxz0lSc5eM7kZ+m/hbLHYLY8LGNNvGWmaibOs5GvKqAfCIC8F3cg9s6ppGj5uCgyYQHZ2CVuvdVt1/oFSVFHP0A8dcP6OffhadmztXzRb+ll8MwLCLZxlw6RS9IxyPAK+d8ERHAwG/WuqyPgshhGg7ktS0gsQuibfcV3eimPKd2ShmGyq9Br9pvfAYEERl1VecPv3Nx02/pmuXp+Vx03dI3fJ3rJYGukbH0GuY45q/mltEvV0hpKqM3ldy6ReYjUZl57IlBN2FBrziotD36ePingshhGgLktTcI3azlYp/XqDumKOK4NbNB//pUWj83MnLf0seN7XQ5XNfk5mWCiqVYxVulYrztSa2XHEMth6e8zUD8k/Rs3s+ACXHPXDHTMCLS1zZbSGEEG1Ikpp7oCG/mtKt57CVmhwLUT4Ujs9D4VjtVZw6tVAeN7WQYrc7VuEGYsc+QkhkDwBW5BRiAyJKCom8WkDfkBzUKoU8a2fcc8x4RHfDIz7ehT0XQgjRliSpaUWKXaE69TJVn10Cu4LG1x3/GVG4RxivP276OSZTgTxuaqGv933O1Zxs3AweJE6fCUB6RQ0fl1SiUhSGXjzDoLzjdO/lmOyw7Jged0wE/FLmpRFCiI5EkppWYqs0U/ZuJuYLlQAYYgPxm9YTlUHb+HGTIZyYmL/g4x3j4h4/GBrq6ziwdRMAwx+bgaev3/XlEAoBiCq6RNdrRUR1yUOlgovWrrjnmHDvFozn6NGu7LoQQog2JklNK6j/upTy7VnY66yo3NT4Tu6BR1wIVmsVZ079JyUl/wYgOPhRovuskMdNLXBk57vUVpTj2ymUwROSAfi4pJIvq2rR2mwMyT1HXP4xIvpcQQHKM9wxUE/gL/5DqmBCCNHBSFJzl2yVZkrfOQtWBV0XL8fcM0EeVFae4PTXv3A+burd6zd06fKUfNG2QMXVIjI+eh+A0TPnotHqsNoVVuQ4qjT9L2cTWlJM7zDH4OAcazcMF+tx6+SL9w8fdVW3hRBCuIgkNXdJY3TH+MNIbFVmjOMiQKMiL28D2RdeQVGs8rjpLuzb/CY2q5Xw2IH0iBsKwD8KS8muM6O3mBmYf54hl48S1vcqdlRUHdeix0LAcy+g0mhc3HshhBBtTZKaVuCd1AUAi6WCM6f+nzxuagV5p0+S/eUhVCo1Y5+Zi0qlotZq44+5RQDEXcok5Fox3SMci1hm2yLRX6hH6+eJ8UePu7LrQgghXESSmlbS9HHTErp0eVIeN90Bu93G3o1rARgwbgKB4REA/C3/GsUNVnzqa+l75SJDrxylS8w1bKipOabCAATMmYfKzc11nRdCCOEyktTcJUVRyM/fQPaF3zsfN8XG/BVv736u7toD69Tnn3EtLxe9pxcjHn8KgGsNFlZfXw5h6MUzhBYVEtnDUbU5b++B4UItGk93fJ+c6bJ+CyGEcC1Jau6SueEqORf/jKJY5XFTKzDV1nBw22YAEh5/EoO3Y3HQlblXqbXZCaoup8e1AoYWH6NTSClWNNSfUNADfjNnovbwcGHvhRBCuJL6Tg5avXo1kZGR6PV64uLi2L9//3e2f/3114mOjsZgMBAVFcWmTZuatFm1ahVRUVEYDAbCwsJ48cUXMZlMjdoUFBTw9NNPExAQgIeHBwMHDiQjI+NOQmg1evdO9I3+PVG9Xyam358loblLh7e/Q311Ff5dwhjwiOMNppw6M5uvlACO5RC6XL5MeC9H1SZL6YU+qw61uxb/Z+e6rN9CCCFcr8WVmm3btrFw4UJWr15NYmIib7zxBhMmTODMmTOEh4c3ab9mzRoWL17MunXrGDJkCOnp6cybNw8/Pz+Skx3zjmzZsoVFixaxYcMGRowYQVZWFrNnzwbgtddeA6C8vJzExETGjh3Lxx9/THBwMBcuXMDX1/fOo28lwcHjXd2FdqHsymWOf/IhAGOemYtG67g9V+RcwapAeGkRXcqvMbTsGMFdy7GgxXzCgh7wffxHaIxGF/ZeCCGEq7U4qVm5ciVz5sxh7lzHv4pXrVrFp59+ypo1a0hJSWnSfvPmzSxYsIDp06cD0L17dw4fPswrr7ziTGoOHTpEYmIiTz75JAARERE88cQTpKenO8/zyiuvEBYWxltvveXcFhER0dLui/vYvs1vYrfZ6D54CJED4wA4VlnLh9ccyyEMu3iGbpdyCYtyVG3O0Qd9ZhkqjQr/+c+7sutCCCHuAy16/NTQ0EBGRgbjxo1rtH3cuHGkpaU1e4zZbEav1zfaZjAYSE9Px2KxAJCUlERGRoYzicnJyWHXrl1MnDjRecwHH3xAfHw8jz/+OMHBwQwaNIh169Z9Z3/NZjNVVVWNPuL+lHsig5xjX6LWaBg9cw7gGIT98gXHK9u9r+YRWFPJkKrjBLhXYkaH9WQ9AMbkCeiCg13WdyGEEPeHFiU1JSUl2Gw2QkJCGm0PCQmhqKio2WPGjx/P+vXrycjIQFEUjh49yoYNG7BYLJSUOP7FPWPGDJYtW0ZSUhI6nY4ePXowduxYFi26uSBhTk4Oa9asoVevXnz66ac899xz/OIXv2h2fM4NKSkpGI1G5ycsLKwl4Yo2YrNa2bNpPQCDfjgJ/85dAdhdWsXhylq0djtDcs/RPTubLn3KATin6of+TD2oIOD//NJlfRdCCHH/uKOBwt+ee0VRlFvOx7J06VImTJjA8OHD0el0TJkyxTleRnN91te9e/eyfPlyVq9ezbFjx9ixYwcffvghy5Ytc57HbrczePBgVqxYwaBBg1iwYAHz5s1jzZo1t+zn4sWLqaysdH7y8/PvJFxxj321exdlBfkYvH0Y/tgTAFjtCr+9vmhlzOVsjPW1DKk7gZ9bFSbcsZ+uAcDnB6Nwa2YslxBCiI6nRUlNYGAgGo2mSVWmuLi4SfXmBoPBwIYNG6irqyM3N5e8vDwiIiLw9vYmMDAQcCQ+M2fOZO7cucTGxjJt2jRWrFhBSkoKdrsdgNDQUPr27dvo3NHR0eTl5d2yv+7u7vj4+DT6iPtLfXUVh977BwCJ059G7+kFwLtFZWTVmTDYLAzKO0/vzLOERlcAcEYTi/5ULQABL/zKJf0WQghx/2lRUuPm5kZcXBy7d+9utH337t2MGDHiO4/V6XR07doVjUbD1q1bmTRpEmq148/X1dU5f75Bo9GgKAqKogCQmJhIZmZmozZZWVl069atJSGI+0zae1sw1dYQFB5B7MOOt8jqbHZ+f9GROA/MPYdXg4lBltMY3WqoQ4/663JAhVfCYPRRvV3YeyGEEPeTFr/99NJLLzFz5kzi4+NJSEhg7dq15OXl8dxzzwGORz4FBQXOsS5ZWVmkp6czbNgwysvLWblyJadPn2bjxo3OcyYnJ7Ny5UoGDRrEsGHDyM7OZunSpUyePNn5iOrFF19kxIgRrFixgp/85Cekp6ezdu1a1q5d2xrXQbhASf4lvtr9MQBjZs1HrXb8t15/+RpFDRaMDSZiCi7S5+vThEZXAnBGOxDPk5cAFQG/+A9XdV0IIcR9qMVJzfTp0yktLeXll1+msLCQmJgYdu3a5ayYFBYWNnokZLPZePXVV8nMzESn0zF27FjS0tIavY69ZMkSVCoVS5YsoaCggKCgIJKTk1m+fLmzzZAhQ9i5cyeLFy/m5ZdfJjIyklWrVvHUU0/dRfjCVRRFYe+m9Sh2Oz2HJBAe0x+A0gYrf7l0FYC4C6fxtDQwSHUGb10tNXjgdrYE7Co8+vfBY9AgV4YghBDiPqNSbjzf6QCqqqowGo1UVlbK+BoXyz56hH/+YRkarZbZK/+Gb0gnAJaev8y6yyUE11Ux7csviD9+lIdij+KprSfdbSQ+W86j2NSErVuH18gkF0chhBCiLdzu9/cdvf0kxN2wWizs2+x4hXvwxKnOhOZSvZm/F5QCMCT7FJ4NDcTqzuGpracKL/RZRSg2NfqeYXgmJbqs/0IIIe5PktSINnf8k39RUVSIp68fw6f9xLn9dzmFWBSF8MoSwsqvEXsig+A+1QB87T4E9wzHzwEvvHTLKQSEEEJ0XJLUiDZVV1nB4e1bAUia8QxuBseq2ieq6thZXIEKhaHZp/Cpr6efZzYeWhPl+OCVfQW7RY1b5yC8H3nElSEIIYS4T0lSI9rUgW2baaivI6R7T/qNfhhwDBr+7Y3lEK5dIbCmkgEnviS4j2NZizOG4egzKgAIeP6XqK6/ESeEEEJ8kyQ1os1cvXiBU198BsCYWfNQXZ+baE9ZNQcqatAqCnEXvsa/uoYo40X0mgZK8cV4MR+rSYM2wIjx+iKoQgghxLdJUiPahKIo7N24DhSFqISRdO3TDwDbN6o0/a7k4GOuY8CJdIJujKXxGIH+S8fg4YD5z6Fyc3NNAEIIIe57ktSINnH+yEEunz2NVufGqKefdW7ffrWcM7UmPBQ7g3LPEVxRQc/gPNzVForxJyD/EpZaLRovA76P/+Q7/oIQQoiOTpIacc9ZGxrY9/ZbAMRPfgyfwGAATDY7r+Q4Fq3sf/EsequFgSeOENTLsVjl114jMaQXA+D/7BzUHh4u6L0QQogHhSQ14p7L+Oh9qq5dxcs/gKGTH3Nuf7OghAKzBV+bhZiCC3S+VkJk50Lc1BaKCCKkMBdzpQ61Xoff00+7MAIhhBAPAklqxD1VU1bKkZ3vAjDqydno9HoAyi1W/nx9OYSB50+htdsZdOoIQT0dbzx97T0aj8MFAPg+8SQao9EFvRdCCPEgkaRG3FP739mIxWwitFcUfZLGOLf/+dJVKq02Qi0mel/No1thIeHdrqJV2ygghC4lOdSXuqHSavB/9qeuC0AIIcQDQ5Iacc8UZmdyJvULAMbOnu+cBTjf1MCbl0sAGHTuGBpg4Ol0ArpfH0tjfAiPQ7kAGKdNRRcc3OZ9F0II8eCRpEbcE4qisGfjOgD6jnqI0J5Rzn2v5BTSoCh0r68mrKyY7pfy6NKzBK3KRh6d6VaZQ22RHlQqAuYvcFUIQgghHjCS1Ih74tzBfRRmnUPnrmfkE7Oc27+uqWf71XIABp7JQAsMyPySgAhHleaM7w/wPJgFgM+ER3ALC2vzvgshhHgwSVIjWp3FZCJ1i+MV7qFTH8fLP8C577cXrqAAMVUlBNdU0Cv7AqG9y9Go7OQQRvfaXKrzHYOJA5573hXdF0II8YCSpEa0uvQPtlNTVopPUDBxk6Y6t6eWVbOnrBot0P/sMXRA/wsZBHRzVGnO+j2C14GvABVeoxLR9+7tiu4LIYR4QElSI1pVVUkxRz/YDsDop3+Kzs0dALuisOz6cgiDSgrwMdURdfYcIdGVqFUK54kgynKZykuOCfYC/88LrglACCHEA0uSGtGqUrf8Haulga7RMfQalujc/n5xBadq6jGg0DfrK/RAbN4J/MJqATgbMB6v1MNgV+ExeACGgQNdE4AQQogHliQ1otVcPvc1mWmpoFI5VuG+/gq32W4n5fpyCPFXcjBYGuhz8iRB/apRqxTO0YO+XKHigqNKE/B/fu6yGIQQQjy4JKkRrUKx2x2rcAOxYx8hJLKHc9/fC0rINzXgj0LvC2fwVKBf4Sn8uzrG0pwLnIDP3r0oNjX6Pj3xTBzhkhiEEEI82CSpEa3i632fczUnGzeDgcTpM53bKy1WVuU6lkOIu3gGnd1G9LEMAvs7Hjudpjex2kLKszwBCPjZz50VHiGEEKIlJKkRd62hvo4DWzcBMPyxJ/D09XPu+2teMeVWG12wEZmXjdGuEF16Fr/QWuyoyAp6FOOeT7Bb1LiFd8H7kR+4KgwhhBAPOElqxF07svNdaivK8e0UyuAJyc7tV0wNrLt8DYDB506gRqHvkcPOKs0p+jDAUEJZphfgmJdGpZZbUgghxJ2RbxBxVyquFpHx0fsAjJ45F41W59z3+4tFmOwKUfYGOl/NJ8Bmo3dNNsZOdY4qTfCj+H3xPjaTBm1wAMZJE10UhRBCiPZAkhpxV/ZtfhOb1Up47EB6xA11bj9bU8+7RWUAxJ5KRwX0PXiQgIGOKs1x+hHnVU7Z2etvPM1dgMrNrc37L4QQov2QpEbcsbzTJ8n+8hAqlZqxz8xtNMB3eU4hdiDOWkdgRQkhFgs9rLn4BNVjQ0128ET8976LpVaLxuiF7+M/dl0gQggh2gVJasQdsdtt7N24FoAB4yYQGB7h3HewvJp/l1ahBaJOHAKgX+o+Agc4qjQZxBLvV03Z1wYA/J+dg9pgaNP+CyGEaH8kqRF35NTnn3EtLxe9pxcjHn/KuV1RFJZdcEy0N8JUiU9tNV1NZiI0BXgFmLCg4UKnSQTtextzpQ61QY/fk0+6KgwhhBDtiCQ1osVMtTUc3LYZgITHn8Tg7ePc98G1Ck5U1+GhVhFxLA2Avnv3EHC9SnOUAQwNqKPslGP8jN9TT6Px8UEIIYS4W5LUiBY7vP0d6qur8O8SxoBHHnVub/jGcggjK6/iYTETWVtLmMdVPP3MNKAlp9MkQva/RX2pGyqdFv9Zz7gqDCGEEO2MJDWiRcquXOb4Jx8CMOaZuWi0Wue+zVdKya1vwF+jouuJI6hUKvrs3eus0hxhEMNDGig76bjtjI89hjYoqO2DEEII0S7dUVKzevVqIiMj0ev1xMXFsX///u9s//rrrxMdHY3BYCAqKopNmzY1abNq1SqioqIwGAyEhYXx4osvYjKZnPv/+7//G5VK1ejTqVOnO+m+uAv7Nr+J3Waj++AhRA6Mc26vttp4NbcIgFHFeejsNnpWVNDFeA0PoxkzblwKnURo2lpqi/SgVhMwd66rwhBCCNEOab+/SWPbtm1j4cKFrF69msTERN544w0mTJjAmTNnCA8Pb9J+zZo1LF68mHXr1jFkyBDS09OZN28efn5+JCc7Zp/dsmULixYtYsOGDYwYMYKsrCxmz54NwGuvveY8V79+/fj3v//t/F2j0bS0++Iu5J7IIOfYl6g1GkbPnNNo3+q8YsosNsK0KoK+PoZGpSJq3z4CxjqqNIcYREJnG2XvKQD4THwUt65d2zwGIYQQ7VeLk5qVK1cyZ84c5l7/V/aqVav49NNPWbNmDSkpKU3ab968mQULFjB9+nQAunfvzuHDh3nllVecSc2hQ4dITEzkyetvwURERPDEE0+Qnp7euLNarVRnXMRmtbJn03oABv1wEv6dbyYkV80W/pbvWA5hRF4mGkWhd2kpoUGlGLwbqMed/M6TGH7kr1y8rAcgcP78tg9CCCFEu9aix08NDQ1kZGQwbty4RtvHjRtHWlpas8eYzWb0en2jbQaDgfT0dCwWCwBJSUlkZGQ4k5icnBx27drFxImNp80/f/48nTt3JjIykhkzZpCTk/Od/TWbzVRVVTX6iDvz1e5dlBXkY/D2YfhjTzTa98fcIurtdvrqVPieP4NOo6H3/lQC+tcBkEY8I7qqKcswAyq8HhqLe69eLohCCCFEe9aipKakpASbzUZISEij7SEhIRQVFTV7zPjx41m/fj0ZGRkoisLRo0fZsGEDFouFkpISAGbMmMGyZctISkpCp9PRo0cPxo4dy6JFi5znGTZsGJs2beLTTz9l3bp1FBUVMWLECEpLS2/Z35SUFIxGo/MTFhbWknDFdfXVVRx67x8AJE5/Gr2nl3NfVq2JLVcc/w2GZJ5ABUQXXCEktAK9ZwO1GLjceSJhX/6NylzHBHuBCxa0eQxCCCHavzsaKPzN6fDBMeHat7fdsHTpUiZMmMDw4cPR6XRMmTLFOV7mxpiYvXv3snz5clavXs2xY8fYsWMHH374IcuWLXOeZ8KECTz22GPExsbygx/8gI8++giAjRs33rKfixcvprKy0vnJz8+/k3A7vLT3tmCqrSEwPILYh8Y32rci5wp2IEGrYMi/iF6rpWfaAQKur8R9gCEkdtNRllEJigqPofEYBgxwQRRCCCHauxYlNYGBgWg0miZVmeLi4ibVmxsMBgMbNmygrq6O3Nxc8vLyiIiIwNvbm8DAQMCR+MycOZO5c+cSGxvLtGnTWLFiBSkpKdjt9mbP6+npSWxsLOfPn79lf93d3fHx8Wn0ES1Tkn+Jr3Z/DMDYWfNQf2NwdnpFDZ+UVKEG+p5yPDrsm5tLUFgl7gYL1XhwpfMEup34GxUXHAtXBj73szaPQQghRMfQoqTGzc2NuLg4du/e3Wj77t27GTFixHceq9Pp6Nq1KxqNhq1btzJp0iTUasefr6urc/58g0ajQVEUFEVp9nxms5mzZ88SGhrakhBECyiKwt5N61HsdnoOSSA8ZkCjfS9fuALAI1obuuJCPHU6uh85RECsYyzNfoYysrsn5UeKUWxq9P2i8UhIcEksQggh2r8Wv/300ksvMXPmTOLj40lISGDt2rXk5eXx3HPPAY5HPgUFBc65aLKyskhPT2fYsGGUl5ezcuVKTp8+3eixUXJyMitXrmTQoEEMGzaM7Oxsli5dyuTJk52PqP7jP/6D5ORkwsPDKS4u5re//S1VVVXMmjWrNa6DaMaFjHQunTyORqtt8gr3rpJKjlbVYVCriMg4AEDMuUwCI6pw01uoxJuiLhMYd+oNLpz3BCBgwXO3fEwphBBC3K0WJzXTp0+ntLSUl19+mcLCQmJiYti1axfdunUDoLCwkLy8PGd7m83Gq6++SmZmJjqdjrFjx5KWlkZERISzzZIlS1CpVCxZsoSCggKCgoJITk5m+fLlzjaXL1/miSeeoKSkhKCgIIYPH87hw4edf1e0LqvFwr7Njle4B0+cim/IzVfpLXaFFdcXrZygmFBVlGN0c6Pb8XQCkh1VmlSGMqqXL5VrLmG3+OAW0Q3vH/yg7QMRQgjRYaiUWz3faYeqqqowGo1UVlbK+Jrv8eW/dpD69gY8jL7M+dNa3Awezn0bC0r4f1mX8ddq+MmhT6CulsRzmfQ3HSBkYBXl+LCjyxJmN2zjwrrL2EwaQlNS8J021XUBCSGEeGDd7ve3rP0kmqirrODw9q0AjHxiVqOEptZq44/Xl0OY1FAJdbUEuLsTdvoYAf0cVZq9DGd0nyCq0s5iM2nQdgrBOGli0z8khBBCtCJJakQTB7ZtpqG+jpDuPek3+uFG+9bkX+Nag5Vwdy2eh/cBEPPlUfx716LVWSnBj9LOD9M9601KzznmswmYOw+VTtfmcQghhOhYJKkRjVy9eIFTX3wGwJhZ81B94620aw0WVucXA/BoVTGKxUInd3dCz5/Gv++NKk0Co2PDqN6fgaVWi8bPF9/HftT2gQghhOhwJKkRToqisHfjOlAUohJG0rVPv0b7X829Sp3NTqzBDVW6442nfmlpBETVoNXauEoA5Z3H0CP775SedVRp/GfNRm0wtHksQgghOh5JaoTT+SMHuXz2NFqdG6OefrbRvgt1Jt6+4ljW4uHiXBS7nXB3d0IuZeIfXQ84qjRjBkRSm7oPc6UOtacHfk8+0eTvCCGEEPeCJDUCAGtDA/vefguA+MmP4RMY3Gj/ipxCrAqM8nLHdMwxe3D0nr34R9ei0dgoJJiqzqPpcfFtSs54A+D35FNo5C0zIYQQbUSSGgFAxkfvU3XtKl7+AQyd/FjjfZW1fHStEjWQkHsWgB7u7gQUX8Q/yjGW5gsSGDO4F/X7PsFU6obKzQ3/Wc+0dRhCCCE6MElqBDVlpRzZ+S4Ao56cjU6vd+5TFIVl15dDmOjjTuXXJ1GpVER99hmB0TWo1XYu04m6zkn0zNtG6RnH7MG+P34M7fW1vYQQQoi2IEmNYP87G7GYTYT2iqJP0phG+3aXVnG4sha9WkXM2eMAROl0+FUU4NvLMZbmC0YwOr4vpr07qL2qB7Ua/5/O+fafEUIIIe4pSWo6uMLsTM6kfgHA2NnzG63NZLXfrNI87qWjPDsLtVpN7492Edi3GrXaTi5dMIUOp1fBdkrPOCbpMyZPwq1rl7YPRgghRIcmSU0HpigKezauA6DvqIcI7RnVaP+2ojLO15nx02oIO3EYgH5qNT6mq/j2cFRp9jCCMcMG0rDvHaovOx5bBcyd24ZRCCGEEA6S1HRg5w7uozDrHDp3PSOfaLzaeZ3Nzh8uOpZDeMIAZfl56LRaevzrQwL71aBSK1wgHEvnofQq/Celp90BFV4/eBj3Xr1cEI0QQoiOTpKaDspiMpG6xfEK99Cpj+PlH9Bo/7r8axQ1WAjT6zCm7wcgxmbHy1aKMfLmWJoxwwdj3f93Ki85JtgLnD+/DaMQQgghbpKkpoNK/2A7NWWl+AQFEzdpaqN9JQ1W/pJ3FYAnNRbKi4vRu7sT+c9/EhRTjUqlkEUkSuc4ehXvovSUBhQVHsOHYejf3wXRCCGEEJLUdEhVJcUc/WA7AKOe+ik6N/dG+1ddKqLGZifWSw9pewEYUG/CU1uJT7cbY2kSGDNiCLbU9VTkOF7jDlywoO2CEEIIIb5FkpoOKHXL37FaGugaHUPv4YmN9l2qN7OxoBSAx63VVFZU4Gkw0NVZpYEz9ETVeRC9Sv9N2Wk7ik2FPjYGj+HDXRGOEEIIAUhS0+FcPvc1mWmpoFI5VuH+xivcACk5hVgUhdG+nlQd3AfAwKoqPA01+ISbULi+EnficOz711B+3lGlCZg/v8m5hBBCiLYkSU0HotjtjlW4gdixjxAS2aPR/uNVdbxfXIEKSK65Rk1NDb5eXoRer9IAnCYKTWgsvStTKT9lxm5R49a9O94PP9zW4QghhBCNSFLTgXy973Ou5mTjZjCQOH1mo32KovDb6xPtTQ304fLBVAAGXivB06ce764m7KjYy3DGjExE2f9Xys7dqNLMQ6WWW0kIIYRryTdRB9FQX8eBrZsAGP6jGXj6+jXa/0VZNQcranBTqRhdfAmTyUSg0UjQv/5FUKyjSnOSaNxC+9G79ggVJyuxmTXoOodinDixzeMRQgghvk2Smg7iyM53qa0ox7dTKIMmTG60z/aNKs0zIUYuHE4DYGD+ZTz96/EKNWNDzT6GMWbUSEhdRdk5LwD858xBpdO1bTBCCCFEMySp6QAqrhaR8dH7AIyeORftt5KQ94rKOFtrwqjV0P/iWSwWC6H+/vjv2uWs0hynH/rQaHqbv6Lq5DUstVo0/v74PvZYW4cjhBBCNEuSmg5g3+Y3sVmthMcOpEfc0Eb76m12fn99OYT5wT6cO/olAAPPZ+MZbMYzpAErGlIZxuhRo2D/q5ScvV6lmTULtV7ftsEIIYQQtyBJTTuXd/ok2V8eQqVSM/aZuU1eu37z8jWumC10cdcRfvYEdrudboGB+Oz+jKD+jipNBrF4dOpFlO0cNV/l01ClQ+3lid+TT7giJCGEEKJZktS0Y3a7jb0b1wLQ/5EJBIZHNNpfZrHy5+vLIfws0JMzX51wtD19Gs9QMx6BDVjQsp8hjBkzxlGlOeOo0vg9+RQab++2CkUIIYT4XpLUtGOnPv+Ma3m56D29SPzJU032/+nSVaqsdvp66vE8fgRFUegZEoLH3r0ExdYAkM4AvDr1JEp1ibqTmZjK3FC5u+E/65m2DkcIIYT4TpLUtFOm2hoObtsMQMLjT2Lw9mm0P6/ezFuXSwD4ma87mWfPAtDvyy/x6mLC4N9AAzoOEs+Y0aNRHXiV0jOOyozvjx9HG9B4VW8hhBDC1SSpaacOb3+H+uoq/LuEMeCRR5vs//3FIhoUhSRfL+qPHACgb6dQ9GlpBPV3VGkOMwifTt2Jcr9G/Ylj1F51B42GgJ8+26axCCGEELdDkpp2qOzKZY5/8iEAY56Zi0arbbT/VHUd26+WAzDHADkXLqBWq+mzfz/eYSb0Rgsm3EkjjjFjxjiqNNffeDJOmoSuS5e2DUgIIYS4DZLUtEP7Nr+J3WYjclA8kQPjmuz/7YVCFGBasC8FB/YCEBvSCd2xo84qzSEG49spgijPaszHD1B92QAqFQHz57VhJEIIIcTtk6Smnck9kUHOsS9RazSMeWZuk/37yqrZV16NTqViOvVcvnwZrVZLr3//G59u9bh7W6hDz2EGMXr0aFT7b1ZpvH/wMO49ejQ5pxBCCHE/uKOkZvXq1URGRqLX64mLi2P//v3f2f71118nOjoag8FAVFQUmzZtatJm1apVREVFYTAYCAsL48UXX8RkMjV7vpSUFFQqFQsXLryT7rdbNquVPZvWAzDoh5Pw79y10X77N5ZDmNU5gDP79jjaBgejOX2SoNhaAA4Sj1+nbvTxtWI5/hmVlwwABMyf31ahCCGEEC2m/f4mjW3bto2FCxeyevVqEhMTeeONN5gwYQJnzpwhPDy8Sfs1a9awePFi1q1bx5AhQ0hPT2fevHn4+fmRnJwMwJYtW1i0aBEbNmxgxIgRZGVlMXv2bABee+21Ruf78ssvWbt2Lf3797+DcNu3r3bvoqwgH4O3D8Mfazox3s6r5Zyqqcdbo2Z8XSmfFxfj7u5O5EcfYYysw83TQi0epDOQH40ejerA7yk95wmKCs8RCRhiY10QlRBCCHF7WlypWblyJXPmzGHu3LlER0ezatUqwsLCWLNmTbPtN2/ezIIFC5g+fTrdu3dnxowZzJkzh1deecXZ5tChQyQmJvLkk08SERHBuHHjeOKJJzh69Gijc9XU1PDUU0+xbt06/Pz8vv2nOrT66ioOvfcPABKnP43e06vRfpPNTsrFQgCeDwvi2L69AMQHBKA+n+ms0uxnCP4hXYkK1GLN+CcVOZ6AVGmEEELc/1qU1DQ0NJCRkcG4ceMabR83bhxpaWnNHmM2m9F/a30gg8FAeno6FosFgKSkJDIyMkhPTwcgJyeHXbt2MXHixEbHPf/880ycOJEf/OAHt9Vfs9lMVVVVo097lfbeFky1NQSGRxD70Pgm+/9eUMJlk4VQdx3xJQWUl5fj6elJ15078e1eh85gpRovjtKfMWPGoD64irJMDxSbCn3//ngMG+aCqIQQQojb16KkpqSkBJvNRkhISKPtISEhFBUVNXvM+PHjWb9+PRkZGSiKwtGjR9mwYQMWi4WSEsfkbzNmzGDZsmUkJSWh0+no0aMHY8eOZdGiRc7zbN26lWPHjpGSknLb/U1JScFoNDo/YWFhLQn3gVGSf4mvdn8MwNhZ81BrNI32V1qs/OmSYzmEl8KCOJK6D4AhPj6o8i4SGOOo0qQylICQLkR18sB2dBvl2Y4qTeCC+U3WjBJCCCHuN3c0UPjbX3CKotzyS2/p0qVMmDCB4cOHo9PpmDJlinO8jOb6l+/evXtZvnw5q1ev5tixY+zYsYMPP/yQZcuWAZCfn88vf/lL3n777SZVn++yePFiKisrnZ/8/Pw7iPb+pigKezetR7Hb6TkkgfCYAU3a/CWvmHKrjShPPd0unae6uhqj0Ujou+/h16MWrd5KJT4cox+jR49GnfZnyrPcsVvUuPXsgdfYsS6ITAghhGiZFiU1gYGBaDSaJlWZ4uLiJtWbGwwGAxs2bKCuro7c3Fzy8vKIiIjA29ubwMBAwJH4zJw5k7lz5xIbG8u0adNYsWIFKSkp2O12MjIyKC4uJi4uDq1Wi1arZd++ffz5z39Gq9Vis9ma/dvu7u74+Pg0+rQ3FzLSuXTyOBqtltEz5zTZX2BqYN3lawD8Z1gghw46Zg8eajCgFOUTEFMHwF6GERjShT5d/bCnb6Ys83qVZt48VGp5818IIcT9r0XfVm5ubsTFxbF79+5G23fv3s2IESO+81idTkfXrl3RaDRs3bqVSZMmob7+ZVlXV+f8+QaNRoOiKCiKwsMPP8ypU6c4ceKE8xMfH89TTz3FiRMnnBWfjsZqsbBvs+MV7sETp+Ib0qlJm99fLMJsVxhu9ER/9iT19fUEBgQQ+I938O9Vi9bNSjm+fEW0o0pz+HUqsjXYzBp0Xbrg82jTJRaEEEKI+1GLX+l+6aWXmDlzJvHx8SQkJLB27Vry8vJ47rnnAMcjn4KCAudcNFlZWaSnpzNs2DDKy8tZuXIlp0+fZuPGjc5zJicns3LlSgYNGsSwYcPIzs5m6dKlTJ48GY1Gg7e3NzExMY364enpSUBAQJPtHcnxT/5FRVEhHkZfhk/7SZP9Z2vqebeoDID/6OzHvo/fBWCoRgNlRQSMdFRp9jCMwOBQ+nQLQdm+wTnZnv+cn6LS6dooGiGEEOLutDipmT59OqWlpbz88ssUFhYSExPDrl276NatGwCFhYXk5eU529tsNl599VUyMzPR6XSMHTuWtLQ0IiIinG2WLFmCSqViyZIlFBQUEBQURHJyMsuXL7/7CNupusoKDm/fCsDIJ2bhZvBo0ubGcgjJQb7UnDiKxWKhc6dOGDe/jX9ULRqdjRJVAKeUPjw+ZgzqI3+jItuOtU6LJiAA3x/9qI2jEkIIIe6cSlEUxdWdaCtVVVUYjUYqKysf+PE1n639C6c+/5SQ7j15avnKJuNeDpRX8+MTF9Cq4KPoLnyw9m/YbDYmBwTgue6v9Jx8DY3Wxns8yrXgkTz37FOoVsWSs9ONhiodQb96icB5ss6TEEII17vd728ZAfoAunrxAqe++AyAMbOaDuS1KwrLri+HMLNzILmH07DZbHQLC8Pz7S0ERNWg0dq4qgriDL0d89IcfZPqC2YaqnSovb3xe6LpjMRCCCHE/UySmgeMoijs3bgOFIWohJF07dOvSZsPiiv4qroeT42a2T46Tpw4AUB8TQ0qUxn+fRxjab5QEggKDqFPj24oaa9TesYxlsbvySfReHk1Oa8QQghxP5Ok5gFz/shBLp89jVbnxqinn22yv8FuJyXn+nII4cGc3J+Koij06t4dt7e3EBBdg1pjp1DViUy6O954Or6ZuotVmMrcUOn1+D8zs63DEkIIIe6aJDUPEGtDA/vefguA+MmP4RMY3KTNpiulXDI1EOymZYrWxtdffw3A4GvXUFsq8OtdD8DnSgLBwSFE9+4BaX+m5Iw3AL4//jHagIA2ikgIIYRoPZLUPEAyPnqfqmtX8fIPYOjkx5rsr7baWJnrmBjxPyI6cWjvHgBioqJQbfkHAX1rUKvtXFZ1IZtujirNya3UXyymrtgdtFoCftq0+iOEEEI8CCSpeUDUlJVyZKdjnplRT85G18xyEa/nFVNmsdHTw50RDdVkZ2ejVqvpfykPDVX49XSMpflcGU5QUDDRUb3hwGuUXB9LY0xORte5c9sFJYQQQrQiSWoeEPvf2YjFbCK0VxR9ksY02V9ktvBGfjEAv44MZd8XXwAwMDoa+zvvENi3BpVaIVcVzkXCHW88ndmJOfcyNQUGUKkImDe3LUMSQgghWpUkNQ+AwuxMzqQ6kpSxs5tfMfsPFwuptysM8fGkR/lV8vPz0Wq19D2XiVZbi2+PG2NphhMUFER0nz6w/1Xn7MHeP/gB7t27t11QQgghRCuTpOY+pygKezauA6DvyLGE9oxq0iaz1sQ7hY7lEJZ078QX16s08X37Yn33XQJjqlGpFHJUkeTTxTGWJmsXDRfPU3nJAEDA/PltFJEQQghxb0hSc587d3AfhVnn0LnrSXpyVrNtVuRcwQ5MCDTiWXCJq1ev4u7uTtSJr9Dp6zFGOKo0/75epenbty+k/pGyc16gqPAcMQJDbMddQ0sIIUT7IEnNfcxiMpG6xfEK99Cpj+PtH9ikzeGKGj4tqUKjgkURwc4qzbB+/TDt2HG9SgPn1b24QidHlebCF1gvnqTiomO9KKnSCCGEaA8kqbmPpX+wnZqyUnyCgombNLXJfuUbyyE8FRpATdY5ysrK8PDwoHvaIdw96/EJv16lsQ+7WaXZ/0fKMj1RbCr0A/rjMWxoW4YlhBBC3BOS1NynqkqKOfrBdgBGPfVTdG7uTdp8dK2SjKo6DGo1v+gawN69ewFI6NuX+g8+cFZpzqr7cJUgR5Um7xC27MOUn/cEIHDBgmYHHgshhBAPGq2rOyCal7rl71gtDXTp04/ewxOb7LfYFVZcXw7hZ+FB5H11gurqanx8fAj/Yg92owmfMBMKKr6wD71ZpdnyGOXnPbFb1bj36onXmDFtHJkQQghxb0il5j50+dzXZKalgkp1y1e4txSWklNvJlCn5afBPuzfvx+AxD7R1O3aRWBMNQBfq/tyjQBHlabwOPbMPZRlOV7jDpjXdIVvIYQQ4kEllZr7jGK3O1bhBmLHPkJIZI8mbWqsNv540bEcwksRIZz+8kvq6+sJCAgg9ONd2AIa8O5iRlGp+cI+hMDAQEeV5t2ZVOR4YDOr0XXpgs+jj7ZpbEIIIcS9JP9Mv898ve9zruZk42YwkDi9+dWy1+QXU2KxEmlw40dGA2lpaQAk9epF7e5/ExTrqNKcVMdQhp+jSnPtLMqZDyk95xhLEzB3Diqt5LRCCCHaD0lq7iMN9XUc2LoJgOE/moGnr1+TNsVmC2vyrwHw6+6dOZJ2kIaGBjp16oT/zp0Ygsx4dTJjV2nZY4snMDCQfv36wf6VVF4yYK3TogkMxPijH7VpbEIIIcS9JknNfeTIzneprSjHNySUQRMmN9vm1dwi6mx2Bvt4MNIN0tPTAUiKiKBu3z6C+tcAcEIdSwVGR5Wm/CLK6R3OJRECZs9C7d70bSohhBDiQSZJzX2i4moRGR+9D8DoZ+ai1ematMmuM/F2YSkAS3t0Zv/+/dhsNsLDw/Hatg2PkAY8g8zY1brGVZoDr1Gd70ZDlQ61jw++M2a0ZWhCCCFEm5Ck5j6xb/Ob2KxWwmMH0iOu+cnwUnIKsSnwSIAPvW1mjh07BkBi587UHzpEcH/HWJoM1QCq8XJUaaqvoJzY6qzS+D35BBovr7YJSgghhGhDktTcB/JOnyT7y0OoVGrGPjO32Ve4j1bW8tG1StTAb3qEsmfPHhRFoVevXrhvfhuvUDOGgAZsanf22QbfrNIc/DN1hSpMZW6o9Hr8n3mm7QMUQggh2oAkNS5mt9vYu3EtAP0fmUBgeESTNt9cDmFGqD++1ZWcPn0agAQ/f+ozjhI0wDGW5kvVIGrwZNSoUajrSuDYRkrOeAPg+/jjaP392yAqIYQQou1JUuNipz7/jGt5ubh7ejLi8SebbfNpSRVHKmsxqFX838hOfP755wDExMSgeustvLua0Ps2YNUYSLUNJCAggJiYGDj0V+qv2qgrdgetloBnZ7dhZEIIIUTbkqTGhUy1NRzcthmAEY8/hYePsUkbq11heY6jSjOvaxCWq0WcP38elUrFUIMB0+mTzjeejqgGU4fBMZbGVAFfvknJGcf4GePkyeg6d26bwIQQQggXkKTGhQ5v30p9dRX+XcIY8Ejzs/tuLSrjfJ0Zf52G58ODnVWaQYMGYV23Hp+wetx9LFi0nuy39r9ZpUlfi6nYRE2BAVQqAubObcvQhBBCiDYnSY2LlF25zPFP/gXAmGfmomlmdt9am40/XHQsWvlit04U517k0qVLaDQa4hQwZ54lqH8tAIeIx4TeUaWx1MLhNc43nrwfeQT37pFtFJkQQgjhGpLUuMi+zW9it9mIHBRP5MC4Ztuszb/G1QYr4Xo3ng71c1ZphgwZgmntWozd6nHzsmDR+XDAGnOzSvPlmzSUVFOV5wFAwPz5bRaXEEII4SqS1LhA7okMco59iVqjYcwzzT8WKmmw8npeMQCLu4eSk5lJUVERbm5uDKyvpyEnm8DrVZoDSjwNuDmqNDYzHHqd0nNeoIBnYiKGmH5tFpsQQgjhKpLUtDGb1cqeTesBGPTDSfh37tpsu9dyi6ix2envbWBSgDdffPEFAAnDhlHzt7/h270ONw8LDW6+pFn73qzSHNuMtaSEyovXF66UKo0QQogOQpKaNvbV7o8pK8jH4O3D8MeeaLbNxTozG6+UAPD/9ejMqZMnKS0txWAw0Le0FGv+JQJjHVWaVGUIFnSOeWnsVjj4J8qyPFFsYBg4EI+hQ9osNiGEEMKV7iipWb16NZGRkej1euLi4ti/f/93tn/99deJjo7GYDAQFRXFpk2bmrRZtWoVUVFRGAwGwsLCePHFFzGZTM79a9asoX///vj4+ODj40NCQgIff/zxnXTfZeqrqzj03hYAEqc/jd6z+eUKUi4WYlVgrL83w70N7N27F4CkESOo/tsb+PaoRae3YnYP5LClz80qzclt2EoKKM++vnDl/PnNzk4shBBCtEdNX7n5Htu2bWPhwoWsXr2axMRE3njjDSZMmMCZM2cIDw9v0n7NmjUsXryYdevWMWTIENLT05k3bx5+fn4kJycDsGXLFhYtWsSGDRsYMWIEWVlZzJ49G4DXXnsNgK5du/K73/2Onj17ArBx40amTJnC8ePHHcsBPADS3tuCqbaGwPAIYh8a32yb41V1fFBcgQrHopVHjx6lqqoKb29veuflUXr1MoGT6wDYY4/HipZRo0ahUQEHVlJ+3hO7RYV7r154jRnddsEJIYQQLtbiSs3KlSuZM2cOc+fOJTo6mlWrVhEWFsaaNWuabb9582YWLFjA9OnT6d69OzNmzGDOnDm88sorzjaHDh0iMTGRJ598koiICMaNG8cTTzzB0aNHnW2Sk5N59NFH6d27N71792b58uV4eXlx+PDhOwi77ZXkX+Kr3Y7K0thZ81BrNE3afHM5hMc7+dFDpyY1NRWA0YmJVLyxFr+edWjdrZj0IXxp6Y2/v7+jSvP1TuzFFynLciyJEDB/Piq1PF0UQgjRcbToW6+hoYGMjAzGjRvXaPu4ceNIS0tr9hiz2Yxer2+0zWAwkJ6ejsViASApKYmMjAzS09MByMnJYdeuXUycOLHZc9psNrZu3UptbS0JCQktCcElFEVh76b1KHY7PYckEB4zoNl2n5dVk1ZRg7taxX9GhnL48GHq6urw9/en27lz2MuKCOjnqNJ8YYvHhobRo0ejUalg/6tU5HhgM6vQhYXhM+GHbRmiEEII4XItevxUUlKCzWYjJCSk0faQkBCKioqaPWb8+PGsX7+eqVOnMnjwYDIyMtiwYQMWi4WSkhJCQ0OZMWMG165dIykpCUVRsFqt/OxnP2PRokWNznXq1CkSEhIwmUx4eXmxc+dO+vbte8v+ms1mzGaz8/eqqqqWhNtqco6lc+nkcTRaLaNnzmm2jU1R+O31Ks2cLkH4261svp4ojklMpPwXv8Svdy1aNyv1Hp05WtfzZpUm62OUwjOUZnYCIGDOT1E1M5mfEEII0Z7d0fOJbw8+VRTllgNSly5dyoQJExg+fDg6nY4pU6Y4x8torj+C2bt3L8uXL2f16tUcO3aMHTt28OGHH7Js2bJG54qKiuLEiRMcPnyYn/3sZ8yaNYszZ87csp8pKSkYjUbnJyws7E7CvStWi4W911/hHjxxKr4hnZpt925RGedqTfhqNfyiWzAHDx7EbDYTEhJCaEYGStU1Avo6qjS7LUOwo3aMpVGrIfWPVF4yYK1VowkKxDhtWpvFJ4QQQtwvWpTUBAYGotFomlRliouLm1RvbjAYDGzYsIG6ujpyc3PJy8sjIiICb29vAgMDAUfiM3PmTObOnUtsbCzTpk1jxYoVpKSkYLfbnedyc3OjZ8+exMfHk5KSwoABA/jTn/50y/4uXryYyspK5yc/P78l4baK45/8i4qiQjyMvgyf9pNm29Tb7PzhouOa/rJbCOr6Oo4cOQLA2BEjKH9zA/5RNWi0Nmo9wzluicDf35/Y2FjI2YNy+Ril53wACJg9G7W7e9sEJ4QQQtxHWpTUuLm5ERcXx+7duxtt3717NyNGjPjOY3U6HV27dkWj0bB161YmTZqE+vpA1rq6OufPN2g0GhRFQVGUW55TUZRGj5e+zd3d3fkK+I1PW6qrrODw9q0AJD3xDG4Gj2bbrb98jStmC13cdTzbJZDU1FSsVithYWH4798P9eUERNcDjiqNcqNKo9FA6qtUF+hpqNKg9vHBd/qMNotPCCGEuJ+0eODFSy+9xMyZM4mPjychIYG1a9eSl5fHc889BziqIwUFBc65aLKyskhPT2fYsGGUl5ezcuVKTp8+zcaNG53nTE5OZuXKlQwaNIhhw4aRnZ3N0qVLmTx5svMR1a9//WsmTJhAWFgY1dXVbN26lb179/LJJ5+0xnW4Jw5s20xDfR0h3XsSM/oHzbYps1j5S95VABZ1D6WusoJjx44BMHbYMMpnP0tAnxrUGhs1Xt35qibsZpXm0iGU3AOUng0CwP/pp9B4ebZNcEIIIcR9psVJzfTp0yktLeXll1+msLCQmJgYdu3aRbdu3QAoLCwkLy/P2d5ms/Hqq6+SmZmJTqdj7NixpKWlERER4WyzZMkSVCoVS5YsoaCggKCgIJKTk1m+fLmzzdWrV5k5cyaFhYUYjUb69+/PJ598wiOPPHIX4d87xbk5nPriMwDGzJp3y9er/5R7lSqrnX5eeh4L8WPnjh3Y7XZ69uyJ1+5/U2GpxD/KMZbm04Y4FFQ3qzT7/0htkTumMh0qgwG/mTPbLD4hhBDifqNSvuv5TjtTVVWF0WiksrLynj6KUhSFd/9nMZfPniYqYSSTFv6/Ztvl1ZtJOnKOBkVh64DuRFvqnfP9zJkxg9onnyKodyEBfWqp9unNq1WP4ufnzwsvvIDm6klYO4ZLXwRSV+yG3zMz6fTrX9+zmIQQQghXud3vb5md7R44f+Qgl8+eRqtzY9TTz96y3SsXi2hQFEb5eTHG38e5aGXfvn1x++cHaJRq/Ho7xtJ8bI4DVI55aTQa2P8qdSU66ordQKcj4Nlb/x0hhBCiI5CkppVZGxrY9/ZbAMRP/hE+gcHNtjtZXcf2q+UALOnRmfz8fDIzM1GpVIzqP4Dyf/yDgL7VqNV2Kn37ccYcgp+fn2MsTfE5OPsvSs84Zg82Tk5GFxraNgEKIYQQ9ylJalpZxkfvU3XtKl7+AQyd/ONbtrsx0d6PQvyI9TLw+eefAzBw4EB47z00mlr8ejqqNLvqB8E3x9IcWImpQkvNFT2oVATMmXuvwxJCCCHue5LUtKKaslKO7HwXgFFPzkb3reUhbthbVkVqeQ1uKhX/L7ITOTk55ObmotFoSIyOpuLddwnqV4NKpVDhP5BMcxB+fn70798fynLg1HuUnnWsxO09fjzu3SPbLEYhhBDifiVJTSva/85GLGYTob2i6JM0ptk29m8sWvlsl0DC9W7OKk18fDyWzW+jda/H2N1Rpfmw1rFO1M0qzSoaqlVU5TnmvAmcP+8eRyWEEEI8GCSpaSWF2ZmcSXUM9B07e/4tl43YfrWcr2tM+GjV/DIihLNnz3LlyhV0Oh3Du3enYscOAvtVo1IplAUOIdvsf7NKU1kAJ/5B6TkvUMAzKQn9d6x9JYQQQnQkktS0AkVR2LNxHQB9R44ltGdUs+1MNju/yykE4OfhIfhq1M43nhISEqjf8BZuniaMEderNDX9gW9UadL+gqXWRuVFx6OnwAXz72lcQgghxINEkppWcO7gPgqzzqFz15P05KxbtnuroIQCs4XO7jrmdg3i5MmTlJSUYDAYiOvcmcp//YugmGpUKigNHkGOyedmlabmGmT8nbJMLxSbgmHQIAzx8W0YpRBCCHF/k6TmLllMJlL/8XcAhk59HG//wGbbVVis/OmSYzmE/xvZCZ1iZ8+ePQAkJSVRs3Yt7t5mfMJNKKj4oDoGgJEjRzqqNIdfx1ZnouKC4zXugPnzbvmISwghhOiIJKm5S3a7nV5DEvDtFErcpKm3bPfnS8VUWG308dTzk07+ZGRkUFlZiZeXF/19fana9TFBsdUAlHYayaV6T3x9fRkwYADUl0P6esrPe2K3KLhHReE1ZkzbBCiEEEI8IFq89pNozN3Dg4eeXYC14Vm0bm7NtrlsauDNgmsA/KZ7KDaLhdTUVABGjx5NxZo16P0a8O5qQlGp+WelY/CvcyxN+jrsdTWUne8MKATMkyqNEEII8W1SqWklt0poAH5/sRCzXWGErxc/CPDhyJEj1NbW4ufnR7S7npp/f05Q/xoArnUaS3694WaVxlwDh1dTccEDm0lBFxaGzw/Ht1VYQgghxANDkpp77ExNPe8VOZZDWNqjMyaTiYMHDwIwduxYyv76FwwBDXiFmlBUGv5Z4XhzylmlyXgLpaac0ixfAALmzEGllQKbEEII8W2S1Nxjyy5cQQEmB/syyMeDgwcPYjKZCA4OpofFSm3qfoL6O8bSFHcZR0G9+80qjcUEaX+h8pIBa62CNigI47SpLo1HCCGEuF9JUnMPHSivZk9ZNVoVLI4Mpbq6msOHDwPw0EMPUfKXv+ARbMYzxIyicWNnaU/gG1Wa45tRqq5SmukHgP/s2ajd3V0WjxBCCHE/k6TmHrErCi9fXw5hVudAIj3cSU1NxWq10rVrV7qWV1B3+JBzLM3VzuMpqtferNLYLHDwz1QX6GmoBLXRiO/06a4MSQghhLivSVJzj3xQXMHJ6nq8NGpejOhEeXk5GRkZwM0qjWcnMx6BZhStnh0l3YFvzEtz8l2UijxKzl2v0jz1FBovT5fFI4QQQtzvJKm5B8x2OyuuL4fwfHgwgW5a9u7di91up3v37gQXXKH+WIazSlPY5VGK69U3qzR2GxxYSW2RO+ZSFSqDAb+ZT7syJCGEEOK+J0nNPbCpoJQ8UwMhblrmhwVRXFzMV199BcDDDz/MtT/9Ca/OZgz+DSg6T96/FgE4qjRarRbO/BNKsyk95wuA308eR+vn56JohBBCiAeDJDWtrMpq47VLRQD838hQPDUa56KV0dHR+GRmYjp9iqABjipNQddJFNcpGI1GR5VGUWD/q9SV6Ki7qgGdDv9nn3VZPEIIIcSDQiY8aWWv5xVTZrHRy8OdGZ38uXz5MufOnUOlUjF2zBiuzV+Ad1cTemMDirs37xeHATZGjRrlqNJkfgxXT1N6LggA45TJ6Dp1cm1QQgghxANAKjWtqNDcwNr8YgB+070zWrXKWaXp378/+uPHMWedI2hALQAFXadQUmtrXKVJ/SOmCi01l3WgUhEwZ47L4hFCCCEeJFKpaUV/uFhEvV1hqNGT8YE+5OTkkJOTg1qtZsyoUVx7ZhY+4fW4ezeg6H3ZURQKWG+OpcnZCwVHKT0XAID3D8fjHhnp0piEEEKIB4VUalpJZq2JrYVlAPx/PToD8PnnnwMQHx+P+uBBGnKyCervqNLkh02jrNaK0Whk4MCBjpOk/pGGag1VlxwT7AXOm9e2QQghhBAPMKnUtJLlF65gByYGGYk3enLu3DkKCgrQ6XSMTEjg6vQZGCPqcfO0oHgEsvNKMNBws0qTnw65+x2zByvgOXIk+r59XR2WEEII8cCQSk0rOFRRw2elVWhUsLh7KHa73VmlGT58OLbPP8dy+ZKzSnMp7EeU1zY0qdJY6tVUXvQAIHDBfFeEIoQQQjywpFJzlxRFYdn15RCeCg2gp4eer776imvXrqHX6xkeH8+VadPw7V6HzmBB8Qrh/cv+gPlmlabwKzj/KWWZRhSbgmHwYDzi410bmBBCCPGAkUrNXSqxWDHb7Xho1PxHRCesVit79uwBIDExEfNHH2ErukJgTB0Al8Ifp6LWjI+Pz80qzf5XsTWoqMjxBiBgvoylEUIIIVpKKjV3KchNx+74KM7Vmgh21/Hll19SUVGBp6cnQwYMIP/Xv8G3Zy06vQXFpws7L3kDppvz0lzLhDMfUJblib3BjntUFF6jR7s6LCGEEOKBI5WaVqBWqejrZaChoYF9+/YBMHr0aGp37MBWdtVZpbkY/jiVtabGVZoDr2G3QvkFxzIIAfPnoVKpXBGGEEII8UCTpKYVpaenU1NTg6+vL4P69KF07Tr8etWidbOi+Ebwz1zHKtvOsTTluXDyXSoueGCrt6ELD8dn/HjXBiGEEEI8oCSpaSX19fUcOHAAgDFjxlC5dRtKVQmB/eoByAl/nMqaOnx8fBg0aJDjoIN/QrHaKM12TLYXMHcOKq08ERRCCCHuxB0lNatXryYyMhK9Xk9cXBz79+//zvavv/460dHRGAwGoqKi2LRpU5M2q1atIioqCoPBQFhYGC+++CImk8m5PyUlhSFDhuDt7U1wcDBTp04lMzPzTrp/T6SlpWEymQgKCqJfZCSlb76Jf1QtGp0VJaAn/8xxTKjnrNJUXYHjb1OZ64G12oo2OBjj1KmuDUIIIYR4gLU4qdm2bRsLFy7kN7/5DcePH2fkyJFMmDCBvLy8ZtuvWbOGxYsX89///d98/fXX/M///A/PP/88//rXv5xttmzZwqJFi/iv//ovzp49y5tvvsm2bdtYvHixs82+fft4/vnnOXz4MLt378ZqtTJu3Dhqa2vvIOzWVVNTw+HDhwF46KGHqNj8NtSV4x/tGEuTHfYTqmpqG1dp0v6KYmmgNDsQAP/Zs1G7ubmk/0IIIUR7oFIURWnJAcOGDWPw4MGsWbPGuS06OpqpU6eSkpLSpP2IESNITEzkD3/4g3PbwoULOXr0qPNxzQsvvMDZs2edE9YB/OpXvyI9Pf2WVaBr164RHBzMvn37GDVq1G31vaqqCqPRSGVlJT4+Prd1zO34+OOPOXLkCF26dOHZH/+YC4+MIyDyCoH9alCC+7Ky7jGqa2qYOHEiQ4YMgdpSWBVD1QU7BWn+qI1Gen3xOWpPz1brkxBCCNFe3O73d4sqNQ0NDWRkZDBu3LhG28eNG0daWlqzx5jNZvR6faNtBoOB9PR0LBYLAElJSWRkZJCeng5ATk4Ou3btYuLEibfsS2VlJQD+/v63bGM2m6mqqmr0aW01NTUcPXoUgIcffpiyt/6OqqES/z6OsTTZXR+nuqamcZXm8GqUhjpKzgc7Ynj6aUlohBBCiLvUoqSmpKQEm81GSEhIo+0hISEUFRU1e8z48eNZv349GRkZKIrC0aNH2bBhAxaLhZKSEgBmzJjBsmXLSEpKQqfT0aNHD8aOHcuiRYuaPaeiKLz00kskJSURExNzy/6mpKRgNBqdn7CwsJaEe1u8vLyYOXMmCQkJhBuNlL39NgHRNag1NpRO/fkgyw44EjetVgv1FZC+ltoid8zXrKg8PPB7+qlW75cQQgjR0dzRQOFvz6OiKMot51ZZunQpEyZMYPjw4eh0OqZMmcLs2bMB0Gg0AOzdu5fly5ezevVqjh07xo4dO/jwww9ZtmxZs+d84YUXOHnyJO+888539nPx4sVUVlY6P/n5+S2M9PZEREQwfvx4StetR22vxi/KUaXJ6voTqmtq8Pb2ZvDgwY7GX64DcxWl16s0fj/5CVo/v3vSLyGEEKIjaVFSExgYiEajaVKVKS4ublK9ucFgMLBhwwbq6urIzc0lLy+PiIgIvL29CQx0DJJdunQpM2fOZO7cucTGxjJt2jRWrFhBSkoKdru90fl+/vOf88EHH7Bnzx66du36nf11d3fHx8en0edesVwtpvyddwiMrkGttqN0iefDc2bgG288NdTCodXUXXOj7ooNdDr8n519z/okhBBCdCQtSmrc3NyIi4tj9+7djbbv3r2bESNGfOexOp2Orl27otFo2Lp1K5MmTUKtdvz5uro65883aDQaFEXhxjhmRVF44YUX2LFjB1988QWRkZEt6fo9V/rGG2jUdfj1clRpMrs83rRKk/F3qC+jNDsIAN+pU9DdIhkUQgghRMu0eKa3l156iZkzZxIfH09CQgJr164lLy+P5557DnA88ikoKHDORZOVlUV6ejrDhg2jvLyclStXcvr0aTZu3Og8Z3JyMitXrmTQoEEMGzaM7Oxsli5dyuTJk52PqJ5//nn+8Y9/8M9//hNvb29ntchoNGIwGO76QtwNS0EB5e+9R6f+1ajUdpTwRHaddbxq7qzSWM2Q9hdM5VpqLimgVhMwZ45L+y2EEEK0Jy1OaqZPn05paSkvv/wyhYWFxMTEsGvXLrp16wZAYWFhozlrbDYbr776KpmZmeh0OsaOHUtaWhoRERHONkuWLEGlUrFkyRIKCgoICgoiOTmZ5cuXO9vceIV8zJgxjfrz1ltvOcfouErJ3/6Gzq0e356OKs25Lj+mKi8Pb2/vm288ndgC1YWUZncGwOeH43H7xjUQQgghxN1p8Tw1D7J7MU+NtbSU82PGEjqoGN/u9di7j2VVySiqqqp49NFHGTp0KNgs8JfBNOQXcGFXJ1AUInfuQB8d3Sp9EEIIIdqzezJPjWhKGxBA9zf/iLG7Y1DwudDHqKqqalylOfW/UJHnmD1YUfAcPUoSGiGEEKKVSVLTCtzz30WFHXuv8Xxy6hqAc84d7HY4sBJLvZrKCzoAAufPd2V3hRBCiHZJkpq7VXMNzjrWsTobMtVZpXG+8XT2AyjJoiw7AMVqxxAXh0dcnAs7LIQQQrRPLR4oLL7FKwhe+BJb5qd8etDxRpazSqMosP9VbGYV5dkGwErgAqnSCCGEEPeCVGpag284x9QDqaqqwsvL62aV5vxuKDpJ2QU/FLMV9+hoPEeOdG1fhRBCiHZKkppWYLVanauJjxw58maVJvUP2C0qyrMdI7UD58295XISQgghhLg7ktS0ghMnTjSt0uTuh8vplF/0wVbXgK5bON7jx7u2o0IIIUQ7JknNXbJaraSmpgLfGEsDkPpH7DYoy/YHIGDuXFTXZ0cWQgghROuTpOYuWSwWevfujdFoJO7GW02Xj8LFfVRe8sJaZUYbHIxxyhTXdlQIIYRo5+Ttp7tkMBiYNGkSVqvVscYTQOofUexQeiEEqMf/p8+idnNzaT+FEEKI9k4qNa3EmdAUnYasj6m+7IGltB6N0Yjf44+7tnNCCCFEByBJTWvb/yqKAiU5joUr/WbORO3p6eJOCSGEEO2fJDWtqeQ8fL2T2kJ3zEV1qDw88H/6KVf3SgghhOgQJKlpTQdeAxRKLoYD4Dd9OhpfX5d2SQghhOgoJKlpLRV5cHIbddfcqM+vRaXT4T97tqt7JYQQQnQYktS0loN/AruVktxuABinTkUXEuziTgkhhBAdhyQ1raG6CI5txlSupfZCLajVBMyd4+peCSGEEB2KJDWt4dBfwWam9JKjSuPzwx/i1q2bizslhBBCdCyS1NytujL4cgMN1RqqskwABMyf5+JOCSGEEB2PJDV3S+8L09ZQenUA2BW8Ro9G36ePq3slhBBCdDiS1NwttRpLwHAqvioFIGDBfBd3SAghhOiYJKlpBWVv/R0sFgzxcXgMHuzq7gghhBAdkiQ1d8lWUUH5u+8CELhggYt7I4QQQnRcskr3XVJ7e9N5xQpq9uzBMynJ1d0RQgghOixJau6SSqPB54fj8fnheFd3RQghhOjQ5PGTEEIIIdoFSWqEEEII0S5IUiOEEEKIdkGSGiGEEEK0C5LUCCGEEKJdkKRGCCGEEO3CHSU1q1evJjIyEr1eT1xcHPv37//O9q+//jrR0dEYDAaioqLYtGlTkzarVq0iKioKg8FAWFgYL774IiaTybk/NTWV5ORkOnfujEql4v3337+TrgshhBCinWrxPDXbtm1j4cKFrF69msTERN544w0mTJjAmTNnCA8Pb9J+zZo1LF68mHXr1jFkyBDS09OZN28efn5+JCcnA7BlyxYWLVrEhg0bGDFiBFlZWcyePRuA1157DYDa2loGDBjAs88+y2OPPXYXIQshhBCiPVIpiqK05IBhw4YxePBg1qxZ49wWHR3N1KlTSUlJadJ+xIgRJCYm8oc//MG5beHChRw9epQDBw4A8MILL3D27Fk+//xzZ5tf/epXpKenN1sFUqlU7Ny5k6lTp7ak61RVVWE0GqmsrMTHx6dFxwohhBDCNW73+7tFj58aGhrIyMhg3LhxjbaPGzeOtLS0Zo8xm83o9fpG2wwGA+np6VgsFgCSkpLIyMggPT0dgJycHHbt2sXEiRNb0r1m/3ZVVVWjjxBCCCHapxYlNSUlJdhsNkJCQhptDwkJoaioqNljxo8fz/r168nIyEBRFI4ePcqGDRuwWCyUlJQAMGPGDJYtW0ZSUhI6nY4ePXowduxYFi1adIdhOaSkpGA0Gp2fsLCwuzqfEEIIIe5fdzRQWKVSNfpdUZQm225YunQpEyZMYPjw4eh0OqZMmeIcL6PRaADYu3cvy5cvZ/Xq1Rw7dowdO3bw4YcfsmzZsjvpntPixYuprKx0fvLz8+/qfEIIIYS4f7UoqQkMDESj0TSpyhQXFzep3txgMBjYsGEDdXV15ObmkpeXR0REBN7e3gQGBgKOxGfmzJnMnTuX2NhYpk2bxooVK0hJScFut99haODu7o6Pj0+jjxBCCCHapxa9/eTm5kZcXBy7d+9m2rRpzu27d+9mypQp33msTqeja9euAGzdupVJkyahVjtyqrq6OufPN2g0GhRFoYXjmL/TjXPJ2BohhBDiwXHje/v7coIWv9L90ksvMXPmTOLj40lISGDt2rXk5eXx3HPPAY5HPgUFBc65aLKyskhPT2fYsGGUl5ezcuVKTp8+zcaNG53nTE5OZuXKlQwaNIhhw4aRnZ3N0qVLmTx5svMRVU1NDdnZ2c5jLl68yIkTJ/D392/2VfLmVFdXA8jYGiGEEOIBVF1djdFovOX+Fic106dPp7S0lJdffpnCwkJiYmLYtWsX3bp1A6CwsJC8vDxne5vNxquvvkpmZiY6nY6xY8eSlpZGRESEs82SJUtQqVQsWbKEgoICgoKCSE5OZvny5c42R48eZezYsc7fX3rpJQBmzZrF3//+99vqe+fOncnPz8fb2/uWY4DuRFVVFWFhYeTn58sjru8h1+r2ybVqGblet0+u1e2Ta3X77uW1UhSF6upqOnfu/J3tWjxPjWhK5r+5fXKtbp9cq5aR63X75FrdPrlWt+9+uFay9pMQQggh2gVJaoQQQgjRLkhS0wrc3d35r//6L9zd3V3dlfueXKvbJ9eqZeR63T65VrdPrtXtux+ulYypEUIIIUS7IJUaIYQQQrQLktQIIYQQol2QpEYIIYQQ7YIkNUIIIYRoFySpuU2rV68mMjISvV5PXFwc+/fv/872+/btIy4uDr1eT/fu3fnb3/7WRj11vZZcq71796JSqZp8zp0714Y9do3U1FSSk5Pp3LkzKpWK999//3uP6aj3VUuvVUe+r1JSUhgyZAje3t4EBwczdepUMjMzv/e4jnhv3cm16qj31po1a+jfv79zceiEhAQ+/vjj7zzGFfeUJDW3Ydu2bSxcuJDf/OY3HD9+nJEjRzJhwoRGy0F808WLF3n00UcZOXIkx48f59e//jW/+MUv2L59exv3vO219FrdkJmZSWFhofPTq1evNuqx69TW1jJgwAD++te/3lb7jnxftfRa3dAR76t9+/bx/PPPc/jwYXbv3o3VamXcuHHU1tbe8piOem/dybW6oaPdW127duV3v/sdR48e5ejRozz00ENMmTKFr7/+utn2LrunFPG9hg4dqjz33HONtvXp00dZtGhRs+3/8z//U+nTp0+jbQsWLFCGDx9+z/p4v2jptdqzZ48CKOXl5W3Qu/sXoOzcufM723Tk++qbbudayX11U3FxsQIo+/btu2Ububccbudayb11k5+fn7J+/fpm97nqnpJKzfdoaGggIyODcePGNdo+btw40tLSmj3m0KFDTdqPHz+eo0ePYrFY7llfXe1OrtUNgwYNIjQ0lIcffpg9e/bcy24+sDrqfXU35L6CyspKAPz9/W/ZRu4th9u5Vjd05HvLZrOxdetWamtrSUhIaLaNq+4pSWq+R0lJCTabjZCQkEbbQ0JCKCoqavaYoqKiZttbrVZKSkruWV9d7U6uVWhoKGvXrmX79u3s2LGDqKgoHn74YVJTU9uiyw+Ujnpf3Qm5rxwUReGll14iKSmJmJiYW7aTe+v2r1VHvrdOnTqFl5cX7u7uPPfcc+zcuZO+ffs229ZV95T2np25nVGpVI1+VxSlybbva9/c9vaoJdcqKiqKqKgo5+8JCQnk5+fzxz/+kVGjRt3Tfj6IOvJ91RJyXzm88MILnDx5kgMHDnxv245+b93uterI91ZUVBQnTpygoqKC7du3M2vWLPbt23fLxMYV95RUar5HYGAgGo2mSaWhuLi4SRZ6Q6dOnZptr9VqCQgIuGd9dbU7uVbNGT58OOfPn2/t7j3wOup91Vo62n3185//nA8++IA9e/bQtWvX72zb0e+tllyr5nSUe8vNzY2ePXsSHx9PSkoKAwYM4E9/+lOzbV11T0lS8z3c3NyIi4tj9+7djbbv3r2bESNGNHtMQkJCk/afffYZ8fHx6HS6e9ZXV7uTa9Wc48ePExoa2trde+B11PuqtXSU+0pRFF544QV27NjBF198QWRk5Pce01HvrTu5Vs3pKPfWtymKgtlsbnafy+6pezoMuZ3YunWrotPplDfffFM5c+aMsnDhQsXT01PJzc1VFEVRFi1apMycOdPZPicnR/Hw8FBefPFF5cyZM8qbb76p6HQ65X//939dFUKbaem1eu2115SdO3cqWVlZyunTp5VFixYpgLJ9+3ZXhdBmqqurlePHjyvHjx9XAGXlypXK8ePHlUuXLimKIvfVN7X0WnXk++pnP/uZYjQalb179yqFhYXOT11dnbON3FsOd3KtOuq9tXjxYiU1NVW5ePGicvLkSeXXv/61olarlc8++0xRlPvnnpKk5ja9/vrrSrdu3RQ3Nzdl8ODBjV75mzVrljJ69OhG7ffu3asMGjRIcXNzUyIiIpQ1a9a0cY9dpyXX6pVXXlF69Oih6PV6xc/PT0lKSlI++ugjF/S67d14NfTbn1mzZimKIvfVN7X0WnXk+6q56wQob731lrON3FsOd3KtOuq99dOf/tT5//WgoCDl4YcfdiY0inL/3FMqRbk+ckcIIYQQ4gEmY2qEEEII0S5IUiOEEEKIdkGSGiGEEEK0C5LUCCGEEKJdkKRGCCGEEO2CJDVCCCGEaBckqRFCCCFEuyBJjRBCCCHaBUlqhBBCCNEuSFIjhBBCiHZBkhohhBBCtAuS1AghhBCiXfj/AcH61vC/7JFTAAAAAElFTkSuQmCC\n", 138 | "text/plain": [ 139 | "
" 140 | ] 141 | }, 142 | "metadata": {}, 143 | "output_type": "display_data" 144 | } 145 | ], 146 | "source": [ 147 | "draw_db = torch.stack(weights_log, dim=0)[...,0]\n", 148 | "plt.plot(draw_db)" 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "execution_count": null, 154 | "metadata": {}, 155 | "outputs": [], 156 | "source": [ 157 | "class Cube(nn.Module):\n", 158 | " def __init__(self, cfg):\n", 159 | " super().__init__()\n", 160 | " self.cube_min = torch.tensor(cfg[\"cube_min\"])\n", 161 | " self.cube_max = torch.tensor(cfg[\"cube_max\"])\n", 162 | " self.nB = cfg['nB']\n", 163 | " self.cube_shape = cfg['cube_shape']\n", 164 | " cube = []\n", 165 | " for i in range(self.cube_shape[0]):\n", 166 | " for j in range(self.cube_shape[1]):\n", 167 | " for k in range(self.cube_shape[2]):\n", 168 | " cube.append([i,j,k])\n", 169 | " # (B,Np,3)\n", 170 | " self.cube = torch.tensor(cube.repeat(self.nB,1,1),requires_grad=False)\n", 171 | " self.cube = self.cube * (self.cube_max - self.cube_min)[None,None] + self.cube_min[None]\n", 172 | "\n", 173 | " def __call__(self, x, R, t, weights = None):\n", 174 | " \"\"\"\n", 175 | " Args:\n", 176 | " x : (B,V,J,2)\n", 177 | " R : ((M),B,V,*)\n", 178 | " t : ((M),B,V,*)\n", 179 | " weights: (M,B)\n", 180 | " \n", 181 | " \"\"\"\n", 182 | " if weights is not None:\n", 183 | " # (M,B,V,1,3,3)@(1,B,1,Np,3,1) + (M,B,V,1,3,1) -> (M,B,V,Np,3,1) -> (M,B,V,Np,2)\n", 184 | " reproj = homo_to_eulid( (R.matrix[:,:,:,None] @ self.cube[None,:,None,:,:,None] + t.trans[:,:,:,None]).squeeze(-1))\n", 185 | " # ((M,B,V,1,Np,2) - (1,B,V,J,1,2)) * (M,N,1,1,1,1) -> (M,B,V,J,Np,2) -> (B,V,J,Np,2)\n", 186 | " # (B,V,J,Np,2)**2 -> (B,V,J,Np,2) ** 2 -> (B,J,Np)\n", 187 | " mpjpe = (\n", 188 | " (reproj[:,:,:,None] - x[None,...,None,:])**2 * weights[...,None,None,None,None]\n", 189 | " ).sum(0) / weights.sum(0)[...,None,None,None,None]\n", 190 | " \n", 191 | " \n", 192 | " else:\n", 193 | " # (B,V,1,3,3)@(B,1,Np,3,1) + (B,V,1,3,1) -> (B,V,Np,3,1) -> (B,V,Np,2)\n", 194 | " reproj = homo_to_eulid( (R.matrix[:,:,None] @ self.cube[:,None,:,:,None] + t.trans[:,:,None]).squeeze(-1))\n", 195 | " # ((B,V,1,Np,2) - (B,V,J,1,2))**2 -> (B,V,J,Np,2) ** 2 -> (B,J,Np)\n", 196 | " mpjpe = (reproj[:,:,None] - x[...,None,:])**2\n", 197 | " \n", 198 | " mpjpe = mpjpe.sum(-4).sum(-1)\n", 199 | " mpjpe = mpjpe.view(self.nB,-1,*self.cube_shape)\n", 200 | " return torch.exp(- mpjpe)" 201 | ] 202 | }, 203 | { 204 | "cell_type": "code", 205 | "execution_count": null, 206 | "metadata": {}, 207 | "outputs": [], 208 | "source": [] 209 | } 210 | ], 211 | "metadata": { 212 | "kernelspec": { 213 | "display_name": "Python 3 (ipykernel)", 214 | "language": "python", 215 | "name": "python3" 216 | }, 217 | "language_info": { 218 | "codemirror_mode": { 219 | "name": "ipython", 220 | "version": 3 221 | }, 222 | "file_extension": ".py", 223 | "mimetype": "text/x-python", 224 | "name": "python", 225 | "nbconvert_exporter": "python", 226 | "pygments_lexer": "ipython3", 227 | "version": "3.8.17" 228 | } 229 | }, 230 | "nbformat": 4, 231 | "nbformat_minor": 2 232 | } 233 | -------------------------------------------------------------------------------- /v1/calibration.py: -------------------------------------------------------------------------------- 1 | import torch 2 | from triangulation import ProbabilisticTriangulation 3 | import cv2 4 | import numpy as np 5 | from utils import * 6 | 7 | class CalibrationBatch(): 8 | def __init__(self,cfg, points2d, confi2d): 9 | """ 10 | points2d : (B,V,J,2) 11 | confi2d : (B,V,J) 12 | points3d : (B,J,3) 13 | confi3d : (B,J) 14 | R : (B,V,3,3) 15 | t : (B,V,3,1) 16 | isdistribution : bool 17 | """ 18 | self.n_batch,self.n_view,self.n_joint = points2d.shape[:3] 19 | self.M = cfg["M"] 20 | self.points2d = points2d 21 | self.confi2d = confi2d 22 | self.points3d = torch.zeros((self.n_batch,self.n_joint,3)) 23 | self.confi3d = torch.zeros((self.n_batch,self.n_joint)) 24 | self.R = torch.zeros((self.n_batch,self.n_view,3,3)) 25 | self.t = torch.zeros((self.n_batch,self.n_view,3,1)) 26 | self.prob_tri = ProbabilisticTriangulation(cfg) 27 | self.buffer_weights = None 28 | 29 | 30 | def weighted_triangulation(self, points2d, confi2d, R ,t): 31 | """ 32 | Args: 33 | points2d : (V',J,2) 34 | confi2d : (V',J) 35 | R : (V',3,3) 36 | t : (V',3,1) 37 | Returns: 38 | points3d : (J,3) 39 | confi3d : (J) 40 | """ 41 | n_view_filter= points2d.shape[0] 42 | points3d = torch.zeros((self.n_joint, 3)) 43 | confi3d = torch.zeros((self.n_joint)) 44 | # print(points2d.shape,confi2d.shape,R.shape,t.shape) 45 | for j in range(self.n_joint): 46 | A = [] 47 | for i in range(n_view_filter): 48 | if confi2d[i,j] > 0.5: 49 | P = torch.cat([R[i],t[i]],dim=1) 50 | P3T = P[2] 51 | A.append(confi2d[i,j] * (points2d[i,j,0]*P3T - P[0])) 52 | A.append(confi2d[i,j] * (points2d[i,j,1]*P3T - P[1])) 53 | A = torch.stack(A) 54 | # print(A.shape) 55 | if A.shape[0] >= 4: 56 | u, s, vh = torch.linalg.svd(A) 57 | error = s[-1] 58 | X = vh[len(s) - 1] 59 | points3d[j,:] = X[:3] / X[3] 60 | confi3d[j] = np.exp(-torch.abs(error)) 61 | else: 62 | points3d[:,j] = torch.tensor([0.0,0.0,0.0]) 63 | confi3d[j] = 0 64 | 65 | return points3d, confi3d 66 | 67 | def weighted_triangulation_sample(self, points2d, confi2d, R ,t): 68 | """ 69 | Args: 70 | points2d : (B,V',J,2) 71 | confi2d : (B,V',J) 72 | R : ((M),B, V',3,3) 73 | t : ((M),B, V',3,1) 74 | Returns: 75 | sample_points3d : ((M),B,J,3) 76 | sample_confi3d : ((M),B,J) 77 | """ 78 | if len(R.matrix.shape[:-2]) > len(points2d.shape[:-2]): 79 | nM = R.matrix.shape[0] 80 | sample_points3d = torch.zeros((nM,self.n_batch,self.n_joint,3)) 81 | sample_confi3d = torch.zeros((nM,self.n_batch,self.n_joint)) 82 | for i in range(nM): 83 | for j in range(self.n_batch): 84 | sample_points3d[i,j], sample_confi3d[i,j] = self.weighted_triangulation( 85 | points2d[j], confi2d[j], R.matrix[i,j], t.trans[i,j] 86 | ) 87 | return sample_points3d, sample_confi3d 88 | 89 | else: 90 | sample_points3d = torch.zeros((self.n_batch,self.n_joint,3)) 91 | sample_confi3d = torch.zeros((self.n_batch,self.n_joint)) 92 | 93 | for j in range(self.n_batch): 94 | sample_points3d[j], sample_confi3d[j] = self.weighted_triangulation( 95 | points2d[j], confi2d[j], R.matrix[j], t.trans[j] 96 | ) 97 | 98 | return sample_points3d, sample_confi3d 99 | 100 | def pnp(self,batch_id): 101 | self.R[batch_id,0] = torch.eye(3) 102 | self.t[batch_id,0] = torch.zeros((3,1)) 103 | for i in range(1, self.n_view): 104 | mask = torch.logical_and(self.confi2d[batch_id,i]>0.8,self.confi3d[batch_id]>0.8) 105 | p2d = self.points2d[batch_id,i,mask].numpy() 106 | p3d = self.points3d[batch_id,mask].numpy() 107 | ret, rvec, tvec = cv2.solvePnP(p3d, p2d, np.eye(3), np.zeros(5)) 108 | R, _ = cv2.Rodrigues(rvec) 109 | self.R[batch_id,i] = torch.tensor(R) 110 | self.t[batch_id,i] = torch.tensor(tvec) 111 | 112 | 113 | def eight_point(self): 114 | for batch_id in range(self.n_batch): 115 | mask = torch.logical_and(self.confi2d[batch_id,0]>0.5, self.confi2d[batch_id,1]>0.5) 116 | 117 | p0 = self.points2d[batch_id,0,mask].numpy() 118 | p1 = self.points2d[batch_id,1,mask].numpy() 119 | # p0,p1 (N,2) 120 | # print(p0.shape,p1.shape) 121 | E, mask = cv2.findEssentialMat(p0, p1, focal=1.0, pp=(0., 0.), 122 | method=cv2.RANSAC, prob=0.999, threshold=0.0003) 123 | p0_inliers = p0[mask.ravel() == 1] 124 | p1_inliers = p0[mask.ravel() == 1] 125 | point, R, t,mask = cv2.recoverPose(E, p0_inliers, p1_inliers) 126 | self.R[batch_id,0],self.t[batch_id,0] = torch.eye(3), torch.zeros((3,1)) 127 | self.R[batch_id,1],self.t[batch_id,1] = torch.tensor(R),torch.tensor(t) 128 | 129 | # print(self.R[batch_id,1],self.t[batch_id,1]) 130 | 131 | self.points3d[batch_id], self.confi3d[batch_id] = self.weighted_triangulation( 132 | self.points2d[batch_id,:2],self.confi2d[batch_id,:2],self.R[batch_id,:2],self.t[batch_id,:2] 133 | ) 134 | 135 | self.pnp(batch_id) 136 | 137 | # print(self.R[batch_id,0],self.t[batch_id,0]) 138 | # print(self.mpjpe(2)) 139 | # print(self.confi3d[batch_id]) 140 | 141 | self.points3d[batch_id], self.confi3d[batch_id] = self.weighted_triangulation( 142 | self.points2d[batch_id],self.confi2d[batch_id],self.R[batch_id],self.t[batch_id] 143 | ) 144 | # print(self.confi3d[batch_id]) 145 | # print(self.mpjpe(self.n_view)) 146 | 147 | def monte_carlo(self): 148 | self.eight_point() 149 | self.prob_tri.update_paramater_init(self.points3d, self.points2d ,self.R,self.t) 150 | 151 | weights_log = [] 152 | for i in range(4): 153 | rot,t = self.prob_tri.sample(self.M) 154 | # print(self.points2d.shape, self.confi2d.shape, rot.quan.shape, t.vector.shape) 155 | sample_points3d, sample_confi3d = self.weighted_triangulation_sample(self.points2d, self.confi2d, rot, t) 156 | weights = cal_mpjpe_batch(sample_points3d, self.points2d, rot, t) 157 | weights_log.append(weights) 158 | self.prob_tri.update_paramater_with_weights(rot,t,weights) 159 | 160 | rot,t = self.prob_tri.getbuffer_Rt() 161 | # print(self.points2d.shape, self.confi2d.shape, rot.quan.shape, t.vector.shape) 162 | points3d, confi3d = self.weighted_triangulation_sample(self.points2d, self.confi2d, rot, t) 163 | self.buffer_weights = cal_mpjpe_batch(points3d,self.points2d,rot,t) 164 | 165 | rot,t = self.prob_tri.getbest_Rt() 166 | self.R = rot.matrix 167 | self.t = t.trans 168 | return weights_log 169 | 170 | def mpjpe(self, n_view_filter): 171 | return (homo_to_eulid((self.R[...,:n_view_filter,None,:,:] @ self.points3d[...,None,:,:,None] + self.t[...,:n_view_filter,None,:,:]).squeeze(-1)) - self.points2d[:,:n_view_filter] ).mean() 172 | 173 | 174 | 175 | 176 | -------------------------------------------------------------------------------- /v1/distribution.py: -------------------------------------------------------------------------------- 1 | from pyro.distributions import TorchDistribution,constraints 2 | from pyro.distributions import MultivariateStudentT 3 | from torch.distributions.multivariate_normal import _batch_mahalanobis, _standard_normal, _batch_mv 4 | 5 | from pyro.distributions.util import broadcast_shape 6 | import torch 7 | import math 8 | 9 | 10 | 11 | def cholesky_wrapper(mat, default_diag=None, force_cpu=True): 12 | device = mat.device 13 | if force_cpu: 14 | mat = mat.cpu() 15 | try: 16 | tril = torch.linalg.cholesky(mat, upper=False) 17 | except RuntimeError: 18 | n_dims = mat.size(-1) 19 | tril = [] 20 | default_tril_single = torch.diag(mat.new_tensor(default_diag)) if default_diag is not None \ 21 | else torch.eye(n_dims, dtype=mat.dtype, device=mat.device) 22 | for cov in mat.reshape(-1, n_dims, n_dims): 23 | try: 24 | tril.append(torch.linalg.cholesky(cov, upper=False)) 25 | except RuntimeError: 26 | tril.append(default_tril_single) 27 | tril = torch.stack(tril, dim=0).reshape(mat.shape) 28 | return tril.to(device) 29 | 30 | 31 | class AngularCentralGaussian(TorchDistribution): 32 | arg_constraints = {'scale_tril': constraints.lower_cholesky} 33 | has_rsample = True 34 | 35 | def __init__(self, scale_tril, validate_args=None, eps=1e-6): 36 | q = scale_tril.size(-1) 37 | assert q > 1 38 | assert scale_tril.shape[-2:] == (q, q) 39 | batch_shape = scale_tril.shape[:-2] 40 | event_shape = (q,) 41 | self.scale_tril = scale_tril.expand(batch_shape + (-1, -1)) 42 | self._unbroadcasted_scale_tril = scale_tril 43 | self.q = q 44 | self.area = 2 * math.pi ** (0.5 * q) / math.gamma(0.5 * q) 45 | self.eps = eps 46 | super().__init__(batch_shape, event_shape, validate_args=validate_args) 47 | 48 | def log_prob(self, value): 49 | if self._validate_args: 50 | self._validate_sample(value) 51 | value = value.expand( 52 | broadcast_shape(value.shape[:-1], self._unbroadcasted_scale_tril.shape[:-2]) 53 | + self.event_shape) 54 | M = _batch_mahalanobis(self._unbroadcasted_scale_tril, value) 55 | half_log_det = self._unbroadcasted_scale_tril.diagonal(dim1=-2, dim2=-1).log().sum(-1) 56 | return M.log() * (-self.q / 2) - half_log_det - math.log(self.area) 57 | 58 | def rsample(self, sample_shape=torch.Size()): 59 | shape = self._extended_shape(sample_shape) 60 | normal = _standard_normal(shape, 61 | dtype=self._unbroadcasted_scale_tril.dtype, 62 | device=self._unbroadcasted_scale_tril.device) 63 | gaussian_samples = _batch_mv(self._unbroadcasted_scale_tril, normal) 64 | gaussian_samples_norm = gaussian_samples.norm(dim=-1) 65 | samples = gaussian_samples / gaussian_samples_norm.unsqueeze(-1) 66 | samples[gaussian_samples_norm < self.eps] = samples.new_tensor( 67 | [1.] + [0. for _ in range(self.q - 1)]) 68 | return samples 69 | 70 | 71 | class AngularCentralGaussianMultiView(): 72 | def __init__(self, scale_tril, df): 73 | self.scale_tril = scale_tril 74 | self.nV = scale_tril.shape[-3] 75 | self.distri = [AngularCentralGaussian(scale_tril[...,i,:,:]) for i in range(self.nV)] 76 | 77 | def __call__(self,sample_shape=torch.Size()): 78 | return torch.stack([self.distri[i](sample_shape) for i in range(self.nV)],dim=-2) 79 | 80 | class MultivariateStudentTMultiView(): 81 | def __init__(self, loc, scale_tril, df): 82 | self.loc = loc 83 | self.scale_tril = scale_tril 84 | self.df = df 85 | self.nV = scale_tril.shape[-3] 86 | self.distri = [ MultivariateStudentT(loc=self.loc[...,i,:], scale_tril = scale_tril[...,i,:,:],df=self.df) for i in range(self.nV)] 87 | 88 | def __call__(self,sample_shape=torch.Size()): 89 | return torch.stack([self.distri[i](sample_shape) for i in range(self.nV)],dim=-2) -------------------------------------------------------------------------------- /v1/triangulation.py: -------------------------------------------------------------------------------- 1 | import torch 2 | from distribution import AngularCentralGaussian, cholesky_wrapper 3 | from pyro.distributions import MultivariateStudentT 4 | from utils import * 5 | import torch.nn.functional as F 6 | 7 | class ProbabilisticTriangulation(): 8 | def __init__(self, cfg): 9 | """ 10 | Members: 11 | expect_quan: Rotation (B,V,*) 12 | tril_R: (B,V-1,4,4) 13 | mu_t: Translation (B,V,*) 14 | tril_t: (B,V-1,3,3) 15 | """ 16 | self.nB = cfg["nB"] 17 | self.nV = cfg["nV"] 18 | self.M = cfg["M"] 19 | self.isDistr = cfg["isDistr"] 20 | 21 | if self.isDistr: 22 | self.expect_quan = Rotation(torch.tensor([1.,0.,0.,0.]).repeat(self.nB,self.nV,1)) 23 | self.mu_t = Translation(torch.zeros(self.nB,self.nV,3)) 24 | 25 | self.tril_R = torch.eye(4,4).repeat(self.nB, self.nV-1, 1, 1) 26 | self.tril_t = torch.eye(3,3).repeat(self.nB, self.nV-1, 1, 1) 27 | # conv_quan (B,V,4,4) 28 | self.distrR = AngularCentralGaussian(self.tril_R) 29 | # mu_t (B,V,3) conv_t (B,V,3,3) 30 | self.distrT = MultivariateStudentT(loc=self.mu_t.distr_norm(),scale_tril=self.tril_t,df=3) 31 | 32 | self.bufferR, self.bufferT = None,None 33 | 34 | else: 35 | self.bufferR = Rotation(torch.randn(self.M//8,self.nB, self.nV-1, 4)) 36 | self.bufferT = Translation(torch.randn(self.M//8,self.nB, self.nV-1, 3)) 37 | 38 | self.lr = 1e-2 39 | 40 | def sample(self, nM): 41 | 42 | if self.isDistr: 43 | if self.bufferR is not None: 44 | nM -= self.bufferR.quan.shape[-4] 45 | rot = Rotation(self.distrR((nM,))) 46 | t = Translation(self.distrT((nM,))) 47 | if self.bufferR is not None: 48 | # print(rot.quan.shape, self.bufferR.quan.shape) 49 | rot.cat(self.bufferR) 50 | t.cat(self.bufferT) 51 | return rot,t 52 | 53 | else: 54 | buffer_nM = self.bufferR.quan.shape[-4] 55 | 56 | temp_buffer_quan = self.bufferR.quan[...,1:,:].repeat(nM//buffer_nM, 1, 1, 1) 57 | rot = Rotation( 58 | temp_buffer_quan + torch.randn_like(temp_buffer_quan) * self.lr 59 | ) 60 | 61 | temp_buffer_vector = self.bufferT.vector[...,1:,:].repeat(nM//buffer_nM, 1, 1, 1) 62 | t = Translation( 63 | temp_buffer_vector + torch.randn_like(temp_buffer_vector) * self.lr 64 | ) 65 | self.lr *= 0.1 66 | rot.random(lr = self.lr) 67 | t.random(lr = self.lr*10) 68 | return rot,t 69 | 70 | 71 | def update_paramater_init(self,points3d,points2d, rot,t): 72 | """ 73 | Args: 74 | rot Tensor -> Rotation: (B,V,3,3) 75 | t Tensor -> Translation: (B,V,3,1) 76 | Returns: 77 | sample_quan : (M,B,V,4) 78 | sample_t : (M,B,V,3) 79 | weights: (M,B) 80 | """ 81 | 82 | self.lr = 1e-3 83 | self.bufferR = Rotation(rot.repeat(self.M//8,1,1,1,1)) 84 | self.bufferT = Translation(t.repeat(self.M//8,1,1,1,1)) 85 | self.bufferR.random(self.lr) 86 | self.bufferT.random(self.lr*10) 87 | rot, t = self.sample(self.M) 88 | 89 | # weights = torch.cat([ 90 | # torch.ones(self.M-1,self.nB) * (0.5/(self.M-1)), 91 | # torch.ones(1,self.nB)*0.5, 92 | # ], dim = 0) 93 | weights = cal_mpjpe_batch(points3d,points2d, rot,t) 94 | self.update_paramater_with_weights(rot, t, weights) 95 | 96 | def update_paramater_with_weights(self,rot,t, weights): 97 | """ 98 | Args: 99 | rot : (M,B,V,*) 100 | t : (M,B,V,*) 101 | weights : (M,B) 102 | Returns: 103 | conv_quan : (B,V,4,4) 104 | mu_t : (B,V,3) 105 | conv_t : (B,V,3,3) 106 | """ 107 | 108 | 109 | topk_weight, indices = torch.topk(weights, self.M//8, dim=0) 110 | # indices (M/2, B) -> (M/2,B,V,*) 111 | indices = indices[...,None,None] 112 | half_quan = rot.quan.gather(0, indices.expand(-1,-1,self.nV,4)) 113 | half_vector = t.vector.gather(0, indices.expand(-1,-1,self.nV,3)) 114 | 115 | rot = Rotation(half_quan) 116 | t = Translation(half_vector) 117 | weights = topk_weight 118 | 119 | # (M,B,V,4) * (M,B,1,1) -> (B,V,4) 120 | self.expect_quan = Rotation( 121 | (rot.quan * weights[...,None,None]).sum(0) / weights.sum(0)[...,None,None] 122 | ) 123 | # (M,B,V,3) * (M,B) -> (B,V,3) 124 | self.mu_t = Translation( 125 | (t.vector * weights[...,None,None]).sum(0) / weights.sum(0)[...,None,None] 126 | ) 127 | 128 | if self.isDistr: 129 | # (B,V-1,4,M) @ (B,V-1,M,4) -> (B,V-1,4,4) 130 | conv_quan = ( 131 | rot.distr_norm().permute(1,2,3,0) @ (rot.distr_norm() * weights[...,None,None]).permute(1,2,0,3) 132 | ) / weights.sum(0)[...,None,None,None] 133 | 134 | # u,s,vt = torch.linalg.svd(conv_quan) 135 | # s *= torch.tensor([1,0.1,0.01,0.001])[None,None] 136 | # conv_quan = u @ torch.diag_embed(s) @ vt 137 | 138 | self.tril_quan = cholesky_wrapper(conv_quan) 139 | 140 | # (M,B,V,3) - (1,B,V,3) -> (M,B,V,3) -> (M,B ,V,3) 141 | centered_t = Translation(t.vector - self.mu_t.vector[None]).distr_norm() 142 | # (B,V-1,3,M) @ (B,V-1,M,3) -> (B,V-1,3,3) 143 | conv_t = ( 144 | centered_t.permute(1,2,3,0) @ (centered_t * weights[...,None,None]).permute(1,2,0,3) 145 | ) / weights.sum(0)[...,None,None,None] 146 | 147 | # u,s,vt = torch.linalg.svd(conv_t) 148 | # s *= torch.tensor([1,0.1,0.01])[None,None] 149 | # conv_t = u @ torch.diag_embed(s) @ vt 150 | 151 | self.tril_t = cholesky_wrapper(conv_t) 152 | 153 | self.distrR = AngularCentralGaussian(self.tril_quan) 154 | self.distrT = MultivariateStudentT(loc=self.mu_t.distr_norm(),scale_tril=self.tril_t,df = 3) 155 | 156 | 157 | self.bufferR = Rotation(half_quan) 158 | self.bufferT = Translation(half_vector) 159 | 160 | 161 | def getbest_Rt(self): 162 | return Rotation(self.bufferR.quan[0]), Translation(self.bufferT.vector[0]) 163 | 164 | def getbuffer_Rt(self): 165 | return self.bufferR, self.bufferT 166 | 167 | 168 | -------------------------------------------------------------------------------- /v1/utils.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import torch 3 | import cv2 4 | import torch.nn.functional as F 5 | 6 | JOINT_LINKS = [(0,1),(0,2),(1,3),(2,4),(0,5),(0,6),(5,7),(7,9),(6,8),(8,10),(5,11),(6,12),(11,12),(11,13),(13,15),(12,14),(14,16)] 7 | 8 | 9 | def cal_mpjpe(points3d, points2d, R, t): 10 | """ 11 | Args: 12 | points3d : (...,J,3) 13 | points2d : (...,V,J,2) 14 | R : (...,V,3,3) 15 | t: (...,V,3,1) 16 | """ 17 | return (homo_to_eulid((R[...,None,:,:] @ points3d[...,None,:,:,None] + t[...,None,:,:]).squeeze(-1)) - points2d ).mean() 18 | 19 | 20 | def cal_mpjpe_batch(points3d, points2d, rot, t): 21 | """ 22 | Args: 23 | points3d : ((M),...,J,3) 24 | points2d : (...,V,J,2) 25 | rot : Rotation ((M),...,V,*) 26 | t: Translation ((M),...,V,*) 27 | Returns: 28 | weights : (...) or ((M),...) 29 | """ 30 | if(len(rot.quan.shape[:-1]) > len(points2d.shape[:-2])): 31 | return torch.pow(torch.exp( 32 | -( 33 | homo_to_eulid( 34 | (rot.matrix[...,None,:,:] @ points3d[...,None,:,:,None] + t.trans[...,None,:,:] 35 | ).squeeze(-1) 36 | ) - points2d[None] 37 | ).norm(dim=-1).mean((-1,-2)) 38 | ), 4) 39 | else: 40 | return torch.exp( 41 | -( 42 | homo_to_eulid( 43 | (rot.matrix[...,None,:,:] @ points3d[...,None,:,:,None] + t.trans[...,None,:,:] 44 | ).squeeze(-1) 45 | ) - points2d 46 | ).norm(dim=-1).mean((-1,-2)) 47 | ) 48 | 49 | def eulid_to_homo(points): 50 | """ 51 | points: (...,N,M) 52 | return: (...,N,M+1) 53 | """ 54 | if isinstance(points, np.ndarray): 55 | return np.concatenate([points, np.ones((*points.shape[:-1],1))], axis=-1) 56 | elif torch.is_tensor(points): 57 | return torch.cat([points, torch.ones((*points.shape[:-1],1),dtype=points.dtype,device=points.device)],dim=-1) 58 | else: 59 | raise TypeError("Works Only with numpy arrays and Pytorch tensors") 60 | 61 | def homo_to_eulid(points): 62 | """ 63 | points: (...,N,M+1) 64 | return: (...,N,M) 65 | """ 66 | if isinstance(points, np.ndarray): 67 | return points[...,:-1] / points[...,-1,None] 68 | elif torch.is_tensor(points): 69 | return points[...,:-1] / points[...,-1,None] 70 | else: 71 | raise TypeError("Works Only with numpy arrays and Pytorch tensors") 72 | 73 | def calIOU(b1,b2): 74 | """ 75 | Input: 76 | b1,b2: [x1,y1,x2,y2] 77 | """ 78 | s1 = (b1[2] - b1[0]) * (b1[3]-b1[1]) 79 | s2 = (b2[2] - b2[0]) * (b2[3]-b2[1]) 80 | a = max(0,min(b1[2],b2[2]) - max(b1[0],b2[0])) * max(0,min(b1[3],b2[3]) - max(b1[1],b2[1])) 81 | return a/(s1+s2-a) 82 | 83 | 84 | def quaternion_to_matrix(quaternions: torch.Tensor) -> torch.Tensor: 85 | """ 86 | Convert rotations given as quaternions to rotation matrices. 87 | 88 | Args: 89 | quaternions: quaternions with real part first, 90 | as tensor of shape (..., 4). 91 | 92 | Returns: 93 | Rotation matrices as tensor of shape (..., 3, 3). 94 | """ 95 | r, i, j, k = torch.unbind(quaternions, -1) 96 | # pyre-fixme[58]: `/` is not supported for operand types `float` and `Tensor`. 97 | two_s = 2.0 / (quaternions * quaternions).sum(-1) 98 | 99 | o = torch.stack( 100 | ( 101 | 1 - two_s * (j * j + k * k), 102 | two_s * (i * j - k * r), 103 | two_s * (i * k + j * r), 104 | two_s * (i * j + k * r), 105 | 1 - two_s * (i * i + k * k), 106 | two_s * (j * k - i * r), 107 | two_s * (i * k - j * r), 108 | two_s * (j * k + i * r), 109 | 1 - two_s * (i * i + j * j), 110 | ), 111 | -1, 112 | ) 113 | return o.reshape(quaternions.shape[:-1] + (3, 3)) 114 | 115 | def _sqrt_positive_part(x: torch.Tensor) -> torch.Tensor: 116 | """ 117 | Returns torch.sqrt(torch.max(0, x)) 118 | but with a zero subgradient where x is 0. 119 | """ 120 | ret = torch.zeros_like(x) 121 | positive_mask = x > 0 122 | ret[positive_mask] = torch.sqrt(x[positive_mask]) 123 | return ret 124 | 125 | def matrix_to_quaternion(matrix: torch.Tensor) -> torch.Tensor: 126 | """ 127 | Convert rotations given as rotation matrices to quaternions. 128 | 129 | Args: 130 | matrix: Rotation matrices as tensor of shape (..., 3, 3). 131 | 132 | Returns: 133 | quaternions with real part first, as tensor of shape (..., 4). 134 | """ 135 | if matrix.size(-1) != 3 or matrix.size(-2) != 3: 136 | raise ValueError(f"Invalid rotation matrix shape {matrix.shape}.") 137 | 138 | batch_dim = matrix.shape[:-2] 139 | m00, m01, m02, m10, m11, m12, m20, m21, m22 = torch.unbind( 140 | matrix.reshape(batch_dim + (9,)), dim=-1 141 | ) 142 | 143 | q_abs = _sqrt_positive_part( 144 | torch.stack( 145 | [ 146 | 1.0 + m00 + m11 + m22, 147 | 1.0 + m00 - m11 - m22, 148 | 1.0 - m00 + m11 - m22, 149 | 1.0 - m00 - m11 + m22, 150 | ], 151 | dim=-1, 152 | ) 153 | ) 154 | 155 | # we produce the desired quaternion multiplied by each of r, i, j, k 156 | quat_by_rijk = torch.stack( 157 | [ 158 | # pyre-fixme[58]: `**` is not supported for operand types `Tensor` and 159 | # `int`. 160 | torch.stack([q_abs[..., 0] ** 2, m21 - m12, m02 - m20, m10 - m01], dim=-1), 161 | # pyre-fixme[58]: `**` is not supported for operand types `Tensor` and 162 | # `int`. 163 | torch.stack([m21 - m12, q_abs[..., 1] ** 2, m10 + m01, m02 + m20], dim=-1), 164 | # pyre-fixme[58]: `**` is not supported for operand types `Tensor` and 165 | # `int`. 166 | torch.stack([m02 - m20, m10 + m01, q_abs[..., 2] ** 2, m12 + m21], dim=-1), 167 | # pyre-fixme[58]: `**` is not supported for operand types `Tensor` and 168 | # `int`. 169 | torch.stack([m10 - m01, m20 + m02, m21 + m12, q_abs[..., 3] ** 2], dim=-1), 170 | ], 171 | dim=-2, 172 | ) 173 | 174 | # We floor here at 0.1 but the exact level is not important; if q_abs is small, 175 | # the candidate won't be picked. 176 | flr = torch.tensor(0.1).to(dtype=q_abs.dtype, device=q_abs.device) 177 | quat_candidates = quat_by_rijk / (2.0 * q_abs[..., None].max(flr)) 178 | 179 | # if not for numerical problems, quat_candidates[i] should be same (up to a sign), 180 | # forall i; we pick the best-conditioned one (with the largest denominator) 181 | 182 | return quat_candidates[ 183 | F.one_hot(q_abs.argmax(dim=-1), num_classes=4) > 0.5, : 184 | ].reshape(batch_dim + (4,)) 185 | 186 | class Rotation(): 187 | def __init__(self, rot): 188 | """ 189 | quan ((M),B,V,4) 190 | matrix ((M),B,V,3,3) 191 | """ 192 | assert((rot.shape[-1] == 4) or (rot.shape[-1] == 3 and rot.shape[-2] == 3)) 193 | 194 | if rot.shape[-1] == 4: 195 | rot = self.standard_quan(rot) 196 | self.quan = rot 197 | self.matrix = quaternion_to_matrix(self.quan) 198 | 199 | else: 200 | self.matrix = rot 201 | self.quan = matrix_to_quaternion(self.matrix) 202 | 203 | def cat(self,rot): 204 | assert(isinstance(rot, (Rotation, torch.Tensor))) 205 | if isinstance(rot, torch.Tensor): 206 | rot = Rotation(rot) 207 | assert(rot.quan.shape[-3:] == self.quan.shape[-3:]) 208 | if len(self.quan.shape) > len(rot.quan.shape): 209 | self.quan = torch.cat([self.quan, rot.quan[None]], dim=-4) 210 | self.matrix = torch.cat([self.matrix, rot.matrix[None]], dim=-5) 211 | else: 212 | self.quan = torch.cat([self.quan, rot.quan], dim=-4) 213 | self.matrix = torch.cat([self.matrix, rot.matrix], dim=-5) 214 | 215 | def distr_norm(self): 216 | return self.quan[...,1:,:] 217 | 218 | def standard_quan(self,rot): 219 | rot = rot / torch.clamp_(rot.norm(dim = -1)[...,None],min=1e-4) 220 | size = rot.shape 221 | rot0 = torch.tensor([1,0,0,0]).repeat(*(size[:-2]),1,1) 222 | if F.l1_loss(rot0,rot[...,0:1,:]) < 1e-6: 223 | return rot 224 | else: 225 | return torch.cat([rot0, rot],dim=-2) 226 | 227 | def random(self, lr): 228 | assert(len(self.quan.shape) == 4) 229 | self.quan[1:,:,1:] += torch.randn_like(self.quan[1:,:,1:]) * lr 230 | self.quan = self.standard_quan(self.quan) 231 | self.matrix = quaternion_to_matrix(self.quan) 232 | 233 | 234 | 235 | 236 | class Translation(): 237 | def __init__(self, t): 238 | """ 239 | vector ((M),B,V,3) 240 | trans ((M),B,V,3,1) 241 | """ 242 | assert(t.shape[-1] == 3 or (t.shape[-1]==1 and t.shape[-2]==3)) 243 | if t.shape[-1] == 3: 244 | t = self.standard_vector(t) 245 | self.vector = t 246 | self.trans = t.unsqueeze(-1) 247 | else: 248 | self.vector = t.squeeze(-1) 249 | self.trans = t 250 | # (M,B) 251 | t_norm = self.vector[...,1,:].norm(dim=-1) 252 | if not (t_norm == 0.0).any(): 253 | self.vector[...,1:,:] /= t_norm[...,None,None] 254 | self.trans = self.vector.unsqueeze(-1) 255 | 256 | def cat(self,t): 257 | assert(isinstance(t, (Translation, torch.Tensor))) 258 | if isinstance(t ,torch.Tensor): 259 | t = Translation(t) 260 | assert(t.trans.shape[-3:] == self.trans.shape[-3:]) 261 | if len(self.vector.shape) > len(t.vector.shape): 262 | self.vector = torch.cat([self.vector, t.vector[None]],dim=-4) 263 | self.trans = torch.cat([self.trans, t.trans[None]], dim=-5) 264 | else: 265 | self.vector = torch.cat([self.vector, t.vector],dim=-4) 266 | self.trans = torch.cat([self.trans, t.trans], dim=-5) 267 | 268 | def distr_norm(self): 269 | return self.vector[...,1:,:] 270 | 271 | def standard_vector(self,t): 272 | size = t.shape 273 | t0 = torch.tensor([0,0,0]).repeat(*(size[:-2]),1,1) 274 | if F.l1_loss(t0,t[...,0:1,:]) < 1e-6: 275 | return t / torch.clamp_(t[...,1,:].norm(dim = -1)[...,None,None],min=1e-4) 276 | else: 277 | return torch.cat([t0, t/torch.clamp_(t[...,0,:].norm(dim = -1)[...,None,None],min=1e-4)],dim=-2) 278 | 279 | def random(self, lr): 280 | assert(len(self.vector.shape) == 4) 281 | self.vector[1:,:,1:] += torch.randn_like(self.vector[1:,:,1:]) * lr 282 | self.vector = self.standard_vector(self.vector) 283 | self.trans = self.vector.unsqueeze(-1) --------------------------------------------------------------------------------