├── LICENSE ├── README.md ├── code ├── inference.py └── requirements.txt ├── files ├── coco_labels.txt ├── imagenet1000_clsidx_to_labels.txt ├── init.sh └── mscoco_label_map.pbtxt ├── get_started.md ├── images ├── doc │ ├── 001.png │ ├── 002.png │ ├── 003.png │ ├── 004.png │ ├── 005.png │ ├── 006.png │ ├── 007.png │ ├── 008.png │ ├── 009.png │ ├── 010.png │ ├── 011.png │ ├── 012.png │ └── 013.png ├── imagenet_test │ ├── n02129604_7580_tiger.jpg │ ├── n03584254_2594_iPod.jpg │ ├── n04517823_9843_vacuum.jpg │ └── n07745940_2123_strawberry.jpg └── test │ ├── 000000029984.jpg │ ├── 000000059044.jpg │ ├── 000000119088.jpg │ ├── 000000119233.jpg │ ├── 000000133819.jpg │ └── 000000133969.jpg ├── mxnet-serving-endpoint.ipynb ├── pytorch-serving-endpoint.ipynb ├── pytorch-serving-neo-inf1.ipynb ├── pytorch-serving-neo.ipynb ├── src ├── inference_mxnet.py ├── inference_pytorch.py ├── inference_pytorch_inf1.py ├── inference_pytorch_neo.py └── utils.py └── tensorflow-serving-endpoint.ipynb /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | 3 | Permission is hereby granted, free of charge, to any person obtaining a copy of 4 | this software and associated documentation files (the "Software"), to deal in 5 | the Software without restriction, including without limitation the rights to 6 | use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of 7 | the Software, and to permit persons to whom the Software is furnished to do so. 8 | 9 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 10 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS 11 | FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR 12 | COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER 13 | IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN 14 | CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 15 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Deep Learning Inference Hands-on-Lab 2 | 3 | ## Introduction 4 | 5 | Amazon SageMaker에서 딥러닝 모델을 모두 훈련하고 추론(inference)해야 하나요? 그렇지 않습니다. 6 |
7 | 만약 여러분이 SageMaker에서 훈련 없이 추론만 수행하고 싶다면, 여러분의 온프레미스(on-premise)에서 훈련한 모델이나 공개 모델 저장소(model zoo)에 저장되어 있는 사전 훈련된(pre-trained) 모델들을 도커(Docker) 이미지 빌드 없이 곧바로 Amazon SageMaker Endpoint에 모델을 배포할 수 있습니다. 즉, 복잡한 8 | 절차 없이 오토스케일링(auto-scaling), A/B 테스트, 고가용성(high availability) 기능을 쉽게 이용할 수 있습니다. 9 |
10 | 이 워크샵을 통해 여러분은 딥러닝 프레임워크로 사전 훈련된 모델을 Amazon SageMaker Endpoint로 호스팅하는 방법을 배울 수 있습니다.
11 |
12 | - [사전 준비 (Optional)](get_started.md)
13 |
14 | ### PyTorch
15 | - [PyTorch에서 사전 훈련된 FasterRCNN 모델을 Endpoint로 호스팅](pytorch-serving-endpoint.ipynb)
16 | - [PyTorch에서 사전 훈련된 MNasNet 모델을 SageMaker Neo로 컴파일하기](pytorch-serving-neo.ipynb)
17 | - [PyTorch에서 사전 훈련된 MNasNet 모델을 SageMaker Neo로 컴파일 후 Inf1 인스턴스에 배포하기](pytorch-serving-neo-inf1.ipynb)
18 |
19 | ### MXNet
20 | - [MXNet에서 사전 훈련된 YOLO3 모델을 Endpoint로 호스팅](mxnet-serving-endpoint.ipynb)
21 |
22 | ### TensorFlow
23 | - [TensorFlow에서 사전 훈련된 MobileNet-V2 모델을 Endpoint로 호스팅](tensorflow-serving-endpoint.ipynb)
24 |
25 | ## License Summary
26 |
27 | 이 샘플 코드는 MIT-0 라이센스에 따라 제공됩니다. LICENSE 파일을 참조하십시오.
--------------------------------------------------------------------------------
/code/inference.py:
--------------------------------------------------------------------------------
1 | import base64
2 | import io
3 | import json
4 | import requests
5 | from PIL import Image
6 | import numpy as np
7 |
8 | def input_handler(data, context):
9 | """ Pre-process request input before it is sent to TensorFlow Serving REST API
10 |
11 | Args:
12 | data (obj): the request data stream
13 | context (Context): an object containing request and configuration details
14 |
15 | Returns:
16 | (dict): a JSON-serializable dict that contains request body and headers
17 | """
18 |
19 | if context.request_content_type == 'application/x-image':
20 | image = Image.open(io.BytesIO(data.read()))
21 | #image = Image.open(io.BytesIO(open(data, 'rb').read()))
22 | image_np = np.asarray(image)
23 | instance = image_np[np.newaxis, ...]
24 | return json.dumps({"instances": instance.tolist()})
25 | else:
26 | _return_error(415, 'Unsupported content type "{}"'.format(context.request_content_type or 'Unknown'))
27 |
28 |
29 | def output_handler(response, context):
30 | """Post-process TensorFlow Serving output before it is returned to the client.
31 |
32 | Args:
33 | response (obj): the TensorFlow serving response
34 | context (Context): an object containing request and configuration details
35 |
36 | Returns:
37 | (bytes, string): data to return to client, response content type
38 | """
39 | if response.status_code != 200:
40 | _return_error(response.status_code, response.content.decode('utf-8'))
41 | response_content_type = context.accept_header
42 | prediction = response.content
43 | return prediction, response_content_type
44 |
45 |
46 | def _return_error(code, message):
47 | raise ValueError('Error: {}, {}'.format(str(code), message))
--------------------------------------------------------------------------------
/code/requirements.txt:
--------------------------------------------------------------------------------
1 | numpy==1.22.0
2 | Pillow
--------------------------------------------------------------------------------
/files/coco_labels.txt:
--------------------------------------------------------------------------------
1 | 1,1,person
2 | 2,2,bicycle
3 | 3,3,car
4 | 4,4,motorcycle
5 | 5,5,airplane
6 | 6,6,bus
7 | 7,7,train
8 | 8,8,truck
9 | 9,9,boat
10 | 10,10,traffic light
11 | 11,11,fire hydrant
12 | 13,12,stop sign
13 | 14,13,parking meter
14 | 15,14,bench
15 | 16,15,bird
16 | 17,16,cat
17 | 18,17,dog
18 | 19,18,horse
19 | 20,19,sheep
20 | 21,20,cow
21 | 22,21,elephant
22 | 23,22,bear
23 | 24,23,zebra
24 | 25,24,giraffe
25 | 27,25,backpack
26 | 28,26,umbrella
27 | 31,27,handbag
28 | 32,28,tie
29 | 33,29,suitcase
30 | 34,30,frisbee
31 | 35,31,skis
32 | 36,32,snowboard
33 | 37,33,sports ball
34 | 38,34,kite
35 | 39,35,baseball bat
36 | 40,36,baseball glove
37 | 41,37,skateboard
38 | 42,38,surfboard
39 | 43,39,tennis racket
40 | 44,40,bottle
41 | 46,41,wine glass
42 | 47,42,cup
43 | 48,43,fork
44 | 49,44,knife
45 | 50,45,spoon
46 | 51,46,bowl
47 | 52,47,banana
48 | 53,48,apple
49 | 54,49,sandwich
50 | 55,50,orange
51 | 56,51,broccoli
52 | 57,52,carrot
53 | 58,53,hot dog
54 | 59,54,pizza
55 | 60,55,donut
56 | 61,56,cake
57 | 62,57,chair
58 | 63,58,couch
59 | 64,59,potted plant
60 | 65,60,bed
61 | 67,61,dining table
62 | 70,62,toilet
63 | 72,63,tv
64 | 73,64,laptop
65 | 74,65,mouse
66 | 75,66,remote
67 | 76,67,keyboard
68 | 77,68,cell phone
69 | 78,69,microwave
70 | 79,70,oven
71 | 80,71,toaster
72 | 81,72,sink
73 | 82,73,refrigerator
74 | 84,74,book
75 | 85,75,clock
76 | 86,76,vase
77 | 87,77,scissors
78 | 88,78,teddy bear
79 | 89,79,hair drier
80 | 90,80,toothbrush
--------------------------------------------------------------------------------
/files/imagenet1000_clsidx_to_labels.txt:
--------------------------------------------------------------------------------
1 | {0: 'tench, Tinca tinca',
2 | 1: 'goldfish, Carassius auratus',
3 | 2: 'great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias',
4 | 3: 'tiger shark, Galeocerdo cuvieri',
5 | 4: 'hammerhead, hammerhead shark',
6 | 5: 'electric ray, crampfish, numbfish, torpedo',
7 | 6: 'stingray',
8 | 7: 'cock',
9 | 8: 'hen',
10 | 9: 'ostrich, Struthio camelus',
11 | 10: 'brambling, Fringilla montifringilla',
12 | 11: 'goldfinch, Carduelis carduelis',
13 | 12: 'house finch, linnet, Carpodacus mexicanus',
14 | 13: 'junco, snowbird',
15 | 14: 'indigo bunting, indigo finch, indigo bird, Passerina cyanea',
16 | 15: 'robin, American robin, Turdus migratorius',
17 | 16: 'bulbul',
18 | 17: 'jay',
19 | 18: 'magpie',
20 | 19: 'chickadee',
21 | 20: 'water ouzel, dipper',
22 | 21: 'kite',
23 | 22: 'bald eagle, American eagle, Haliaeetus leucocephalus',
24 | 23: 'vulture',
25 | 24: 'great grey owl, great gray owl, Strix nebulosa',
26 | 25: 'European fire salamander, Salamandra salamandra',
27 | 26: 'common newt, Triturus vulgaris',
28 | 27: 'eft',
29 | 28: 'spotted salamander, Ambystoma maculatum',
30 | 29: 'axolotl, mud puppy, Ambystoma mexicanum',
31 | 30: 'bullfrog, Rana catesbeiana',
32 | 31: 'tree frog, tree-frog',
33 | 32: 'tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui',
34 | 33: 'loggerhead, loggerhead turtle, Caretta caretta',
35 | 34: 'leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea',
36 | 35: 'mud turtle',
37 | 36: 'terrapin',
38 | 37: 'box turtle, box tortoise',
39 | 38: 'banded gecko',
40 | 39: 'common iguana, iguana, Iguana iguana',
41 | 40: 'American chameleon, anole, Anolis carolinensis',
42 | 41: 'whiptail, whiptail lizard',
43 | 42: 'agama',
44 | 43: 'frilled lizard, Chlamydosaurus kingi',
45 | 44: 'alligator lizard',
46 | 45: 'Gila monster, Heloderma suspectum',
47 | 46: 'green lizard, Lacerta viridis',
48 | 47: 'African chameleon, Chamaeleo chamaeleon',
49 | 48: 'Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis',
50 | 49: 'African crocodile, Nile crocodile, Crocodylus niloticus',
51 | 50: 'American alligator, Alligator mississipiensis',
52 | 51: 'triceratops',
53 | 52: 'thunder snake, worm snake, Carphophis amoenus',
54 | 53: 'ringneck snake, ring-necked snake, ring snake',
55 | 54: 'hognose snake, puff adder, sand viper',
56 | 55: 'green snake, grass snake',
57 | 56: 'king snake, kingsnake',
58 | 57: 'garter snake, grass snake',
59 | 58: 'water snake',
60 | 59: 'vine snake',
61 | 60: 'night snake, Hypsiglena torquata',
62 | 61: 'boa constrictor, Constrictor constrictor',
63 | 62: 'rock python, rock snake, Python sebae',
64 | 63: 'Indian cobra, Naja naja',
65 | 64: 'green mamba',
66 | 65: 'sea snake',
67 | 66: 'horned viper, cerastes, sand viper, horned asp, Cerastes cornutus',
68 | 67: 'diamondback, diamondback rattlesnake, Crotalus adamanteus',
69 | 68: 'sidewinder, horned rattlesnake, Crotalus cerastes',
70 | 69: 'trilobite',
71 | 70: 'harvestman, daddy longlegs, Phalangium opilio',
72 | 71: 'scorpion',
73 | 72: 'black and gold garden spider, Argiope aurantia',
74 | 73: 'barn spider, Araneus cavaticus',
75 | 74: 'garden spider, Aranea diademata',
76 | 75: 'black widow, Latrodectus mactans',
77 | 76: 'tarantula',
78 | 77: 'wolf spider, hunting spider',
79 | 78: 'tick',
80 | 79: 'centipede',
81 | 80: 'black grouse',
82 | 81: 'ptarmigan',
83 | 82: 'ruffed grouse, partridge, Bonasa umbellus',
84 | 83: 'prairie chicken, prairie grouse, prairie fowl',
85 | 84: 'peacock',
86 | 85: 'quail',
87 | 86: 'partridge',
88 | 87: 'African grey, African gray, Psittacus erithacus',
89 | 88: 'macaw',
90 | 89: 'sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita',
91 | 90: 'lorikeet',
92 | 91: 'coucal',
93 | 92: 'bee eater',
94 | 93: 'hornbill',
95 | 94: 'hummingbird',
96 | 95: 'jacamar',
97 | 96: 'toucan',
98 | 97: 'drake',
99 | 98: 'red-breasted merganser, Mergus serrator',
100 | 99: 'goose',
101 | 100: 'black swan, Cygnus atratus',
102 | 101: 'tusker',
103 | 102: 'echidna, spiny anteater, anteater',
104 | 103: 'platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus',
105 | 104: 'wallaby, brush kangaroo',
106 | 105: 'koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus',
107 | 106: 'wombat',
108 | 107: 'jellyfish',
109 | 108: 'sea anemone, anemone',
110 | 109: 'brain coral',
111 | 110: 'flatworm, platyhelminth',
112 | 111: 'nematode, nematode worm, roundworm',
113 | 112: 'conch',
114 | 113: 'snail',
115 | 114: 'slug',
116 | 115: 'sea slug, nudibranch',
117 | 116: 'chiton, coat-of-mail shell, sea cradle, polyplacophore',
118 | 117: 'chambered nautilus, pearly nautilus, nautilus',
119 | 118: 'Dungeness crab, Cancer magister',
120 | 119: 'rock crab, Cancer irroratus',
121 | 120: 'fiddler crab',
122 | 121: 'king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica',
123 | 122: 'American lobster, Northern lobster, Maine lobster, Homarus americanus',
124 | 123: 'spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish',
125 | 124: 'crayfish, crawfish, crawdad, crawdaddy',
126 | 125: 'hermit crab',
127 | 126: 'isopod',
128 | 127: 'white stork, Ciconia ciconia',
129 | 128: 'black stork, Ciconia nigra',
130 | 129: 'spoonbill',
131 | 130: 'flamingo',
132 | 131: 'little blue heron, Egretta caerulea',
133 | 132: 'American egret, great white heron, Egretta albus',
134 | 133: 'bittern',
135 | 134: 'crane',
136 | 135: 'limpkin, Aramus pictus',
137 | 136: 'European gallinule, Porphyrio porphyrio',
138 | 137: 'American coot, marsh hen, mud hen, water hen, Fulica americana',
139 | 138: 'bustard',
140 | 139: 'ruddy turnstone, Arenaria interpres',
141 | 140: 'red-backed sandpiper, dunlin, Erolia alpina',
142 | 141: 'redshank, Tringa totanus',
143 | 142: 'dowitcher',
144 | 143: 'oystercatcher, oyster catcher',
145 | 144: 'pelican',
146 | 145: 'king penguin, Aptenodytes patagonica',
147 | 146: 'albatross, mollymawk',
148 | 147: 'grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus',
149 | 148: 'killer whale, killer, orca, grampus, sea wolf, Orcinus orca',
150 | 149: 'dugong, Dugong dugon',
151 | 150: 'sea lion',
152 | 151: 'Chihuahua',
153 | 152: 'Japanese spaniel',
154 | 153: 'Maltese dog, Maltese terrier, Maltese',
155 | 154: 'Pekinese, Pekingese, Peke',
156 | 155: 'Shih-Tzu',
157 | 156: 'Blenheim spaniel',
158 | 157: 'papillon',
159 | 158: 'toy terrier',
160 | 159: 'Rhodesian ridgeback',
161 | 160: 'Afghan hound, Afghan',
162 | 161: 'basset, basset hound',
163 | 162: 'beagle',
164 | 163: 'bloodhound, sleuthhound',
165 | 164: 'bluetick',
166 | 165: 'black-and-tan coonhound',
167 | 166: 'Walker hound, Walker foxhound',
168 | 167: 'English foxhound',
169 | 168: 'redbone',
170 | 169: 'borzoi, Russian wolfhound',
171 | 170: 'Irish wolfhound',
172 | 171: 'Italian greyhound',
173 | 172: 'whippet',
174 | 173: 'Ibizan hound, Ibizan Podenco',
175 | 174: 'Norwegian elkhound, elkhound',
176 | 175: 'otterhound, otter hound',
177 | 176: 'Saluki, gazelle hound',
178 | 177: 'Scottish deerhound, deerhound',
179 | 178: 'Weimaraner',
180 | 179: 'Staffordshire bullterrier, Staffordshire bull terrier',
181 | 180: 'American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier',
182 | 181: 'Bedlington terrier',
183 | 182: 'Border terrier',
184 | 183: 'Kerry blue terrier',
185 | 184: 'Irish terrier',
186 | 185: 'Norfolk terrier',
187 | 186: 'Norwich terrier',
188 | 187: 'Yorkshire terrier',
189 | 188: 'wire-haired fox terrier',
190 | 189: 'Lakeland terrier',
191 | 190: 'Sealyham terrier, Sealyham',
192 | 191: 'Airedale, Airedale terrier',
193 | 192: 'cairn, cairn terrier',
194 | 193: 'Australian terrier',
195 | 194: 'Dandie Dinmont, Dandie Dinmont terrier',
196 | 195: 'Boston bull, Boston terrier',
197 | 196: 'miniature schnauzer',
198 | 197: 'giant schnauzer',
199 | 198: 'standard schnauzer',
200 | 199: 'Scotch terrier, Scottish terrier, Scottie',
201 | 200: 'Tibetan terrier, chrysanthemum dog',
202 | 201: 'silky terrier, Sydney silky',
203 | 202: 'soft-coated wheaten terrier',
204 | 203: 'West Highland white terrier',
205 | 204: 'Lhasa, Lhasa apso',
206 | 205: 'flat-coated retriever',
207 | 206: 'curly-coated retriever',
208 | 207: 'golden retriever',
209 | 208: 'Labrador retriever',
210 | 209: 'Chesapeake Bay retriever',
211 | 210: 'German short-haired pointer',
212 | 211: 'vizsla, Hungarian pointer',
213 | 212: 'English setter',
214 | 213: 'Irish setter, red setter',
215 | 214: 'Gordon setter',
216 | 215: 'Brittany spaniel',
217 | 216: 'clumber, clumber spaniel',
218 | 217: 'English springer, English springer spaniel',
219 | 218: 'Welsh springer spaniel',
220 | 219: 'cocker spaniel, English cocker spaniel, cocker',
221 | 220: 'Sussex spaniel',
222 | 221: 'Irish water spaniel',
223 | 222: 'kuvasz',
224 | 223: 'schipperke',
225 | 224: 'groenendael',
226 | 225: 'malinois',
227 | 226: 'briard',
228 | 227: 'kelpie',
229 | 228: 'komondor',
230 | 229: 'Old English sheepdog, bobtail',
231 | 230: 'Shetland sheepdog, Shetland sheep dog, Shetland',
232 | 231: 'collie',
233 | 232: 'Border collie',
234 | 233: 'Bouvier des Flandres, Bouviers des Flandres',
235 | 234: 'Rottweiler',
236 | 235: 'German shepherd, German shepherd dog, German police dog, alsatian',
237 | 236: 'Doberman, Doberman pinscher',
238 | 237: 'miniature pinscher',
239 | 238: 'Greater Swiss Mountain dog',
240 | 239: 'Bernese mountain dog',
241 | 240: 'Appenzeller',
242 | 241: 'EntleBucher',
243 | 242: 'boxer',
244 | 243: 'bull mastiff',
245 | 244: 'Tibetan mastiff',
246 | 245: 'French bulldog',
247 | 246: 'Great Dane',
248 | 247: 'Saint Bernard, St Bernard',
249 | 248: 'Eskimo dog, husky',
250 | 249: 'malamute, malemute, Alaskan malamute',
251 | 250: 'Siberian husky',
252 | 251: 'dalmatian, coach dog, carriage dog',
253 | 252: 'affenpinscher, monkey pinscher, monkey dog',
254 | 253: 'basenji',
255 | 254: 'pug, pug-dog',
256 | 255: 'Leonberg',
257 | 256: 'Newfoundland, Newfoundland dog',
258 | 257: 'Great Pyrenees',
259 | 258: 'Samoyed, Samoyede',
260 | 259: 'Pomeranian',
261 | 260: 'chow, chow chow',
262 | 261: 'keeshond',
263 | 262: 'Brabancon griffon',
264 | 263: 'Pembroke, Pembroke Welsh corgi',
265 | 264: 'Cardigan, Cardigan Welsh corgi',
266 | 265: 'toy poodle',
267 | 266: 'miniature poodle',
268 | 267: 'standard poodle',
269 | 268: 'Mexican hairless',
270 | 269: 'timber wolf, grey wolf, gray wolf, Canis lupus',
271 | 270: 'white wolf, Arctic wolf, Canis lupus tundrarum',
272 | 271: 'red wolf, maned wolf, Canis rufus, Canis niger',
273 | 272: 'coyote, prairie wolf, brush wolf, Canis latrans',
274 | 273: 'dingo, warrigal, warragal, Canis dingo',
275 | 274: 'dhole, Cuon alpinus',
276 | 275: 'African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus',
277 | 276: 'hyena, hyaena',
278 | 277: 'red fox, Vulpes vulpes',
279 | 278: 'kit fox, Vulpes macrotis',
280 | 279: 'Arctic fox, white fox, Alopex lagopus',
281 | 280: 'grey fox, gray fox, Urocyon cinereoargenteus',
282 | 281: 'tabby, tabby cat',
283 | 282: 'tiger cat',
284 | 283: 'Persian cat',
285 | 284: 'Siamese cat, Siamese',
286 | 285: 'Egyptian cat',
287 | 286: 'cougar, puma, catamount, mountain lion, painter, panther, Felis concolor',
288 | 287: 'lynx, catamount',
289 | 288: 'leopard, Panthera pardus',
290 | 289: 'snow leopard, ounce, Panthera uncia',
291 | 290: 'jaguar, panther, Panthera onca, Felis onca',
292 | 291: 'lion, king of beasts, Panthera leo',
293 | 292: 'tiger, Panthera tigris',
294 | 293: 'cheetah, chetah, Acinonyx jubatus',
295 | 294: 'brown bear, bruin, Ursus arctos',
296 | 295: 'American black bear, black bear, Ursus americanus, Euarctos americanus',
297 | 296: 'ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus',
298 | 297: 'sloth bear, Melursus ursinus, Ursus ursinus',
299 | 298: 'mongoose',
300 | 299: 'meerkat, mierkat',
301 | 300: 'tiger beetle',
302 | 301: 'ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle',
303 | 302: 'ground beetle, carabid beetle',
304 | 303: 'long-horned beetle, longicorn, longicorn beetle',
305 | 304: 'leaf beetle, chrysomelid',
306 | 305: 'dung beetle',
307 | 306: 'rhinoceros beetle',
308 | 307: 'weevil',
309 | 308: 'fly',
310 | 309: 'bee',
311 | 310: 'ant, emmet, pismire',
312 | 311: 'grasshopper, hopper',
313 | 312: 'cricket',
314 | 313: 'walking stick, walkingstick, stick insect',
315 | 314: 'cockroach, roach',
316 | 315: 'mantis, mantid',
317 | 316: 'cicada, cicala',
318 | 317: 'leafhopper',
319 | 318: 'lacewing, lacewing fly',
320 | 319: "dragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk",
321 | 320: 'damselfly',
322 | 321: 'admiral',
323 | 322: 'ringlet, ringlet butterfly',
324 | 323: 'monarch, monarch butterfly, milkweed butterfly, Danaus plexippus',
325 | 324: 'cabbage butterfly',
326 | 325: 'sulphur butterfly, sulfur butterfly',
327 | 326: 'lycaenid, lycaenid butterfly',
328 | 327: 'starfish, sea star',
329 | 328: 'sea urchin',
330 | 329: 'sea cucumber, holothurian',
331 | 330: 'wood rabbit, cottontail, cottontail rabbit',
332 | 331: 'hare',
333 | 332: 'Angora, Angora rabbit',
334 | 333: 'hamster',
335 | 334: 'porcupine, hedgehog',
336 | 335: 'fox squirrel, eastern fox squirrel, Sciurus niger',
337 | 336: 'marmot',
338 | 337: 'beaver',
339 | 338: 'guinea pig, Cavia cobaya',
340 | 339: 'sorrel',
341 | 340: 'zebra',
342 | 341: 'hog, pig, grunter, squealer, Sus scrofa',
343 | 342: 'wild boar, boar, Sus scrofa',
344 | 343: 'warthog',
345 | 344: 'hippopotamus, hippo, river horse, Hippopotamus amphibius',
346 | 345: 'ox',
347 | 346: 'water buffalo, water ox, Asiatic buffalo, Bubalus bubalis',
348 | 347: 'bison',
349 | 348: 'ram, tup',
350 | 349: 'bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis',
351 | 350: 'ibex, Capra ibex',
352 | 351: 'hartebeest',
353 | 352: 'impala, Aepyceros melampus',
354 | 353: 'gazelle',
355 | 354: 'Arabian camel, dromedary, Camelus dromedarius',
356 | 355: 'llama',
357 | 356: 'weasel',
358 | 357: 'mink',
359 | 358: 'polecat, fitch, foulmart, foumart, Mustela putorius',
360 | 359: 'black-footed ferret, ferret, Mustela nigripes',
361 | 360: 'otter',
362 | 361: 'skunk, polecat, wood pussy',
363 | 362: 'badger',
364 | 363: 'armadillo',
365 | 364: 'three-toed sloth, ai, Bradypus tridactylus',
366 | 365: 'orangutan, orang, orangutang, Pongo pygmaeus',
367 | 366: 'gorilla, Gorilla gorilla',
368 | 367: 'chimpanzee, chimp, Pan troglodytes',
369 | 368: 'gibbon, Hylobates lar',
370 | 369: 'siamang, Hylobates syndactylus, Symphalangus syndactylus',
371 | 370: 'guenon, guenon monkey',
372 | 371: 'patas, hussar monkey, Erythrocebus patas',
373 | 372: 'baboon',
374 | 373: 'macaque',
375 | 374: 'langur',
376 | 375: 'colobus, colobus monkey',
377 | 376: 'proboscis monkey, Nasalis larvatus',
378 | 377: 'marmoset',
379 | 378: 'capuchin, ringtail, Cebus capucinus',
380 | 379: 'howler monkey, howler',
381 | 380: 'titi, titi monkey',
382 | 381: 'spider monkey, Ateles geoffroyi',
383 | 382: 'squirrel monkey, Saimiri sciureus',
384 | 383: 'Madagascar cat, ring-tailed lemur, Lemur catta',
385 | 384: 'indri, indris, Indri indri, Indri brevicaudatus',
386 | 385: 'Indian elephant, Elephas maximus',
387 | 386: 'African elephant, Loxodonta africana',
388 | 387: 'lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens',
389 | 388: 'giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca',
390 | 389: 'barracouta, snoek',
391 | 390: 'eel',
392 | 391: 'coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch',
393 | 392: 'rock beauty, Holocanthus tricolor',
394 | 393: 'anemone fish',
395 | 394: 'sturgeon',
396 | 395: 'gar, garfish, garpike, billfish, Lepisosteus osseus',
397 | 396: 'lionfish',
398 | 397: 'puffer, pufferfish, blowfish, globefish',
399 | 398: 'abacus',
400 | 399: 'abaya',
401 | 400: "academic gown, academic robe, judge's robe",
402 | 401: 'accordion, piano accordion, squeeze box',
403 | 402: 'acoustic guitar',
404 | 403: 'aircraft carrier, carrier, flattop, attack aircraft carrier',
405 | 404: 'airliner',
406 | 405: 'airship, dirigible',
407 | 406: 'altar',
408 | 407: 'ambulance',
409 | 408: 'amphibian, amphibious vehicle',
410 | 409: 'analog clock',
411 | 410: 'apiary, bee house',
412 | 411: 'apron',
413 | 412: 'ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin',
414 | 413: 'assault rifle, assault gun',
415 | 414: 'backpack, back pack, knapsack, packsack, rucksack, haversack',
416 | 415: 'bakery, bakeshop, bakehouse',
417 | 416: 'balance beam, beam',
418 | 417: 'balloon',
419 | 418: 'ballpoint, ballpoint pen, ballpen, Biro',
420 | 419: 'Band Aid',
421 | 420: 'banjo',
422 | 421: 'bannister, banister, balustrade, balusters, handrail',
423 | 422: 'barbell',
424 | 423: 'barber chair',
425 | 424: 'barbershop',
426 | 425: 'barn',
427 | 426: 'barometer',
428 | 427: 'barrel, cask',
429 | 428: 'barrow, garden cart, lawn cart, wheelbarrow',
430 | 429: 'baseball',
431 | 430: 'basketball',
432 | 431: 'bassinet',
433 | 432: 'bassoon',
434 | 433: 'bathing cap, swimming cap',
435 | 434: 'bath towel',
436 | 435: 'bathtub, bathing tub, bath, tub',
437 | 436: 'beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon',
438 | 437: 'beacon, lighthouse, beacon light, pharos',
439 | 438: 'beaker',
440 | 439: 'bearskin, busby, shako',
441 | 440: 'beer bottle',
442 | 441: 'beer glass',
443 | 442: 'bell cote, bell cot',
444 | 443: 'bib',
445 | 444: 'bicycle-built-for-two, tandem bicycle, tandem',
446 | 445: 'bikini, two-piece',
447 | 446: 'binder, ring-binder',
448 | 447: 'binoculars, field glasses, opera glasses',
449 | 448: 'birdhouse',
450 | 449: 'boathouse',
451 | 450: 'bobsled, bobsleigh, bob',
452 | 451: 'bolo tie, bolo, bola tie, bola',
453 | 452: 'bonnet, poke bonnet',
454 | 453: 'bookcase',
455 | 454: 'bookshop, bookstore, bookstall',
456 | 455: 'bottlecap',
457 | 456: 'bow',
458 | 457: 'bow tie, bow-tie, bowtie',
459 | 458: 'brass, memorial tablet, plaque',
460 | 459: 'brassiere, bra, bandeau',
461 | 460: 'breakwater, groin, groyne, mole, bulwark, seawall, jetty',
462 | 461: 'breastplate, aegis, egis',
463 | 462: 'broom',
464 | 463: 'bucket, pail',
465 | 464: 'buckle',
466 | 465: 'bulletproof vest',
467 | 466: 'bullet train, bullet',
468 | 467: 'butcher shop, meat market',
469 | 468: 'cab, hack, taxi, taxicab',
470 | 469: 'caldron, cauldron',
471 | 470: 'candle, taper, wax light',
472 | 471: 'cannon',
473 | 472: 'canoe',
474 | 473: 'can opener, tin opener',
475 | 474: 'cardigan',
476 | 475: 'car mirror',
477 | 476: 'carousel, carrousel, merry-go-round, roundabout, whirligig',
478 | 477: "carpenter's kit, tool kit",
479 | 478: 'carton',
480 | 479: 'car wheel',
481 | 480: 'cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM',
482 | 481: 'cassette',
483 | 482: 'cassette player',
484 | 483: 'castle',
485 | 484: 'catamaran',
486 | 485: 'CD player',
487 | 486: 'cello, violoncello',
488 | 487: 'cellular telephone, cellular phone, cellphone, cell, mobile phone',
489 | 488: 'chain',
490 | 489: 'chainlink fence',
491 | 490: 'chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour',
492 | 491: 'chain saw, chainsaw',
493 | 492: 'chest',
494 | 493: 'chiffonier, commode',
495 | 494: 'chime, bell, gong',
496 | 495: 'china cabinet, china closet',
497 | 496: 'Christmas stocking',
498 | 497: 'church, church building',
499 | 498: 'cinema, movie theater, movie theatre, movie house, picture palace',
500 | 499: 'cleaver, meat cleaver, chopper',
501 | 500: 'cliff dwelling',
502 | 501: 'cloak',
503 | 502: 'clog, geta, patten, sabot',
504 | 503: 'cocktail shaker',
505 | 504: 'coffee mug',
506 | 505: 'coffeepot',
507 | 506: 'coil, spiral, volute, whorl, helix',
508 | 507: 'combination lock',
509 | 508: 'computer keyboard, keypad',
510 | 509: 'confectionery, confectionary, candy store',
511 | 510: 'container ship, containership, container vessel',
512 | 511: 'convertible',
513 | 512: 'corkscrew, bottle screw',
514 | 513: 'cornet, horn, trumpet, trump',
515 | 514: 'cowboy boot',
516 | 515: 'cowboy hat, ten-gallon hat',
517 | 516: 'cradle',
518 | 517: 'crane',
519 | 518: 'crash helmet',
520 | 519: 'crate',
521 | 520: 'crib, cot',
522 | 521: 'Crock Pot',
523 | 522: 'croquet ball',
524 | 523: 'crutch',
525 | 524: 'cuirass',
526 | 525: 'dam, dike, dyke',
527 | 526: 'desk',
528 | 527: 'desktop computer',
529 | 528: 'dial telephone, dial phone',
530 | 529: 'diaper, nappy, napkin',
531 | 530: 'digital clock',
532 | 531: 'digital watch',
533 | 532: 'dining table, board',
534 | 533: 'dishrag, dishcloth',
535 | 534: 'dishwasher, dish washer, dishwashing machine',
536 | 535: 'disk brake, disc brake',
537 | 536: 'dock, dockage, docking facility',
538 | 537: 'dogsled, dog sled, dog sleigh',
539 | 538: 'dome',
540 | 539: 'doormat, welcome mat',
541 | 540: 'drilling platform, offshore rig',
542 | 541: 'drum, membranophone, tympan',
543 | 542: 'drumstick',
544 | 543: 'dumbbell',
545 | 544: 'Dutch oven',
546 | 545: 'electric fan, blower',
547 | 546: 'electric guitar',
548 | 547: 'electric locomotive',
549 | 548: 'entertainment center',
550 | 549: 'envelope',
551 | 550: 'espresso maker',
552 | 551: 'face powder',
553 | 552: 'feather boa, boa',
554 | 553: 'file, file cabinet, filing cabinet',
555 | 554: 'fireboat',
556 | 555: 'fire engine, fire truck',
557 | 556: 'fire screen, fireguard',
558 | 557: 'flagpole, flagstaff',
559 | 558: 'flute, transverse flute',
560 | 559: 'folding chair',
561 | 560: 'football helmet',
562 | 561: 'forklift',
563 | 562: 'fountain',
564 | 563: 'fountain pen',
565 | 564: 'four-poster',
566 | 565: 'freight car',
567 | 566: 'French horn, horn',
568 | 567: 'frying pan, frypan, skillet',
569 | 568: 'fur coat',
570 | 569: 'garbage truck, dustcart',
571 | 570: 'gasmask, respirator, gas helmet',
572 | 571: 'gas pump, gasoline pump, petrol pump, island dispenser',
573 | 572: 'goblet',
574 | 573: 'go-kart',
575 | 574: 'golf ball',
576 | 575: 'golfcart, golf cart',
577 | 576: 'gondola',
578 | 577: 'gong, tam-tam',
579 | 578: 'gown',
580 | 579: 'grand piano, grand',
581 | 580: 'greenhouse, nursery, glasshouse',
582 | 581: 'grille, radiator grille',
583 | 582: 'grocery store, grocery, food market, market',
584 | 583: 'guillotine',
585 | 584: 'hair slide',
586 | 585: 'hair spray',
587 | 586: 'half track',
588 | 587: 'hammer',
589 | 588: 'hamper',
590 | 589: 'hand blower, blow dryer, blow drier, hair dryer, hair drier',
591 | 590: 'hand-held computer, hand-held microcomputer',
592 | 591: 'handkerchief, hankie, hanky, hankey',
593 | 592: 'hard disc, hard disk, fixed disk',
594 | 593: 'harmonica, mouth organ, harp, mouth harp',
595 | 594: 'harp',
596 | 595: 'harvester, reaper',
597 | 596: 'hatchet',
598 | 597: 'holster',
599 | 598: 'home theater, home theatre',
600 | 599: 'honeycomb',
601 | 600: 'hook, claw',
602 | 601: 'hoopskirt, crinoline',
603 | 602: 'horizontal bar, high bar',
604 | 603: 'horse cart, horse-cart',
605 | 604: 'hourglass',
606 | 605: 'iPod',
607 | 606: 'iron, smoothing iron',
608 | 607: "jack-o'-lantern",
609 | 608: 'jean, blue jean, denim',
610 | 609: 'jeep, landrover',
611 | 610: 'jersey, T-shirt, tee shirt',
612 | 611: 'jigsaw puzzle',
613 | 612: 'jinrikisha, ricksha, rickshaw',
614 | 613: 'joystick',
615 | 614: 'kimono',
616 | 615: 'knee pad',
617 | 616: 'knot',
618 | 617: 'lab coat, laboratory coat',
619 | 618: 'ladle',
620 | 619: 'lampshade, lamp shade',
621 | 620: 'laptop, laptop computer',
622 | 621: 'lawn mower, mower',
623 | 622: 'lens cap, lens cover',
624 | 623: 'letter opener, paper knife, paperknife',
625 | 624: 'library',
626 | 625: 'lifeboat',
627 | 626: 'lighter, light, igniter, ignitor',
628 | 627: 'limousine, limo',
629 | 628: 'liner, ocean liner',
630 | 629: 'lipstick, lip rouge',
631 | 630: 'Loafer',
632 | 631: 'lotion',
633 | 632: 'loudspeaker, speaker, speaker unit, loudspeaker system, speaker system',
634 | 633: "loupe, jeweler's loupe",
635 | 634: 'lumbermill, sawmill',
636 | 635: 'magnetic compass',
637 | 636: 'mailbag, postbag',
638 | 637: 'mailbox, letter box',
639 | 638: 'maillot',
640 | 639: 'maillot, tank suit',
641 | 640: 'manhole cover',
642 | 641: 'maraca',
643 | 642: 'marimba, xylophone',
644 | 643: 'mask',
645 | 644: 'matchstick',
646 | 645: 'maypole',
647 | 646: 'maze, labyrinth',
648 | 647: 'measuring cup',
649 | 648: 'medicine chest, medicine cabinet',
650 | 649: 'megalith, megalithic structure',
651 | 650: 'microphone, mike',
652 | 651: 'microwave, microwave oven',
653 | 652: 'military uniform',
654 | 653: 'milk can',
655 | 654: 'minibus',
656 | 655: 'miniskirt, mini',
657 | 656: 'minivan',
658 | 657: 'missile',
659 | 658: 'mitten',
660 | 659: 'mixing bowl',
661 | 660: 'mobile home, manufactured home',
662 | 661: 'Model T',
663 | 662: 'modem',
664 | 663: 'monastery',
665 | 664: 'monitor',
666 | 665: 'moped',
667 | 666: 'mortar',
668 | 667: 'mortarboard',
669 | 668: 'mosque',
670 | 669: 'mosquito net',
671 | 670: 'motor scooter, scooter',
672 | 671: 'mountain bike, all-terrain bike, off-roader',
673 | 672: 'mountain tent',
674 | 673: 'mouse, computer mouse',
675 | 674: 'mousetrap',
676 | 675: 'moving van',
677 | 676: 'muzzle',
678 | 677: 'nail',
679 | 678: 'neck brace',
680 | 679: 'necklace',
681 | 680: 'nipple',
682 | 681: 'notebook, notebook computer',
683 | 682: 'obelisk',
684 | 683: 'oboe, hautboy, hautbois',
685 | 684: 'ocarina, sweet potato',
686 | 685: 'odometer, hodometer, mileometer, milometer',
687 | 686: 'oil filter',
688 | 687: 'organ, pipe organ',
689 | 688: 'oscilloscope, scope, cathode-ray oscilloscope, CRO',
690 | 689: 'overskirt',
691 | 690: 'oxcart',
692 | 691: 'oxygen mask',
693 | 692: 'packet',
694 | 693: 'paddle, boat paddle',
695 | 694: 'paddlewheel, paddle wheel',
696 | 695: 'padlock',
697 | 696: 'paintbrush',
698 | 697: "pajama, pyjama, pj's, jammies",
699 | 698: 'palace',
700 | 699: 'panpipe, pandean pipe, syrinx',
701 | 700: 'paper towel',
702 | 701: 'parachute, chute',
703 | 702: 'parallel bars, bars',
704 | 703: 'park bench',
705 | 704: 'parking meter',
706 | 705: 'passenger car, coach, carriage',
707 | 706: 'patio, terrace',
708 | 707: 'pay-phone, pay-station',
709 | 708: 'pedestal, plinth, footstall',
710 | 709: 'pencil box, pencil case',
711 | 710: 'pencil sharpener',
712 | 711: 'perfume, essence',
713 | 712: 'Petri dish',
714 | 713: 'photocopier',
715 | 714: 'pick, plectrum, plectron',
716 | 715: 'pickelhaube',
717 | 716: 'picket fence, paling',
718 | 717: 'pickup, pickup truck',
719 | 718: 'pier',
720 | 719: 'piggy bank, penny bank',
721 | 720: 'pill bottle',
722 | 721: 'pillow',
723 | 722: 'ping-pong ball',
724 | 723: 'pinwheel',
725 | 724: 'pirate, pirate ship',
726 | 725: 'pitcher, ewer',
727 | 726: "plane, carpenter's plane, woodworking plane",
728 | 727: 'planetarium',
729 | 728: 'plastic bag',
730 | 729: 'plate rack',
731 | 730: 'plow, plough',
732 | 731: "plunger, plumber's helper",
733 | 732: 'Polaroid camera, Polaroid Land camera',
734 | 733: 'pole',
735 | 734: 'police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria',
736 | 735: 'poncho',
737 | 736: 'pool table, billiard table, snooker table',
738 | 737: 'pop bottle, soda bottle',
739 | 738: 'pot, flowerpot',
740 | 739: "potter's wheel",
741 | 740: 'power drill',
742 | 741: 'prayer rug, prayer mat',
743 | 742: 'printer',
744 | 743: 'prison, prison house',
745 | 744: 'projectile, missile',
746 | 745: 'projector',
747 | 746: 'puck, hockey puck',
748 | 747: 'punching bag, punch bag, punching ball, punchball',
749 | 748: 'purse',
750 | 749: 'quill, quill pen',
751 | 750: 'quilt, comforter, comfort, puff',
752 | 751: 'racer, race car, racing car',
753 | 752: 'racket, racquet',
754 | 753: 'radiator',
755 | 754: 'radio, wireless',
756 | 755: 'radio telescope, radio reflector',
757 | 756: 'rain barrel',
758 | 757: 'recreational vehicle, RV, R.V.',
759 | 758: 'reel',
760 | 759: 'reflex camera',
761 | 760: 'refrigerator, icebox',
762 | 761: 'remote control, remote',
763 | 762: 'restaurant, eating house, eating place, eatery',
764 | 763: 'revolver, six-gun, six-shooter',
765 | 764: 'rifle',
766 | 765: 'rocking chair, rocker',
767 | 766: 'rotisserie',
768 | 767: 'rubber eraser, rubber, pencil eraser',
769 | 768: 'rugby ball',
770 | 769: 'rule, ruler',
771 | 770: 'running shoe',
772 | 771: 'safe',
773 | 772: 'safety pin',
774 | 773: 'saltshaker, salt shaker',
775 | 774: 'sandal',
776 | 775: 'sarong',
777 | 776: 'sax, saxophone',
778 | 777: 'scabbard',
779 | 778: 'scale, weighing machine',
780 | 779: 'school bus',
781 | 780: 'schooner',
782 | 781: 'scoreboard',
783 | 782: 'screen, CRT screen',
784 | 783: 'screw',
785 | 784: 'screwdriver',
786 | 785: 'seat belt, seatbelt',
787 | 786: 'sewing machine',
788 | 787: 'shield, buckler',
789 | 788: 'shoe shop, shoe-shop, shoe store',
790 | 789: 'shoji',
791 | 790: 'shopping basket',
792 | 791: 'shopping cart',
793 | 792: 'shovel',
794 | 793: 'shower cap',
795 | 794: 'shower curtain',
796 | 795: 'ski',
797 | 796: 'ski mask',
798 | 797: 'sleeping bag',
799 | 798: 'slide rule, slipstick',
800 | 799: 'sliding door',
801 | 800: 'slot, one-armed bandit',
802 | 801: 'snorkel',
803 | 802: 'snowmobile',
804 | 803: 'snowplow, snowplough',
805 | 804: 'soap dispenser',
806 | 805: 'soccer ball',
807 | 806: 'sock',
808 | 807: 'solar dish, solar collector, solar furnace',
809 | 808: 'sombrero',
810 | 809: 'soup bowl',
811 | 810: 'space bar',
812 | 811: 'space heater',
813 | 812: 'space shuttle',
814 | 813: 'spatula',
815 | 814: 'speedboat',
816 | 815: "spider web, spider's web",
817 | 816: 'spindle',
818 | 817: 'sports car, sport car',
819 | 818: 'spotlight, spot',
820 | 819: 'stage',
821 | 820: 'steam locomotive',
822 | 821: 'steel arch bridge',
823 | 822: 'steel drum',
824 | 823: 'stethoscope',
825 | 824: 'stole',
826 | 825: 'stone wall',
827 | 826: 'stopwatch, stop watch',
828 | 827: 'stove',
829 | 828: 'strainer',
830 | 829: 'streetcar, tram, tramcar, trolley, trolley car',
831 | 830: 'stretcher',
832 | 831: 'studio couch, day bed',
833 | 832: 'stupa, tope',
834 | 833: 'submarine, pigboat, sub, U-boat',
835 | 834: 'suit, suit of clothes',
836 | 835: 'sundial',
837 | 836: 'sunglass',
838 | 837: 'sunglasses, dark glasses, shades',
839 | 838: 'sunscreen, sunblock, sun blocker',
840 | 839: 'suspension bridge',
841 | 840: 'swab, swob, mop',
842 | 841: 'sweatshirt',
843 | 842: 'swimming trunks, bathing trunks',
844 | 843: 'swing',
845 | 844: 'switch, electric switch, electrical switch',
846 | 845: 'syringe',
847 | 846: 'table lamp',
848 | 847: 'tank, army tank, armored combat vehicle, armoured combat vehicle',
849 | 848: 'tape player',
850 | 849: 'teapot',
851 | 850: 'teddy, teddy bear',
852 | 851: 'television, television system',
853 | 852: 'tennis ball',
854 | 853: 'thatch, thatched roof',
855 | 854: 'theater curtain, theatre curtain',
856 | 855: 'thimble',
857 | 856: 'thresher, thrasher, threshing machine',
858 | 857: 'throne',
859 | 858: 'tile roof',
860 | 859: 'toaster',
861 | 860: 'tobacco shop, tobacconist shop, tobacconist',
862 | 861: 'toilet seat',
863 | 862: 'torch',
864 | 863: 'totem pole',
865 | 864: 'tow truck, tow car, wrecker',
866 | 865: 'toyshop',
867 | 866: 'tractor',
868 | 867: 'trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi',
869 | 868: 'tray',
870 | 869: 'trench coat',
871 | 870: 'tricycle, trike, velocipede',
872 | 871: 'trimaran',
873 | 872: 'tripod',
874 | 873: 'triumphal arch',
875 | 874: 'trolleybus, trolley coach, trackless trolley',
876 | 875: 'trombone',
877 | 876: 'tub, vat',
878 | 877: 'turnstile',
879 | 878: 'typewriter keyboard',
880 | 879: 'umbrella',
881 | 880: 'unicycle, monocycle',
882 | 881: 'upright, upright piano',
883 | 882: 'vacuum, vacuum cleaner',
884 | 883: 'vase',
885 | 884: 'vault',
886 | 885: 'velvet',
887 | 886: 'vending machine',
888 | 887: 'vestment',
889 | 888: 'viaduct',
890 | 889: 'violin, fiddle',
891 | 890: 'volleyball',
892 | 891: 'waffle iron',
893 | 892: 'wall clock',
894 | 893: 'wallet, billfold, notecase, pocketbook',
895 | 894: 'wardrobe, closet, press',
896 | 895: 'warplane, military plane',
897 | 896: 'washbasin, handbasin, washbowl, lavabo, wash-hand basin',
898 | 897: 'washer, automatic washer, washing machine',
899 | 898: 'water bottle',
900 | 899: 'water jug',
901 | 900: 'water tower',
902 | 901: 'whiskey jug',
903 | 902: 'whistle',
904 | 903: 'wig',
905 | 904: 'window screen',
906 | 905: 'window shade',
907 | 906: 'Windsor tie',
908 | 907: 'wine bottle',
909 | 908: 'wing',
910 | 909: 'wok',
911 | 910: 'wooden spoon',
912 | 911: 'wool, woolen, woollen',
913 | 912: 'worm fence, snake fence, snake-rail fence, Virginia fence',
914 | 913: 'wreck',
915 | 914: 'yawl',
916 | 915: 'yurt',
917 | 916: 'web site, website, internet site, site',
918 | 917: 'comic book',
919 | 918: 'crossword puzzle, crossword',
920 | 919: 'street sign',
921 | 920: 'traffic light, traffic signal, stoplight',
922 | 921: 'book jacket, dust cover, dust jacket, dust wrapper',
923 | 922: 'menu',
924 | 923: 'plate',
925 | 924: 'guacamole',
926 | 925: 'consomme',
927 | 926: 'hot pot, hotpot',
928 | 927: 'trifle',
929 | 928: 'ice cream, icecream',
930 | 929: 'ice lolly, lolly, lollipop, popsicle',
931 | 930: 'French loaf',
932 | 931: 'bagel, beigel',
933 | 932: 'pretzel',
934 | 933: 'cheeseburger',
935 | 934: 'hotdog, hot dog, red hot',
936 | 935: 'mashed potato',
937 | 936: 'head cabbage',
938 | 937: 'broccoli',
939 | 938: 'cauliflower',
940 | 939: 'zucchini, courgette',
941 | 940: 'spaghetti squash',
942 | 941: 'acorn squash',
943 | 942: 'butternut squash',
944 | 943: 'cucumber, cuke',
945 | 944: 'artichoke, globe artichoke',
946 | 945: 'bell pepper',
947 | 946: 'cardoon',
948 | 947: 'mushroom',
949 | 948: 'Granny Smith',
950 | 949: 'strawberry',
951 | 950: 'orange',
952 | 951: 'lemon',
953 | 952: 'fig',
954 | 953: 'pineapple, ananas',
955 | 954: 'banana',
956 | 955: 'jackfruit, jak, jack',
957 | 956: 'custard apple',
958 | 957: 'pomegranate',
959 | 958: 'hay',
960 | 959: 'carbonara',
961 | 960: 'chocolate sauce, chocolate syrup',
962 | 961: 'dough',
963 | 962: 'meat loaf, meatloaf',
964 | 963: 'pizza, pizza pie',
965 | 964: 'potpie',
966 | 965: 'burrito',
967 | 966: 'red wine',
968 | 967: 'espresso',
969 | 968: 'cup',
970 | 969: 'eggnog',
971 | 970: 'alp',
972 | 971: 'bubble',
973 | 972: 'cliff, drop, drop-off',
974 | 973: 'coral reef',
975 | 974: 'geyser',
976 | 975: 'lakeside, lakeshore',
977 | 976: 'promontory, headland, head, foreland',
978 | 977: 'sandbar, sand bar',
979 | 978: 'seashore, coast, seacoast, sea-coast',
980 | 979: 'valley, vale',
981 | 980: 'volcano',
982 | 981: 'ballplayer, baseball player',
983 | 982: 'groom, bridegroom',
984 | 983: 'scuba diver',
985 | 984: 'rapeseed',
986 | 985: 'daisy',
987 | 986: "yellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum",
988 | 987: 'corn',
989 | 988: 'acorn',
990 | 989: 'hip, rose hip, rosehip',
991 | 990: 'buckeye, horse chestnut, conker',
992 | 991: 'coral fungus',
993 | 992: 'agaric',
994 | 993: 'gyromitra',
995 | 994: 'stinkhorn, carrion fungus',
996 | 995: 'earthstar',
997 | 996: 'hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa',
998 | 997: 'bolete',
999 | 998: 'ear, spike, capitulum',
1000 | 999: 'toilet tissue, toilet paper, bathroom tissue'}
--------------------------------------------------------------------------------
/files/init.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 |
3 | set -e
4 |
5 | git clone https://github.com/tensorflow/models.git
6 |
7 | source activate tensorflow_p36
8 |
9 | cd /home/ec2-user/SageMaker/tfs-workshop/files/models/research
10 | protoc object_detection/protos/*.proto --python_out=.
11 |
12 | python setup.py build
13 | python setup.py install
14 |
15 | export PYTHONPATH=/home/ec2-user/SageMaker/tfs-workshop/files/models/research
16 |
17 | source deactivate tensorflow_p36
--------------------------------------------------------------------------------
/files/mscoco_label_map.pbtxt:
--------------------------------------------------------------------------------
1 | item {
2 | name: "/m/01g317"
3 | id: 1
4 | display_name: "person"
5 | }
6 | item {
7 | name: "/m/0199g"
8 | id: 2
9 | display_name: "bicycle"
10 | }
11 | item {
12 | name: "/m/0k4j"
13 | id: 3
14 | display_name: "car"
15 | }
16 | item {
17 | name: "/m/04_sv"
18 | id: 4
19 | display_name: "motorcycle"
20 | }
21 | item {
22 | name: "/m/05czz6l"
23 | id: 5
24 | display_name: "airplane"
25 | }
26 | item {
27 | name: "/m/01bjv"
28 | id: 6
29 | display_name: "bus"
30 | }
31 | item {
32 | name: "/m/07jdr"
33 | id: 7
34 | display_name: "train"
35 | }
36 | item {
37 | name: "/m/07r04"
38 | id: 8
39 | display_name: "truck"
40 | }
41 | item {
42 | name: "/m/019jd"
43 | id: 9
44 | display_name: "boat"
45 | }
46 | item {
47 | name: "/m/015qff"
48 | id: 10
49 | display_name: "traffic light"
50 | }
51 | item {
52 | name: "/m/01pns0"
53 | id: 11
54 | display_name: "fire hydrant"
55 | }
56 | item {
57 | name: "/m/02pv19"
58 | id: 13
59 | display_name: "stop sign"
60 | }
61 | item {
62 | name: "/m/015qbp"
63 | id: 14
64 | display_name: "parking meter"
65 | }
66 | item {
67 | name: "/m/0cvnqh"
68 | id: 15
69 | display_name: "bench"
70 | }
71 | item {
72 | name: "/m/015p6"
73 | id: 16
74 | display_name: "bird"
75 | }
76 | item {
77 | name: "/m/01yrx"
78 | id: 17
79 | display_name: "cat"
80 | }
81 | item {
82 | name: "/m/0bt9lr"
83 | id: 18
84 | display_name: "dog"
85 | }
86 | item {
87 | name: "/m/03k3r"
88 | id: 19
89 | display_name: "horse"
90 | }
91 | item {
92 | name: "/m/07bgp"
93 | id: 20
94 | display_name: "sheep"
95 | }
96 | item {
97 | name: "/m/01xq0k1"
98 | id: 21
99 | display_name: "cow"
100 | }
101 | item {
102 | name: "/m/0bwd_0j"
103 | id: 22
104 | display_name: "elephant"
105 | }
106 | item {
107 | name: "/m/01dws"
108 | id: 23
109 | display_name: "bear"
110 | }
111 | item {
112 | name: "/m/0898b"
113 | id: 24
114 | display_name: "zebra"
115 | }
116 | item {
117 | name: "/m/03bk1"
118 | id: 25
119 | display_name: "giraffe"
120 | }
121 | item {
122 | name: "/m/01940j"
123 | id: 27
124 | display_name: "backpack"
125 | }
126 | item {
127 | name: "/m/0hnnb"
128 | id: 28
129 | display_name: "umbrella"
130 | }
131 | item {
132 | name: "/m/080hkjn"
133 | id: 31
134 | display_name: "handbag"
135 | }
136 | item {
137 | name: "/m/01rkbr"
138 | id: 32
139 | display_name: "tie"
140 | }
141 | item {
142 | name: "/m/01s55n"
143 | id: 33
144 | display_name: "suitcase"
145 | }
146 | item {
147 | name: "/m/02wmf"
148 | id: 34
149 | display_name: "frisbee"
150 | }
151 | item {
152 | name: "/m/071p9"
153 | id: 35
154 | display_name: "skis"
155 | }
156 | item {
157 | name: "/m/06__v"
158 | id: 36
159 | display_name: "snowboard"
160 | }
161 | item {
162 | name: "/m/018xm"
163 | id: 37
164 | display_name: "sports ball"
165 | }
166 | item {
167 | name: "/m/02zt3"
168 | id: 38
169 | display_name: "kite"
170 | }
171 | item {
172 | name: "/m/03g8mr"
173 | id: 39
174 | display_name: "baseball bat"
175 | }
176 | item {
177 | name: "/m/03grzl"
178 | id: 40
179 | display_name: "baseball glove"
180 | }
181 | item {
182 | name: "/m/06_fw"
183 | id: 41
184 | display_name: "skateboard"
185 | }
186 | item {
187 | name: "/m/019w40"
188 | id: 42
189 | display_name: "surfboard"
190 | }
191 | item {
192 | name: "/m/0dv9c"
193 | id: 43
194 | display_name: "tennis racket"
195 | }
196 | item {
197 | name: "/m/04dr76w"
198 | id: 44
199 | display_name: "bottle"
200 | }
201 | item {
202 | name: "/m/09tvcd"
203 | id: 46
204 | display_name: "wine glass"
205 | }
206 | item {
207 | name: "/m/08gqpm"
208 | id: 47
209 | display_name: "cup"
210 | }
211 | item {
212 | name: "/m/0dt3t"
213 | id: 48
214 | display_name: "fork"
215 | }
216 | item {
217 | name: "/m/04ctx"
218 | id: 49
219 | display_name: "knife"
220 | }
221 | item {
222 | name: "/m/0cmx8"
223 | id: 50
224 | display_name: "spoon"
225 | }
226 | item {
227 | name: "/m/04kkgm"
228 | id: 51
229 | display_name: "bowl"
230 | }
231 | item {
232 | name: "/m/09qck"
233 | id: 52
234 | display_name: "banana"
235 | }
236 | item {
237 | name: "/m/014j1m"
238 | id: 53
239 | display_name: "apple"
240 | }
241 | item {
242 | name: "/m/0l515"
243 | id: 54
244 | display_name: "sandwich"
245 | }
246 | item {
247 | name: "/m/0cyhj_"
248 | id: 55
249 | display_name: "orange"
250 | }
251 | item {
252 | name: "/m/0hkxq"
253 | id: 56
254 | display_name: "broccoli"
255 | }
256 | item {
257 | name: "/m/0fj52s"
258 | id: 57
259 | display_name: "carrot"
260 | }
261 | item {
262 | name: "/m/01b9xk"
263 | id: 58
264 | display_name: "hot dog"
265 | }
266 | item {
267 | name: "/m/0663v"
268 | id: 59
269 | display_name: "pizza"
270 | }
271 | item {
272 | name: "/m/0jy4k"
273 | id: 60
274 | display_name: "donut"
275 | }
276 | item {
277 | name: "/m/0fszt"
278 | id: 61
279 | display_name: "cake"
280 | }
281 | item {
282 | name: "/m/01mzpv"
283 | id: 62
284 | display_name: "chair"
285 | }
286 | item {
287 | name: "/m/02crq1"
288 | id: 63
289 | display_name: "couch"
290 | }
291 | item {
292 | name: "/m/03fp41"
293 | id: 64
294 | display_name: "potted plant"
295 | }
296 | item {
297 | name: "/m/03ssj5"
298 | id: 65
299 | display_name: "bed"
300 | }
301 | item {
302 | name: "/m/04bcr3"
303 | id: 67
304 | display_name: "dining table"
305 | }
306 | item {
307 | name: "/m/09g1w"
308 | id: 70
309 | display_name: "toilet"
310 | }
311 | item {
312 | name: "/m/07c52"
313 | id: 72
314 | display_name: "tv"
315 | }
316 | item {
317 | name: "/m/01c648"
318 | id: 73
319 | display_name: "laptop"
320 | }
321 | item {
322 | name: "/m/020lf"
323 | id: 74
324 | display_name: "mouse"
325 | }
326 | item {
327 | name: "/m/0qjjc"
328 | id: 75
329 | display_name: "remote"
330 | }
331 | item {
332 | name: "/m/01m2v"
333 | id: 76
334 | display_name: "keyboard"
335 | }
336 | item {
337 | name: "/m/050k8"
338 | id: 77
339 | display_name: "cell phone"
340 | }
341 | item {
342 | name: "/m/0fx9l"
343 | id: 78
344 | display_name: "microwave"
345 | }
346 | item {
347 | name: "/m/029bxz"
348 | id: 79
349 | display_name: "oven"
350 | }
351 | item {
352 | name: "/m/01k6s3"
353 | id: 80
354 | display_name: "toaster"
355 | }
356 | item {
357 | name: "/m/0130jx"
358 | id: 81
359 | display_name: "sink"
360 | }
361 | item {
362 | name: "/m/040b_t"
363 | id: 82
364 | display_name: "refrigerator"
365 | }
366 | item {
367 | name: "/m/0bt_c3"
368 | id: 84
369 | display_name: "book"
370 | }
371 | item {
372 | name: "/m/01x3z"
373 | id: 85
374 | display_name: "clock"
375 | }
376 | item {
377 | name: "/m/02s195"
378 | id: 86
379 | display_name: "vase"
380 | }
381 | item {
382 | name: "/m/01lsmm"
383 | id: 87
384 | display_name: "scissors"
385 | }
386 | item {
387 | name: "/m/0kmg4"
388 | id: 88
389 | display_name: "teddy bear"
390 | }
391 | item {
392 | name: "/m/03wvsk"
393 | id: 89
394 | display_name: "hair drier"
395 | }
396 | item {
397 | name: "/m/012xff"
398 | id: 90
399 | display_name: "toothbrush"
400 | }
--------------------------------------------------------------------------------
/get_started.md:
--------------------------------------------------------------------------------
1 | # 사전 준비
2 |
3 | ## S3 Bucket 생성하기
4 |
5 | SageMaker는 S3를 데이터와 모델 저장소로 사용합니다. 여기서는 해당 목적으로 S3 Bucket을 생성합니다. 본 실습에서는 `Seoul (ap-northeast-2)` 리전을 사용합니다.
6 |
7 | 1. [AWS 관리 콘솔](https://console.aws.amazon.com/)에 Sign in 합니다.
8 | 만약 AWS 측에서 Event Engine을 사용하여 임시 아이디를 생성한 경우 제공받으신 URL을 여시고 team hash code를 입력하시면 됩니다.
9 | Event Engine 접속 가이드는 https://bit.ly/workshop-guide-sagemaker 를 참조해 주세요.
10 |
11 | 1. AWS Services 리스트에서 S3 로 이동합니다.
12 | 1. `"+ Create Bucket"` 버튼을 선택합니다.
13 | 1. 아래 내용 설정 후 화면 왼쪽 아래 Create 클릭합니다.
14 |
15 | * Bucket name: sagemaker-hol-{userid} [반드시 고유한 값 설정]
16 | * Region : Asia Pacific (Seoul)
17 | 
18 |
19 | ## Notebook instance 생성
20 |
21 | 1. 새로운 Notebook instance를 생성하기 위해 왼쪽 패널 메뉴 중 Notebook Instances 선택 후 오른쪽 상단의 `Create notebook instance` 버튼을 클릭합니다.
22 |
23 | 
24 |
25 | 1. Notebook instance 이름으로 `sagemaker-inference-hol-[YOUR-NAME]` 으로 넣은 뒤 `ml.t2.medium` 인스턴스 타입을 선택합니다.
26 |
27 | 
28 |
29 | 1. IAM role은 `Create a new role` 을 선택하고, 생성된 팝업 창에서는 `S3 buckets you specify – optional` 밑에 `Specific S3 Bucket` 을 선택 합니다. 그리고 텍스트 필드에 위에서 만든 S3 bucket 이름(예: sagemaker-xxxxx)을 선택 합니다. 이후 `Create role` 을 클릭합니다.
30 |
31 | 
32 |
33 | 1. 다시 Create Notebook instance 페이지로 돌아온 뒤 `Create notebook instance` 를 클릭합니다.
34 |
35 | ## Notebook Instance 접근하기
36 |
37 | 1. 서버 상태가 `InService` 로 바뀔 때까지 기다립니다. 보통 5분정도의 시간이 소요 됩니다.
38 | 1. `Open Jupyter`를 클릭하면 방금 생성한 notebook instance의 Jupyter 홈페이지로 이동하게 됩니다.
39 |
40 | 
41 |
42 | 1. 본 핸즈온 실습에 필요한 파일들을 github 저장소에서 복사하기 위해 Jupyter 홈페이지의 오른쪽 상단의 New 버튼을 클릭 후, Terminal 항목을 클릭합니다.
43 |
44 | 
45 |
46 | 1. 검은색 Terminal 화면이 출력되면, 아래의 명령어를 순차적으로 입력합니다. 이 과정은 네트워크 속도에 따라 변동적이지만 평균적으로 약 10초 정도 소요됩니다.
47 |
48 | ```
49 | $ cd SageMaker
50 | $ git clone https://github.com/daekeun-ml/tfs-workshop.git
51 | ```
52 | 
53 |
54 | 1. Github 코드가 복사되었으면 Google object detection API를 아래의 명령어로 설치합니다. 이 API는 TensorFlow Serving에는 필요 없지만, 추론(inference) 결과를 확인할 때 유용하게 활용할 수 있습니다. 이 과정은 네트워크 속도에 따라 변동적이지만, 약 2분 정도 소요됩니다.
55 |
56 | ```
57 | $ cd tfs-workshop/files
58 | $ bash init.sh
59 | ```
60 | 
61 |
62 | 1. 아래 스크린샷처럼 `Finished processing dependencies for object-detection==0.1` 문구가 정상적으로 출력되었는지 확인합니다.
63 |
64 | 
65 |
66 | 1. 다시 Jupyter Notebook 홈페이지로 돌아가서 `tfs-workshop` 폴더가 생성되었는지 확인하고, `tfs-workshop` 폴더를 클릭합니다.
67 |
68 | 
69 |
70 | 1. 아래 스크린샷의 파일들이 정상적으로 생성되었는지 확인합니다. `.ipynb` 파일들이 다음 모듈에서 실습으로 사용하게 될 코드들입니다.
71 |
72 | 
73 |
74 | 수고하셨습니다. 준비 과정을 완료하였습니다.
75 | 이어서 TensorFlow로 사전 훈련한 모델을 Endpoint로 호스팅하는 방법은 `tensorflow-serving-endpoint.ipynb`을,
76 | MXNet(GluonCV)으로 사전 훈련한 모델을 Endpoint로 호스팅하는 방법은 `mxnet-serving-endpoint.ipynb`을 실행하시면 됩니다.
77 |
--------------------------------------------------------------------------------
/images/doc/001.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/001.png
--------------------------------------------------------------------------------
/images/doc/002.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/002.png
--------------------------------------------------------------------------------
/images/doc/003.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/003.png
--------------------------------------------------------------------------------
/images/doc/004.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/004.png
--------------------------------------------------------------------------------
/images/doc/005.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/005.png
--------------------------------------------------------------------------------
/images/doc/006.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/006.png
--------------------------------------------------------------------------------
/images/doc/007.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/007.png
--------------------------------------------------------------------------------
/images/doc/008.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/008.png
--------------------------------------------------------------------------------
/images/doc/009.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/009.png
--------------------------------------------------------------------------------
/images/doc/010.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/010.png
--------------------------------------------------------------------------------
/images/doc/011.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/011.png
--------------------------------------------------------------------------------
/images/doc/012.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/012.png
--------------------------------------------------------------------------------
/images/doc/013.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/doc/013.png
--------------------------------------------------------------------------------
/images/imagenet_test/n02129604_7580_tiger.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/imagenet_test/n02129604_7580_tiger.jpg
--------------------------------------------------------------------------------
/images/imagenet_test/n03584254_2594_iPod.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/imagenet_test/n03584254_2594_iPod.jpg
--------------------------------------------------------------------------------
/images/imagenet_test/n04517823_9843_vacuum.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/imagenet_test/n04517823_9843_vacuum.jpg
--------------------------------------------------------------------------------
/images/imagenet_test/n07745940_2123_strawberry.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/imagenet_test/n07745940_2123_strawberry.jpg
--------------------------------------------------------------------------------
/images/test/000000029984.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/test/000000029984.jpg
--------------------------------------------------------------------------------
/images/test/000000059044.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/test/000000059044.jpg
--------------------------------------------------------------------------------
/images/test/000000119088.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/test/000000119088.jpg
--------------------------------------------------------------------------------
/images/test/000000119233.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/test/000000119233.jpg
--------------------------------------------------------------------------------
/images/test/000000133819.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/test/000000133819.jpg
--------------------------------------------------------------------------------
/images/test/000000133969.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/daekeun-ml/tfs-workshop/56951eb66d7400f992f7da8591d319c44e3e223f/images/test/000000133969.jpg
--------------------------------------------------------------------------------
/pytorch-serving-endpoint.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# SageMaker Endpoint에 사전 훈련된 모델을 호스팅 후 Object Detection 수행하기 (PyTorch)\n",
8 | "---\n",
9 | "\n",
10 | "Amazon SageMaker에서 추론(inference)을 수행하려면 반드시 SageMaker에서 먼저 훈련을 수행해야 하나요? 그렇지 않습니다.
\n",
11 | "만약 여러분이 SageMaker에서 추론만 수행하고 싶다면, 여러분의 온프레미스(on-premise)에서 훈련한 모델이나 공개 모델 저장소(model zoo)에 저장되어 있는 사전 훈련된(pre-trained) 모델들을 도커(Docker) 이미지 빌드 없이 그대로 SageMaker Endpoint에 배포할 수 있습니다. 여러분이 수행할 작업은 오로지 추론용 엔트리포인트(entrypoint)만 작성하는 것입니다.\n",
12 | "\n",
13 | "이 노트북에서는 PyTorch API를 사용하여 사전 훈련된 `faster_rcnn` 모델을 SageMaker 엔드포인트에 배포 후, Object Detection을 수행합니다. \n",
14 | "\n",
15 | "## Pre-requisites\n",
16 | "\n",
17 | "- 기본 용법: [PyTorch](https://gluon-cv.mxnet.io/tutorials/index.html)\n",
18 | "- AWS 서비스: [AWS S3](https://docs.aws.amazon.com/s3/index.html), [Amazon SageMaker](https://aws.amazon.com/sagemaker/)"
19 | ]
20 | },
21 | {
22 | "cell_type": "code",
23 | "execution_count": null,
24 | "metadata": {},
25 | "outputs": [],
26 | "source": [
27 | "%load_ext autoreload\n",
28 | "%autoreload 2"
29 | ]
30 | },
31 | {
32 | "cell_type": "markdown",
33 | "metadata": {},
34 | "source": [
35 | "SageMaker SDK를 최신 버전으로 업그레이드합니다. 본 노트북은 SDK 2.x 버전 이상에서 구동해야 합니다."
36 | ]
37 | },
38 | {
39 | "cell_type": "code",
40 | "execution_count": null,
41 | "metadata": {},
42 | "outputs": [],
43 | "source": [
44 | "import sys, sagemaker, boto3\n",
45 | "!{sys.executable} -m pip install -qU \"sagemaker>=2.11.0\"\n",
46 | "print(sagemaker.__version__)"
47 | ]
48 | },
49 | {
50 | "cell_type": "markdown",
51 | "metadata": {},
52 | "source": [
53 | "
\n",
54 | "\n",
55 | "# 1. Inference script\n",
56 | "---\n",
57 | "\n",
58 | "아래 코드 셀은 `src` 디렉토리에 SageMaker 추론 스크립트를 저장합니다.
\n",
59 | "\n",
60 | "이 스크립트는 SageMaker 상에서 MXNet에 최적화된 추론 서버인 MMS(Multi Model Server)나 PyTorch에 최적화된 추론 서버인 torchserve를 쉽고 편하게 배포할 수 있는 high-level 툴킷인 SageMaker inference toolkit의 인터페이스를 사용하고 있으며, 여러분께서는 인터페이스에 정의된 핸들러(handler) 함수들만 구현하시면 됩니다. MXNet 및 PyTorch용 엔트리포인트(entrypoint) 인터페이스는 아래 두 가지 옵션 중 하나를 선택하면 되며, 본 예제에서는 Option 2.의 사용 예시를 보여줍니다.\n",
61 | "\n",
62 | "\n",
63 | "### Option 1.\n",
64 | "- `model_fn(model_dir)`: 딥러닝 네트워크 아키텍처를 정의하고 S3의 model_dir에 저장된 모델 아티팩트를 로드합니다.\n",
65 | "- `input_fn(request_body, content_type)`: 입력 데이터를 전처리합니다. (예: request_body로 전송된 bytearray 배열을 PIL.Image로 변환 수 cropping, resizing, normalization등의 전처리 수행). content_type은 입력 데이터 종류에 따라 다양하게 처리 가능합니다. (예: application/x-npy, application/json, application/csv 등)\n",
66 | "- `predict_fn(input_object, model)`: input_fn을 통해 들어온 데이터에 대해 추론을 수행합니다. \n",
67 | "- `output_fn(prediction, accept_type)`: predict_fn에서 받은 추론 결과를 추가 변환을 거쳐 프론트 엔드로 전송합니다. \n",
68 | "\n",
69 | "### Option 2. \n",
70 | "- `model_fn(model_dir)`: 딥러닝 네트워크 아키텍처를 정의하고 S3의 model_dir에 저장된 모델 아티팩트를 로드합니다.\n",
71 | "- `transform_fn(model, request_body, content_type, accept_type)`: input_fn(), predict_fn(), output_fn()을 transform_fn()으로 통합할 수 있습니다."
72 | ]
73 | },
74 | {
75 | "cell_type": "code",
76 | "execution_count": null,
77 | "metadata": {},
78 | "outputs": [],
79 | "source": [
80 | "%%writefile src/inference_pytorch.py\n",
81 | "\n",
82 | "# Built-Ins\n",
83 | "import io, os, sys\n",
84 | "import json\n",
85 | "import subprocess, time\n",
86 | "\n",
87 | "import numpy as np\n",
88 | "from base64 import b64decode\n",
89 | "from PIL import Image\n",
90 | "\n",
91 | "import torch\n",
92 | "import torchvision\n",
93 | "from torchvision import datasets, transforms, models\n",
94 | "from torchvision.models.detection import FasterRCNN\n",
95 | "import torchvision.transforms as transforms\n",
96 | "device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')\n",
97 | " \n",
98 | " \n",
99 | "def model_fn(model_dir=None):\n",
100 | " '''\n",
101 | " Loads the model into memory from storage and return the model.\n",
102 | " '''\n",
103 | " model = torchvision.models.detection.fasterrcnn_resnet50_fpn(pretrained=True)\n",
104 | " # load the model onto the computation device\n",
105 | " model = model.eval().to(device) \n",
106 | " return model\n",
107 | "\n",
108 | "\n",
109 | "def transform_fn(model, request_body, content_type='application/x-image', accept_type=None):\n",
110 | " '''\n",
111 | " Deserialize the request body and predicts on the deserialized object with the model from model_fn()\n",
112 | " '''\n",
113 | " if content_type == 'application/x-image': \n",
114 | " img = np.array(Image.open(io.BytesIO(request_body)))\n",
115 | " elif content_type == 'application/x-npy': \n",
116 | " img = np.frombuffer(request_body, dtype='uint8').reshape(137, 236) \n",
117 | " else:\n",
118 | " raise ValueError(\n",
119 | " 'Requested unsupported ContentType in content_type : ' + content_type)\n",
120 | "\n",
121 | " t0 = time.time()\n",
122 | " \n",
123 | " test_transforms = transforms.Compose([\n",
124 | " transforms.ToTensor()\n",
125 | " ])\n",
126 | " img_tensor = test_transforms(img).to(device)\n",
127 | " img_tensor = img_tensor.unsqueeze(0)\n",
128 | " \n",
129 | " with torch.no_grad(): \n",
130 | " result = model(img_tensor)\n",
131 | "\n",
132 | " t1 = time.time() - t0\n",
133 | " print(\"--- Elapsed time: %s secs ---\" % t1)\n",
134 | " \n",
135 | " scores = result[0]['scores'].detach().cpu().numpy()\n",
136 | " bboxes = result[0]['boxes'].detach().cpu().numpy()\n",
137 | " cids = result[0]['labels'].detach().cpu().numpy() \n",
138 | " \n",
139 | " outputs = json.dumps({'score': scores.tolist(), \n",
140 | " 'bbox': bboxes.tolist(),\n",
141 | " 'cid': cids.tolist()})\n",
142 | " \n",
143 | " return outputs\n"
144 | ]
145 | },
146 | {
147 | "cell_type": "markdown",
148 | "metadata": {},
149 | "source": [
150 | "Object Detection에 필요한 유틸리티 함수들을 정의합니다."
151 | ]
152 | },
153 | {
154 | "cell_type": "code",
155 | "execution_count": null,
156 | "metadata": {},
157 | "outputs": [],
158 | "source": [
159 | "%%writefile src/utils.py\n",
160 | "\n",
161 | "def get_label_map(label_file):\n",
162 | " label_map = {}\n",
163 | " labels = open(label_file, 'r')\n",
164 | " \n",
165 | " for line in labels:\n",
166 | " line = line.rstrip(\"\\n\")\n",
167 | " ids = line.split(',')\n",
168 | " label_map[int(ids[0])] = ids[2] \n",
169 | " \n",
170 | " return label_map\n",
171 | "\n",
172 | "\n",
173 | "def get_label_map_imagenet(label_file):\n",
174 | " label_map = {}\n",
175 | " with open(label_file, 'r') as f:\n",
176 | " for line in f:\n",
177 | " key, val = line.strip().split(':')\n",
178 | " label_map[key] = val.replace(',', '')\n",
179 | " return label_map\n",
180 | "\n",
181 | "\n",
182 | "def delete_endpoint(client, endpoint_name):\n",
183 | " response = client.describe_endpoint_config(EndpointConfigName=endpoint_name)\n",
184 | " model_name = response['ProductionVariants'][0]['ModelName']\n",
185 | "\n",
186 | " client.delete_model(ModelName=model_name) \n",
187 | " client.delete_endpoint(EndpointName=endpoint_name)\n",
188 | " client.delete_endpoint_config(EndpointConfigName=endpoint_name) \n",
189 | " \n",
190 | " print(f'--- Deleted model: {model_name}')\n",
191 | " print(f'--- Deleted endpoint: {endpoint_name}')\n",
192 | " print(f'--- Deleted endpoint_config: {endpoint_name}') \n",
193 | " \n",
194 | " \n",
195 | "def plot_bbox(img_resized, bboxes, scores, cids, class_info, framework='pytorch', threshold=0.5):\n",
196 | "\n",
197 | " import numpy as np\n",
198 | " import random\n",
199 | " import matplotlib.patches as patches\n",
200 | " import matplotlib.pyplot as plt\n",
201 | " \n",
202 | " if framework=='mxnet':\n",
203 | " img_np = img_resized.asnumpy()\n",
204 | " scores = scores.asnumpy()\n",
205 | " bboxes = bboxes.asnumpy()\n",
206 | " cids = cids.asnumpy()\n",
207 | " else:\n",
208 | " img_np = img_resized\n",
209 | " scores = np.array(scores)\n",
210 | " bboxes = np.array(bboxes)\n",
211 | " cids = np.array(cids) \n",
212 | "\n",
213 | " # Get only results that are above the threshold. Default threshold is 0.5. \n",
214 | " scores = scores[scores > threshold]\n",
215 | " num_detections = len(scores)\n",
216 | " bboxes = bboxes[:num_detections, :]\n",
217 | " cids = cids[:num_detections].astype('int').squeeze()\n",
218 | "\n",
219 | " # Get bounding-box colors\n",
220 | " cmap = plt.get_cmap('tab20b')\n",
221 | " colors = [cmap(i) for i in np.linspace(0, 1, 20)]\n",
222 | " random.seed(42)\n",
223 | " random.shuffle(colors)\n",
224 | " \n",
225 | " plt.figure()\n",
226 | " fig, ax = plt.subplots(1, figsize=(10,10))\n",
227 | " ax.imshow(img_np)\n",
228 | "\n",
229 | " if cids is not None:\n",
230 | " # Get unique class labels \n",
231 | " unique_labels = set(list(cids.astype('int').squeeze()))\n",
232 | " unique_labels = np.array(list(unique_labels))\n",
233 | " n_cls_preds = len(unique_labels)\n",
234 | " bbox_colors = colors[:n_cls_preds]\n",
235 | "\n",
236 | " for b, cls_pred, cls_conf in zip(bboxes, cids, scores):\n",
237 | " x1, y1, x2, y2 = b[0], b[1], b[2], b[3]\n",
238 | " predicted_class = class_info[int(cls_pred)]\n",
239 | " label = '{} {:.2f}'.format(predicted_class, cls_conf)\n",
240 | " \n",
241 | " # Get box height and width\n",
242 | " box_h = y2 - y1\n",
243 | " box_w = x2 - x1\n",
244 | "\n",
245 | " # Add a box with the color for this class\n",
246 | " color = bbox_colors[int(np.where(unique_labels == int(cls_pred))[0])]\n",
247 | " bbox = patches.Rectangle((x1, y1), box_w, box_h, linewidth=3, edgecolor=color, facecolor='none')\n",
248 | " ax.add_patch(bbox)\n",
249 | "\n",
250 | " plt.text(x1, y1, s=label, color='white', verticalalignment='top',\n",
251 | " bbox={'color': color, 'pad': 0})\n"
252 | ]
253 | },
254 | {
255 | "cell_type": "markdown",
256 | "metadata": {},
257 | "source": [
258 | "
\n",
259 | "\n",
260 | "# 2. Local Endpoint Inference\n",
261 | "---\n",
262 | "\n",
263 | "충분한 검증 및 테스트 없이 훈련된 모델을 곧바로 실제 운영 환경에 배포하기에는 많은 위험 요소들이 있습니다. 따라서, 로컬 모드를 사용하여 실제 운영 환경에 배포하기 위한 추론 인스턴스를 시작하기 전에 노트북 인스턴스의 로컬 환경에서 모델을 배포하는 것을 권장합니다. 이를 로컬 모드 엔드포인트(Local Mode Endpoint)라고 합니다.\n",
264 | "\n",
265 | "먼저, 로컬 모드 엔드포인트의 컨테이너 배포 이전에 로컬 환경 상에서 직접 추론을 수행하여 결과를 확인하고, 곧바로 로컬 모드 엔드포인트를 배포해 보겠습니다."
266 | ]
267 | },
268 | {
269 | "cell_type": "code",
270 | "execution_count": null,
271 | "metadata": {},
272 | "outputs": [],
273 | "source": [
274 | "import os\n",
275 | "import json\n",
276 | "import numpy as np\n",
277 | "from io import BytesIO\n",
278 | "from PIL import Image\n",
279 | "from src.inference_pytorch import model_fn, transform_fn\n",
280 | "from src.utils import get_label_map, delete_endpoint, plot_bbox\n",
281 | "\n",
282 | "label_map = get_label_map('files/coco_labels.txt')\n",
283 | "label_list = list(label_map.values())\n",
284 | "print(label_map)\n",
285 | "\n",
286 | "path = \"./images/test\"\n",
287 | "img_list = os.listdir(path)\n",
288 | "img_path_list = [os.path.join(path, img) for img in img_list]\n",
289 | "\n",
290 | "test_idx = 0\n",
291 | "img_path = img_path_list[test_idx]\n",
292 | "\n",
293 | "with open(img_path, mode='rb') as file:\n",
294 | " img_byte = bytearray(file.read())"
295 | ]
296 | },
297 | {
298 | "cell_type": "markdown",
299 | "metadata": {},
300 | "source": [
301 | "## 2.1. Local Inference without Endpoint\n",
302 | "\n",
303 | "로컬 모드 엔드포인트 배포 이전에 로컬 환경상에서 아래와 같이 추론을 수행하면서 디버깅을 수행할 수 있습니다."
304 | ]
305 | },
306 | {
307 | "cell_type": "code",
308 | "execution_count": null,
309 | "metadata": {},
310 | "outputs": [],
311 | "source": [
312 | "model = model_fn()\n",
313 | "response_body = transform_fn(model, img_byte)\n",
314 | "outputs = json.loads(response_body)"
315 | ]
316 | },
317 | {
318 | "cell_type": "code",
319 | "execution_count": null,
320 | "metadata": {},
321 | "outputs": [],
322 | "source": [
323 | "img = Image.open(img_path)\n",
324 | "plot_bbox(img, outputs['bbox'], outputs['score'], outputs['cid'], class_info=label_map)"
325 | ]
326 | },
327 | {
328 | "cell_type": "markdown",
329 | "metadata": {},
330 | "source": [
331 | "## 2.2. Local Mode Endpoint\n",
332 | "\n",
333 | "이제 로컬 모드로 배포를 수행합니다.\n",
334 | "\n",
335 | "Fine-tuning 없이 곧바로 사전 훈련된 모델을 사용할 것이므로 `model.tar.gz`는 0바이트의 빈 파일로 구성합니다.\n",
336 | "만약 온프레미스에서 fine-tuning을 수행한 모델을 사용하고 싶다면, 모델 파라메터(예: model.pth)들을 `model.tar.gz`로 압축하세요."
337 | ]
338 | },
339 | {
340 | "cell_type": "code",
341 | "execution_count": null,
342 | "metadata": {},
343 | "outputs": [],
344 | "source": [
345 | "f = open(\"model.pth\", 'w')\n",
346 | "f.close()\n",
347 | "!tar -czf model.tar.gz model.pth"
348 | ]
349 | },
350 | {
351 | "cell_type": "code",
352 | "execution_count": null,
353 | "metadata": {},
354 | "outputs": [],
355 | "source": [
356 | "import os\n",
357 | "import time\n",
358 | "from sagemaker.deserializers import JSONDeserializer\n",
359 | "from sagemaker.serializers import IdentitySerializer\n",
360 | "from sagemaker.pytorch.model import PyTorchModel\n",
361 | "role = sagemaker.get_execution_role()"
362 | ]
363 | },
364 | {
365 | "cell_type": "code",
366 | "execution_count": null,
367 | "metadata": {},
368 | "outputs": [],
369 | "source": [
370 | "endpoint_name = \"local-endpoint-pytorch-{}\".format(int(time.time()))\n",
371 | "local_model_path = f'file://{os.getcwd()}/model.tar.gz'"
372 | ]
373 | },
374 | {
375 | "cell_type": "markdown",
376 | "metadata": {},
377 | "source": [
378 | "\n",
379 | "아래 코드 셀을 실행 후, 로그를 확인해 보세요. torchserve 대한 세팅값들을 확인하실 수 있습니다.\n",
380 | "\n",
381 | "```bash\n",
382 | "algo-1-9m80o_1 | ['torchserve', '--start', '--model-store', '/.sagemaker/ts/models', '--ts-config', '/etc/sagemaker-ts.properties', '--log-config', '/opt/conda/lib/python3.6/site-packages/sagemaker_pytorch_serving_container/etc/log4j.properties', '--models', 'model.mar']\n",
383 | "algo-1-9m80o_1 | 2020-12-29 14:07:56,008 [INFO ] main org.pytorch.serve.ModelServer - \n",
384 | "algo-1-9m80o_1 | Torchserve version: 0.2.1\n",
385 | "algo-1-9m80o_1 | TS Home: /opt/conda/lib/python3.6/site-packages\n",
386 | "algo-1-9m80o_1 | Current directory: /\n",
387 | "algo-1-9m80o_1 | Temp directory: /home/model-server/tmp\n",
388 | "...\n",
389 | "```"
390 | ]
391 | },
392 | {
393 | "cell_type": "markdown",
394 | "metadata": {},
395 | "source": [
396 | "**[Note]** SageMaker SDK v2부터 serializer와 deserializer의 content_type에 대한 property를 직접 지정하는 것이 아니라 직접 클래스 인스턴스를 생성해야 합니다. content_type이 `'application/x-image'`일 경우는 `IdentitySerializer` 클래스를 사용하시면 됩니다."
397 | ]
398 | },
399 | {
400 | "cell_type": "code",
401 | "execution_count": null,
402 | "metadata": {},
403 | "outputs": [],
404 | "source": [
405 | "model = PyTorchModel(model_data=local_model_path,\n",
406 | " role=role,\n",
407 | " entry_point='inference_pytorch.py', \n",
408 | " source_dir='src',\n",
409 | " framework_version='1.6.0',\n",
410 | " py_version='py3')\n",
411 | "\n",
412 | "predictor = model.deploy(\n",
413 | " initial_instance_count=1,\n",
414 | " instance_type='local',\n",
415 | " serializer=IdentitySerializer(content_type='application/x-image'),\n",
416 | " deserializer=JSONDeserializer()\n",
417 | ")"
418 | ]
419 | },
420 | {
421 | "cell_type": "markdown",
422 | "metadata": {},
423 | "source": [
424 | "로컬에서 컨테이너를 배포했기 때문에 컨테이너가 현재 실행 중임을 확인할 수 있습니다."
425 | ]
426 | },
427 | {
428 | "cell_type": "code",
429 | "execution_count": null,
430 | "metadata": {},
431 | "outputs": [],
432 | "source": [
433 | "!docker ps"
434 | ]
435 | },
436 | {
437 | "cell_type": "markdown",
438 | "metadata": {},
439 | "source": [
440 | "### SageMaker SDK로 엔드포인트 호출"
441 | ]
442 | },
443 | {
444 | "cell_type": "code",
445 | "execution_count": null,
446 | "metadata": {},
447 | "outputs": [],
448 | "source": [
449 | "outputs = predictor.predict(img_byte)"
450 | ]
451 | },
452 | {
453 | "cell_type": "code",
454 | "execution_count": null,
455 | "metadata": {},
456 | "outputs": [],
457 | "source": [
458 | "plot_bbox(img, outputs['bbox'], outputs['score'], outputs['cid'], class_info=label_map)"
459 | ]
460 | },
461 | {
462 | "cell_type": "markdown",
463 | "metadata": {},
464 | "source": [
465 | "### Boto3 API로 엔드포인트 호출\n",
466 | "\n",
467 | "위의 코드 셀처럼 SageMaker SDK `predict()` 메서드로 추론을 수행할 수도 있지만, 이번에는 boto3의 `invoke_endpoint()` 메서드로 추론을 수행해 보겠습니다.
\n",
468 | "Boto3는 서비스 레벨의 low-level SDK로, ML 실험에 초점을 맞춰 일부 기능들이 추상화된 high-level SDK인 SageMaker SDK와 달리\n",
469 | "SageMaker API를 완벽하게 제어할 수 있습으며, 프로덕션 및 자동화 작업에 적합합니다.\n",
470 | "\n",
471 | "참고로 `invoke_endpoint()` 호출을 위한 런타임 클라이언트 인스턴스 생성 시, 로컬 배포 모드에서는 `sagemaker.local.LocalSagemakerRuntimeClient()`를 호출해야 합니다."
472 | ]
473 | },
474 | {
475 | "cell_type": "code",
476 | "execution_count": null,
477 | "metadata": {},
478 | "outputs": [],
479 | "source": [
480 | "runtime_client = sagemaker.local.LocalSagemakerRuntimeClient()\n",
481 | "endpoint_name = model.endpoint_name\n",
482 | "\n",
483 | "response = runtime_client.invoke_endpoint(\n",
484 | " EndpointName=endpoint_name, \n",
485 | " ContentType='application/x-image',\n",
486 | " Accept='application/json',\n",
487 | " Body=img_byte\n",
488 | " )\n",
489 | "outputs = json.loads(response['Body'].read().decode())"
490 | ]
491 | },
492 | {
493 | "cell_type": "code",
494 | "execution_count": null,
495 | "metadata": {},
496 | "outputs": [],
497 | "source": [
498 | "plot_bbox(img, outputs['bbox'], outputs['score'], outputs['cid'], class_info=label_map)"
499 | ]
500 | },
501 | {
502 | "cell_type": "markdown",
503 | "metadata": {},
504 | "source": [
505 | "### Local Mode Endpoint Clean-up\n",
506 | "\n",
507 | "엔드포인트를 계속 사용하지 않는다면, 엔드포인트를 삭제해야 합니다. SageMaker SDK에서는 `delete_endpoint()` 메소드로 간단히 삭제할 수 있습니다.
\n",
508 | "참고로, 노트북 인스턴스에서 추론 컨테이너를 배포했기 때문에 엔드포인트를 띄워 놓아도 별도로 추가 요금이 과금되지는 않습니다. \n",
509 | "\n",
510 | "\n",
511 | "```python\n",
512 | "# SageMaker SDK\n",
513 | "predictor.delete_endpoint()\n",
514 | "\n",
515 | "# Boto3 API\n",
516 | "client.delete_model(ModelName=model_name) \n",
517 | "client.delete_endpoint(EndpointName=endpoint_name)\n",
518 | "client.delete_endpoint_config(EndpointConfigName=endpoint_name) \n",
519 | "\n",
520 | "# 직접 삭제\n",
521 | "!docker rm $(docker ps -a -q)\n",
522 | "```"
523 | ]
524 | },
525 | {
526 | "cell_type": "code",
527 | "execution_count": null,
528 | "metadata": {},
529 | "outputs": [],
530 | "source": [
531 | "predictor.delete_endpoint()"
532 | ]
533 | },
534 | {
535 | "cell_type": "markdown",
536 | "metadata": {},
537 | "source": [
538 | "
\n",
539 | "\n",
540 | "# 3. SageMaker Hosted Endpoint Inference\n",
541 | "---\n",
542 | "\n",
543 | "이제 실제 운영 환경에 엔드포인트 배포를 수행해 보겠습니다. 로컬 모드 엔드포인트와 대부분의 코드가 동일하며, 모델 아티팩트 경로(`model_data`)와 인스턴스 유형(`instance_type`)만 변경해 주시면 됩니다."
544 | ]
545 | },
546 | {
547 | "cell_type": "markdown",
548 | "metadata": {},
549 | "source": [
550 | "#### [주의] 아래 코드 셀을 그대로 실행하지 마시고 bucket 이름을 반드시 수정해 주세요.\n",
551 | "```python\n",
552 | "bucket = '[YOUR-S3-BUCKET]' # as-is\n",
553 | "bucket = 'sagemaker-hol-daekeun' # to-be\n",
554 | "```"
555 | ]
556 | },
557 | {
558 | "cell_type": "code",
559 | "execution_count": null,
560 | "metadata": {},
561 | "outputs": [],
562 | "source": [
563 | "role = sagemaker.get_execution_role()\n",
564 | "#bucket = '[YOUR-S3-BUCKET]' # bucket 이름을 반드시 수정해 주세요.\n",
565 | "bucket = sagemaker.Session().default_bucket() # SageMaker에서 자동으로 생성되는 bucket"
566 | ]
567 | },
568 | {
569 | "cell_type": "code",
570 | "execution_count": null,
571 | "metadata": {},
572 | "outputs": [],
573 | "source": [
574 | "%%bash -s \"$role\" \"$bucket\"\n",
575 | "ROLE=$1\n",
576 | "BUCKET=$2\n",
577 | "\n",
578 | "aws s3 cp model.tar.gz s3://$BUCKET/model.tar.gz"
579 | ]
580 | },
581 | {
582 | "cell_type": "markdown",
583 | "metadata": {},
584 | "source": [
585 | "SageMaker가 관리하는 배포 클러스터를 프로비저닝하고 추론 컨테이너를 배포하기 때문에, 추론 서비스를 시작하는 데에는 약 5~10분 정도 소요됩니다."
586 | ]
587 | },
588 | {
589 | "cell_type": "code",
590 | "execution_count": null,
591 | "metadata": {},
592 | "outputs": [],
593 | "source": [
594 | "%%time\n",
595 | "\n",
596 | "model_path = 's3://{}/model.tar.gz'.format(bucket)\n",
597 | "endpoint_name = \"endpoint-pytorch-object-detection-{}\".format(int(time.time()))\n",
598 | "\n",
599 | "model = PyTorchModel(model_data=model_path,\n",
600 | " role=role,\n",
601 | " entry_point='inference_pytorch.py', \n",
602 | " source_dir='src',\n",
603 | " framework_version='1.6.0',\n",
604 | " py_version='py3')\n",
605 | "\n",
606 | "predictor = model.deploy(\n",
607 | " initial_instance_count=1,\n",
608 | " instance_type='ml.c5.large',\n",
609 | " serializer=IdentitySerializer(content_type='application/x-image'),\n",
610 | " deserializer=JSONDeserializer()\n",
611 | ")"
612 | ]
613 | },
614 | {
615 | "cell_type": "code",
616 | "execution_count": null,
617 | "metadata": {},
618 | "outputs": [],
619 | "source": [
620 | "outputs = predictor.predict(img_byte)"
621 | ]
622 | },
623 | {
624 | "cell_type": "code",
625 | "execution_count": null,
626 | "metadata": {},
627 | "outputs": [],
628 | "source": [
629 | "plot_bbox(img, outputs['bbox'], outputs['score'], outputs['cid'], class_info=label_map)"
630 | ]
631 | },
632 | {
633 | "cell_type": "markdown",
634 | "metadata": {},
635 | "source": [
636 | "### Endpoint Clean-up\n",
637 | "\n",
638 | "SageMaker Endpoint로 인한 과금을 막기 위해, 본 핸즈온이 끝나면 반드시 Endpoint를 삭제해 주시기 바랍니다."
639 | ]
640 | },
641 | {
642 | "cell_type": "code",
643 | "execution_count": null,
644 | "metadata": {},
645 | "outputs": [],
646 | "source": [
647 | "predictor.delete_endpoint()"
648 | ]
649 | }
650 | ],
651 | "metadata": {
652 | "kernelspec": {
653 | "display_name": "conda_pytorch_p36",
654 | "language": "python",
655 | "name": "conda_pytorch_p36"
656 | },
657 | "language_info": {
658 | "codemirror_mode": {
659 | "name": "ipython",
660 | "version": 3
661 | },
662 | "file_extension": ".py",
663 | "mimetype": "text/x-python",
664 | "name": "python",
665 | "nbconvert_exporter": "python",
666 | "pygments_lexer": "ipython3",
667 | "version": "3.6.10"
668 | }
669 | },
670 | "nbformat": 4,
671 | "nbformat_minor": 4
672 | }
673 |
--------------------------------------------------------------------------------
/src/inference_mxnet.py:
--------------------------------------------------------------------------------
1 |
2 | import os
3 | import time
4 | import sys
5 | import json
6 | import subprocess
7 | from base64 import b64decode
8 |
9 | # Install/Update GluonCV:
10 | subprocess.call([sys.executable, '-m', 'pip', 'install', 'gluoncv'])
11 |
12 | import mxnet as mx
13 | import gluoncv as gcv
14 | ctx = mx.cpu()
15 |
16 | def model_fn(model_dir=None):
17 | '''
18 | Loads the model into memory from storage and return the model.
19 | '''
20 | net = gcv.model_zoo.get_model(
21 | 'yolo3_darknet53_coco',
22 | pretrained=True,
23 | ctx=ctx,
24 | )
25 | net.hybridize(static_alloc=True, static_shape=True)
26 | #net.load_parameters(os.path.join(model_dir, 'model.params'), ctx=ctx)
27 | return net
28 |
29 | def input_fn(request_body, content_type):
30 | '''
31 | Deserialize the request body
32 | '''
33 | if content_type == 'application/json':
34 | D = json.loads(request_body)
35 |
36 | short = D.get('short')
37 | image = b64decode(D['image'])
38 | x, _ = gcv.data.transforms.presets.yolo.transform_test(
39 | mx.image.imdecode(image), short=short
40 | )
41 | return x
42 | else:
43 | raise RuntimeError(f'Not support content-type: {content_type}')
44 |
45 |
46 | def predict_fn(input_object, model):
47 | '''
48 | Predicts on the deserialized object with the model from model_fn().
49 | '''
50 | x = input_object
51 |
52 | t0 = time.time()
53 | cid, score, bbox = model(x.as_in_context(ctx))
54 | t1 = time.time() - t0
55 | print("--- Elapsed time: %s secs ---" % t1)
56 |
57 | return x.shape, cid[0], score[0], bbox[0]
58 |
59 |
60 | def output_fn(prediction, content_type):
61 | '''
62 | Serializes predictions from predict_fn() to JSON format.
63 | '''
64 | shape, cid, score, bbox = prediction
65 | if content_type == 'application/json':
66 | return json.dumps({
67 | 'shape': shape,
68 | 'cid': cid.asnumpy().tolist(),
69 | 'score': score.asnumpy().tolist(),
70 | 'bbox': bbox.asnumpy().tolist()
71 | })
72 |
--------------------------------------------------------------------------------
/src/inference_pytorch.py:
--------------------------------------------------------------------------------
1 |
2 | # Built-Ins
3 | import io, os, sys
4 | import json
5 | import subprocess, time
6 |
7 | import numpy as np
8 | from base64 import b64decode
9 | from PIL import Image
10 |
11 | import torch
12 | import torchvision
13 | from torchvision import datasets, transforms, models
14 | from torchvision.models.detection import FasterRCNN
15 | import torchvision.transforms as transforms
16 | device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
17 |
18 |
19 | def model_fn(model_dir=None):
20 | '''
21 | Loads the model into memory from storage and return the model.
22 | '''
23 | model = torchvision.models.detection.fasterrcnn_resnet50_fpn(pretrained=True)
24 | # load the model onto the computation device
25 | model = model.eval().to(device)
26 | return model
27 |
28 |
29 | def transform_fn(model, request_body, content_type='application/x-image', accept_type=None):
30 | '''
31 | Deserialize the request body and predicts on the deserialized object with the model from model_fn()
32 | '''
33 | if content_type == 'application/x-image':
34 | img = np.array(Image.open(io.BytesIO(request_body)))
35 | elif content_type == 'application/x-npy':
36 | img = np.frombuffer(request_body, dtype='uint8').reshape(137, 236)
37 | else:
38 | raise ValueError(
39 | 'Requested unsupported ContentType in content_type : ' + content_type)
40 |
41 | t0 = time.time()
42 |
43 | test_transforms = transforms.Compose([
44 | transforms.ToTensor()
45 | ])
46 | img_tensor = test_transforms(img).to(device)
47 | img_tensor = img_tensor.unsqueeze(0)
48 |
49 | with torch.no_grad():
50 | result = model(img_tensor)
51 |
52 | t1 = time.time() - t0
53 | print("--- Elapsed time: %s secs ---" % t1)
54 |
55 | scores = result[0]['scores'].detach().cpu().numpy()
56 | bboxes = result[0]['boxes'].detach().cpu().numpy()
57 | cids = result[0]['labels'].detach().cpu().numpy()
58 |
59 | outputs = json.dumps({'score': scores.tolist(),
60 | 'bbox': bboxes.tolist(),
61 | 'cid': cids.tolist()})
62 |
63 | return outputs
64 |
--------------------------------------------------------------------------------
/src/inference_pytorch_inf1.py:
--------------------------------------------------------------------------------
1 |
2 | def input_fn(request_body, request_content_type):
3 | import torch
4 | import torchvision.transforms as transforms
5 | from PIL import Image
6 | import io
7 | f = io.BytesIO(request_body)
8 | input_image = Image.open(f).convert('RGB')
9 | preprocess = transforms.Compose([
10 | transforms.Resize(255),
11 | transforms.CenterCrop(224),
12 | transforms.ToTensor(),
13 | transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]),
14 | ])
15 | input_tensor = preprocess(input_image)
16 | input_batch = input_tensor.unsqueeze(0)
17 | return input_batch
18 |
--------------------------------------------------------------------------------
/src/inference_pytorch_neo.py:
--------------------------------------------------------------------------------
1 |
2 | import io
3 | import json
4 | import logging
5 | import os
6 | import pickle
7 |
8 | import numpy as np
9 | import torch
10 | import torchvision.transforms as transforms
11 | from PIL import Image # Training container doesn't have this package
12 |
13 | logger = logging.getLogger(__name__)
14 | logger.setLevel(logging.DEBUG)
15 |
16 | def model_fn(model_dir):
17 |
18 | logger.info('model_fn')
19 | with torch.neo.config(model_dir=model_dir, neo_runtime=True):
20 | device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
21 | # The compiled model is saved as "compiled.pt"
22 | model = torch.jit.load(os.path.join(model_dir, 'compiled.pt'))
23 | model = model.to(device)
24 |
25 | # It is recommended to run warm-up inference during model load
26 | sample_input_path = os.path.join(model_dir, 'sample_input.pkl')
27 | with open(sample_input_path, 'rb') as input_file:
28 | model_input = pickle.load(input_file)
29 | if torch.is_tensor(model_input):
30 | model_input = model_input.to(device)
31 | model(model_input)
32 | elif isinstance(model_input, tuple):
33 | model_input = (inp.to(device)
34 | for inp in model_input if torch.is_tensor(inp))
35 | model(*model_input)
36 | else:
37 | print("Only supports a torch tensor or a tuple of torch tensors")
38 |
39 | return model
40 |
41 |
42 | def transform_fn(model, payload, request_content_type='application/octet-stream',
43 | response_content_type='application/json'):
44 |
45 | logger.info('Invoking user-defined transform function')
46 |
47 | if request_content_type != 'application/octet-stream':
48 | raise RuntimeError(
49 | 'Content type must be application/octet-stream. Provided: {0}'.format(request_content_type))
50 |
51 | # preprocess
52 | decoded = Image.open(io.BytesIO(payload))
53 | preprocess = transforms.Compose([
54 | transforms.Resize(256),
55 | transforms.CenterCrop(224),
56 | transforms.ToTensor(),
57 | transforms.Normalize(
58 | mean=[
59 | 0.485, 0.456, 0.406], std=[
60 | 0.229, 0.224, 0.225]),
61 | ])
62 | normalized = preprocess(decoded)
63 | batchified = normalized.unsqueeze(0)
64 |
65 | # predict
66 | device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
67 | batchified = batchified.to(device)
68 | result = model.forward(batchified)
69 |
70 | # Softmax (assumes batch size 1)
71 | result = np.squeeze(result.detach().cpu().numpy())
72 | result_exp = np.exp(result - np.max(result))
73 | result = result_exp / np.sum(result_exp)
74 |
75 | response_body = json.dumps(result.tolist())
76 |
77 | return response_body, response_content_type
78 |
--------------------------------------------------------------------------------
/src/utils.py:
--------------------------------------------------------------------------------
1 | def get_label_map(label_file):
2 | label_map = {}
3 | labels = open(label_file, 'r')
4 |
5 | for line in labels:
6 | line = line.rstrip("\n")
7 | ids = line.split(',')
8 | label_map[int(ids[0])] = ids[2]
9 |
10 | return label_map
11 |
12 |
13 | def get_label_map_imagenet(label_file):
14 | label_map = {}
15 | with open(label_file, 'r') as f:
16 | for line in f:
17 | key, val = line.strip().split(':')
18 | label_map[key] = val.replace(',', '')
19 | return label_map
20 |
21 |
22 | def delete_endpoint(client, endpoint_name):
23 | response = client.describe_endpoint_config(EndpointConfigName=endpoint_name)
24 | model_name = response['ProductionVariants'][0]['ModelName']
25 |
26 | client.delete_model(ModelName=model_name)
27 | client.delete_endpoint(EndpointName=endpoint_name)
28 | client.delete_endpoint_config(EndpointConfigName=endpoint_name)
29 |
30 | print(f'--- Deleted model: {model_name}')
31 | print(f'--- Deleted endpoint: {endpoint_name}')
32 | print(f'--- Deleted endpoint_config: {endpoint_name}')
33 |
34 |
35 | def plot_bbox(img_resized, bboxes, scores, cids, class_info, framework='pytorch', threshold=0.5):
36 |
37 | import numpy as np
38 | import random
39 | import matplotlib.patches as patches
40 | import matplotlib.pyplot as plt
41 |
42 | if framework=='mxnet':
43 | img_np = img_resized.asnumpy()
44 | scores = scores.asnumpy()
45 | bboxes = bboxes.asnumpy()
46 | cids = cids.asnumpy()
47 | else:
48 | img_np = img_resized
49 | scores = np.array(scores)
50 | bboxes = np.array(bboxes)
51 | cids = np.array(cids)
52 |
53 | # Get only results that are above the threshold. Default threshold is 0.5.
54 | scores = scores[scores > threshold]
55 | num_detections = len(scores)
56 | bboxes = bboxes[:num_detections, :]
57 | cids = cids[:num_detections].astype('int').squeeze()
58 |
59 | # Get bounding-box colors
60 | cmap = plt.get_cmap('tab20b')
61 | colors = [cmap(i) for i in np.linspace(0, 1, 20)]
62 | random.seed(42)
63 | random.shuffle(colors)
64 |
65 | plt.figure()
66 | fig, ax = plt.subplots(1, figsize=(10,10))
67 | ax.imshow(img_np)
68 |
69 | if cids is not None:
70 | # Get unique class labels
71 | unique_labels = set(list(cids.astype('int').squeeze()))
72 | unique_labels = np.array(list(unique_labels))
73 | n_cls_preds = len(unique_labels)
74 | bbox_colors = colors[:n_cls_preds]
75 |
76 | for b, cls_pred, cls_conf in zip(bboxes, cids, scores):
77 | x1, y1, x2, y2 = b[0], b[1], b[2], b[3]
78 | predicted_class = class_info[int(cls_pred)]
79 | label = '{} {:.2f}'.format(predicted_class, cls_conf)
80 |
81 | # Get box height and width
82 | box_h = y2 - y1
83 | box_w = x2 - x1
84 |
85 | # Add a box with the color for this class
86 | color = bbox_colors[int(np.where(unique_labels == int(cls_pred))[0])]
87 | bbox = patches.Rectangle((x1, y1), box_w, box_h, linewidth=3, edgecolor=color, facecolor='none')
88 | ax.add_patch(bbox)
89 |
90 | plt.text(x1, y1, s=label, color='white', verticalalignment='top',
91 | bbox={'color': color, 'pad': 0})
92 |
--------------------------------------------------------------------------------
/tensorflow-serving-endpoint.ipynb:
--------------------------------------------------------------------------------
1 | {
2 | "cells": [
3 | {
4 | "cell_type": "markdown",
5 | "metadata": {},
6 | "source": [
7 | "# SageMaker Endpoint에 사전 훈련된 모델을 호스팅 후 Object Deteciton 수행하기 (TensorFlow)\n",
8 | "---\n",
9 | "\n",
10 | "*본 노트북은 AWS 머신러닝 블로그 [Performing batch inference with TensorFlow Serving in Amazon SageMaker](https://aws.amazon.com/ko/blogs/machine-learning/performing-batch-inference-with-tensorflow-serving-in-amazon-sagemaker/)를 참조하여 핸즈온 랩(Hands-on Lab)에 적합하게 새롭게 구성하였습니다.*\n",
11 | "\n",
12 | "Amazon SageMaker에서 추론(inference)을 수행하려면 반드시 SageMaker에서 먼저 훈련을 수행해야 하나요? 그렇지 않습니다.
\n",
13 | "만약 여러분이 SageMaker에서 추론만 수행하고 싶다면, 여러분의 온프레미스(on-premise)에서 훈련한 모델이나 공개 모델 저장소(model zoo)에 저장되어 있는 사전 훈련된(pre-trained) 모델들을 도커(Docker) 이미지 빌드 없이 그대로 SageMaker Endpoint에 배포할 수 있습니다. \n",
14 | "여러분이 수행할 작업은 오로지 추론용 엔트리포인트(entrypoint)만 작성하는 것입니다.\n",
15 | "\n",
16 | "우선, TensorFlow 엔트리포인트 인터페이스에 대해 간단히 살펴 보도록 하겠습니다. \n",
17 | "\n",
18 | "- `input_handler(data, context)`: TensorFlow Serving REST API로 전송되기 전에 입력 데이터에 대한 전처리를 수행합니다. (예: data로 전송된 bytearray 배열을 PIL.Image로 변환 수 cropping, resizing, normalization등의 전처리 수행). context.request_content_type은 입력 데이터 종류에 따라 다양하게 처리 가능합니다. (예: application/x-npy, application/json, application/csv 등)\n",
19 | "- `output_handler(data, context)`: TensorFlow Serving의 추론 결과를 추가 변환을 거쳐 프론트 엔드로 전송합니다. \n",
20 | "\n",
21 | "이 노트북에서는 [TensorFlow model zoo](https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md)에서 제공하고 있는 Object Detection 모델들 중 `ssd_mobilenet_v2_coco` 모델을 가져와 Endpoint에 모델을 배포하여 추론을 수행해 보겠습니다.\n",
22 | " \n",
23 | "대용량 데이터셋에 대한 배치 변환(batch transform) TensorFlow 추론 예시는 AWS 머신러닝 블로그 [Performing batch inference with TensorFlow Serving in Amazon SageMaker](https://aws.amazon.com/ko/blogs/machine-learning/performing-batch-inference-with-tensorflow-serving-in-amazon-sagemaker/)를 참조해 주세요.\n",
24 | "\n",
25 | "## Pre-requisites\n",
26 | "\n",
27 | "- 기본 용법: [TensorFlow](https://www.tensorflow.org/tutorials)\n",
28 | "- AWS 서비스: [AWS S3](https://docs.aws.amazon.com/s3/index.html), [Amazon SageMaker](https://aws.amazon.com/sagemaker/)"
29 | ]
30 | },
31 | {
32 | "cell_type": "markdown",
33 | "metadata": {},
34 | "source": [
35 | "
\n",
36 | "\n",
37 | "# 1. Deployment\n",
38 | "----\n",
39 | "\n",
40 | "\n",
41 | "## SavedModel 다운로드 및 확인\n",
42 | "\n",
43 | "TensorFlow Model zoo의 `mobilenet_v2` 모델 아티팩트(model artifact)를 로컬 공간으로 복사하여 압축을 해제합니다. 이 모델 아티팩트는\n",
44 | "TensorFlow SavedModel(https://www.tensorflow.org/guide/saved_model) 을 포함하고 있습니다. 물론 Model zoo에는 다른 모델들도 제공되어 있으니 향후 다른 모델로 변경해서 자유롭게 실험하셔도 좋습니다.
\n",
45 | "압축을 해제하면 아래 구조의 디렉토리들과 파일들을 확인할 수 있습니다.\n",
46 | "```\n",
47 | ".\n",
48 | "├── checkpoint\n",
49 | "├── frozen_inference_graph.pb\n",
50 | "├── model.ckpt.data-00000-of-00001\n",
51 | "├── model.ckpt.index\n",
52 | "├── model.ckpt.meta\n",
53 | "├── pipeline.config\n",
54 | "└── saved_model\n",
55 | " ├── saved_model.pb\n",
56 | " └── variables\n",
57 | "```\n"
58 | ]
59 | },
60 | {
61 | "cell_type": "code",
62 | "execution_count": null,
63 | "metadata": {},
64 | "outputs": [],
65 | "source": [
66 | "%%bash\n",
67 | "\n",
68 | "wget -q http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v2_coco_2018_03_29.tar.gz\n",
69 | "tar zxvf ssd_mobilenet_v2_coco_2018_03_29.tar.gz"
70 | ]
71 | },
72 | {
73 | "cell_type": "markdown",
74 | "metadata": {},
75 | "source": [
76 | "사전 훈련된 모델 파일들 중 `saved_model` 디렉토리 내 모델 아티팩트(model artifact)인 `saved_model.pb`와 그 하위 디렉토리인\n",
77 | "`variables` 디렉토리만 TFS 컨테이너에 필요합니다.\n",
78 | "본 노트북에서는 폴더 이름을 저장된 모델 버전 번호(본 노트북에서는 `1` 부여)로 변환하겠습니다."
79 | ]
80 | },
81 | {
82 | "cell_type": "code",
83 | "execution_count": null,
84 | "metadata": {},
85 | "outputs": [],
86 | "source": [
87 | "!mv ssd_mobilenet_v2_coco_2018_03_29/saved_model ssd_mobilenet_v2_coco_2018_03_29/1"
88 | ]
89 | },
90 | {
91 | "cell_type": "markdown",
92 | "metadata": {},
93 | "source": [
94 | "SageMaker의 TFS 컨테이너는 TensorFlow SavedModel을 내보낼 때 선언되는 serving_default 모델의 SignatureDef를 사용합니다.
\n",
95 | "`saved_model_cli show`를 통해 어떤 값이 입력으로 들어가고 어떤 값들이 출력되는지 확인하실 수 있습니다."
96 | ]
97 | },
98 | {
99 | "cell_type": "code",
100 | "execution_count": null,
101 | "metadata": {},
102 | "outputs": [],
103 | "source": [
104 | "!saved_model_cli show --dir ssd_mobilenet_v2_coco_2018_03_29/1/ --all"
105 | ]
106 | },
107 | {
108 | "cell_type": "markdown",
109 | "metadata": {},
110 | "source": [
111 | "## Pre/post-processing inference script\n",
112 | "\n",
113 | "SageMaker TFS 컨테이너는 기본적으로 REST API를 사용하여 추론 요청을 제공하며 데이터 pre-processing 및 post-processing 스크립트 기능을 제공하기 때문에 직접 도커 컨테이너를 빌드하여 Amazon ECR(Elastic Container Registry)에 업로드할 필요가 없습니다.\n",
114 | "여러분은 엔트리포인트(entrypoint) 스크립트만 구현하시면 됩니다.\n",
115 | "\n",
116 | "- 스크립트 구현; `inference.py`내의 `input_handler` 및 `output_handler` 인터페이스 구현\n",
117 | "- 스크립트가 특정 패키지에 의존할 때 선택적으로 `requirements.txt` 파일 작성\n",
118 | "\n",
119 | "그 다음, `inference.py`와 `requirements.txt`을 `code` 디렉토리에 포함하면 모든 준비가 끝난 것입니다.\n",
120 | "\n",
121 | "`input_handler` 및 `output_handler`에 대한 자세한 내용은 [SageMaker TensorFlow Serving 컨테이너 README](https://github.com/aws/sagemaker-tensorflow-serving-container/blob/master/README.md)를 참조하십시오."
122 | ]
123 | },
124 | {
125 | "cell_type": "markdown",
126 | "metadata": {},
127 | "source": [
128 | "먼저 `inference.py` 파일을 살펴보겠습니다.
\n",
129 | "입력 pre-processing 측에서는 입력 데이터를 필요한 TFS REST API 입력 형식으로 변환하고, 출력 post-processing 측에서는 클라이언트에게 표준 TFS 형식으로 예측 결과를 전달합니다. 본 노트북에서는 영상 데이터에 대한 스크립트를 제공하고 있지만, TFS는 TFRecord 형식의 입력 데이터도 받을 수 있기 때문에 여러분이 입력 데이터 형식에 맞게 자유롭게 스크립트를 구성하실 수 있습니다. \n",
130 | "\n",
131 | "```\n",
132 | "Endpoint Request -> inference.py -> Endpoint Response\n",
133 | " ├── input_handler()\n",
134 | " ├── actual inference on container\n",
135 | " └── output_handler()\n",
136 | "```"
137 | ]
138 | },
139 | {
140 | "cell_type": "code",
141 | "execution_count": null,
142 | "metadata": {},
143 | "outputs": [],
144 | "source": [
145 | "%%writefile code/inference.py\n",
146 | "import base64\n",
147 | "import io\n",
148 | "import json\n",
149 | "import requests\n",
150 | "from PIL import Image\n",
151 | "import numpy as np\n",
152 | "\n",
153 | "def input_handler(data, context):\n",
154 | " \"\"\" Pre-process request input before it is sent to TensorFlow Serving REST API\n",
155 | "\n",
156 | " Args:\n",
157 | " data (obj): the request data stream\n",
158 | " context (Context): an object containing request and configuration details\n",
159 | "\n",
160 | " Returns:\n",
161 | " (dict): a JSON-serializable dict that contains request body and headers\n",
162 | " \"\"\"\n",
163 | "\n",
164 | " if context.request_content_type == 'application/x-image':\n",
165 | " image = Image.open(io.BytesIO(data.read()))\n",
166 | " #image = Image.open(io.BytesIO(open(data, 'rb').read()))\n",
167 | " image_np = np.asarray(image)\n",
168 | " instance = image_np[np.newaxis, ...]\n",
169 | " return json.dumps({\"instances\": instance.tolist()})\n",
170 | " else:\n",
171 | " _return_error(415, 'Unsupported content type \"{}\"'.format(context.request_content_type or 'Unknown'))\n",
172 | "\n",
173 | "\n",
174 | "def output_handler(response, context):\n",
175 | " \"\"\"Post-process TensorFlow Serving output before it is returned to the client.\n",
176 | "\n",
177 | " Args:\n",
178 | " response (obj): the TensorFlow serving response\n",
179 | " context (Context): an object containing request and configuration details\n",
180 | "\n",
181 | " Returns:\n",
182 | " (bytes, string): data to return to client, response content type\n",
183 | " \"\"\"\n",
184 | " if response.status_code != 200:\n",
185 | " _return_error(response.status_code, response.content.decode('utf-8'))\n",
186 | " response_content_type = context.accept_header\n",
187 | " prediction = response.content\n",
188 | " return prediction, response_content_type\n",
189 | "\n",
190 | "\n",
191 | "def _return_error(code, message):\n",
192 | " raise ValueError('Error: {}, {}'.format(str(code), message))"
193 | ]
194 | },
195 | {
196 | "cell_type": "code",
197 | "execution_count": null,
198 | "metadata": {},
199 | "outputs": [],
200 | "source": [
201 | "!pygmentize code/inference.py"
202 | ]
203 | },
204 | {
205 | "cell_type": "markdown",
206 | "metadata": {},
207 | "source": [
208 | "#### Tip\n",
209 | "아래 웹사이트의 **4. Pre/Post-Processing**에서 `inference.py` 내의 pre-processing 및 post-processing 스크립트의 다양한 예시들을 확인할 수 있습니다.
\n",
210 | "https://github.com/aws/sagemaker-tensorflow-serving-container"
211 | ]
212 | },
213 | {
214 | "cell_type": "markdown",
215 | "metadata": {},
216 | "source": [
217 | "## Model Packaging\n",
218 | "\n",
219 | "pre-proecssing 및 post-processing 스크립트 작성이 완료되었다면 이 스크립트들을 TensorFlow SavedModel와 함께 `model.tar.gz` 파일로 패키지화합니다. `model.tar.gz`에 포함된 파일 및 디렉터리의 구조는 아래와 같습니다.
\n",
220 | "\n",
221 | "```\n",
222 | "├── code\n",
223 | "│ ├── inference.py\n",
224 | "│ └── requirements.txt\n",
225 | "└── 1\n",
226 | " ├── saved_model.pb\n",
227 | " └── variables\n",
228 | "```\n"
229 | ]
230 | },
231 | {
232 | "cell_type": "code",
233 | "execution_count": null,
234 | "metadata": {},
235 | "outputs": [],
236 | "source": [
237 | "!tar -czvf model.tar.gz code --directory=ssd_mobilenet_v2_coco_2018_03_29 1\n",
238 | "!rm ssd_mobilenet_v2_coco_2018_03_29.tar.gz"
239 | ]
240 | },
241 | {
242 | "cell_type": "markdown",
243 | "metadata": {},
244 | "source": [
245 | "## Model Deployment \n",
246 | "\n",
247 | "## Introduction\n",
248 | "\n",
249 | "#### CPU/GPU\n",
250 | "\n",
251 | "본 노트북에서는 범용 컴퓨팅 CPU 인스턴스로 추론을 수행합니다. 왜일까요?
\n",
252 | "머신러닝 워크로드의 전체 비용을 고려할 때 추론이 전체 비용의 90%를 차지하는 경우가 많습니다. **즉, 추론 시 GPU를 그대로 사용하게 되면 매우 많은 비용이 발생하게 됩니다.**
\n",
253 | "MobileNet은 컴퓨터 성능이 제한된 모바일 등의 디바이스에서 사용될 목적으로 설계된 경량화 CNN 구조이며, Depthwise Separable Convolutions과 Linear Bottlenecks을 결합한 Inverted Residuals을 사용하여 일반적인 CNN 구조에 비해 파라미터 수를 상당히 감소시켰습니다.\n",
254 | "따라서, 대량의 데이터를 배치 변환(batch transform)해야 하거나 응답 속도가 매우 빨라야 하는(예: 10ms) 어플리케이션이 아닌 이상 CPU 인스턴스로 충분히 추론을 수행할 수 있습니다.\n",
255 | "\n",
256 | "#### Elastic Inference\n",
257 | "응답 속도가 매우 빨라야 하는 어플리케이션의 경우 GPU 인스턴스의 사용을 고려하게 될 수 있습니다.\n",
258 | "하지만, GPU 인스턴스 유형이 실시간 추론에 사용되는 경우 일반적으로 훈련과 달리 실시간 훈련에는 대량의 데이터를 모델에 지속적으로 입력하지 않으므로 일반적으로 완전히 활용되지 않습니다.
\n",
259 | "Elastic Inference는 추론에 적합한 GPU 가속을 제공하므로, 범용 컴퓨팅 CPU 인스턴스와 Elastic Inference를 같이 활용하면 GPU 인스턴스를 사용하는 것보다 훨씬 저렴한 비용으로 Endpoint를 호스팅할 수 있습니다. \n",
260 | "자세한 내용은 아래 링크를 참조해 주세요.
\n",
261 | "https://docs.aws.amazon.com/ko_kr/sagemaker/latest/dg/ei.html\n",
262 | "\n",
263 | "\n",
264 | "#### How to Use\n",
265 | "컨테이너 이미지 주소를 용도에 맞게 지정하고, Elastic Inference를 적용할 경우 `AcceleratorType`을 추가하시면 됩니다.\n",
266 | "\n",
267 | "CPU/GPU의 경우 컨테이너 이미지는\n",
268 | "```\n",
269 | "520713654638.dkr.ecr.{REGION}.amazonaws.com/sagemaker-tensorflow-serving:{TENSORFLOW_SERVING_VERSION}-{cpu|gpu}\n",
270 | "```\n",
271 | "Elastic Inference를 사용할 경우 컨테이너 이미지는\n",
272 | "```\n",
273 | "520713654638.dkr.ecr.{REGION}.amazonaws.com/sagemaker-tensorflow-serving-eia:{TENSORFLOW_SERVING_VERSION}-cpu\n",
274 | "```\n",
275 | "입니다.\n",
276 | "\n",
277 | "Elastic Inference를 CLI에서 사용하는 경우\n",
278 | "```\n",
279 | "aws sagemaker create-endpoint-config --production-variants AcceleratorType=[YOUR_EIA_INSTANCE_TYPE], ...\n",
280 | "```\n",
281 | "을 지정하시면 되고, SageMaker SDK에서 사용하는 경우\n",
282 | "```\n",
283 | "model.deploy(accelerator_type=[YOUR_EIA_INSTANCE_TYPE],...)\n",
284 | "```\n",
285 | "을 지정하시면 됩니다.\n",
286 | "\n",
287 | "
\n",
288 | "\n",
289 | "#### [주의] 아래 코드 셀을 그대로 실행하지 마시고 bucket 이름을 반드시 수정해 주세요.
\n",
290 | "```python\n",
291 | "bucket = '[YOUR-S3-BUCKET]' # as-is\n",
292 | "bucket = 'sagemaker-hol-daekeun' # to-be\n",
293 | "```"
294 | ]
295 | },
296 | {
297 | "cell_type": "code",
298 | "execution_count": null,
299 | "metadata": {},
300 | "outputs": [],
301 | "source": [
302 | "import numpy as np\n",
303 | "import os\n",
304 | "import time\n",
305 | "import sagemaker\n",
306 | "from sagemaker import get_execution_role\n",
307 | "\n",
308 | "sagemaker_session = sagemaker.Session()\n",
309 | "\n",
310 | "role = get_execution_role() # role arn\n",
311 | "region = sagemaker_session.boto_region_name\n",
312 | "#bucket = '[YOUR-S3-BUCKET]' # bucket 이름을 반드시 수정해 주세요.\n",
313 | "bucket = sagemaker_session.default_bucket()\n",
314 | "prefix = 'sagemaker/pretrained-model-inference'\n",
315 | "\n",
316 | "tfs_version = '1.13.0'\n",
317 | "processor_type = 'cpu'\n",
318 | "\n",
319 | "endpoint_name = \"endpoint-pytorch-object-detection-{}\".format(int(time.time()))\n",
320 | "t = int(time.time())\n",
321 | "\n",
322 | "model_url = \"s3://{}/model.tar.gz\".format(os.path.join(bucket, prefix))\n",
323 | "model_name = \"ssd-mobilenet-v2-tfs-{}\".format(t)\n",
324 | "\n",
325 | "endpoint_cfg_name = \"ssd-mobilenet-v2-tfs-endpoint-config-{}\".format(t)\n",
326 | "endpoint_name = \"ssd-mobilenet-v2-tfs-endpoint-{}\".format(t)\n",
327 | "\n",
328 | "container = \"520713654638.dkr.ecr.{}.amazonaws.com/sagemaker-tensorflow-serving:{}-{}\".format(\n",
329 | " region, tfs_version, processor_type)\n",
330 | "\n",
331 | "print('Region:\\t{}'.format(region))\n",
332 | "print('S3 URL:\\ts3://{}/{}'.format(bucket, prefix))\n",
333 | "#print('Role:\\t{}'.format(role))\n",
334 | "print(container)"
335 | ]
336 | },
337 | {
338 | "cell_type": "markdown",
339 | "metadata": {},
340 | "source": [
341 | "필요한 파라메터 세팅이 끝났으면 AWS CLI(Command Line Interface)를 사용하여 Endpoint를 배포해 보겠습니다.
\n",
342 | "물론 SageMaker SDK로 SageMaker 모델을 생성하고 Endpoint를 배포하는 것도 가능합니다. SageMaker SDK로 수행하는 방법은 아래의 code snippet을 참조해 주세요.\n",
343 | "\n",
344 | "```python\n",
345 | "from sagemaker.tensorflow.serving import Model\n",
346 | "\n",
347 | "model_data = sagemaker_session.upload_data('model.tar.gz', bucket, model_url)\n",
348 | " \n",
349 | "# The \"Model\" object doesn't create a SageMaker Model until a Transform Job or Endpoint is created.\n",
350 | "tensorflow_serving_model = Model(model_data=model_data,\n",
351 | " role=role,\n",
352 | " framework_version='1.13.0',\n",
353 | " sagemaker_session=sagemaker_session)\n",
354 | "\n",
355 | "predictor = tensorflow_serving_model.deploy(initial_instance_count=1,\n",
356 | " instance_type='ml.c5.large',\n",
357 | " endpoint_name='ssd-mobilenet-v2-tfs-endpoint-gpu',\n",
358 | " wait=False)\n",
359 | "```"
360 | ]
361 | },
362 | {
363 | "cell_type": "code",
364 | "execution_count": null,
365 | "metadata": {},
366 | "outputs": [],
367 | "source": [
368 | "%%bash -s \"$role\" \"$region\" \"$bucket\" \"$prefix\" \"$model_url\" \"$model_name\" \"$endpoint_cfg_name\" \"$endpoint_name\" \"$container\"\n",
369 | "# For convenience, we pass in variables in first Python set-up cell\n",
370 | "\n",
371 | "timestamp() {\n",
372 | " date +%Y-%m-%d-%H-%M-%S\n",
373 | "}\n",
374 | "\n",
375 | "ROLE_ARN=$1\n",
376 | "REGION=$2\n",
377 | "BUCKET=$3\n",
378 | "PREFIX=$4\n",
379 | "MODEL_DATA_URL=$5\n",
380 | "MODEL_NAME=$6\n",
381 | "ENDPOINT_CONFIG_NAME=$7\n",
382 | "ENDPOINT_NAME=$8\n",
383 | "CONTAINER=$9\n",
384 | "\n",
385 | "aws s3 cp model.tar.gz $MODEL_DATA_URL\n",
386 | "\n",
387 | "aws sagemaker create-model \\\n",
388 | " --model-name $MODEL_NAME \\\n",
389 | " --primary-container Image=$CONTAINER,ModelDataUrl=$MODEL_DATA_URL \\\n",
390 | " --execution-role-arn $ROLE_ARN\n",
391 | " \n",
392 | "VARIANT_NAME=\"TFS\"\n",
393 | "INITIAL_INSTANCE_COUNT=1\n",
394 | "INSTANCE_TYPE=\"ml.c5.large\"\n",
395 | "aws sagemaker create-endpoint-config \\\n",
396 | " --endpoint-config-name $ENDPOINT_CONFIG_NAME \\\n",
397 | " --production-variants VariantName=$VARIANT_NAME,ModelName=$MODEL_NAME,InitialInstanceCount=$INITIAL_INSTANCE_COUNT,InstanceType=$INSTANCE_TYPE\n",
398 | "\n",
399 | "aws sagemaker create-endpoint \\\n",
400 | " --endpoint-name $ENDPOINT_NAME \\\n",
401 | " --endpoint-config-name $ENDPOINT_CONFIG_NAME "
402 | ]
403 | },
404 | {
405 | "cell_type": "markdown",
406 | "metadata": {},
407 | "source": [
408 | "Endpoint가 생성되기까지는 약 10분 정도 소요됩니다. Endpoint 생성이 완료되기 전까지 **2. Real time Inference**를 실행하지 말아 주세요.
SageMaker 대쉬보드에서 Endpoint 생성 여부를 확인하실 수 있으며, Status가 InService가 될 때까지 기다려 주세요.\n",
409 | "(아래 Figure 참조)\n",
410 | ""
411 | ]
412 | },
413 | {
414 | "cell_type": "markdown",
415 | "metadata": {},
416 | "source": [
417 | "
\n",
418 | "\n",
419 | "# 2. Real time Inference\n",
420 | "----\n",
421 | "\n",
422 | "### Introduction\n",
423 | "\n",
424 | "Endpoint 생성이 완료되었으면, 사전에 설치한 object detection API를 활용하여 테스트 영상에 대한 object detection을 수행해 보겠습니다.
\n",
425 | "참고로, `show_detection_result()` 함수는 TensorFlow Object dertection API의 built-in 함수로 검출된 object의 bounding box를 그리고, `my_show_detection_result()` 는 직접 구현한 로직으로 검출된 object의 bounding box를 그립니다."
426 | ]
427 | },
428 | {
429 | "cell_type": "code",
430 | "execution_count": null,
431 | "metadata": {},
432 | "outputs": [],
433 | "source": [
434 | "def get_label_map(label_file):\n",
435 | " label_map = {}\n",
436 | " labels = open(label_file, 'r')\n",
437 | " \n",
438 | " for line in labels:\n",
439 | " line = line.rstrip(\"\\n\")\n",
440 | " ids = line.split(',')\n",
441 | " label_map[int(ids[0])] = ids[2] \n",
442 | " \n",
443 | " return label_map\n",
444 | "\n",
445 | "\n",
446 | "def show_detection_result(img_filepath, endpoint_name, min_score_thresh=0.5):\n",
447 | " import numpy as np\n",
448 | " from PIL import Image\n",
449 | " from object_detection.utils import ops as utils_ops\n",
450 | " from object_detection.utils import label_map_util\n",
451 | " from object_detection.utils import visualization_utils as vis_util\n",
452 | "\n",
453 | " with open(img_filepath, 'rb') as data:\n",
454 | " json_response = sagemaker_client.invoke_endpoint(\n",
455 | " EndpointName=endpoint_name,\n",
456 | " ContentType='application/x-image',\n",
457 | " Body=data)\n",
458 | "\n",
459 | " res_json = json.loads(json_response['Body'].read().decode(\"utf-8\"))\n",
460 | " predictions = res_json['predictions'][0]\n",
461 | " \n",
462 | " image = Image.open(img_filepath)\n",
463 | " width, height = image.size \n",
464 | " image_np = np.array(image)\n",
465 | " image = image_np.reshape([-1, height, width, 3])\n",
466 | "\n",
467 | " category_index = label_map_util.create_category_index_from_labelmap('files/mscoco_label_map.pbtxt', \n",
468 | " use_display_name=True)\n",
469 | "\n",
470 | " vis_util.visualize_boxes_and_labels_on_image_array(\n",
471 | " image_np,\n",
472 | " np.array(predictions['detection_boxes']),\n",
473 | " np.array(predictions['detection_classes']).astype(np.int64),\n",
474 | " np.array(predictions['detection_scores']),\n",
475 | " category_index=category_index,\n",
476 | " instance_masks=predictions.get('detection_masks_reframed', None),\n",
477 | " min_score_thresh=min_score_thresh,\n",
478 | " use_normalized_coordinates=True,\n",
479 | " line_thickness=4)\n",
480 | " \n",
481 | " display(Image.fromarray(image_np))\n",
482 | " \n",
483 | " return predictions\n",
484 | "\n",
485 | "\n",
486 | "def my_show_detection_result(img_filepath, endpoint_name, label_map, min_score_thresh=0.5):\n",
487 | " import numpy as np\n",
488 | " from PIL import Image\n",
489 | " import matplotlib.patches as patches\n",
490 | " import matplotlib.pyplot as plt\n",
491 | " import random \n",
492 | "\n",
493 | " with open(img_filepath, 'rb') as data:\n",
494 | " json_response = sagemaker_client.invoke_endpoint(\n",
495 | " EndpointName=endpoint_name,\n",
496 | " ContentType='application/x-image',\n",
497 | " Body=data)\n",
498 | "\n",
499 | " res_json = json.loads(json_response['Body'].read().decode(\"utf-8\"))\n",
500 | " predictions = res_json['predictions'][0]\n",
501 | " \n",
502 | " image = Image.open(img_filepath)\n",
503 | " width, height = image.size \n",
504 | " image_np = np.array(image)\n",
505 | "\n",
506 | " num_detections = int(predictions['num_detections'])\n",
507 | " bboxes = predictions['detection_boxes'][0:num_detections]\n",
508 | " scores = np.array(predictions['detection_scores'][0:num_detections])\n",
509 | "\n",
510 | " if min_score_thresh is not None:\n",
511 | " scores = scores[scores > min_score_thresh]\n",
512 | " num_detections = int(predictions['num_detections'])\n",
513 | " bboxes = predictions['detection_boxes'][0:num_detections]\n",
514 | "\n",
515 | " cids = predictions['detection_classes'][0:num_detections]\n",
516 | " cids = list(map(int, cids))\n",
517 | " unique_labels = set(cids)\n",
518 | "\n",
519 | "\n",
520 | " plt.figure()\n",
521 | " fig, ax = plt.subplots(1, figsize=(10,10))\n",
522 | " ax.imshow(image_np)\n",
523 | " \n",
524 | " # Get bounding-box colors\n",
525 | " cmap = plt.get_cmap('tab20b')\n",
526 | " bbox_colors = [cmap(i) for i in np.linspace(0, 1, len(label_map))]\n",
527 | "\n",
528 | " for bbox, cls_pred, cls_conf in zip(bboxes, cids, scores):\n",
529 | " ymin, xmin, ymax, xmax = bbox\n",
530 | " (xmin, xmax, ymin, ymax) = (xmin * width, xmax * width, ymin * height, ymax * height)\n",
531 | "\n",
532 | " # Get box height and width\n",
533 | " box_h = ymax - ymin\n",
534 | " box_w = xmax - xmin\n",
535 | "\n",
536 | " # Add a box with the color for this class\n",
537 | " color = bbox_colors[cls_pred]\n",
538 | "\n",
539 | " bbox = patches.Rectangle((xmin, ymin), box_w, box_h, linewidth=3, edgecolor=color, facecolor='none')\n",
540 | " ax.add_patch(bbox) \n",
541 | " label = '{} {:.3f}'.format(label_map[cls_pred], cls_conf)\n",
542 | " plt.text(xmin, ymin, s=label, color='white', verticalalignment='top',\n",
543 | " bbox={'color': color, 'pad': 0}) \n",
544 | "\n",
545 | "\n",
546 | " return predictions"
547 | ]
548 | },
549 | {
550 | "cell_type": "markdown",
551 | "metadata": {},
552 | "source": [
553 | "테스트 이미지로 COCO dataset(http://cocodataset.org/#download) 의 2017 Val images를 6장 준비했습니다. 핸즈온을 잘 진행하셨다면, 주어진 6장의 영상 파일들 외에 여러분의 영상 파일들을 업로드하여 자유롭게 테스트해 보세요.
\n",
554 | "\n",
555 | "참고로, Object Detection 출력 결과인 bounding box의 기본 임계치(threshold)는 0.5입니다."
556 | ]
557 | },
558 | {
559 | "cell_type": "markdown",
560 | "metadata": {},
561 | "source": [
562 | "**[Note] 본 주피터 노트북을 셧다운하고 재실행 시 엔드포인트를 삭제하지 않았다면, 여러분께서는 새로 엔드포인트를 배포할 필요가 없이 상기 코드 셀에서 지정한 엔드포인트명이나 SageMaker 콘솔의 엔드포인트명을 직접 지정하여 추론을 쉽게 수행할 수 있습니다. (아래 Figure 참조)**\n",
563 | "\n",
564 | ""
565 | ]
566 | },
567 | {
568 | "cell_type": "code",
569 | "execution_count": null,
570 | "metadata": {},
571 | "outputs": [],
572 | "source": [
573 | "endpoint_name"
574 | ]
575 | },
576 | {
577 | "cell_type": "code",
578 | "execution_count": null,
579 | "metadata": {},
580 | "outputs": [],
581 | "source": [
582 | "#endpoint_name = '[YOUR-ENDPOINT-NAME]'"
583 | ]
584 | },
585 | {
586 | "cell_type": "code",
587 | "execution_count": null,
588 | "metadata": {},
589 | "outputs": [],
590 | "source": [
591 | "%matplotlib inline\n",
592 | "import os, boto3, json\n",
593 | "sagemaker_client = boto3.client('sagemaker-runtime')\n",
594 | "\n",
595 | "path = \"./images/test\"\n",
596 | "img_list = os.listdir(path)\n",
597 | "img_list = [os.path.join(path, img) for img in img_list]\n",
598 | "label_map = get_label_map('files/coco_labels.txt')\n",
599 | "\n",
600 | "for img in img_list:\n",
601 | " pred = my_show_detection_result(img, endpoint_name, label_map)"
602 | ]
603 | },
604 | {
605 | "cell_type": "markdown",
606 | "metadata": {},
607 | "source": [
608 | "## (Optional) Clean up\n",
609 | "\n",
610 | "SageMaker Endpoint로 인한 과금을 막기 위해, 본 핸즈온이 끝나면 반드시 Endpoint를 삭제해 주시기 바랍니다."
611 | ]
612 | }
613 | ],
614 | "metadata": {
615 | "kernelspec": {
616 | "display_name": "conda_tensorflow_p36",
617 | "language": "python",
618 | "name": "conda_tensorflow_p36"
619 | },
620 | "language_info": {
621 | "codemirror_mode": {
622 | "name": "ipython",
623 | "version": 3
624 | },
625 | "file_extension": ".py",
626 | "mimetype": "text/x-python",
627 | "name": "python",
628 | "nbconvert_exporter": "python",
629 | "pygments_lexer": "ipython3",
630 | "version": "3.6.10"
631 | }
632 | },
633 | "nbformat": 4,
634 | "nbformat_minor": 4
635 | }
636 |
--------------------------------------------------------------------------------