├── LICENSE
├── README.md
├── SketchyDataset_README.md
├── check
├── __init__.py
└── feature_vis.py
├── data
├── __init__.py
├── image_input.py
└── triplet_input.py
├── feature
└── feature_extract.py
├── models
├── TripletEmbedding.py
├── TripletLoss.py
├── __init__.py
├── sketch_resnet.py
└── vgg.py
├── record
├── feature_vis
│ ├── resnet50_0.png
│ ├── resnet50_315.png
│ ├── resnet50_64_265.png
│ ├── resnet50_940.png
│ ├── resnet_150.png
│ ├── resnet_90.png
│ ├── resnet_imagenet.png
│ ├── resnet_separate.png
│ ├── vgg16_190epoch.png
│ ├── vgg16_5epoch.png
│ ├── vgg16_995epoch.png
│ └── vgg16_baseline.png
└── retrieval_reslut
│ └── result.png
├── retrieval_test.py
├── test_img.txt
├── test_sketch.txt
├── train.py
└── utils
├── __init__.py
├── extractor.py
├── feature_visualizer.py
├── test.py
└── visualize.py
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2019
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # SketchyDatabase
2 |
3 | [](LICENSE)    
4 |
5 | This project is a repo of [**The Sketchy Database: Learning to Retrieve Badly Drawn Bunnies**](https://www.cc.gatech.edu/~hays/tmp/sketchy-database.pdf).
6 |
7 | The [**homepage**](http://sketchy.eye.gatech.edu/) of the original project.
8 |
9 | Get the dataset via Google Drive [sketchydataset](https://drive.google.com/file/d/0B7ISyeE8QtDdTjE1MG9Gcy1kSkE/view?usp=sharing&resourcekey=0-r6nB4crmdU-LK7H38xnOUw) [SketchyDataset Intro](https://github.com/CDOTAD/SketchyDatabase/blob/master/SketchyDataset_README.md)
10 |
11 | # DataSet
12 |
13 | Sketchy Database
14 |
15 | ### Test Set
16 |
17 | As I didn't notice that the Sketchy Database contained a list of the testing photos, I randomly chose the testing photos and their related sketches myself. The test data set are listed in [TEST_IMG](test_img.txt) and [TEST_SKETCH](test_sketch.txt)
18 |
19 | | category | photo | sketch |
20 | | :---: | :---: | :---: |
21 | | airplane | 10 | 75 |
22 | | alarm_clock | | 52 |
23 | | ant | | 53 |
24 | | . | | . |
25 | | . | | . |
26 | | . | | . |
27 | | window | | 54 |
28 | | wine_bottle | | 52 |
29 | | zebra | | 66 |
30 | | **Total** | 1250 | 7875 |
31 |
32 | ### The Dataset Structure in My Project
33 |
34 | ```Bash
35 | Dataset
36 | ├── photo-train # the training set of photos
37 | ├── sketch-triplet-train # the training set of sketches
38 | ├── photo-test # the testing set of photos
39 | ├── sketch-triplet-test # the testing set of sketches
40 | ```
41 | # Test
42 |
43 | using [feature_extract.py](https://github.com/CDOTAD/SketchyDatabase/blob/master/feature/feature_extract.py) to get the extracted feature files ('\*.pkl')
44 |
45 | using [retrieval_test.py](https://github.com/CDOTAD/SketchyDatabase/blob/master/retrieval_test.py) to get the testing result.
46 |
47 | # Testing Result
48 |
49 | There is no GoogLeNet, which resulted the best in the original paper, implement in PyTorch, so I used vgg16 instead.
50 |
51 | | model | epoch | recall@1 | recall@5|
52 | | :---: | :---: | :---: | :---: |
53 | | resnet34(pretrained;mixed training set;metric='cosine') | | | |
54 | | | 90 | 8.51% | 18.68% |
55 | | | 150 | 9.31% | 20.44% |
56 | |resnet34(pretrained;mixed training set;metric='euclidean')| | | |
57 | | | 90 | 6.45% | 14.79% |
58 | | | 150 | 6.96% | 16.46% |
59 | |resnet34(150 epoch;triplet loss m=0.02;metric='euclidean';lr=1e-5 batch_size=16)| | | |
60 | | | 85 | 9.87% | 22.37% |
61 | |vgg16(pretrained;triplet loss m=0.3;metric='euclidean';lr=1e-5;batch_size=16) | | | |
62 | | | 0 | 0.17% | 0.72% |
63 | | | 5 | 17.59% | 45.51% |
64 | | | 190 | 31.03% | 67.86% |
65 | | | 275 | 32.22% | 68.48% |
66 | | | 975 | 35.24% | 71.53% |
67 | |vgg16(fine-tune(275epoch);m=0.15;metric='euclidean';lr=1e-7;batch_size=16) | | | |
68 | | | 55 | 33.22% | 70.04% |
69 | | | 625 | 35.78% | 72.44% |
70 | | | 995 | 36.09% | 73.02% |
71 | |resnet50(pretrained; triplet loss m=0.15; metric='euclidean'; lr=1e-7;batch_size=16)||||
72 | | | 0 | 0.71% | 11.48% |
73 | | | 55 | 10.18% | 29.94% |
74 | | | 940 | 15.17% | 47.61% |
75 | |resnet50(pretrained; triplet loss m=0.1; metric='euclidean'; lr=1e-6 batch_size=32)||| |
76 | | | 315 | 19.58% | 57.19% |
77 | |resnet50(pretrained; triplet loss m=0.3; metric='euclidean'; lr=1e-5 batch_size=48)||| |
78 | | | 20 | 21.56% | 57.50% |
79 | | | 95 | 30.32% | 71.73% |
80 | | | 265 | 40.08% | 78.83% |
81 | | | 930 | 46.04% | 83.30% |
82 |
83 | I have no idea about why the resnet34 got that bad result, while the vgg16 and resnet50 resulted pretty well.
84 |
85 | ### Retrieval Result
86 |
87 | I randomly chose 20 sketches as the query sketch and here is the retrieval result. The model I used is the [resnet50](#resnet)(pretrained; triplet loss m=0.3; metric='euclidean'; lr=1e-5 batch_size=48) after 265 training epoch.
88 |
89 | 
90 |
91 | # Feature Visulization via T-SNE
92 |
93 | all the visulizated categories are the first ten categories in alphabetical order.
94 |
95 | **The boxes represent the photos, while the points represent the sketches.**
96 |
97 | | model | vis |
98 | | :---: | :---: |
99 | | resnet34 pretrained on ImageNet ||
100 | | **pretrained; sketch branch& photo branch are trained sparately** |
101 | | resnet34 ||
102 | | **pretrained; mixed training set** |
103 | | resnet34 after 90 training epoch |  |
104 | | resnet34 after 150 training epoch |  |
105 | | **pretrained; triplet loss m=0.3 lr=1e-5** |
106 | | vgg16 after 0 training epoch | |
107 | | vgg16 after 5 training epoch | |
108 | | vgg16 after 190 training epoch | |
109 | | **fine tune; triplet loss m=0.15 lr=1e-7** |
110 | | vgg16(fine tune) after 995 training epoch | |
111 | | **pretrained; triplet loss m=0.15 lr=1e-7** |
112 | | resnet50 after 0 training epoch |  |
113 | | resnet50 after 940 training epoch |  |
114 | | **pretrained; triplet loss m=0.1 lr=1e-6** |
115 | | resnet50 after 315 training epoch |  |
116 | | **pretrained; triplet loss m=0.3 lr=1e-5 batch_size=48** |
117 | | resnet50 after 265 training epoch ||
118 |
--------------------------------------------------------------------------------
/SketchyDataset_README.md:
--------------------------------------------------------------------------------
1 | Sketchy Database - Rendered sketches and augmented photos
2 |
3 | Contents:
4 | photo - a directory containing two different renderings of
5 | all photographs contained within the Sketchy Database
6 | sketch - a directory containing six different renderings of
7 | all sketches contained within the Sketchy Database
8 |
9 | Photographs
10 | All photographs are rendered in JPEG format. Resizing is
11 | performed via OpenCV Imgproc (typically area interpolation
12 | for full images renderings and cubic for bounding box
13 | renderings).
14 |
15 | Augmentations (directories within 'photo')
16 |
17 | tx_000000000000 : image is non-uniformly scaled to 256x256
18 | tx_000100000000 : image bounding box scaled to 256x256 with
19 | an additional +10% on each edge; note
20 | that due to position within the image,
21 | sometimes the object is not centered
22 |
23 | Sketches
24 | All sketches are rendered in PNG format. The original
25 | sketch canvas size is 640x480. In rendering the sketch to a
26 | 256x256 canvas, we take the original photo aspect ratio
27 | as well as the original sketch canvas aspect ratio into
28 | account. We render sketches such that they are consistent
29 | with the transformation made to the image (non-uniform
30 | scale to 256x256). In order to ensure sketches remain fully
31 | on the canvas, some minor adjustments to scale and/or
32 | location are occasionally necessary.
33 | All sketches are rendered using custom OpenGL code, with
34 | a PNG encoding provided by Java's ImageIO API.
35 |
36 | Augmentations (directories within 'sketch')
37 |
38 | tx_000000000000 : sketch canvas is rendered to 256x256
39 | such that it undergoes the same
40 | scaling as the paired photo
41 | tx_000100000000 : sketch is centered and uniformly scaled
42 | such that its greatest dimension (x or y)
43 | fills 78% of the canvas (roughly the same
44 | as in Eitz 2012 sketch data set)
45 | tx_000000000010 : sketch is translated such that it is
46 | centered on the object bounding box
47 | tx_000000000110 : sketch is centered on bounding box and
48 | is uniformly scaled such that one dimension
49 | (x or y; whichever requires the least amount
50 | of scaling) fits within the bounding box
51 | tx_000000001010 : sketch is centered on bounding box and
52 | is uniformly scaled such that one dimension
53 | (x or y; whichever requires the most amount
54 | of scaling) fits within the bounding box
55 | tx_000000001110 : sketch is centered on bounding box and
56 | is non-uniformly scaled such that it
57 | completely fits within the bounding box
58 |
59 |
60 |
--------------------------------------------------------------------------------
/check/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/check/__init__.py
--------------------------------------------------------------------------------
/check/feature_vis.py:
--------------------------------------------------------------------------------
1 | from utils.feature_visualizer import FeatureVisualizer
2 |
3 | test_set_root = '/data1/zzl/dataset/sketch-triplet-test'
4 | test_photo_root = '/data1/zzl/dataset/photo-test'
5 | vis = FeatureVisualizer(data_root=test_set_root, feature_root='../feature/sketch-vgg-190epoch.pkl',
6 | env='caffe2torch_featurevis')
7 | # vis.visualize(10, win='training together sketch')
8 |
9 | vis.embedding_vis(em_data_root=test_photo_root, em_feature_root='../feature/photo-vgg-190epoch.pkl',
10 | min_class=0, max_class=10, win='embedding')
11 |
--------------------------------------------------------------------------------
/data/__init__.py:
--------------------------------------------------------------------------------
1 | from data.triplet_input import TripleDataset
2 | from data.image_input import ImageDataset
3 | import torch.utils.data
4 |
5 |
6 | class TripleDataLoader(object):
7 | def __init__(self, opt):
8 | self.dataset = TripleDataset(opt.photo_root, opt.sketch_root)
9 | self.dataloader = torch.utils.data.DataLoader(
10 | self.dataset,
11 | shuffle=True,
12 | batch_size=opt.batch_size,
13 | num_workers=4,
14 | drop_last=True
15 | )
16 |
17 | def load_data(self):
18 | return self
19 |
20 | def __len__(self):
21 | return len(self.dataset)
22 |
23 | def __iter__(self):
24 | for i, data in enumerate(self.dataloader):
25 | yield data
26 |
27 |
28 | class ImageDataLoader(object):
29 | def __init__(self, opt):
30 | self.dataset = ImageDataset(opt.image_root)
31 | self.dataloader = torch.utils.data.DataLoader(
32 | self.dataset,
33 | shuffle=False,
34 | batch_size=opt.batch_size,
35 | num_workers=4,
36 | drop_last=False
37 | )
38 |
39 | def load_data(self):
40 | return self
41 |
42 | def __len__(self):
43 | return len(self.dataset)
44 |
45 | def __iter__(self):
46 | for i, data in enumerate(self.dataloader):
47 | yield data
--------------------------------------------------------------------------------
/data/image_input.py:
--------------------------------------------------------------------------------
1 | import os
2 | import torch.utils.data as data
3 | import torchvision.transforms as transforms
4 | from PIL import Image
5 |
6 |
7 | def make_dataset(root):
8 | images = []
9 |
10 | cnames = os.listdir(root)
11 | for cname in cnames:
12 | c_path = os.path.join(root, cname)
13 | if os.path.isdir(c_path):
14 | fnames = os.listdir(c_path)
15 | for fname in fnames:
16 | path = os.path.join(c_path, fname)
17 | images.append(path)
18 |
19 | return images
20 |
21 |
22 | class ImageDataset(data.Dataset):
23 | def __init__(self, image_root):
24 | self.tranform = transforms.Compose([
25 | transforms.Resize(224),
26 | transforms.CenterCrop(224),
27 | transforms.ToTensor(),
28 | transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
29 | ])
30 |
31 | self.image_root = image_root
32 | self.image_paths = sorted(make_dataset(self.image_root))
33 |
34 | self.len = len(self.image_paths)
35 |
36 | def __getitem__(self, index):
37 |
38 | image_path = self.image_paths[index]
39 |
40 | image = Image.open(image_path).convert('RGB')
41 | image = self.tranform(image)
42 |
43 | image_path = image_path.split('/')
44 | cname = image_path[-2]
45 | fname = image_path[-1]
46 |
47 | name = cname + '/' + fname
48 |
49 | return {'I': image, 'N': name}
50 |
51 | def __len__(self):
52 | return self.len
53 |
54 |
--------------------------------------------------------------------------------
/data/triplet_input.py:
--------------------------------------------------------------------------------
1 | import os
2 | import torch.utils.data as data
3 | import torchvision.transforms as transforms
4 | import numpy as np
5 | from PIL import Image
6 |
7 |
8 | def find_classes(root):
9 | classes = [d for d in os.listdir(root) if os.path.isdir(os.path.join(root, d))]
10 | classes.sort()
11 | class_to_idex = {classes[i]: i for i in range(len(classes))}
12 | return classes, class_to_idex
13 |
14 |
15 | def make_dataset(root):
16 | images = []
17 |
18 | cnames = os.listdir(root)
19 | for cname in cnames:
20 | c_path = os.path.join(root, cname)
21 | if os.path.isdir(c_path):
22 | fnames = os.listdir(c_path)
23 | for fname in fnames:
24 | path = os.path.join(c_path, fname)
25 | images.append(path)
26 |
27 | return images
28 |
29 |
30 | class TripleDataset(data.Dataset):
31 | def __init__(self, photo_root, sketch_root):
32 | super(TripleDataset, self).__init__()
33 | self.tranform = transforms.Compose([
34 | transforms.Resize(224),
35 | transforms.CenterCrop(224),
36 | transforms.ToTensor(),
37 | transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
38 | ])
39 | classes, class_to_idx = find_classes(photo_root)
40 |
41 | self.photo_root = photo_root
42 | self.sketch_root = sketch_root
43 |
44 | self.photo_paths = sorted(make_dataset(self.photo_root))
45 | self.classes = classes
46 | self.class_to_idx = class_to_idx
47 |
48 | self.len = len(self.photo_paths)
49 |
50 | def __getitem__(self, index):
51 |
52 | photo_path = self.photo_paths[index]
53 | sketch_path, label = self._getrelate_sketch(photo_path)
54 |
55 | photo = Image.open(photo_path).convert('RGB')
56 | sketch = Image.open(sketch_path).convert('RGB')
57 |
58 | P = self.tranform(photo)
59 | S = self.tranform(sketch)
60 | L = label
61 | return {'P': P, 'S': S, 'L': L}
62 |
63 | def __len__(self):
64 | return self.len
65 |
66 | def _getrelate_sketch(self, photo_path):
67 |
68 | paths = photo_path.split('/')
69 | fname = paths[-1].split('.')[0]
70 | cname = paths[-2]
71 |
72 | label = self.class_to_idx[cname]
73 |
74 | sketchs = sorted(os.listdir(os.path.join(self.sketch_root, cname)))
75 |
76 | sketch_rel = []
77 | for sketch_name in sketchs:
78 | if sketch_name.split('-')[0] == fname:
79 | sketch_rel.append(sketch_name)
80 |
81 | rnd = np.random.randint(0, len(sketch_rel))
82 |
83 | sketch = sketch_rel[rnd]
84 |
85 | return os.path.join(self.sketch_root, cname, sketch), label
86 |
--------------------------------------------------------------------------------
/feature/feature_extract.py:
--------------------------------------------------------------------------------
1 | from utils.extractor import Extractor
2 | from models.vgg import vgg16
3 | from models.sketch_resnet import resnet50
4 | import torch as t
5 | from torch import nn
6 | import os
7 |
8 | # The script to extract sketches or photos' features using the trained model
9 |
10 | os.environ['CUDA_VISIBLE_DEVICES'] = '1'
11 |
12 | train_set_root = '/data1/zzl/dataset/sketch-triplet-train'
13 | test_set_root = '/data1/zzl/dataset/sketch-triplet-test'
14 |
15 | train_photo_root = '/data1/zzl/dataset/photo-train'
16 | test_photo_root = '/data1/zzl/dataset/photo-test'
17 |
18 | # The trained model root for resnet
19 | SKETCH_RESNET = '/data1/zzl/model/caffe2torch/mixed_triplet_loss/sketch/sketch_resnet_85.pth'
20 | PHOTO_RESNET = '/data1/zzl/model/caffe2torch/mixed_triplet_loss/photo/photo_resnet_85.pth'
21 |
22 | # The trained model root for vgg
23 | SKETCH_VGG = '/data1/zzl/model/caffe2torch/vgg_triplet_loss/sketch/sketch_vgg_190.pth'
24 | PHOTO_VGG = '/data1/zzl/model/caffe2torch/vgg_triplet_loss/photo/photo_vgg_190.pth'
25 |
26 | FINE_TUNE_RESNET = '/data1/zzl/model/caffe2torch/fine_tune/model_270.pth'
27 |
28 | device = 'cuda:1'
29 |
30 | '''vgg'''
31 | vgg = vgg16(pretrained=False)
32 | vgg.classifier[6] = nn.Linear(in_features=4096, out_features=125, bias=True)
33 | vgg.load_state_dict(t.load(PHOTO_VGG, map_location=t.device('cpu')))
34 | vgg.cuda()
35 |
36 | ext = Extractor(pretrained=False)
37 | ext.reload_model(vgg)
38 |
39 | photo_feature = ext.extract_with_dataloader(test_photo_root, 'photo-vgg-190epoch.pkl')
40 |
41 | vgg.load_state_dict(t.load(SKETCH_VGG, map_location=t.device('cpu')))
42 | ext.reload_model(vgg)
43 |
44 | sketch_feature = ext.extract_with_dataloader(test_set_root, 'sketch-vgg-190epoch.pkl')
45 |
46 |
47 | '''resnet'''
48 | resnet = resnet50()
49 | resnet.fc = nn.Linear(in_features=2048, out_features=125)
50 | resnet.load_state_dict(t.load(PHOTO_RESNET, map_location=t.device('cpu')))
51 | resnet.cuda()
52 |
53 | ext = Extractor(pretrained=False)
54 | ext.reload_model(resnet)
55 |
56 | photo_feature = ext.extract_with_dataloader(test_photo_root, 'photo-resnet-epoch.pkl')
57 |
58 | resnet.load_state_dict(t.load(SKETCH_RESNET, map_location=t.device('cpu')))
59 | ext.reload_model(resnet)
60 |
61 | sketch_feature = ext.extract_with_dataloader(test_set_root, 'sketch-resnet-epoch.pkl')
62 |
63 |
--------------------------------------------------------------------------------
/models/TripletEmbedding.py:
--------------------------------------------------------------------------------
1 | import torch as t
2 | from torch import nn
3 | from data import TripleDataLoader
4 | from models.vgg import vgg16
5 | from models.sketch_resnet import resnet34, resnet50
6 | from utils.visualize import Visualizer
7 | from torchnet.meter import AverageValueMeter
8 | # import tqdm
9 | from utils.extractor import Extractor
10 | from sklearn.neighbors import NearestNeighbors
11 | from torch.nn import DataParallel
12 | from .TripletLoss import TripletLoss
13 | from utils.test import Tester
14 | import os
15 | import numpy as np
16 |
17 |
18 | class Config(object):
19 | def __init__(self):
20 | return
21 |
22 |
23 | class TripletNet(object):
24 |
25 | def __init__(self, opt):
26 | # train config
27 | self.photo_root = opt.photo_root
28 | self.sketch_root = opt.sketch_root
29 | self.batch_size = opt.batch_size
30 | # self.device = opt.device
31 | self.epochs = opt.epochs
32 | self.lr = opt.lr
33 |
34 | # testing config
35 | self.photo_test = opt.photo_test
36 | self.sketch_test = opt.sketch_test
37 | self.test = opt.test
38 | self.test_f = opt.test_f
39 |
40 | self.save_model = opt.save_model
41 | self.save_dir = opt.save_dir
42 |
43 | # vis
44 | self.vis = opt.vis
45 | self.env = opt.env
46 |
47 | # fine_tune
48 | self.fine_tune = opt.fine_tune
49 | self.model_root = opt.model_root
50 |
51 | # dataloader config
52 | data_opt = Config()
53 | data_opt.photo_root = opt.photo_root
54 | data_opt.sketch_root = opt.sketch_root
55 | data_opt.batch_size = opt.batch_size
56 |
57 | self.dataloader_opt = data_opt
58 |
59 | # triplet config
60 | self.margin = opt.margin
61 | self.p = opt.p
62 |
63 | # feature extractor net
64 | self.net = opt.net
65 | self.cat = opt.cat
66 |
67 | def _get_vgg16(self, pretrained=True):
68 | model = vgg16(pretrained=pretrained)
69 | model.classifier[6] = nn.Linear(in_features=4096, out_features=125, bias=True)
70 |
71 | return model
72 |
73 | def _get_resnet34(self, pretrained=True):
74 | model = resnet34(pretrained=pretrained)
75 | model.fc = nn.Linear(in_features=512, out_features=125)
76 |
77 | return model
78 |
79 | def _get_resnet50(self, pretrained=True):
80 | model = resnet50(pretrained=pretrained)
81 | model.fc = nn.Linear(in_features=2048, out_features=125)
82 |
83 | return model
84 |
85 | def train(self):
86 |
87 | if self.net == 'vgg16':
88 | photo_net = DataParallel(self._get_vgg16()).cuda()
89 | sketch_net = DataParallel(self._get_vgg16()).cuda()
90 | elif self.net == 'resnet34':
91 | photo_net = DataParallel(self._get_resnet34()).cuda()
92 | sketch_net = DataParallel(self._get_resnet34()).cuda()
93 | elif self.net == 'resnet50':
94 | photo_net = DataParallel(self._get_resnet50()).cuda()
95 | sketch_net = DataParallel(self._get_resnet50()).cuda()
96 |
97 | if self.fine_tune:
98 | photo_net_root = self.model_root
99 | sketch_net_root = self.model_root.replace('photo', 'sketch')
100 |
101 | photo_net.load_state_dict(t.load(photo_net_root, map_location=t.device('cpu')))
102 | sketch_net.load_state_dict(t.load(sketch_net_root, map_location=t.device('cpu')))
103 |
104 | print('net')
105 | print(photo_net)
106 |
107 | # triplet_loss = nn.TripletMarginLoss(margin=self.margin, p=self.p).cuda()
108 | photo_cat_loss = nn.CrossEntropyLoss().cuda()
109 | sketch_cat_loss = nn.CrossEntropyLoss().cuda()
110 |
111 | my_triplet_loss = TripletLoss().cuda()
112 |
113 | # optimizer
114 | photo_optimizer = t.optim.Adam(photo_net.parameters(), lr=self.lr)
115 | sketch_optimizer = t.optim.Adam(sketch_net.parameters(), lr=self.lr)
116 |
117 | if self.vis:
118 | vis = Visualizer(self.env)
119 |
120 | triplet_loss_meter = AverageValueMeter()
121 | sketch_cat_loss_meter = AverageValueMeter()
122 | photo_cat_loss_meter = AverageValueMeter()
123 |
124 | data_loader = TripleDataLoader(self.dataloader_opt)
125 | dataset = data_loader.load_data()
126 |
127 | for epoch in range(self.epochs):
128 |
129 | print('---------------{0}---------------'.format(epoch))
130 |
131 | if self.test and epoch % self.test_f == 0:
132 |
133 | tester_config = Config()
134 | tester_config.test_bs = 128
135 | tester_config.photo_net = photo_net
136 | tester_config.sketch_net = sketch_net
137 |
138 | tester_config.photo_test = self.photo_test
139 | tester_config.sketch_test = self.sketch_test
140 |
141 | tester = Tester(tester_config)
142 | test_result = tester.test_instance_recall()
143 |
144 | result_key = list(test_result.keys())
145 | vis.plot('recall', np.array([test_result[result_key[0]], test_result[result_key[1]]]),
146 | legend=[result_key[0], result_key[1]])
147 | if self.save_model:
148 | t.save(photo_net.state_dict(), self.save_dir + '/photo' + '/photo_' + self.net + '_%s.pth' % epoch)
149 | t.save(sketch_net.state_dict(), self.save_dir + '/sketch' + '/sketch_' + self.net + '_%s.pth' % epoch)
150 |
151 | photo_net.train()
152 | sketch_net.train()
153 |
154 | for ii, data in enumerate(dataset):
155 |
156 | photo_optimizer.zero_grad()
157 | sketch_optimizer.zero_grad()
158 |
159 | photo = data['P'].cuda()
160 | sketch = data['S'].cuda()
161 | label = data['L'].cuda()
162 |
163 | p_cat, p_feature = photo_net(photo)
164 | s_cat, s_feature = sketch_net(sketch)
165 |
166 | # category loss
167 | p_cat_loss = photo_cat_loss(p_cat, label)
168 | s_cat_loss = sketch_cat_loss(s_cat, label)
169 |
170 | photo_cat_loss_meter.add(p_cat_loss.item())
171 | sketch_cat_loss_meter.add(s_cat_loss.item())
172 |
173 | # triplet loss
174 | loss = p_cat_loss + s_cat_loss
175 |
176 | # tri_record = 0.
177 | '''
178 | for i in range(self.batch_size):
179 | # negative
180 | negative_feature = t.cat([p_feature[0:i, :], p_feature[i + 1:, :]], dim=0)
181 | # print('negative_feature.size :', negative_feature.size())
182 | # photo_feature
183 | anchor_feature = s_feature[i, :]
184 | anchor_feature = anchor_feature.expand_as(negative_feature)
185 | # print('anchor_feature.size :', anchor_feature.size())
186 |
187 | # positive
188 | positive_feature = p_feature[i, :]
189 | positive_feature = positive_feature.expand_as(negative_feature)
190 | # print('positive_feature.size :', positive_feature.size())
191 |
192 | tri_loss = triplet_loss(anchor_feature, positive_feature, negative_feature)
193 |
194 | tri_record = tri_record + tri_loss
195 |
196 | # print('tri_loss :', tri_loss)
197 | loss = loss + tri_loss
198 | '''
199 | # print('tri_record : ', tri_record)
200 |
201 | my_tri_loss = my_triplet_loss(s_feature, p_feature) / (self.batch_size - 1)
202 | triplet_loss_meter.add(my_tri_loss.item())
203 | # print('my_tri_loss : ', my_tri_loss)
204 |
205 | # print(tri_record - my_tri_loss)
206 | loss = loss + my_tri_loss
207 | # print('loss :', loss)
208 | # loss = loss / opt.batch_size
209 |
210 | loss.backward()
211 |
212 | photo_optimizer.step()
213 | sketch_optimizer.step()
214 |
215 | if self.vis:
216 | vis.plot('triplet_loss', np.array([triplet_loss_meter.value()[0], photo_cat_loss_meter.value()[0],
217 | sketch_cat_loss_meter.value()[0]]),
218 | legend=['triplet_loss', 'photo_cat_loss', 'sketch_cat_loss'])
219 |
220 | triplet_loss_meter.reset()
221 | photo_cat_loss_meter.reset()
222 | sketch_cat_loss_meter.reset()
223 |
224 |
225 |
226 |
227 |
228 |
229 |
230 |
231 |
232 |
--------------------------------------------------------------------------------
/models/TripletLoss.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import torch.nn as nn
3 | import torch.nn.functional as F
4 |
5 |
6 | class TripletLoss(nn.Module):
7 | def __init__(self, p=2, m=0.3):
8 | super(TripletLoss, self).__init__()
9 |
10 | self.p = p
11 | self.m = m
12 |
13 | def forward(self, sketch, photo):
14 |
15 | bs = sketch.size(0)
16 |
17 | pos_distance = sketch - photo
18 | pos_distance = torch.pow(pos_distance, 2)
19 | pos_distance = torch.sqrt(torch.sum(pos_distance, dim=1))
20 |
21 | sketch_self = sketch.unsqueeze(0)
22 | photo_T = photo.unsqueeze(1)
23 |
24 | negative_distance = sketch_self - photo_T
25 | negative_distance = torch.pow(negative_distance, 2)
26 | negative_distance = torch.sqrt(torch.sum(negative_distance, dim=2))
27 |
28 | triplet_loss = pos_distance - negative_distance # bs x bs x num_vec
29 | # print('TripletLoss.forward.triplet_loss', triplet_loss)
30 | triplet_loss = triplet_loss + self.m
31 | eye = torch.eye(bs).cuda()
32 | triplet_loss = triplet_loss * (1 - eye)
33 | triplet_loss = F.relu(triplet_loss)
34 |
35 | triplet_loss = torch.sum(triplet_loss, dim=1)
36 |
37 | return torch.sum(triplet_loss)
38 |
--------------------------------------------------------------------------------
/models/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/models/__init__.py
--------------------------------------------------------------------------------
/models/sketch_resnet.py:
--------------------------------------------------------------------------------
1 | import torch.nn as nn
2 | import math
3 | import torch.utils.model_zoo as model_zoo
4 |
5 | __all__ = ['ResNet', 'resnet50', 'resnet34']
6 |
7 | model_urls = {
8 | 'resnet18': 'https://download.pytorch.org/models/resnet18-5c106cde.pth',
9 | 'resnet34': 'https://download.pytorch.org/models/resnet34-333f7ec4.pth',
10 | 'resnet50': 'https://download.pytorch.org/models/resnet50-19c8e357.pth',
11 | 'resnet101': 'https://download.pytorch.org/models/resnet101-5d3b4d8f.pth',
12 | 'resnet152': 'https://download.pytorch.org/models/resnet152-b121ed2d.pth',
13 | }
14 |
15 |
16 | def conv3x3(in_planes, out_planes, stride=1):
17 | """3x3 convolution with padding"""
18 | return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride,
19 | padding=1, bias=False)
20 |
21 |
22 | class BasicBlock(nn.Module):
23 | expansion = 1
24 |
25 | def __init__(self, inplanes, planes, stride=1, downsample=None):
26 | super(BasicBlock, self).__init__()
27 | self.conv1 = conv3x3(inplanes, planes, stride)
28 | self.bn1 = nn.BatchNorm2d(planes)
29 | self.relu = nn.ReLU(inplace=True)
30 | self.conv2 = conv3x3(planes, planes)
31 | self.bn2 = nn.BatchNorm2d(planes)
32 | self.downsample = downsample
33 | self.stride = stride
34 |
35 | def forward(self, x):
36 | residual = x
37 |
38 | out = self.conv1(x)
39 | out = self.bn1(out)
40 | out = self.relu(out)
41 |
42 | out = self.conv2(out)
43 | out = self.bn2(out)
44 |
45 | if self.downsample is not None:
46 | residual = self.downsample(x)
47 |
48 | out += residual
49 | out = self.relu(out)
50 |
51 | return out
52 |
53 |
54 | class Bottleneck(nn.Module):
55 | expansion = 4
56 |
57 | def __init__(self, inplanes, planes, stride=1, downsample=None):
58 | super(Bottleneck, self).__init__()
59 | self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=1, bias=False)
60 | self.bn1 = nn.BatchNorm2d(planes)
61 | self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, stride=stride,
62 | padding=1, bias=False)
63 | self.bn2 = nn.BatchNorm2d(planes)
64 | self.conv3 = nn.Conv2d(planes, planes * 4, kernel_size=1, bias=False)
65 | self.bn3 = nn.BatchNorm2d(planes * 4)
66 | self.relu = nn.ReLU(inplace=True)
67 | self.downsample = downsample
68 | self.stride = stride
69 |
70 | def forward(self, x):
71 | residual = x
72 |
73 | out = self.conv1(x)
74 | out = self.bn1(out)
75 | out = self.relu(out)
76 |
77 | out = self.conv2(out)
78 | out = self.bn2(out)
79 | out = self.relu(out)
80 |
81 | out = self.conv3(out)
82 | out = self.bn3(out)
83 |
84 | if self.downsample is not None:
85 | residual = self.downsample(x)
86 |
87 | out += residual
88 | out = self.relu(out)
89 |
90 | return out
91 |
92 |
93 |
94 |
95 | class ResNet(nn.Module):
96 |
97 | def __init__(self, block, layers, num_classes=1000):
98 | self.inplanes = 64
99 | super(ResNet, self).__init__()
100 | self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,
101 | bias=False)
102 | self.bn1 = nn.BatchNorm2d(64)
103 | self.relu = nn.ReLU(inplace=True)
104 | self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)
105 | self.layer1 = self._make_layer(block, 64, layers[0])
106 | self.layer2 = self._make_layer(block, 128, layers[1], stride=2)
107 | self.layer3 = self._make_layer(block, 256, layers[2], stride=2)
108 | self.layer4 = self._make_layer(block, 512, layers[3], stride=2)
109 | self.avgpool = nn.AdaptiveAvgPool2d(1)
110 |
111 | self.fc = nn.Linear(512 * block.expansion, num_classes)
112 |
113 | for m in self.modules():
114 | if isinstance(m, nn.Conv2d):
115 | n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels
116 | m.weight.data.normal_(0, math.sqrt(2. / n))
117 | elif isinstance(m, nn.BatchNorm2d):
118 | m.weight.data.fill_(1)
119 | m.bias.data.zero_()
120 |
121 | def _make_layer(self, block, planes, blocks, stride=1):
122 | downsample = None
123 | if stride != 1 or self.inplanes != planes * block.expansion:
124 | downsample = nn.Sequential(
125 | nn.Conv2d(self.inplanes, planes * block.expansion,
126 | kernel_size=1, stride=stride, bias=False),
127 | nn.BatchNorm2d(planes * block.expansion),
128 | )
129 |
130 | layers = []
131 | layers.append(block(self.inplanes, planes, stride, downsample))
132 | self.inplanes = planes * block.expansion
133 | for i in range(1, blocks):
134 | layers.append(block(self.inplanes, planes))
135 |
136 | return nn.Sequential(*layers)
137 |
138 | def forward(self, x):
139 | x = self.conv1(x)
140 | x = self.bn1(x)
141 | x = self.relu(x)
142 | x = self.maxpool(x)
143 |
144 | x = self.layer1(x)
145 | x = self.layer2(x)
146 | x = self.layer3(x)
147 | x = self.layer4(x)
148 |
149 | x = self.avgpool(x)
150 | x = x.view(x.size(0), -1)
151 |
152 | feature = x
153 | x = self.fc(x)
154 |
155 | return x, feature
156 |
157 |
158 | def resnet34(pretrained=False, **kwargs):
159 | """Constructs a ResNet-34 model.
160 |
161 | Args:
162 | pretrained (bool): If True, returns a model pre-trained on ImageNet
163 | """
164 | model = ResNet(BasicBlock, [3, 4, 6, 3], **kwargs)
165 | if pretrained:
166 | model.load_state_dict(model_zoo.load_url(model_urls['resnet34']))
167 | return model
168 |
169 |
170 | def resnet50(pretrained=False, **kwargs):
171 | """Constructs a ResNet-50 model.
172 |
173 | Args:
174 | pretrained (bool): If True, returns a model pre-trained on ImageNet
175 | """
176 | model = ResNet(Bottleneck, [3, 4, 6, 3], **kwargs)
177 | if pretrained:
178 | model.load_state_dict(model_zoo.load_url(model_urls['resnet50']), strict=False)
179 | return model
180 |
181 |
182 |
--------------------------------------------------------------------------------
/models/vgg.py:
--------------------------------------------------------------------------------
1 | import torch.nn as nn
2 | import torch.utils.model_zoo as model_zoo
3 | import math
4 |
5 | __all__ = [
6 | 'VGG', 'vgg16',
7 | ]
8 |
9 |
10 | model_urls = {
11 | 'vgg16': 'https://download.pytorch.org/models/vgg16-397923af.pth',
12 | }
13 |
14 |
15 | class VGG(nn.Module):
16 |
17 | def __init__(self, features, num_classes=1000, init_weights=True):
18 | super(VGG, self).__init__()
19 | self.features = features
20 | self.avgpool = nn.AdaptiveAvgPool2d(1)
21 |
22 | self.classifier = nn.Sequential(
23 | nn.Linear(512 * 7 * 7, 4096),
24 | nn.ReLU(True),
25 | nn.Dropout(),
26 | nn.Linear(4096, 4096),
27 | nn.ReLU(True),
28 | nn.Dropout(),
29 | nn.Linear(4096, num_classes),
30 | )
31 | if init_weights:
32 | self._initialize_weights()
33 |
34 | def forward(self, x):
35 | x = self.features(x)
36 | # print('feature.size()', x.size())
37 | feature = self.avgpool(x)
38 | x = x.view(x.size(0), -1)
39 | x = self.classifier(x)
40 |
41 | return x, feature
42 |
43 | def _initialize_weights(self):
44 | for m in self.modules():
45 | if isinstance(m, nn.Conv2d):
46 | n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels
47 | m.weight.data.normal_(0, math.sqrt(2. / n))
48 | if m.bias is not None:
49 | m.bias.data.zero_()
50 | elif isinstance(m, nn.BatchNorm2d):
51 | m.weight.data.fill_(1)
52 | m.bias.data.zero_()
53 | elif isinstance(m, nn.Linear):
54 | m.weight.data.normal_(0, 0.01)
55 | m.bias.data.zero_()
56 |
57 |
58 | def make_layers(cfg, batch_norm=False):
59 | layers = []
60 | in_channels = 3
61 | for v in cfg:
62 | if v == 'M':
63 | layers += [nn.MaxPool2d(kernel_size=2, stride=2)]
64 | else:
65 | conv2d = nn.Conv2d(in_channels, v, kernel_size=3, padding=1)
66 | if batch_norm:
67 | layers += [conv2d, nn.BatchNorm2d(v), nn.ReLU(inplace=True)]
68 | else:
69 | layers += [conv2d, nn.ReLU(inplace=True)]
70 | in_channels = v
71 | return nn.Sequential(*layers)
72 |
73 |
74 | cfg = {
75 | 'A': [64, 'M', 128, 'M', 256, 256, 'M', 512, 512, 'M', 512, 512, 'M'],
76 | 'B': [64, 64, 'M', 128, 128, 'M', 256, 256, 'M', 512, 512, 'M', 512, 512, 'M'],
77 | 'D': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 'M', 512, 512, 512, 'M', 512, 512, 512, 'M'],
78 | 'E': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 256, 'M', 512, 512, 512, 512, 'M', 512, 512, 512, 512, 'M'],
79 | }
80 |
81 |
82 | def vgg16(pretrained=False, **kwargs):
83 | """VGG 16-layer model (configuration "D")
84 |
85 | Args:
86 | pretrained (bool): If True, returns a model pre-trained on ImageNet
87 | """
88 | if pretrained:
89 | kwargs['init_weights'] = False
90 | model = VGG(make_layers(cfg['D']), **kwargs)
91 | if pretrained:
92 | model.load_state_dict(model_zoo.load_url(model_urls['vgg16']))
93 | return model
94 |
95 |
--------------------------------------------------------------------------------
/record/feature_vis/resnet50_0.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/resnet50_0.png
--------------------------------------------------------------------------------
/record/feature_vis/resnet50_315.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/resnet50_315.png
--------------------------------------------------------------------------------
/record/feature_vis/resnet50_64_265.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/resnet50_64_265.png
--------------------------------------------------------------------------------
/record/feature_vis/resnet50_940.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/resnet50_940.png
--------------------------------------------------------------------------------
/record/feature_vis/resnet_150.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/resnet_150.png
--------------------------------------------------------------------------------
/record/feature_vis/resnet_90.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/resnet_90.png
--------------------------------------------------------------------------------
/record/feature_vis/resnet_imagenet.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/resnet_imagenet.png
--------------------------------------------------------------------------------
/record/feature_vis/resnet_separate.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/resnet_separate.png
--------------------------------------------------------------------------------
/record/feature_vis/vgg16_190epoch.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/vgg16_190epoch.png
--------------------------------------------------------------------------------
/record/feature_vis/vgg16_5epoch.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/vgg16_5epoch.png
--------------------------------------------------------------------------------
/record/feature_vis/vgg16_995epoch.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/vgg16_995epoch.png
--------------------------------------------------------------------------------
/record/feature_vis/vgg16_baseline.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/feature_vis/vgg16_baseline.png
--------------------------------------------------------------------------------
/record/retrieval_reslut/result.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/record/retrieval_reslut/result.png
--------------------------------------------------------------------------------
/retrieval_test.py:
--------------------------------------------------------------------------------
1 | import pickle
2 | from sklearn.neighbors import NearestNeighbors
3 | import numpy as np
4 | from PIL import Image
5 | import os
6 | import torchvision.transforms as transforms
7 | from utils.visualize import Visualizer
8 | import tqdm
9 |
10 | PHOTO_ROOT = '/data1/zzl/dataset/photo-test'
11 | SKETCH_ROOT = '/data1/zzl/dataset/sketch-triplet-test'
12 |
13 | photo_data = pickle.load(open('feature/photo-resnet50_64-265.pkl', 'rb'))
14 | sketch_data = pickle.load(open('feature/sketch-resnet50_64-265.pkl', 'rb'))
15 | # print(photo_data['name'][0])
16 | photo_feature = photo_data['feature']
17 | photo_name = photo_data['name']
18 |
19 | sketch_feature = sketch_data['feature']
20 | sketch_name = sketch_data['name']
21 | # print(np.size(photo_feature, 0))
22 |
23 | nbrs = NearestNeighbors(n_neighbors=np.size(photo_feature, 0),
24 | algorithm='brute', metric='euclidean').fit(photo_feature)
25 |
26 | s_len = np.size(sketch_feature, 0)
27 | picked = [0] * s_len
28 |
29 | transform = transforms.Compose([
30 | transforms.ToTensor()
31 | ])
32 |
33 | vis = Visualizer('caffe2torch_test')
34 | count = 0
35 | for ii in range(20):
36 | index = np.random.randint(0, s_len)
37 | while picked[index]:
38 | index = np.random.randint(0, s_len)
39 |
40 | picked[index] = 1
41 |
42 | query_feature = sketch_feature[index]
43 | query_feature = np.reshape(query_feature, [1, np.shape(query_feature)[0]])
44 | query_name = sketch_name[index]
45 | # print(query_name)
46 | distances, indices = nbrs.kneighbors(query_feature)
47 |
48 | query_split = query_name.split('/')
49 | query_class = query_split[0]
50 | query_img = query_split[1]
51 |
52 | query_image = np.array(Image.open(os.path.join(SKETCH_ROOT, query_name)).convert('RGB'))
53 | for i, indice in enumerate(indices[0][:5]):
54 | retrievaled_name = photo_name[indice]
55 | retrievaled_im = np.array(Image.open(os.path.join(PHOTO_ROOT, retrievaled_name)).convert('RGB'))
56 | query_image = np.append(query_image, retrievaled_im, axis=1)
57 |
58 | retrievaled_class = retrievaled_name.split('/')[0]
59 | retrievaled_name = retrievaled_name.split('/')[1]
60 | retrievaled_name = retrievaled_name.split('.')[0]
61 |
62 | if retrievaled_class == query_class:
63 | print(ii, 'correct class', query_name, retrievaled_name)
64 | if query_img.find(retrievaled_name) != -1:
65 | print(ii, 'correct item', query_name)
66 | count += 1
67 |
68 | if ii == 0:
69 | result = query_image
70 | else:
71 | result = np.append(result, query_image, axis=0)
72 |
73 | result = transform(result)
74 | vis.images(result.numpy(), win='result')
75 |
76 | print(count)
77 |
78 |
79 | count = 0
80 | count_5 = 0
81 | K = 5
82 |
83 | div = 0
84 |
85 | for ii, (query_sketch, query_name) in tqdm.tqdm(enumerate(zip(sketch_feature, sketch_name))):
86 | query_sketch = np.reshape(query_sketch, [1, np.shape(query_sketch)[0]])
87 |
88 | query_split = query_name.split('/')
89 | query_class = query_split[0]
90 | query_img = query_split[1]
91 |
92 | distances, indices = nbrs.kneighbors(query_sketch)
93 |
94 | div += distances[0][1] - distances[0][0]
95 |
96 | # top K
97 |
98 | for i, indice in enumerate(indices[0][:K]):
99 |
100 | retrievaled_name = photo_name[indice]
101 | retrievaled_class = retrievaled_name.split('/')[0]
102 |
103 | retrievaled_name = retrievaled_name.split('/')[1]
104 | retrievaled_name = retrievaled_name.split('.')[0]
105 |
106 | if retrievaled_class == query_class:
107 | if query_img.find(retrievaled_name) != -1:
108 | if i == 0:
109 | count += 1
110 | count_5 += 1
111 | break
112 | recall = count / (ii+1)
113 | recall_5 = count_5 / (ii+1)
114 | print('recall@1 :', recall, ' recall@5 :', recall_5, 'div :', div/(ii+1))
115 |
116 |
--------------------------------------------------------------------------------
/test_img.txt:
--------------------------------------------------------------------------------
1 | airplane/n02691156_2701.jpg
2 | airplane/n02691156_8876.jpg
3 | airplane/n02691156_8352.jpg
4 | airplane/n02691156_7027.jpg
5 | airplane/n02691156_7472.jpg
6 | airplane/n02691156_5813.jpg
7 | airplane/n02691156_53586.jpg
8 | airplane/n02691156_4134.jpg
9 | airplane/n02691156_9491.jpg
10 | airplane/n02691156_3791.jpg
11 | alarm_clock/n02694662_7256.jpg
12 | alarm_clock/n02694662_14931.jpg
13 | alarm_clock/n02694662_7591.jpg
14 | alarm_clock/n02694662_1079.jpg
15 | alarm_clock/n02694662_14901.jpg
16 | alarm_clock/n02694662_12296.jpg
17 | alarm_clock/n02694662_17547.jpg
18 | alarm_clock/n02694662_11650.jpg
19 | alarm_clock/n02694662_9154.jpg
20 | alarm_clock/n02694662_4008.jpg
21 | ant/n02219486_677.jpg
22 | ant/n02219486_17589.jpg
23 | ant/n02219486_30822.jpg
24 | ant/n02219486_29662.jpg
25 | ant/n02219486_28415.jpg
26 | ant/n02219486_25482.jpg
27 | ant/n02219486_24244.jpg
28 | ant/n02219486_26134.jpg
29 | ant/n02219486_27825.jpg
30 | ant/n02219486_21712.jpg
31 | ape/n02483362_6484.jpg
32 | ape/n02470325_11452.jpg
33 | ape/n02480495_788.jpg
34 | ape/n02483362_5683.jpg
35 | ape/n02480495_162.jpg
36 | ape/n02483708_6167.jpg
37 | ape/n02480495_25570.jpg
38 | ape/n02481823_3822.jpg
39 | ape/n02481823_3882.jpg
40 | ape/n02481500_7639.jpg
41 | apple/n07739125_10746.jpg
42 | apple/n07739125_5971.jpg
43 | apple/n07739125_7934.jpg
44 | apple/n07739125_10944.jpg
45 | apple/n07739125_4074.jpg
46 | apple/n07739125_9823.jpg
47 | apple/n07739125_1406.jpg
48 | apple/n07739125_6509.jpg
49 | apple/n07739125_6351.jpg
50 | apple/n07739125_12353.jpg
51 | armor/n02895154_34811.jpg
52 | armor/n03146219_14890.jpg
53 | armor/n02895154_14298.jpg
54 | armor/n03000247_55528.jpg
55 | armor/n03000247_68189.jpg
56 | armor/n03000247_17925.jpg
57 | armor/n02895154_36195.jpg
58 | armor/n03146219_4724.jpg
59 | armor/n03000247_9005.jpg
60 | armor/n03000247_71108.jpg
61 | axe/n02764044_13358.jpg
62 | axe/n02764044_8541.jpg
63 | axe/n02764044_20350.jpg
64 | axe/n02764044_25921.jpg
65 | axe/n02764044_56605.jpg
66 | axe/n02764044_33304.jpg
67 | axe/n02764044_30843.jpg
68 | axe/n02764044_23413.jpg
69 | axe/n02764044_17192.jpg
70 | axe/n02764044_64984.jpg
71 | banana/n07753592_6075.jpg
72 | banana/n07753592_2049.jpg
73 | banana/n07753592_7792.jpg
74 | banana/n07753592_20750.jpg
75 | banana/n07753592_1416.jpg
76 | banana/n07753592_10233.jpg
77 | banana/n07753592_1193.jpg
78 | banana/n07753592_11641.jpg
79 | banana/n07753592_12072.jpg
80 | banana/n07753592_6756.jpg
81 | bat/n02139199_12462.jpg
82 | bat/n02139199_11487.jpg
83 | bat/n02139199_8857.jpg
84 | bat/n02147947_6236.jpg
85 | bat/n02149420_2799.jpg
86 | bat/n02147947_3903.jpg
87 | bat/n02147328_2008.jpg
88 | bat/n02147328_642.jpg
89 | bat/n02139199_16145.jpg
90 | bat/n02149420_3768.jpg
91 | bear/n02131653_7966.jpg
92 | bear/n02131653_12462.jpg
93 | bear/n02131653_1550.jpg
94 | bear/n02131653_2786.jpg
95 | bear/n02131653_6170.jpg
96 | bear/n02131653_3012.jpg
97 | bear/n02131653_5922.jpg
98 | bear/n02131653_7163.jpg
99 | bear/n02131653_6207.jpg
100 | bear/n02131653_5813.jpg
101 | bee/n02206856_3121.jpg
102 | bee/n02206856_5850.jpg
103 | bee/n02206856_4261.jpg
104 | bee/n02206856_6361.jpg
105 | bee/n02206856_6903.jpg
106 | bee/n02206856_1815.jpg
107 | bee/n02206856_6552.jpg
108 | bee/n02206856_8544.jpg
109 | bee/n02206856_1139.jpg
110 | bee/n02206856_2419.jpg
111 | beetle/n02169497_3918.jpg
112 | beetle/n02169497_5428.jpg
113 | beetle/n02169497_197.jpg
114 | beetle/n02167151_967.jpg
115 | beetle/n02169497_3809.jpg
116 | beetle/n02168699_7300.jpg
117 | beetle/n02176261_7522.jpg
118 | beetle/n02167151_7464.jpg
119 | beetle/n02169497_2275.jpg
120 | beetle/n02167151_9858.jpg
121 | bell/n02824448_19917.jpg
122 | bell/n02824448_3251.jpg
123 | bell/n02824448_1680.jpg
124 | bell/n02824448_13821.jpg
125 | bell/n03028596_4575.jpg
126 | bell/n02824448_13582.jpg
127 | bell/n02824448_30419.jpg
128 | bell/n02824448_11201.jpg
129 | bell/n03028596_2273.jpg
130 | bell/n02824448_15595.jpg
131 | bench/n03891251_15.jpg
132 | bench/n02828884_4015.jpg
133 | bench/n02828884_2200.jpg
134 | bench/n03891251_7764.jpg
135 | bench/n02828884_7329.jpg
136 | bench/n03891251_1444.jpg
137 | bench/n03891251_3664.jpg
138 | bench/n03891251_2480.jpg
139 | bench/n02828884_10535.jpg
140 | bench/n02828884_1200.jpg
141 | bicycle/n02834778_6942.jpg
142 | bicycle/n02834778_11645.jpg
143 | bicycle/n03792782_4600.jpg
144 | bicycle/n04126066_9549.jpg
145 | bicycle/n02834778_2873.jpg
146 | bicycle/n02834778_12279.jpg
147 | bicycle/n02834778_7767.jpg
148 | bicycle/n02834778_11636.jpg
149 | bicycle/n02834778_6186.jpg
150 | bicycle/n02834778_12174.jpg
151 | blimp/n02850950_12373.jpg
152 | blimp/n02850950_3246.jpg
153 | blimp/n02850950_22283.jpg
154 | blimp/n02850950_3765.jpg
155 | blimp/n02850950_10002.jpg
156 | blimp/n02850950_58.jpg
157 | blimp/n02850950_13395.jpg
158 | blimp/n02850950_1715.jpg
159 | blimp/n02850950_18396.jpg
160 | blimp/n02850950_4074.jpg
161 | bread/n07682316_7608.jpg
162 | bread/n07683786_25458.jpg
163 | bread/n07679356_27111.jpg
164 | bread/n07683786_6427.jpg
165 | bread/n07684084_2909.jpg
166 | bread/n07684084_1579.jpg
167 | bread/n07684084_2352.jpg
168 | bread/n07684084_368.jpg
169 | bread/n07683786_14450.jpg
170 | bread/n07682316_1438.jpg
171 | butterfly/n02274259_21905.jpg
172 | butterfly/n02274259_18681.jpg
173 | butterfly/n02274259_5479.jpg
174 | butterfly/n02274259_20043.jpg
175 | butterfly/n02274259_19471.jpg
176 | butterfly/n02274259_14994.jpg
177 | butterfly/n02274259_23833.jpg
178 | butterfly/n02274259_15828.jpg
179 | butterfly/n02274259_8298.jpg
180 | butterfly/n02274259_19616.jpg
181 | cabin/n02932400_7727.jpg
182 | cabin/n02932400_11404.jpg
183 | cabin/n02932400_4934.jpg
184 | cabin/n02932400_9565.jpg
185 | cabin/n02932400_5733.jpg
186 | cabin/n03686924_4151.jpg
187 | cabin/n03686924_2295.jpg
188 | cabin/n02932400_1945.jpg
189 | cabin/n02932400_1408.jpg
190 | cabin/n03686924_10083.jpg
191 | camel/n02437136_3994.jpg
192 | camel/n02437136_3674.jpg
193 | camel/n02437136_4845.jpg
194 | camel/n02437136_584.jpg
195 | camel/n02437136_3428.jpg
196 | camel/n02437136_4429.jpg
197 | camel/n02437136_8319.jpg
198 | camel/n02437136_14156.jpg
199 | camel/n02437136_257.jpg
200 | camel/n02437136_4752.jpg
201 | candle/n02948072_3987.jpg
202 | candle/n02948072_21856.jpg
203 | candle/n02948072_12086.jpg
204 | candle/n02948072_29247.jpg
205 | candle/n02948072_1666.jpg
206 | candle/n02948072_33885.jpg
207 | candle/n02948072_6033.jpg
208 | candle/n02948072_5309.jpg
209 | candle/n02948072_34562.jpg
210 | candle/n02948072_14341.jpg
211 | cannon/n02950826_10208.jpg
212 | cannon/n02950826_14185.jpg
213 | cannon/n02950826_1353.jpg
214 | cannon/n02950826_18518.jpg
215 | cannon/n02950826_5567.jpg
216 | cannon/n02950826_4158.jpg
217 | cannon/n02950826_17195.jpg
218 | cannon/n02950826_2757.jpg
219 | cannon/n02950826_4653.jpg
220 | cannon/n02950826_12133.jpg
221 | car_(sedan)/n04166281_5551.jpg
222 | car_(sedan)/n02958343_1853.jpg
223 | car_(sedan)/n02958343_520.jpg
224 | car_(sedan)/n04166281_1917.jpg
225 | car_(sedan)/n04166281_263.jpg
226 | car_(sedan)/n02958343_12252.jpg
227 | car_(sedan)/n02958343_4750.jpg
228 | car_(sedan)/n02958343_6586.jpg
229 | car_(sedan)/n04166281_185.jpg
230 | car_(sedan)/n04166281_4002.jpg
231 | castle/n02980441_8875.jpg
232 | castle/n02980441_22886.jpg
233 | castle/n02980441_14457.jpg
234 | castle/n02980441_11297.jpg
235 | castle/n02980441_11789.jpg
236 | castle/n02980441_8524.jpg
237 | castle/n02980441_1533.jpg
238 | castle/n02980441_14009.jpg
239 | castle/n02980441_11128.jpg
240 | castle/n02980441_6317.jpg
241 | cat/n02121620_10527.jpg
242 | cat/n02121620_4954.jpg
243 | cat/n02121620_51.jpg
244 | cat/n02121620_31930.jpg
245 | cat/n02121620_5192.jpg
246 | cat/n02121620_9283.jpg
247 | cat/n02121620_7791.jpg
248 | cat/n02121620_531.jpg
249 | cat/n02121620_20153.jpg
250 | cat/n02121620_13243.jpg
251 | chair/n03001627_3318.jpg
252 | chair/n03001627_6105.jpg
253 | chair/n02738535_904.jpg
254 | chair/n03001627_12243.jpg
255 | chair/n02738535_14114.jpg
256 | chair/n02738535_6285.jpg
257 | chair/n02738535_1312.jpg
258 | chair/n03001627_2245.jpg
259 | chair/n03001627_1776.jpg
260 | chair/n02738535_7205.jpg
261 | chicken/n01791625_8908.jpg
262 | chicken/n01791625_4689.jpg
263 | chicken/n01791625_15751.jpg
264 | chicken/n01791625_16214.jpg
265 | chicken/n01791625_4044.jpg
266 | chicken/n01791625_7354.jpg
267 | chicken/n01791625_1914.jpg
268 | chicken/n01791625_4567.jpg
269 | chicken/n01791625_17466.jpg
270 | chicken/n01791625_1081.jpg
271 | church/n03028079_9810.jpg
272 | church/n03028079_5432.jpg
273 | church/n03028079_2585.jpg
274 | church/n03028079_29412.jpg
275 | church/n03028079_9283.jpg
276 | church/n03028079_2718.jpg
277 | church/n03028079_13935.jpg
278 | church/n03028079_10579.jpg
279 | church/n03028079_11250.jpg
280 | church/n03028079_15095.jpg
281 | couch/n04256520_22692.jpg
282 | couch/n04256520_21394.jpg
283 | couch/n04256520_20048.jpg
284 | couch/n04256520_14377.jpg
285 | couch/n04256520_24708.jpg
286 | couch/n04256520_28871.jpg
287 | couch/n04256520_3251.jpg
288 | couch/n04256520_19834.jpg
289 | couch/n04256520_3986.jpg
290 | couch/n04256520_1299.jpg
291 | cow/n01887787_7620.jpg
292 | cow/n01887787_5589.jpg
293 | cow/n02403454_3113.jpg
294 | cow/n01887787_1.jpg
295 | cow/n02404432_9953.jpg
296 | cow/n01887787_7746.jpg
297 | cow/n01887787_6114.jpg
298 | cow/n01887787_591.jpg
299 | cow/n01887787_6520.jpg
300 | cow/n01887787_5137.jpg
301 | crab/n01980166_9179.jpg
302 | crab/n01978455_795.jpg
303 | crab/n01981276_14823.jpg
304 | crab/n01978455_88.jpg
305 | crab/n01978455_2200.jpg
306 | crab/n01978455_9813.jpg
307 | crab/n01980166_9451.jpg
308 | crab/n01978455_5679.jpg
309 | crab/n01978455_2536.jpg
310 | crab/n01978455_11617.jpg
311 | crocodilian/n01698640_1904.jpg
312 | crocodilian/n01697457_7336.jpg
313 | crocodilian/n01698640_2000.jpg
314 | crocodilian/n01698640_3055.jpg
315 | crocodilian/n01698640_4477.jpg
316 | crocodilian/n01698640_7483.jpg
317 | crocodilian/n01697457_8961.jpg
318 | crocodilian/n01697457_9993.jpg
319 | crocodilian/n01697457_2737.jpg
320 | crocodilian/n01698640_7977.jpg
321 | cup/n03147509_12636.jpg
322 | cup/n03147509_12211.jpg
323 | cup/n03063073_9082.jpg
324 | cup/n03063073_933.jpg
325 | cup/n03147509_21660.jpg
326 | cup/n03063073_3117.jpg
327 | cup/n03147509_13746.jpg
328 | cup/n03147509_20483.jpg
329 | cup/n03063073_3104.jpg
330 | cup/n03147509_20180.jpg
331 | deer/n02430045_2684.jpg
332 | deer/n02431628_9369.jpg
333 | deer/n02432983_12240.jpg
334 | deer/n02433925_15161.jpg
335 | deer/n02431628_3868.jpg
336 | deer/n02432983_8470.jpg
337 | deer/n02433925_22295.jpg
338 | deer/n02432983_7830.jpg
339 | deer/n02433925_12638.jpg
340 | deer/n02430045_5969.jpg
341 | dog/n02106662_13178.jpg
342 | dog/n02106662_18405.jpg
343 | dog/n02103406_7708.jpg
344 | dog/n02103406_5563.jpg
345 | dog/n02106662_15858.jpg
346 | dog/n02106662_2157.jpg
347 | dog/n02103406_2706.jpg
348 | dog/n02106662_8511.jpg
349 | dog/n02103406_2976.jpg
350 | dog/n02109525_13700.jpg
351 | dolphin/n02068974_6910.jpg
352 | dolphin/n02068974_3178.jpg
353 | dolphin/n02068974_613.jpg
354 | dolphin/n02068974_5747.jpg
355 | dolphin/n02068974_4003.jpg
356 | dolphin/n02068974_7259.jpg
357 | dolphin/n02068974_212.jpg
358 | dolphin/n02068974_4210.jpg
359 | dolphin/n02068974_4066.jpg
360 | dolphin/n02072040_156.jpg
361 | door/n03222176_11589.jpg
362 | door/n03222318_13609.jpg
363 | door/n03222176_13242.jpg
364 | door/n03222318_10058.jpg
365 | door/n03222318_10367.jpg
366 | door/n03222176_3296.jpg
367 | door/n03222318_13264.jpg
368 | door/n03222318_7012.jpg
369 | door/n03222176_16166.jpg
370 | door/n03226880_3614.jpg
371 | duck/n01846331_6199.jpg
372 | duck/n01846331_4246.jpg
373 | duck/n01846331_16000.jpg
374 | duck/n01846331_15616.jpg
375 | duck/n01846331_6072.jpg
376 | duck/n01846331_16486.jpg
377 | duck/n01846331_16375.jpg
378 | duck/n01846331_3530.jpg
379 | duck/n01846331_13988.jpg
380 | duck/n01846331_5951.jpg
381 | elephant/n02503517_8222.jpg
382 | elephant/n02503517_5527.jpg
383 | elephant/n02503517_8205.jpg
384 | elephant/n02503517_6880.jpg
385 | elephant/n02503517_3808.jpg
386 | elephant/n02503517_9827.jpg
387 | elephant/n02503517_10193.jpg
388 | elephant/n02503517_3799.jpg
389 | elephant/n02503517_4687.jpg
390 | elephant/n02503517_5117.jpg
391 | eyeglasses/n04272054_17512.jpg
392 | eyeglasses/n04356056_7859.jpg
393 | eyeglasses/n04356056_1773.jpg
394 | eyeglasses/n04356056_1989.jpg
395 | eyeglasses/n04356056_1223.jpg
396 | eyeglasses/n04356056_1937.jpg
397 | eyeglasses/n04272054_6470.jpg
398 | eyeglasses/n04272054_3524.jpg
399 | eyeglasses/n04356056_2972.jpg
400 | eyeglasses/n04272054_17451.jpg
401 | fan/n03271574_2512.jpg
402 | fan/n03271574_14023.jpg
403 | fan/n03271574_304.jpg
404 | fan/n03271574_9446.jpg
405 | fan/n03271574_8678.jpg
406 | fan/n03271574_6313.jpg
407 | fan/n03271574_12669.jpg
408 | fan/n03271574_11720.jpg
409 | fan/n03271574_1965.jpg
410 | fan/n03271574_4669.jpg
411 | fish/n02607072_3344.jpg
412 | fish/n02607072_3296.jpg
413 | fish/n02607072_909.jpg
414 | fish/n02534734_8449.jpg
415 | fish/n02605316_6101.jpg
416 | fish/n02605316_2350.jpg
417 | fish/n02513560_8353.jpg
418 | fish/n02643566_671.jpg
419 | fish/n02606052_1521.jpg
420 | fish/n02514041_5398.jpg
421 | flower/n11669921_22603.jpg
422 | flower/n11669921_22395.jpg
423 | flower/n11939491_3128.jpg
424 | flower/n11939491_19282.jpg
425 | flower/n11939491_8731.jpg
426 | flower/n11939491_55149.jpg
427 | flower/n11669921_21097.jpg
428 | flower/n11669921_23918.jpg
429 | flower/n11939491_44353.jpg
430 | flower/n11939491_35516.jpg
431 | frog/n01639765_19550.jpg
432 | frog/n01639765_14517.jpg
433 | frog/n01639765_20156.jpg
434 | frog/n01639765_7349.jpg
435 | frog/n01641577_13063.jpg
436 | frog/n01641577_8170.jpg
437 | frog/n01641391_19772.jpg
438 | frog/n01641577_218.jpg
439 | frog/n01639765_10652.jpg
440 | frog/n01639765_10465.jpg
441 | geyser/n09288635_11801.jpg
442 | geyser/n09288635_14337.jpg
443 | geyser/n09288635_14555.jpg
444 | geyser/n09288635_25596.jpg
445 | geyser/n09288635_13949.jpg
446 | geyser/n09288635_11393.jpg
447 | geyser/n09288635_11016.jpg
448 | geyser/n09288635_12147.jpg
449 | geyser/n09288635_10966.jpg
450 | geyser/n09288635_29220.jpg
451 | giraffe/n02439033_8875.jpg
452 | giraffe/n02439033_13602.jpg
453 | giraffe/n02439033_11666.jpg
454 | giraffe/n02439033_8846.jpg
455 | giraffe/n02439033_14962.jpg
456 | giraffe/n02439033_13384.jpg
457 | giraffe/n02439033_3944.jpg
458 | giraffe/n02439033_11894.jpg
459 | giraffe/n02439033_221.jpg
460 | giraffe/n02439033_9719.jpg
461 | guitar/n02676566_4605.jpg
462 | guitar/n02676566_8467.jpg
463 | guitar/n03272010_6662.jpg
464 | guitar/n03272010_12151.jpg
465 | guitar/n02676566_9062.jpg
466 | guitar/n02676566_307.jpg
467 | guitar/n03467517_22246.jpg
468 | guitar/n02676566_7385.jpg
469 | guitar/n02676566_6980.jpg
470 | guitar/n02676566_7830.jpg
471 | hamburger/n07697313_13156.jpg
472 | hamburger/n07697313_4367.jpg
473 | hamburger/n07697313_8515.jpg
474 | hamburger/n07697313_2730.jpg
475 | hamburger/n07697313_12432.jpg
476 | hamburger/n07697313_2172.jpg
477 | hamburger/n07697313_9789.jpg
478 | hamburger/n07697313_11264.jpg
479 | hamburger/n07697313_12711.jpg
480 | hamburger/n07697313_14416.jpg
481 | hammer/n03481172_16541.jpg
482 | hammer/n03481172_19147.jpg
483 | hammer/n03481172_4618.jpg
484 | hammer/n03481172_1565.jpg
485 | hammer/n03481172_11589.jpg
486 | hammer/n03481172_12416.jpg
487 | hammer/n03481172_22505.jpg
488 | hammer/n03481172_26915.jpg
489 | hammer/n03481172_15468.jpg
490 | hammer/n03481172_15211.jpg
491 | harp/n03495258_14373.jpg
492 | harp/n03495258_15784.jpg
493 | harp/n03495258_5207.jpg
494 | harp/n03495258_12596.jpg
495 | harp/n03495258_9888.jpg
496 | harp/n03495258_15623.jpg
497 | harp/n03495258_15739.jpg
498 | harp/n03495258_20155.jpg
499 | harp/n03495258_15256.jpg
500 | harp/n03495258_14000.jpg
501 | hat/n02859184_51537.jpg
502 | hat/n02954340_7738.jpg
503 | hat/n03124170_6952.jpg
504 | hat/n02954340_13808.jpg
505 | hat/n02859184_41293.jpg
506 | hat/n04259630_11895.jpg
507 | hat/n02859184_22326.jpg
508 | hat/n03497657_10828.jpg
509 | hat/n02859184_47904.jpg
510 | hat/n03124170_216.jpg
511 | hedgehog/n01872401_10227.jpg
512 | hedgehog/n01872401_12468.jpg
513 | hedgehog/n02346627_785.jpg
514 | hedgehog/n01894207_264.jpg
515 | hedgehog/n02346627_10071.jpg
516 | hedgehog/n01872401_15771.jpg
517 | hedgehog/n02346627_1757.jpg
518 | hedgehog/n02346627_15025.jpg
519 | hedgehog/n01872401_2948.jpg
520 | hedgehog/n01872401_9740.jpg
521 | helicopter/n03512147_1830.jpg
522 | helicopter/n03512147_2013.jpg
523 | helicopter/n03512147_224.jpg
524 | helicopter/n03512147_39770.jpg
525 | helicopter/n03512147_2302.jpg
526 | helicopter/n03512147_2935.jpg
527 | helicopter/n03512147_2669.jpg
528 | helicopter/n03512147_582.jpg
529 | helicopter/n03512147_6451.jpg
530 | helicopter/n03512147_1414.jpg
531 | hermit_crab/n01986214_5645.jpg
532 | hermit_crab/n01986214_10516.jpg
533 | hermit_crab/n01986214_6402.jpg
534 | hermit_crab/n01986214_9627.jpg
535 | hermit_crab/n01986214_4221.jpg
536 | hermit_crab/n01986214_17404.jpg
537 | hermit_crab/n01986214_11695.jpg
538 | hermit_crab/n01986214_15370.jpg
539 | hermit_crab/n01986214_18591.jpg
540 | hermit_crab/n01986214_3659.jpg
541 | horse/n02374451_9225.jpg
542 | horse/n02374451_8269.jpg
543 | horse/n02374451_4818.jpg
544 | horse/n02374451_262.jpg
545 | horse/n02374451_11492.jpg
546 | horse/n02374451_8065.jpg
547 | horse/n02374451_12174.jpg
548 | horse/n02374451_503.jpg
549 | horse/n02374451_11894.jpg
550 | horse/n02374451_17474.jpg
551 | hot-air_balloon/n02782093_5111.jpg
552 | hot-air_balloon/n02782093_1989.jpg
553 | hot-air_balloon/n03541923_1963.jpg
554 | hot-air_balloon/n03541923_3571.jpg
555 | hot-air_balloon/n02782093_5421.jpg
556 | hot-air_balloon/n03541923_7575.jpg
557 | hot-air_balloon/n03541923_3913.jpg
558 | hot-air_balloon/n02782093_4675.jpg
559 | hot-air_balloon/n03541923_1367.jpg
560 | hot-air_balloon/n02782093_622.jpg
561 | hotdog/n07697537_23696.jpg
562 | hotdog/n07697537_32913.jpg
563 | hotdog/n07697537_23589.jpg
564 | hotdog/n07697537_13566.jpg
565 | hotdog/n07697537_1271.jpg
566 | hotdog/n07697537_23978.jpg
567 | hotdog/n07697537_34124.jpg
568 | hotdog/n07697537_14033.jpg
569 | hotdog/n07697537_19413.jpg
570 | hotdog/n07697537_18866.jpg
571 | hourglass/n03544143_8556.jpg
572 | hourglass/n03544143_4358.jpg
573 | hourglass/n03544143_10288.jpg
574 | hourglass/n03544143_577.jpg
575 | hourglass/n03544143_3348.jpg
576 | hourglass/n03544143_4868.jpg
577 | hourglass/n03544143_11550.jpg
578 | hourglass/n03544143_15035.jpg
579 | hourglass/n03544143_17851.jpg
580 | hourglass/n03544143_10028.jpg
581 | jack-o-lantern/n03590841_5679.jpg
582 | jack-o-lantern/n03590841_11444.jpg
583 | jack-o-lantern/n03590841_6382.jpg
584 | jack-o-lantern/n03590841_3947.jpg
585 | jack-o-lantern/n03590841_11344.jpg
586 | jack-o-lantern/n03590841_6963.jpg
587 | jack-o-lantern/n03590841_10464.jpg
588 | jack-o-lantern/n03590841_6798.jpg
589 | jack-o-lantern/n03590841_11395.jpg
590 | jack-o-lantern/n03590841_7100.jpg
591 | jellyfish/n01910747_1552.jpg
592 | jellyfish/n01910747_10023.jpg
593 | jellyfish/n01910747_563.jpg
594 | jellyfish/n01910747_13178.jpg
595 | jellyfish/n01910747_2724.jpg
596 | jellyfish/n01910747_1522.jpg
597 | jellyfish/n01910747_2663.jpg
598 | jellyfish/n01910747_11859.jpg
599 | jellyfish/n01910747_11139.jpg
600 | jellyfish/n01910747_3076.jpg
601 | kangaroo/n01877134_11382.jpg
602 | kangaroo/n01877134_1720.jpg
603 | kangaroo/n01877134_1108.jpg
604 | kangaroo/n01877134_9114.jpg
605 | kangaroo/n01877134_9452.jpg
606 | kangaroo/n01877134_11335.jpg
607 | kangaroo/n01877134_10341.jpg
608 | kangaroo/n01877134_10270.jpg
609 | kangaroo/n01877134_10244.jpg
610 | kangaroo/n01877134_728.jpg
611 | knife/n02976123_8715.jpg
612 | knife/n03624134_13816.jpg
613 | knife/n02976123_3654.jpg
614 | knife/n03624134_17319.jpg
615 | knife/n02976123_4150.jpg
616 | knife/n03624134_15272.jpg
617 | knife/n02976123_1752.jpg
618 | knife/n03624134_12494.jpg
619 | knife/n03624134_15440.jpg
620 | knife/n03624134_12920.jpg
621 | lion/n02129165_13679.jpg
622 | lion/n02129165_325.jpg
623 | lion/n02129165_5082.jpg
624 | lion/n02129165_20590.jpg
625 | lion/n02129165_5079.jpg
626 | lion/n02129165_4588.jpg
627 | lion/n02129165_4058.jpg
628 | lion/n02129165_12630.jpg
629 | lion/n02129165_1901.jpg
630 | lion/n02129165_2496.jpg
631 | lizard/n01674464_1831.jpg
632 | lizard/n01693334_10211.jpg
633 | lizard/n01693334_9741.jpg
634 | lizard/n01674464_1561.jpg
635 | lizard/n01693334_1360.jpg
636 | lizard/n01674464_11126.jpg
637 | lizard/n01674464_2019.jpg
638 | lizard/n01674464_10844.jpg
639 | lizard/n01674464_7290.jpg
640 | lizard/n01683558_5963.jpg
641 | lobster/n01985128_9104.jpg
642 | lobster/n01985128_32662.jpg
643 | lobster/n01985128_20238.jpg
644 | lobster/n01985128_9799.jpg
645 | lobster/n01985128_749.jpg
646 | lobster/n01985128_5607.jpg
647 | lobster/n01985128_27838.jpg
648 | lobster/n01985128_12648.jpg
649 | lobster/n01985128_29355.jpg
650 | lobster/n01985128_52414.jpg
651 | motorcycle/n03790512_12830.jpg
652 | motorcycle/n03790512_1734.jpg
653 | motorcycle/n03790512_2611.jpg
654 | motorcycle/n03790512_345.jpg
655 | motorcycle/n03790512_12653.jpg
656 | motorcycle/n03790512_400.jpg
657 | motorcycle/n03790512_393.jpg
658 | motorcycle/n03790512_10417.jpg
659 | motorcycle/n03790512_13786.jpg
660 | motorcycle/n03790512_13819.jpg
661 | mouse/n02330245_822.jpg
662 | mouse/n02330245_9247.jpg
663 | mouse/n02330245_137.jpg
664 | mouse/n02330245_3252.jpg
665 | mouse/n02330245_15445.jpg
666 | mouse/n02330245_15553.jpg
667 | mouse/n02330245_11922.jpg
668 | mouse/n02330245_12266.jpg
669 | mouse/n02330245_15069.jpg
670 | mouse/n02330245_3938.jpg
671 | mushroom/n12998815_9340.jpg
672 | mushroom/n12998815_5759.jpg
673 | mushroom/n12997919_7227.jpg
674 | mushroom/n12998815_12868.jpg
675 | mushroom/n12998815_6887.jpg
676 | mushroom/n12997919_3128.jpg
677 | mushroom/n12997919_2931.jpg
678 | mushroom/n12997919_7536.jpg
679 | mushroom/n12998815_22994.jpg
680 | mushroom/n12997919_3413.jpg
681 | owl/n01621127_2580.jpg
682 | owl/n01621127_13869.jpg
683 | owl/n01621127_2379.jpg
684 | owl/n01621127_2756.jpg
685 | owl/n01621127_14495.jpg
686 | owl/n01621127_208.jpg
687 | owl/n01621127_18305.jpg
688 | owl/n01621127_2733.jpg
689 | owl/n01621127_12894.jpg
690 | owl/n01621127_2025.jpg
691 | parrot/n01816887_4698.jpg
692 | parrot/n01817953_3417.jpg
693 | parrot/n01816887_224.jpg
694 | parrot/n01816887_8397.jpg
695 | parrot/n01817953_9201.jpg
696 | parrot/n01816887_1320.jpg
697 | parrot/n01816887_9494.jpg
698 | parrot/n01816887_6792.jpg
699 | parrot/n01820546_6753.jpg
700 | parrot/n01816887_9978.jpg
701 | pear/n12651611_7758.jpg
702 | pear/n12651611_10828.jpg
703 | pear/n12651611_7402.jpg
704 | pear/n07768230_3086.jpg
705 | pear/n07768230_13745.jpg
706 | pear/n07767847_4987.jpg
707 | pear/n12651611_6870.jpg
708 | pear/n07767847_5698.jpg
709 | pear/n07767847_10507.jpg
710 | pear/n12651611_3083.jpg
711 | penguin/n02055803_3489.jpg
712 | penguin/n02056570_8253.jpg
713 | penguin/n02055803_7917.jpg
714 | penguin/n02055803_1502.jpg
715 | penguin/n02055803_2199.jpg
716 | penguin/n02056570_8709.jpg
717 | penguin/n02055803_422.jpg
718 | penguin/n02055803_3848.jpg
719 | penguin/n02056570_9810.jpg
720 | penguin/n02055803_6994.jpg
721 | piano/n03452741_8023.jpg
722 | piano/n03452741_6003.jpg
723 | piano/n03452741_6174.jpg
724 | piano/n03452741_4470.jpg
725 | piano/n03452741_10838.jpg
726 | piano/n03452741_6507.jpg
727 | piano/n03452741_10036.jpg
728 | piano/n03452741_16897.jpg
729 | piano/n03452741_15379.jpg
730 | piano/n03452741_8763.jpg
731 | pickup_truck/n03930630_5135.jpg
732 | pickup_truck/n03930630_9324.jpg
733 | pickup_truck/n03930630_6704.jpg
734 | pickup_truck/n03930630_10956.jpg
735 | pickup_truck/n03930630_7422.jpg
736 | pickup_truck/n03930630_5801.jpg
737 | pickup_truck/n03930630_7937.jpg
738 | pickup_truck/n03930630_661.jpg
739 | pickup_truck/n03930630_19292.jpg
740 | pickup_truck/n03930630_11741.jpg
741 | pig/n02395406_33040.jpg
742 | pig/n02395406_5730.jpg
743 | pig/n02395406_7226.jpg
744 | pig/n02395406_11157.jpg
745 | pig/n02395406_33109.jpg
746 | pig/n02395406_2776.jpg
747 | pig/n02395406_27034.jpg
748 | pig/n02395406_32520.jpg
749 | pig/n02395406_2303.jpg
750 | pig/n02395406_8445.jpg
751 | pineapple/n07753275_20080.jpg
752 | pineapple/n07753275_21346.jpg
753 | pineapple/n07753275_28461.jpg
754 | pineapple/n07753275_23767.jpg
755 | pineapple/n07753275_27111.jpg
756 | pineapple/n07753275_17554.jpg
757 | pineapple/n07753275_26165.jpg
758 | pineapple/n07753275_15275.jpg
759 | pineapple/n07753275_26035.jpg
760 | pineapple/n07753275_29605.jpg
761 | pistol/n03948459_12984.jpg
762 | pistol/n03948459_18932.jpg
763 | pistol/n04086273_3414.jpg
764 | pistol/n03948459_8078.jpg
765 | pistol/n03948459_13616.jpg
766 | pistol/n03948459_10942.jpg
767 | pistol/n04086273_5765.jpg
768 | pistol/n03948459_20127.jpg
769 | pistol/n03948459_17330.jpg
770 | pistol/n04086273_23984.jpg
771 | pizza/n07873807_12811.jpg
772 | pizza/n07873807_8370.jpg
773 | pizza/n07873807_1766.jpg
774 | pizza/n07873807_10042.jpg
775 | pizza/n07873807_12794.jpg
776 | pizza/n07873807_9429.jpg
777 | pizza/n07873807_2885.jpg
778 | pizza/n07873807_10932.jpg
779 | pizza/n07873807_3557.jpg
780 | pizza/n07873807_5785.jpg
781 | pretzel/n07695742_8381.jpg
782 | pretzel/n07695742_9706.jpg
783 | pretzel/n07695742_4492.jpg
784 | pretzel/n07695742_6230.jpg
785 | pretzel/n07695742_8444.jpg
786 | pretzel/n07695742_3358.jpg
787 | pretzel/n07695742_6804.jpg
788 | pretzel/n07695742_4716.jpg
789 | pretzel/n07695742_4484.jpg
790 | pretzel/n07695742_1832.jpg
791 | rabbit/n02325366_327.jpg
792 | rabbit/n02325366_6689.jpg
793 | rabbit/n02325366_11692.jpg
794 | rabbit/n02325366_5095.jpg
795 | rabbit/n02325366_6527.jpg
796 | rabbit/n02325366_6132.jpg
797 | rabbit/n02325366_5669.jpg
798 | rabbit/n02325366_6270.jpg
799 | rabbit/n02325366_9233.jpg
800 | rabbit/n02325366_5308.jpg
801 | raccoon/n02508021_10406.jpg
802 | raccoon/n02508021_12493.jpg
803 | raccoon/n02508021_2938.jpg
804 | raccoon/n02508021_1752.jpg
805 | raccoon/n02508021_2458.jpg
806 | raccoon/n02508021_15891.jpg
807 | raccoon/n02508021_4712.jpg
808 | raccoon/n02508021_6078.jpg
809 | raccoon/n02508021_750.jpg
810 | raccoon/n02508021_10788.jpg
811 | racket/n04039381_14533.jpg
812 | racket/n02772700_1761.jpg
813 | racket/n02772700_2240.jpg
814 | racket/n02772700_1967.jpg
815 | racket/n04039381_8327.jpg
816 | racket/n04039381_8917.jpg
817 | racket/n04039381_593.jpg
818 | racket/n04039381_4860.jpg
819 | racket/n04039381_7370.jpg
820 | racket/n04409806_5226.jpg
821 | ray/n01496331_33236.jpg
822 | ray/n01496331_10373.jpg
823 | ray/n01496331_34865.jpg
824 | ray/n01498041_2351.jpg
825 | ray/n01496331_21710.jpg
826 | ray/n01496331_15801.jpg
827 | ray/n01496331_23623.jpg
828 | ray/n01498041_4234.jpg
829 | ray/n01498041_197.jpg
830 | ray/n01496331_13467.jpg
831 | rhinoceros/n02391994_3673.jpg
832 | rhinoceros/n02391994_14936.jpg
833 | rhinoceros/n02391994_18653.jpg
834 | rhinoceros/n02391994_16313.jpg
835 | rhinoceros/n02391994_17073.jpg
836 | rhinoceros/n02391994_15908.jpg
837 | rhinoceros/n02391994_3259.jpg
838 | rhinoceros/n02391994_19014.jpg
839 | rhinoceros/n02391994_530.jpg
840 | rhinoceros/n02391994_12492.jpg
841 | rifle/n03416775_11882.jpg
842 | rifle/n03416775_14703.jpg
843 | rifle/n02749479_7185.jpg
844 | rifle/n02749479_407.jpg
845 | rifle/n02907391_21279.jpg
846 | rifle/n04090263_24512.jpg
847 | rifle/n02749479_1437.jpg
848 | rifle/n02749479_7403.jpg
849 | rifle/n03416775_8292.jpg
850 | rifle/n02749479_4107.jpg
851 | rocket/n04099429_20475.jpg
852 | rocket/n04415663_4998.jpg
853 | rocket/n04415663_11259.jpg
854 | rocket/n04415663_4677.jpg
855 | rocket/n03773504_21983.jpg
856 | rocket/n04099429_8939.jpg
857 | rocket/n03773504_22120.jpg
858 | rocket/n04415663_4099.jpg
859 | rocket/n04099429_1534.jpg
860 | rocket/n03773504_6665.jpg
861 | sailboat/n04128499_6235.jpg
862 | sailboat/n04128499_2799.jpg
863 | sailboat/n04128499_5397.jpg
864 | sailboat/n04128499_6797.jpg
865 | sailboat/n04128499_2893.jpg
866 | sailboat/n04128499_6762.jpg
867 | sailboat/n04128499_7174.jpg
868 | sailboat/n04128499_230.jpg
869 | sailboat/n04128499_3865.jpg
870 | sailboat/n04128499_13658.jpg
871 | saw/n04140064_8460.jpg
872 | saw/n04140064_18814.jpg
873 | saw/n02770585_3240.jpg
874 | saw/n04140064_23555.jpg
875 | saw/n04140064_45049.jpg
876 | saw/n04140064_10178.jpg
877 | saw/n04140064_18657.jpg
878 | saw/n04140064_3739.jpg
879 | saw/n04140064_19771.jpg
880 | saw/n02770585_5006.jpg
881 | saxophone/n04141076_41535.jpg
882 | saxophone/n04141076_4140.jpg
883 | saxophone/n04141076_39197.jpg
884 | saxophone/n04141076_35278.jpg
885 | saxophone/n04141076_10007.jpg
886 | saxophone/n04141076_4403.jpg
887 | saxophone/n04141076_39548.jpg
888 | saxophone/n04141076_12329.jpg
889 | saxophone/n04141076_41723.jpg
890 | saxophone/n04141076_42145.jpg
891 | scissors/n04148054_2603.jpg
892 | scissors/n04148054_13312.jpg
893 | scissors/n04148054_15870.jpg
894 | scissors/n04148054_7900.jpg
895 | scissors/n04148054_20645.jpg
896 | scissors/n04148054_1233.jpg
897 | scissors/n04148054_11540.jpg
898 | scissors/n04148054_370.jpg
899 | scissors/n04148054_17514.jpg
900 | scissors/n04148054_8138.jpg
901 | scorpion/n01770393_3955.jpg
902 | scorpion/n01770393_144.jpg
903 | scorpion/n01770393_2975.jpg
904 | scorpion/n01770393_13358.jpg
905 | scorpion/n01770393_6144.jpg
906 | scorpion/n01770393_3338.jpg
907 | scorpion/n01770393_12256.jpg
908 | scorpion/n01770393_10666.jpg
909 | scorpion/n01770393_27241.jpg
910 | scorpion/n01770393_1283.jpg
911 | sea_turtle/n01664065_9290.jpg
912 | sea_turtle/n01664065_15939.jpg
913 | sea_turtle/n01663401_3035.jpg
914 | sea_turtle/n01663401_8444.jpg
915 | sea_turtle/n01663401_11120.jpg
916 | sea_turtle/n01663401_1997.jpg
917 | sea_turtle/n01663401_8261.jpg
918 | sea_turtle/n01664065_2112.jpg
919 | sea_turtle/n01664065_26514.jpg
920 | sea_turtle/n01664065_4095.jpg
921 | seagull/n02041246_8583.jpg
922 | seagull/n02041246_7834.jpg
923 | seagull/n02041246_1005.jpg
924 | seagull/n02041246_18235.jpg
925 | seagull/n02041246_4512.jpg
926 | seagull/n02041246_18641.jpg
927 | seagull/n02041246_5731.jpg
928 | seagull/n02041246_35326.jpg
929 | seagull/n02041246_4572.jpg
930 | seagull/n02041246_16400.jpg
931 | seal/n02076196_17306.jpg
932 | seal/n02077923_95.jpg
933 | seal/n02077923_1297.jpg
934 | seal/n02077923_1231.jpg
935 | seal/n02077923_12945.jpg
936 | seal/n02077923_3713.jpg
937 | seal/n02077923_521.jpg
938 | seal/n02076196_1919.jpg
939 | seal/n02076196_2469.jpg
940 | seal/n02076196_10596.jpg
941 | shark/n01491361_7484.jpg
942 | shark/n01483021_50.jpg
943 | shark/n01494475_2974.jpg
944 | shark/n01482330_3285.jpg
945 | shark/n01494475_8365.jpg
946 | shark/n01494475_714.jpg
947 | shark/n01491361_4444.jpg
948 | shark/n01491361_4716.jpg
949 | shark/n01482330_1753.jpg
950 | shark/n01489920_9229.jpg
951 | sheep/n02412080_2578.jpg
952 | sheep/n02412080_7720.jpg
953 | sheep/n02411705_13543.jpg
954 | sheep/n02411705_14656.jpg
955 | sheep/n02411705_1556.jpg
956 | sheep/n02412080_7753.jpg
957 | sheep/n02411705_6546.jpg
958 | sheep/n02412080_6209.jpg
959 | sheep/n02413131_1409.jpg
960 | sheep/n02412080_7894.jpg
961 | shoe/n03680355_2585.jpg
962 | shoe/n04120489_3237.jpg
963 | shoe/n04120489_18.jpg
964 | shoe/n03680355_4637.jpg
965 | shoe/n04120489_1990.jpg
966 | shoe/n04120489_3518.jpg
967 | shoe/n04120489_3775.jpg
968 | shoe/n04546081_2665.jpg
969 | shoe/n04120489_4013.jpg
970 | shoe/n03680355_887.jpg
971 | skyscraper/n04233124_4190.jpg
972 | skyscraper/n04233124_3400.jpg
973 | skyscraper/n04233124_8777.jpg
974 | skyscraper/n04233124_1041.jpg
975 | skyscraper/n04233124_11521.jpg
976 | skyscraper/n04233124_18366.jpg
977 | skyscraper/n04233124_15373.jpg
978 | skyscraper/n04233124_11849.jpg
979 | skyscraper/n04233124_15445.jpg
980 | skyscraper/n04233124_3550.jpg
981 | snail/n01944390_3725.jpg
982 | snail/n01944390_5807.jpg
983 | snail/n01944390_10342.jpg
984 | snail/n01944390_3028.jpg
985 | snail/n01944390_4191.jpg
986 | snail/n01944390_6382.jpg
987 | snail/n01944390_4196.jpg
988 | snail/n01944390_10864.jpg
989 | snail/n01944390_7676.jpg
990 | snail/n01944390_11743.jpg
991 | snake/n01748264_14024.jpg
992 | snake/n01726692_2433.jpg
993 | snake/n01726692_5470.jpg
994 | snake/n01726692_80.jpg
995 | snake/n01726692_1615.jpg
996 | snake/n01726692_2164.jpg
997 | snake/n01726692_981.jpg
998 | snake/n01726692_9144.jpg
999 | snake/n01729977_5068.jpg
1000 | snake/n01726692_182.jpg
1001 | songbird/n01527347_331.jpg
1002 | songbird/n01531178_11416.jpg
1003 | songbird/n01594787_5190.jpg
1004 | songbird/n01592084_5207.jpg
1005 | songbird/n01592084_760.jpg
1006 | songbird/n01530575_509.jpg
1007 | songbird/n01531178_6171.jpg
1008 | songbird/n01558993_19761.jpg
1009 | songbird/n01558993_2370.jpg
1010 | songbird/n01527347_17690.jpg
1011 | spider/n01772222_954.jpg
1012 | spider/n01772222_3092.jpg
1013 | spider/n01772222_3438.jpg
1014 | spider/n01772222_10676.jpg
1015 | spider/n01772222_6998.jpg
1016 | spider/n01772222_12437.jpg
1017 | spider/n01772222_8205.jpg
1018 | spider/n01772222_1171.jpg
1019 | spider/n01772222_9346.jpg
1020 | spider/n01772222_8692.jpg
1021 | spoon/n04350769_6642.jpg
1022 | spoon/n04350769_483.jpg
1023 | spoon/n03633091_15236.jpg
1024 | spoon/n04350769_5916.jpg
1025 | spoon/n03633091_1273.jpg
1026 | spoon/n04597913_6066.jpg
1027 | spoon/n04284002_20512.jpg
1028 | spoon/n04350769_9981.jpg
1029 | spoon/n03633091_15372.jpg
1030 | spoon/n04350769_1634.jpg
1031 | squirrel/n02355227_13541.jpg
1032 | squirrel/n02355227_13077.jpg
1033 | squirrel/n02355227_15209.jpg
1034 | squirrel/n02355227_15129.jpg
1035 | squirrel/n02355227_12764.jpg
1036 | squirrel/n02355227_9552.jpg
1037 | squirrel/n02355227_14224.jpg
1038 | squirrel/n02355227_451.jpg
1039 | squirrel/n02355227_14716.jpg
1040 | squirrel/n02355227_15878.jpg
1041 | starfish/n02317335_832.jpg
1042 | starfish/n02317335_16368.jpg
1043 | starfish/n02317335_2072.jpg
1044 | starfish/n02317335_572.jpg
1045 | starfish/n02317335_762.jpg
1046 | starfish/n02317335_4568.jpg
1047 | starfish/n02317335_22895.jpg
1048 | starfish/n02317335_19633.jpg
1049 | starfish/n02317335_26045.jpg
1050 | starfish/n02317335_16276.jpg
1051 | strawberry/n07745940_2642.jpg
1052 | strawberry/n07745940_5035.jpg
1053 | strawberry/n07745940_991.jpg
1054 | strawberry/n07745940_2329.jpg
1055 | strawberry/n07745940_2244.jpg
1056 | strawberry/n07745940_20814.jpg
1057 | strawberry/n07745940_12035.jpg
1058 | strawberry/n07745940_3145.jpg
1059 | strawberry/n07745940_5599.jpg
1060 | strawberry/n07745940_20922.jpg
1061 | swan/n01858441_9652.jpg
1062 | swan/n01858441_2937.jpg
1063 | swan/n01858441_1041.jpg
1064 | swan/n01860187_739.jpg
1065 | swan/n01858441_7162.jpg
1066 | swan/n01858441_10732.jpg
1067 | swan/n01858441_2850.jpg
1068 | swan/n01858441_9626.jpg
1069 | swan/n01858845_12958.jpg
1070 | swan/n01860187_950.jpg
1071 | sword/n04373894_29721.jpg
1072 | sword/n04373894_11246.jpg
1073 | sword/n04373894_37359.jpg
1074 | sword/n04147793_13881.jpg
1075 | sword/n04373894_40106.jpg
1076 | sword/n04373894_35504.jpg
1077 | sword/n04373894_36791.jpg
1078 | sword/n04373894_32014.jpg
1079 | sword/n04373894_47693.jpg
1080 | sword/n04147793_6149.jpg
1081 | table/n04379964_24141.jpg
1082 | table/n03201208_32524.jpg
1083 | table/n03201208_25958.jpg
1084 | table/n04379243_1127.jpg
1085 | table/n04379964_2419.jpg
1086 | table/n04379243_23142.jpg
1087 | table/n04379243_24695.jpg
1088 | table/n03201208_37167.jpg
1089 | table/n03201208_2363.jpg
1090 | table/n04379964_30971.jpg
1091 | tank/n04389033_3055.jpg
1092 | tank/n04389033_24620.jpg
1093 | tank/n04389033_25719.jpg
1094 | tank/n04389033_8657.jpg
1095 | tank/n04389033_19025.jpg
1096 | tank/n04389033_23639.jpg
1097 | tank/n04389033_25541.jpg
1098 | tank/n04389033_12442.jpg
1099 | tank/n04389033_14209.jpg
1100 | tank/n04389033_20803.jpg
1101 | teapot/n04398044_18256.jpg
1102 | teapot/n04398044_19613.jpg
1103 | teapot/n04398044_2917.jpg
1104 | teapot/n04398044_5486.jpg
1105 | teapot/n04398044_14381.jpg
1106 | teapot/n04398044_7880.jpg
1107 | teapot/n04398044_25219.jpg
1108 | teapot/n04398044_35589.jpg
1109 | teapot/n04398044_11626.jpg
1110 | teapot/n04398044_31041.jpg
1111 | teddy_bear/n04399382_22297.jpg
1112 | teddy_bear/n04399382_16584.jpg
1113 | teddy_bear/n04399382_1100.jpg
1114 | teddy_bear/n04399382_26765.jpg
1115 | teddy_bear/n04399382_4704.jpg
1116 | teddy_bear/n04399382_34899.jpg
1117 | teddy_bear/n04399382_2073.jpg
1118 | teddy_bear/n04399382_11606.jpg
1119 | teddy_bear/n04399382_33819.jpg
1120 | teddy_bear/n04399382_28912.jpg
1121 | tiger/n02129604_9840.jpg
1122 | tiger/n02129604_15991.jpg
1123 | tiger/n02129604_1431.jpg
1124 | tiger/n02129604_1675.jpg
1125 | tiger/n02129604_7019.jpg
1126 | tiger/n02129604_8618.jpg
1127 | tiger/n02129604_15877.jpg
1128 | tiger/n02129604_10691.jpg
1129 | tiger/n02129604_17243.jpg
1130 | tiger/n02129604_570.jpg
1131 | tree/n12755727_1126.jpg
1132 | tree/n12726670_6812.jpg
1133 | tree/n12713866_6140.jpg
1134 | tree/n12305089_6316.jpg
1135 | tree/n12593994_8214.jpg
1136 | tree/n11646344_1934.jpg
1137 | tree/n12582231_1696.jpg
1138 | tree/n12587803_33319.jpg
1139 | tree/n12713866_9719.jpg
1140 | tree/n12523475_9706.jpg
1141 | trumpet/n03110669_93834.jpg
1142 | trumpet/n03110669_148863.jpg
1143 | trumpet/n03110669_89645.jpg
1144 | trumpet/n03110669_96018.jpg
1145 | trumpet/n03110669_98252.jpg
1146 | trumpet/n03110669_91100.jpg
1147 | trumpet/n03110669_120627.jpg
1148 | trumpet/n03110669_98315.jpg
1149 | trumpet/n03110669_73253.jpg
1150 | trumpet/n03110669_19514.jpg
1151 | turtle/n01669191_7682.jpg
1152 | turtle/n01669191_3313.jpg
1153 | turtle/n01669191_1717.jpg
1154 | turtle/n01669191_5474.jpg
1155 | turtle/n01669191_10039.jpg
1156 | turtle/n01669191_9556.jpg
1157 | turtle/n01669191_9607.jpg
1158 | turtle/n01669191_6969.jpg
1159 | turtle/n01669191_329.jpg
1160 | turtle/n01669191_9750.jpg
1161 | umbrella/n04507155_8835.jpg
1162 | umbrella/n04507155_1165.jpg
1163 | umbrella/n04507155_15141.jpg
1164 | umbrella/n04507155_17309.jpg
1165 | umbrella/n04507155_3113.jpg
1166 | umbrella/n04507155_4852.jpg
1167 | umbrella/n04507155_2158.jpg
1168 | umbrella/n04507155_9727.jpg
1169 | umbrella/n04507155_5583.jpg
1170 | umbrella/n04507155_17155.jpg
1171 | violin/n04536866_1290.jpg
1172 | violin/n02803934_21947.jpg
1173 | violin/n04536866_1926.jpg
1174 | violin/n04536866_2123.jpg
1175 | violin/n04536866_6286.jpg
1176 | violin/n04536866_10581.jpg
1177 | violin/n04536866_9941.jpg
1178 | violin/n04536866_12600.jpg
1179 | violin/n04536866_7704.jpg
1180 | violin/n02803934_1380.jpg
1181 | volcano/n09472597_38264.jpg
1182 | volcano/n09472597_52209.jpg
1183 | volcano/n09472597_13955.jpg
1184 | volcano/n09472597_35779.jpg
1185 | volcano/n09472597_34930.jpg
1186 | volcano/n09472597_446.jpg
1187 | volcano/n09472597_12664.jpg
1188 | volcano/n09472597_16556.jpg
1189 | volcano/n09472597_39459.jpg
1190 | volcano/n09472597_36414.jpg
1191 | wading_bird/n02007558_4041.jpg
1192 | wading_bird/n02007558_4727.jpg
1193 | wading_bird/n02009912_10484.jpg
1194 | wading_bird/n02009912_2366.jpg
1195 | wading_bird/n02009912_13753.jpg
1196 | wading_bird/n02012849_2562.jpg
1197 | wading_bird/n02009912_6953.jpg
1198 | wading_bird/n02009912_13815.jpg
1199 | wading_bird/n02000954_14335.jpg
1200 | wading_bird/n02009912_13292.jpg
1201 | wheelchair/n04576002_8360.jpg
1202 | wheelchair/n04576002_388.jpg
1203 | wheelchair/n04576002_1665.jpg
1204 | wheelchair/n04576002_6833.jpg
1205 | wheelchair/n04576002_602.jpg
1206 | wheelchair/n04576002_6836.jpg
1207 | wheelchair/n04576002_8668.jpg
1208 | wheelchair/n04576002_7851.jpg
1209 | wheelchair/n04576002_7247.jpg
1210 | wheelchair/n04576002_8774.jpg
1211 | windmill/n04587404_14737.jpg
1212 | windmill/n04587404_18560.jpg
1213 | windmill/n04587559_13032.jpg
1214 | windmill/n04587559_11407.jpg
1215 | windmill/n04587559_9281.jpg
1216 | windmill/n04587559_13370.jpg
1217 | windmill/n04587404_14153.jpg
1218 | windmill/n04587559_11131.jpg
1219 | windmill/n04587559_9715.jpg
1220 | windmill/n04587559_12849.jpg
1221 | window/n04589593_11620.jpg
1222 | window/n04587648_16394.jpg
1223 | window/n04589593_12197.jpg
1224 | window/n04587648_4526.jpg
1225 | window/n04589593_13410.jpg
1226 | window/n04587648_811.jpg
1227 | window/n04589593_12080.jpg
1228 | window/n04587648_3571.jpg
1229 | window/n03227184_3521.jpg
1230 | window/n04587648_16618.jpg
1231 | wine_bottle/n04591713_4821.jpg
1232 | wine_bottle/n04591713_1989.jpg
1233 | wine_bottle/n04591713_3955.jpg
1234 | wine_bottle/n04591713_3642.jpg
1235 | wine_bottle/n04591713_4054.jpg
1236 | wine_bottle/n04591713_6022.jpg
1237 | wine_bottle/n04591713_3570.jpg
1238 | wine_bottle/n04591713_727.jpg
1239 | wine_bottle/n04591713_1407.jpg
1240 | wine_bottle/n04591713_110.jpg
1241 | zebra/n02391049_1950.jpg
1242 | zebra/n02391049_9822.jpg
1243 | zebra/n02391049_1551.jpg
1244 | zebra/n02391049_4069.jpg
1245 | zebra/n02391049_10452.jpg
1246 | zebra/n02391049_1526.jpg
1247 | zebra/n02391049_7077.jpg
1248 | zebra/n02391049_7261.jpg
1249 | zebra/n02391049_6467.jpg
1250 | zebra/n02391049_5695.jpg
1251 |
--------------------------------------------------------------------------------
/train.py:
--------------------------------------------------------------------------------
1 | import argparse
2 | import os
3 | from models.TripletEmbedding import TripletNet
4 |
5 |
6 | def str2bool(v):
7 | if v.lower() in ('yes', 'true', 't', 'y', '1'):
8 | return True
9 | elif v.lower() in ('no', 'false', 'f', 'n', '0'):
10 | return False
11 | else:
12 | raise argparse.ArgumentTypeError('Unsupported value encountered.')
13 |
14 |
15 | def parase_args():
16 | parser = argparse.ArgumentParser()
17 |
18 | parser.add_argument('--photo_root', type=str, default='/data1/zzl/dataset/photo-train', help='Training photo root')
19 | parser.add_argument('--sketch_root', type=str, default='/data1/zzl/dataset/sketch-triplet-train',
20 | help='Training sketch root')
21 | parser.add_argument('--batch_size', type=int, default=16, help='The size of batch (default :16')
22 | parser.add_argument('--device', type=str, default='0', help='The cuda device to be used (default: 0)')
23 | parser.add_argument('--epochs', type=int, default=1000, help='The number of epochs to run (default: 1000)')
24 | parser.add_argument('--lr', type=float, default=1e-5, help='The learning rate of the model')
25 |
26 | parser.add_argument('--test', type=str2bool, nargs='?', default=True)
27 | parser.add_argument('--test_f', type=int, default=5, help='The frequency of testing (default: 5)')
28 | parser.add_argument('--photo_test', type=str, default='/data1/zzl/dataset/photo-test', help='Testing photo root')
29 | parser.add_argument('--sketch_test', type=str, default='/data1/zzl/dataset/sketch-triplet-test',
30 | help='Testing sketch root')
31 |
32 | parser.add_argument('--save_model', type=str2bool, nargs='?', default=False)
33 | parser.add_argument('--save_dir', type=str, default='/data1/zzl/model/caffe2torch/vgg_triplet_loss',
34 | help='The folder to save the model status')
35 |
36 | parser.add_argument('--vis', type=str2bool, nargs='?', default=True, help='Whether to visualize')
37 | parser.add_argument('--env', type=str, default='caffe2torch_tripletloss', help='The visualization environment')
38 |
39 | parser.add_argument('--fine_tune', type=str2bool, nargs='?', default=False, help='Whether to fine tune')
40 | parser.add_argument('--model_root', type=str, default=None, help='The model status files\'s root')
41 |
42 | parser.add_argument('--margin', type=float, default=0.3, help='The margin of the triplet loss')
43 | parser.add_argument('--p', type=int, default=2, help='The p of the triplet loss')
44 |
45 | parser.add_argument('--net', type=str, default='vgg16', help='The model to be used (vgg16, resnet34, resnet50)')
46 | parser.add_argument('--cat', type=str2bool, nargs='?', default=True, help='Whether to use category loss')
47 |
48 | return check_args(parser.parse_args())
49 |
50 |
51 | def check_args(args):
52 | if args.save_model:
53 | save_photo_dir = os.path.join(args.save_dir, 'photo')
54 | save_sketch_dir = os.path.join(args.save_dir, 'sketch')
55 | if not os.path.exists(args.save_dir):
56 | os.mkdir(args.save_dir)
57 | os.mkdir(save_photo_dir)
58 | os.mkdir(save_sketch_dir)
59 |
60 | try:
61 | assert args.epochs >= 1
62 | except:
63 | print('number of epochs must be larger than or equal to one')
64 |
65 | try:
66 | assert args.batch_size >= 1
67 | except:
68 | print('batch size must be larger than or equal to one')
69 |
70 | try:
71 | assert args.net in ['vgg16', 'resnet34', 'resnet50']
72 | except:
73 | print('net model must be chose from [\'vgg16\', \'resnet34\', \'resnet50\']')
74 |
75 | if args.fine_tune:
76 | try:
77 | assert not args.model_root
78 |
79 | except:
80 | print('you should specify the model status file')
81 |
82 | return args
83 |
84 |
85 | def main():
86 |
87 | args = parase_args()
88 | if args is None:
89 | exit()
90 |
91 | os.environ["CUDA_VISIBLE_DEVICES"] = str(args.device)
92 |
93 | tripletNet = TripletNet(args)
94 | tripletNet.train()
95 |
96 |
97 | if __name__ == '__main__':
98 |
99 | main()
--------------------------------------------------------------------------------
/utils/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/CDOTAD/SketchyDatabase/0d8f5abf98c8a6d9847e16258493a1e56371b54c/utils/__init__.py
--------------------------------------------------------------------------------
/utils/extractor.py:
--------------------------------------------------------------------------------
1 | import os
2 | from PIL import Image
3 | import torch as t
4 | import torchvision as tv
5 | from torch import nn
6 | import pickle
7 | from utils.visualize import Visualizer
8 | from data import ImageDataLoader
9 | import numpy as np
10 |
11 |
12 | class Config(object):
13 | def __init__(self):
14 | return
15 |
16 |
17 | class Extractor(object):
18 |
19 | def __init__(self, e_model, batch_size=128, cat_info=True,
20 | vis=False, dataloader=False):
21 | self.batch_size = batch_size
22 | self.cat_info = cat_info
23 |
24 | self.model = e_model
25 |
26 | if dataloader:
27 | self.dataloader = dataloader
28 | else:
29 | self.transform = tv.transforms.Compose([
30 | tv.transforms.Resize(224),
31 | tv.transforms.ToTensor(),
32 | tv.transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
33 | ])
34 | self.vis = vis
35 | if self.vis:
36 | self.viser = Visualizer('caffe2torch_test')
37 |
38 | # extract the inputs' feature via self.model
39 | # the model's output only contains the inputs' feature
40 | @t.no_grad()
41 | def extract(self, data_root, out_root=None):
42 | if self.dataloader:
43 | return self._extract_with_dataloader(data_root=data_root, cat_info=self.cat_info, out_root=out_root)
44 | else:
45 | return self._extract_without_dataloader(data_root=data_root, cat_info=self.cat_info, out_root=out_root)
46 |
47 | # extract the inputs' feature via self.model
48 | # the model's output contains both the inputs' feature and category info
49 | @t.no_grad()
50 | def _extract_without_dataloader(self, data_root, cat_info, out_root):
51 | feature = []
52 | name = []
53 |
54 | self.model.eval()
55 |
56 | cnames = sorted(os.listdir(data_root))
57 |
58 | for cname in cnames:
59 | c_path = os.path.join(data_root, cname)
60 | if os.path.isdir(c_path):
61 | fnames = sorted(os.listdir(c_path))
62 | for fname in fnames:
63 | path = os.path.join(c_path, fname)
64 |
65 | image = Image.open(path)
66 | image = self.transform(image)
67 | image = image[None]
68 | image = image.cuda()
69 |
70 | if self.vis:
71 | self.viser.images(image.cpu().numpy() * 0.5 + 0.5, win='extractor')
72 | out = self.model(image)
73 | if cat_info:
74 | i_feature = out[1]
75 | else:
76 | i_feature = out
77 |
78 | feature.append(i_feature.cpu().squeeze().numpy())
79 | name.append(cname + '/' + fname)
80 |
81 | data = {'name': name, 'feature': feature}
82 | if out_root:
83 | out = open(out_root, 'wb')
84 | pickle.dump(data, out)
85 |
86 | out.close()
87 |
88 | return data
89 |
90 | # extract the inputs' feature via self.model
91 | # the model's output contains both the inputs' feature and category info
92 | # the input is loaded by dataloader
93 | @t.no_grad()
94 | def _extract_with_dataloader(self, data_root, cat_info, out_root):
95 | names = []
96 |
97 | self.model.eval()
98 |
99 | opt = Config()
100 | opt.image_root = data_root
101 | opt.batch_size = 128
102 |
103 | dataloader = ImageDataLoader(opt)
104 | dataset = dataloader.load_data()
105 |
106 | for i, data in enumerate(dataset):
107 | image = data['I'].cuda()
108 | name = data['N']
109 |
110 | out = self.model(image)
111 | if cat_info:
112 | i_feature = out[1]
113 | else:
114 | i_feature = out
115 | if i == 0:
116 | feature = i_feature.cpu().squeeze().numpy()
117 |
118 | else:
119 | feature = np.append(feature, i_feature.cpu().squeeze().numpy(), axis=0)
120 |
121 | names += name
122 |
123 | data = {'name': names, 'feature': feature}
124 | if out_root:
125 | out = open(out_root, 'wb')
126 | pickle.dump(data, out)
127 |
128 | out.close()
129 |
130 | return data
131 |
132 | # reload model with model file
133 | # the reloaded model contains fully connection layer
134 | def reload_state_dict_with_fc(self, state_file):
135 | temp_model = tv.models.resnet34(pretrained=False)
136 | temp_model.fc = nn.Linear(512, 125)
137 | temp_model.load_state_dict(t.load(state_file))
138 |
139 | pretrained_dict = temp_model.state_dict()
140 |
141 | model_dict = self.model.state_dict()
142 |
143 | pretrained_dict = {k: v for k, v in pretrained_dict.items() if k in model_dict}
144 |
145 | model_dict.update(pretrained_dict)
146 | self.model.load_state_dict(model_dict)
147 |
148 | # reload model with model file
149 | # the reloaded model doesn't contain fully connection layer
150 | def reload_state_dic(self, state_file):
151 | self.model.load_state_dict(t.load(state_file))
152 |
153 | # reload model with model object directly
154 | def reload_model(self, model):
155 | self.model = model
156 |
--------------------------------------------------------------------------------
/utils/feature_visualizer.py:
--------------------------------------------------------------------------------
1 | import visdom
2 | import pickle
3 | import os
4 | import numpy as np
5 | from sklearn.manifold import TSNE
6 | import matplotlib.pyplot as plt
7 |
8 |
9 | class FeatureVisualizer(object):
10 |
11 | def __init__(self, data_root, feature_root, env):
12 | self.data_root = data_root
13 |
14 | self.feature_root = feature_root
15 | self.feature_data = pickle.load(open(self.feature_root, 'rb'))
16 |
17 | self.vis = visdom.Visdom(env=env)
18 |
19 | self.c_index = dict()
20 | cnames = sorted(os.listdir(data_root))
21 | for i, cname in enumerate(cnames):
22 | self.c_index[cname] = i
23 |
24 | cname_list, item_num = self._get_feature_info(data_root)
25 | self.cname_list = cname_list
26 | self.item_num = item_num
27 |
28 | # for (cname, num) in zip(cname_list, item_num):
29 | # print(cname, ':', num)
30 |
31 | # visualize the self.data_root
32 | def visualize(self, max_class, win='Feature Vis'):
33 |
34 | calculate_num = 0
35 |
36 | for i in range(max_class):
37 |
38 | calculate_num += self.item_num[i]
39 |
40 | feature = self.feature_data['feature'][:calculate_num]
41 | name = self.feature_data['name'][:calculate_num]
42 |
43 | feature_class = []
44 | for f_name in name:
45 | class_name = f_name.split('/')[0]
46 | feature_class.append(self.c_index[class_name])
47 | # print(np.shape(feature))
48 |
49 | tsne = TSNE(n_components=2, init='pca', random_state=0)
50 | X_tsen = tsne.fit_transform(feature)
51 |
52 | # print(np.shape(X_tsen))
53 |
54 | # print(set(feature_class))
55 | plt.figure()
56 | plt.scatter(X_tsen[:, 0], X_tsen[:, 1], c=np.array(feature_class), marker='.', cmap=plt.cm.Spectral)
57 | plt.colorbar()
58 | plt.grid(True)
59 | plt.xlabel('1st')
60 | plt.ylabel('2nd')
61 | plt.title('Feature Vis')
62 |
63 | self.vis.matplot(plt, win=win, opts=dict(title=win))
64 |
65 | # embedding vis
66 | def embedding_vis(self, em_data_root, em_feature_root, max_class, min_class=0, win='embedding vis'):
67 |
68 | cal_a = 0
69 | cal_b = 0
70 |
71 | em_cname_list, em_item_num = self._get_feature_info(em_data_root)
72 |
73 | em_feature_data = pickle.load(open(em_feature_root, 'rb'))
74 |
75 | start_a = 0
76 | start_b = 0
77 | for i in range(min_class):
78 | start_a += self.item_num[i]
79 | start_b += em_item_num[i]
80 |
81 | for i in range(min_class, max_class):
82 |
83 | cal_a += self.item_num[i]
84 | cal_b += em_item_num[i]
85 |
86 | a_feature = self.feature_data['feature'][start_a:cal_a+start_a]
87 | a_name = self.feature_data['name'][start_a:cal_a+start_a]
88 | print('np.shape(a_feature)', np.shape(a_feature))
89 |
90 | b_feature = em_feature_data['feature'][start_b:cal_b+start_b]
91 | b_name = em_feature_data['name'][start_b:cal_b+start_b]
92 | print('np.shape(b_feature)', np.shape(b_feature))
93 |
94 | a_class = []
95 | for name in a_name:
96 | c_name = name.split('/')[0]
97 | a_class.append(self.c_index[c_name])
98 |
99 | b_class = []
100 | for name in b_name:
101 | c_name = name.split('/')[0]
102 | b_class.append(self.c_index[c_name])
103 |
104 | tsne = TSNE(n_components=2, init='pca', random_state=0)
105 | feature = np.append(a_feature, b_feature, axis=0)
106 |
107 | print('np.shape(feature) :', np.shape(feature))
108 |
109 | X_tsen = tsne.fit_transform(feature)
110 |
111 | X_a = X_tsen[:cal_a]
112 | X_b = X_tsen[cal_a:]
113 |
114 | print('np.shape(X_b) :', np.shape(X_b))
115 | print('np.shape(b_class) :', np.shape(b_class))
116 | print('a_class :', a_class)
117 | print('b_class :', b_class)
118 |
119 | plt.figure()
120 | plt.scatter(X_a[:, 0], X_a[:, 1], c=np.array(a_class), marker='.', cmap=plt.cm.Spectral)
121 | plt.scatter(X_b[:, 0], X_b[:, 1], c=np.array(b_class), marker='s', cmap=plt.cm.Spectral)
122 |
123 | plt.colorbar()
124 | plt.grid(True)
125 | plt.xlabel('1st')
126 | plt.ylabel('2nd')
127 |
128 | plt.title('Embedding Vis')
129 | self.vis.matplot(plt, win=win, opts=dict(title=win))
130 |
131 | def _get_feature_info(self, data_root):
132 | item_num = []
133 | cname_list = []
134 |
135 | cnames = sorted(os.listdir(data_root))
136 | for cname in cnames:
137 | num = len(os.listdir(os.path.join(data_root, cname)))
138 |
139 | item_num.append(num)
140 | cname_list.append(cname)
141 |
142 | return cname_list, item_num
143 |
144 |
--------------------------------------------------------------------------------
/utils/test.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | from sklearn.neighbors import NearestNeighbors
3 | from utils.extractor import Extractor
4 | from utils.visualize import Visualizer
5 | import torch as t
6 |
7 |
8 | class Tester(object):
9 |
10 | def __init__(self, opt):
11 |
12 | # self.vis = opt.vis
13 | self.test_bs = opt.test_bs
14 |
15 | self.photo_net = opt.photo_net
16 | self.sketch_net = opt.sketch_net
17 |
18 | self.photo_test = opt.photo_test
19 | self.sketch_test = opt.sketch_test
20 |
21 | self.eps = 1e-8
22 |
23 | @t.no_grad()
24 | def _extract_feature(self):
25 | with t.no_grad():
26 | self.photo_net.eval()
27 | self.sketch_net.eval()
28 |
29 | extractor = Extractor(e_model=self.photo_net, vis=False, dataloader=True)
30 | photo_data = extractor.extract(self.photo_test)
31 |
32 | extractor.reload_model(self.sketch_net)
33 | sketch_data = extractor.extract(self.sketch_test)
34 |
35 | photo_name = photo_data['name']
36 | photo_feature = photo_data['feature']
37 |
38 | sketch_name = sketch_data['name']
39 | sketch_feature = sketch_data['feature']
40 |
41 | return photo_name, photo_feature, sketch_name, sketch_feature
42 |
43 | @t.no_grad()
44 | def _extract_feature_embedding(self):
45 | with t.no_grad():
46 | self.photo_net.eval()
47 | self.sketch_net.eval()
48 |
49 | extractor = Extractor(e_model=self.photo_net, cat_info=False, vis=False, dataloader=True)
50 | photo_data = extractor.extract(self.photo_test, batch_size=self.test_bs)
51 |
52 | extractor.reload_model(self.sketch_net)
53 | sketch_data = extractor.extract(self.sketch_test, batch_size=self.test_bs)
54 |
55 | photo_name = photo_data['name']
56 | photo_feature = photo_data['feature']
57 |
58 | sketch_name = sketch_data['name']
59 | sketch_feature = sketch_data['feature']
60 |
61 | return photo_name, photo_feature, sketch_name, sketch_feature
62 |
63 | @t.no_grad()
64 | def test_category_recall(self):
65 |
66 | photo_name, photo_feature, sketch_name, sketch_feature = self._extract_feature()
67 |
68 | nbrs = NearestNeighbors(n_neighbors=np.size(photo_feature, 0),
69 | algorithm='brute', metric='euclidean').fit(photo_feature)
70 |
71 | count_1 = 0
72 | count_5 = 0
73 | K = 5
74 | for ii, (query_sketch, query_name) in enumerate(zip(sketch_feature, sketch_name)):
75 | query_sketch = np.reshape(query_sketch, [1, np.shape(query_sketch)[0]])
76 |
77 | query_split = query_name.split('/')
78 | query_class = query_split[0]
79 |
80 | distance, indices = nbrs.kneighbors(query_sketch)
81 |
82 | for i, indice in enumerate(indices[0][:K]):
83 |
84 | retrieved_name = photo_name[indice]
85 | retrieved_class = retrieved_name.split('/')[0]
86 |
87 | if retrieved_class == query_class:
88 |
89 | if i == 0:
90 | count_1 += 1
91 | count_5 += 1
92 | break
93 |
94 | recall_1 = count_1 / (ii + 1)
95 | recall_5 = count_5 / (ii + 1)
96 |
97 | print('recall@1 :', recall_1, ' recall@5 :', recall_5)
98 | return {'recall@1': recall_1, 'recall@5': recall_5}
99 |
100 | @t.no_grad()
101 | def test_instance_recall(self):
102 | photo_name, photo_feature, sketch_name, sketch_feature = self._extract_feature()
103 |
104 | nbrs = NearestNeighbors(n_neighbors=np.size(photo_feature, 0),
105 | algorithm='brute', metric='euclidean').fit(photo_feature)
106 |
107 | count_1 = 0
108 | count_5 = 0
109 | K = 5
110 | for ii, (query_sketch, query_name) in enumerate(zip(sketch_feature, sketch_name)):
111 | query_sketch = np.reshape(query_sketch, [1, np.shape(query_sketch)[0]])
112 |
113 | query_split = query_name.split('/')
114 | query_class = query_split[0]
115 | query_img = query_split[1]
116 |
117 | distance, indices = nbrs.kneighbors(query_sketch)
118 |
119 | for i, indice in enumerate(indices[0][:K]):
120 |
121 | retrieved_name = photo_name[indice]
122 | retrieved_class = retrieved_name.split('/')[0]
123 |
124 | retrieved_name = retrieved_name.split('/')[1]
125 | retrieved_name = retrieved_name.split('.')[0]
126 | if retrieved_class == query_class:
127 | if query_img.split('-')[0] == retrieved_name:
128 | if i == 0:
129 | count_1 += 1
130 | count_5 += 1
131 | break
132 |
133 | recall_1 = count_1 / (ii + 1)
134 | recall_5 = count_5 / (ii + 1)
135 |
136 | print('recall@1 :', recall_1, ' recall@5 :', recall_5)
137 | return {'recall@1': recall_1, 'recall@5': recall_5}
138 |
139 |
140 |
141 |
142 |
143 |
144 |
145 |
146 |
147 |
148 |
149 |
--------------------------------------------------------------------------------
/utils/visualize.py:
--------------------------------------------------------------------------------
1 | # coding:utf8
2 | from itertools import chain
3 | import visdom
4 | import torch
5 | import time
6 | import torchvision as tv
7 | import numpy as np
8 |
9 |
10 | class Visualizer():
11 | """
12 | 封装了visdom的基本操作,但是你仍然可以通过`self.vis.function`
13 | 调用原生的visdom接口
14 | """
15 |
16 | def __init__(self, env='default', **kwargs):
17 | import visdom
18 | self.vis = visdom.Visdom(env=env, use_incoming_socket=False,**kwargs)
19 |
20 | # 画的第几个数,相当于横座标
21 | # 保存(’loss',23) 即loss的第23个点
22 | self.index = {}
23 | self.log_text = ''
24 |
25 | def reinit(self, env='default', **kwargs):
26 | """
27 | 修改visdom的配置
28 | """
29 | self.vis = visdom.Visdom(env=env,use_incoming_socket=False, **kwargs)
30 | return self
31 |
32 | def plot_many(self, d):
33 | """
34 | 一次plot多个
35 | @params d: dict (name,value) i.e. ('loss',0.11)
36 | """
37 | for k, v in d.items():
38 | self.plot(k, v)
39 |
40 | def img_many(self, d):
41 | for k, v in d.items():
42 | self.img(k, v)
43 |
44 | def plot(self, name, y, legend=None):
45 | """
46 | self.plot('loss',1.00)
47 | """
48 | x = self.index.get(name, 0)
49 | self.vis.line(Y=np.array([y]), X=np.array([x]),
50 | win=(name),
51 | opts=dict(title=name) if legend is None else dict(title=name, legend=legend),
52 | update=None if x == 0 else 'append'
53 | )
54 | self.index[name] = x + 1
55 |
56 | def img(self, name, img_):
57 | """
58 | self.img('input_img',t.Tensor(64,64))
59 | """
60 |
61 | if len(img_.size()) < 3:
62 | img_ = img_.cpu().unsqueeze(0)
63 | self.vis.image(img_.cpu(),
64 | win=(name),
65 | opts=dict(title=name)
66 | )
67 |
68 | def img_grid_many(self, d):
69 | for k, v in d.items():
70 | self.img_grid(k, v)
71 |
72 | def img_grid(self, name, input_3d):
73 | """
74 | 一个batch的图片转成一个网格图,i.e. input(36,64,64)
75 | 会变成 6*6 的网格图,每个格子大小64*64
76 | """
77 | self.img(name, tv.utils.make_grid(
78 | input_3d.cpu()[0].unsqueeze(1).clamp(max=1, min=0)))
79 |
80 | def log(self, info, win='log_text'):
81 | """
82 | self.log({'loss':1,'lr':0.0001})
83 | """
84 |
85 | self.log_text += ('[{time}] {info}
'.format(
86 | time=time.strftime('%m%d_%H%M%S'),
87 | info=info))
88 | self.vis.text(self.log_text, win=win)
89 |
90 | def __getattr__(self, name):
91 | return getattr(self.vis, name)
92 |
--------------------------------------------------------------------------------