├── .gitignore
├── README.md
├── config
└── model_config.py
├── data
├── DomainNet
│ ├── Painting
│ │ └── image_unida_list.txt
│ ├── Real
│ │ └── image_unida_list.txt
│ └── Sketch
│ │ └── image_unida_list.txt
├── Office
│ ├── Amazon
│ │ └── image_unida_list.txt
│ ├── Dslr
│ │ └── image_unida_list.txt
│ └── Webcam
│ │ └── image_unida_list.txt
├── OfficeHome
│ ├── Art
│ │ └── image_unida_list.txt
│ ├── Clipart
│ │ └── image_unida_list.txt
│ ├── Product
│ │ └── image_unida_list.txt
│ └── Realworld
│ │ └── image_unida_list.txt
└── VisDA
│ ├── train
│ └── image_unida_list.txt
│ └── validation
│ └── image_unida_list.txt
├── dataset
└── dataset.py
├── environment.yml
├── figures
├── GLC_framework.png
└── SFUNIDA.png
├── model
└── SFUniDA.py
├── scripts
├── train_source_OPDA.sh
├── train_source_OSDA.sh
├── train_source_PDA.sh
├── train_target_OPDA.sh
├── train_target_OSDA.sh
└── train_target_PDA.sh
├── train_source.py
├── train_target.py
└── utils
└── net_utils.py
/.gitignore:
--------------------------------------------------------------------------------
1 | # Byte-compiled / optimized / DLL files
2 | __pycache__/
3 | *.py[cod]
4 | *$py.class
5 |
6 | # C extensions
7 | *.so
8 |
9 | # Distribution / packaging
10 | .Python
11 | build/
12 | develop-eggs/
13 | dist/
14 | downloads/
15 | eggs/
16 | .eggs/
17 | lib/
18 | lib64/
19 | parts/
20 | sdist/
21 | var/
22 | wheels/
23 | pip-wheel-metadata/
24 | share/python-wheels/
25 | *.egg-info/
26 | .installed.cfg
27 | *.egg
28 | MANIFEST
29 |
30 | # PyInstaller
31 | # Usually these files are written by a python script from a template
32 | # before PyInstaller builds the exe, so as to inject date/other infos into it.
33 | *.manifest
34 | *.spec
35 |
36 | # Installer logs
37 | pip-log.txt
38 | pip-delete-this-directory.txt
39 |
40 | # Unit test / coverage reports
41 | htmlcov/
42 | .tox/
43 | .nox/
44 | .coverage
45 | .coverage.*
46 | .cache
47 | nosetests.xml
48 | coverage.xml
49 | *.cover
50 | *.py,cover
51 | .hypothesis/
52 | .pytest_cache/
53 |
54 | # Translations
55 | *.mo
56 | *.pot
57 |
58 | # Django stuff:
59 | *.log
60 | local_settings.py
61 | db.sqlite3
62 | db.sqlite3-journal
63 |
64 | # Flask stuff:
65 | instance/
66 | .webassets-cache
67 |
68 | # Scrapy stuff:
69 | .scrapy
70 |
71 | # Sphinx documentation
72 | docs/_build/
73 |
74 | # PyBuilder
75 | target/
76 |
77 | # Jupyter Notebook
78 | .ipynb_checkpoints
79 |
80 | # IPython
81 | profile_default/
82 | ipython_config.py
83 |
84 | # pyenv
85 | .python-version
86 |
87 | # pipenv
88 | # According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
89 | # However, in case of collaboration, if having platform-specific dependencies or dependencies
90 | # having no cross-platform support, pipenv may install dependencies that don't work, or not
91 | # install all needed dependencies.
92 | #Pipfile.lock
93 |
94 | # PEP 582; used by e.g. github.com/David-OConnor/pyflow
95 | __pypackages__/
96 |
97 | # Celery stuff
98 | celerybeat-schedule
99 | celerybeat.pid
100 |
101 | # SageMath parsed files
102 | *.sage.py
103 |
104 | # Environments
105 | .env
106 | .venv
107 | env/
108 | venv/
109 | ENV/
110 | env.bak/
111 | venv.bak/
112 |
113 | # Spyder project settings
114 | .spyderproject
115 | .spyproject
116 |
117 | # Rope project settings
118 | .ropeproject
119 |
120 | # mkdocs documentation
121 | /site
122 |
123 | # mypy
124 | .mypy_cache/
125 | .dmypy.json
126 | dmypy.json
127 |
128 | # Pyre type checker
129 | .pyre/
130 |
131 | # Wandb
132 | wandb
133 |
134 | # VSCODE
135 | .vscode
136 |
137 | # checkpoints
138 | # checkpoints_sfda
139 | *.pth
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 |
2 | # [Upcycling Models under Domain and Category Shift[CVPR-2023]](https://arxiv.org/abs/2303.07110)
3 |
4 | [](https://paperswithcode.com/sota/universal-domain-adaptation-on-office-31?p=upcycling-models-under-domain-and-category)
5 | [](https://paperswithcode.com/sota/universal-domain-adaptation-on-office-home?p=upcycling-models-under-domain-and-category)
6 | [](https://paperswithcode.com/sota/universal-domain-adaptation-on-visda2017?p=upcycling-models-under-domain-and-category)
7 | [](https://paperswithcode.com/sota/universal-domain-adaptation-on-domainnet?p=upcycling-models-under-domain-and-category)
8 |
9 | #### 🌟🌟🌟: Our new work on source-free universal domain adaptation has been accepted by CVPR-2024! The paper "LEAD: Learning Decomposition for Source-free Universal Domain Adaptation" is available at https://arxiv.org/abs/2403.03421. The code has been made public at https://github.com/ispc-lab/LEAD.
10 |
11 | #### ✨✨✨: We provide a substantial extension to this paper. "GLC++: Source-Free Universal Domain Adaptation through Global-Local Clustering and Contrastive Affinity Learning" is available at https://arxiv.org/abs/2403.14410. The code has been made public at https://github.com/ispc-lab/GLC-plus.
12 |
13 | ## Introduction
14 | Deep neural networks (DNNs) often perform poorly in the presence of domain shift and category shift. To address this, in this paper, we explore the Source-free Universal Domain Adaptation (SF-UniDA). SF-UniDA is appealing in view that universal model adaptation can be resolved only on the basis of a standard pre-trained closed-set model, i.e., without source raw data and dedicated model architecture. To achieve this, we develop a generic global and local clustering technique (GLC). GLC equips with an inovative one-vs-all global pseudo-labeling strategy to realize "known" and "unknown" data samples separation under various category-shift. Remarkably, in the most challenging open-partial-set DA scenario, GLC outperforms UMAD by 14.8% on the VisDA benchmark.
15 |
16 |
17 |
18 | ## Framework
19 |
20 |
21 | ## Prerequisites
22 | - python3, pytorch, numpy, PIL, scipy, sklearn, tqdm, etc.
23 | - We have presented the our conda environment file in `./environment.yml`.
24 |
25 | ## Dataset
26 | We have conducted extensive expeirments on four datasets with three category shift scenario, i.e., Partial-set DA (PDA), Open-set DA (OSDA), and Open-partial DA (OPDA). The following is the details of class split for each scenario. Here, $\mathcal{Y}$, $\mathcal{\bar{Y}_s}$, and $\mathcal{\bar{Y}_t}$ denotes the source-target-shared class, the source-private class, and the target-private class, respectively.
27 |
28 | | Datasets | Class Split| $\mathcal{Y}/\mathcal{\bar{Y}_s}/\mathcal{\bar{Y}_t}$| |
29 | | ----------- | -------- | -------- | -------- |
30 | | | OPDA | OSDA | PDA |
31 | | Office-31 | 10/10/11 | 10/0/11 | 10/21/0 |
32 | | Office-Home | 10/5/50 | 25/0/40 | 25/40/0 |
33 | | VisDA-C | 6/3/3 | 6/0/6 | 6/6/0 |
34 | | DomainNet | 150/50/145 | | |
35 |
36 | Please manually download these datasets from the official websites, and unzip them to the `./data` folder. To ease your implementation, we have provide the `image_unida_list.txt` for each dataset subdomains.
37 |
38 | ```
39 | ./data
40 | ├── Office
41 | │ ├── Amazon
42 | | ├── ...
43 | │ ├── image_unida_list.txt
44 | │ ├── Dslr
45 | | ├── ...
46 | │ ├── image_unida_list.txt
47 | │ ├── Webcam
48 | | ├── ...
49 | │ ├── image_unida_list.txt
50 | ├── OfficeHome
51 | │ ├── ...
52 | ├── VisDA
53 | │ ├── ...
54 | ```
55 |
56 | ## Training
57 | 1. Open-partial Domain Adaptation (OPDA) on Office, OfficeHome, and VisDA
58 | ```
59 | # Source Model Preparing
60 | bash ./scripts/train_source_OPDA.sh
61 | # Target Model Adaptation
62 | bash ./scripts/train_target_OPDA.sh
63 | ```
64 | 2. Open-set Domain Adaptation (OSDA) on Office, OfficeHome, and VisDA
65 | ```
66 | # Source Model Preparing
67 | bash ./scripts/train_source_OSDA.sh
68 | # Target Model Adaptation
69 | bash ./scripts/train_target_OSDA.sh
70 | ```
71 | 3. Partial-set Domain Adaptation (PDA) on Office, OfficeHome, and VisDA
72 | ```
73 | # Source Model Preparing
74 | bash ./scripts/train_source_PDA.sh
75 | # Target Model Adaptation
76 | bash ./scripts/train_target_PDA.sh
77 | ```
78 |
79 |
105 |
106 | ## Citation
107 | If you find our codebase helpful, please star our project and cite our paper:
108 | ```
109 | @inproceedings{sanqing2023GLC,
110 | title={Upcycling Models under Domain and Category Shift},
111 | author={Qu, Sanqing and Zou, Tianpei and Röhrbein, Florian and Lu, Cewu and Chen, Guang and Tao, Dacheng and Jiang, Changjun},
112 | booktitle={CVPR},
113 | year={2023},
114 | }
115 |
116 | @inproceedings{sanqing2022BMD,
117 | title={BMD: A general class-balanced multicentric dynamic prototype strategy for source-free domain adaptation},
118 | author={Qu, Sanqing and Chen, Guang and Zhang, Jing and Li, Zhijun and He, Wei and Tao, Dacheng},
119 | booktitle={ECCV},
120 | year={2022}
121 | }
122 | ```
123 |
124 | ## Contact
125 | - sanqingqu@gmail.com or 2011444@tongji.edu.cn
126 |
--------------------------------------------------------------------------------
/config/model_config.py:
--------------------------------------------------------------------------------
1 | import os
2 | import argparse
3 |
4 | def build_args():
5 |
6 | parser = argparse.ArgumentParser("This script is used to Source-free Universal Domain Adaptation")
7 |
8 | parser.add_argument("--dataset", type=str, default="Office")
9 | parser.add_argument("--backbone_arch", type=str, default="resnet50")
10 | parser.add_argument("--embed_feat_dim", type=int, default=256)
11 | parser.add_argument("--s_idx", type=int, default=0)
12 | parser.add_argument("--t_idx", type=int, default=1)
13 |
14 | parser.add_argument("--checkpoint", default=None, type=str)
15 | parser.add_argument("--epochs", default=50, type=int)
16 |
17 | parser.add_argument("--lr", type=float, default=1e-2)
18 | parser.add_argument("--gpu", default='0', type=str)
19 | parser.add_argument("--num_workers", type=int, default=6)
20 | parser.add_argument("--batch_size", type=int, default=64)
21 | parser.add_argument("--weight_decay", type=float, default=1e-3)
22 |
23 | parser.add_argument("--test", action="store_true")
24 | parser.add_argument("--seed", default=2021, type=int)
25 | # we set lam_psd to 0.3 for Office and VisDA, 1.5 for OfficeHome and DomainNet
26 | parser.add_argument("--lam_psd", default=0.3, type=float)
27 | parser.add_argument("--lam_knn", default=1.0, type=float)
28 | parser.add_argument("--local_K", default=4, type=int)
29 | parser.add_argument("--w_0", default=0.55, type=float)
30 | parser.add_argument("--rho", default=0.75, type=float)
31 |
32 | parser.add_argument("--source_train_type", default="smooth", type=str, help="vanilla, smooth")
33 | parser.add_argument("--target_label_type", default="OPDA", type=str)
34 | parser.add_argument("--target_private_class_num", default=None, type=int)
35 | parser.add_argument("--note", default="GLC_CVPR23")
36 |
37 | args = parser.parse_args()
38 |
39 | '''
40 | assume classes across domains are the same.
41 | [0 1 ............................................................................ N - 1]
42 | |---- common classes --||---- source private classes --||---- target private classes --|
43 |
44 | |-------------------------------------------------|
45 | | DATASET PARTITION |
46 | |-------------------------------------------------|
47 | |DATASET | class split(com/sou_pri/tar_pri) |
48 | |-------------------------------------------------|
49 | |DATASET | PDA | OSDA | OPDA/UniDA |
50 | |-------------------------------------------------|
51 | |Office-31 | 10/21/0 | 10/0/11 | 10/10/11 |
52 | |-------------------------------------------------|
53 | |OfficeHome | 25/40/0 | 25/0/40 | 10/5/50 |
54 | |-------------------------------------------------|
55 | |VisDA-C | 6/6/0 | 6/0/6 | 6/3/3 |
56 | |-------------------------------------------------|
57 | |DomainNet | | | 150/50/145 |
58 | |-------------------------------------------------|
59 | '''
60 |
61 | if args.dataset == "Office":
62 | domain_list = ['Amazon', 'Dslr', 'Webcam']
63 | args.source_data_dir = os.path.join("./data/Office", domain_list[args.s_idx])
64 | args.target_data_dir = os.path.join("./data/Office", domain_list[args.t_idx])
65 | args.target_domain_list = [domain_list[idx] for idx in range(3) if idx != args.s_idx]
66 | args.target_domain_dir_list = [os.path.join("./data/Office", item) for item in args.target_domain_list]
67 |
68 | args.shared_class_num = 10
69 |
70 | if args.target_label_type == "PDA":
71 | args.source_private_class_num = 21
72 | args.target_private_class_num = 0
73 |
74 | elif args.target_label_type == "OSDA":
75 | args.source_private_class_num = 0
76 | if args.target_private_class_num is None:
77 | args.target_private_class_num = 11
78 |
79 | elif args.target_label_type == "OPDA":
80 | args.source_private_class_num = 10
81 | if args.target_private_class_num is None:
82 | args.target_private_class_num = 11
83 |
84 | elif args.target_label_type == "CLDA":
85 | args.shared_class_num = 31
86 | args.source_private_class_num = 0
87 | args.target_private_class_num = 0
88 |
89 | else:
90 | raise NotImplementedError("Unknown target label type specified")
91 |
92 | elif args.dataset == "OfficeHome":
93 | domain_list = ['Art', 'Clipart', 'Product', 'Realworld']
94 | args.source_data_dir = os.path.join("./data/OfficeHome", domain_list[args.s_idx])
95 | args.target_data_dir = os.path.join("./data/OfficeHome", domain_list[args.t_idx])
96 | args.target_domain_list = [domain_list[idx] for idx in range(4) if idx != args.s_idx]
97 | args.target_domain_dir_list = [os.path.join("./data/OfficeHome", item) for item in args.target_domain_list]
98 |
99 | if args.target_label_type == "PDA":
100 | args.shared_class_num = 25
101 | args.source_private_class_num = 40
102 | args.target_private_class_num = 0
103 |
104 | elif args.target_label_type == "OSDA":
105 | args.shared_class_num = 25
106 | args.source_private_class_num = 0
107 | if args.target_private_class_num is None:
108 | args.target_private_class_num = 40
109 |
110 | elif args.target_label_type == "OPDA":
111 | args.shared_class_num = 10
112 | args.source_private_class_num = 5
113 | if args.target_private_class_num is None:
114 | args.target_private_class_num = 50
115 |
116 | elif args.target_label_type == "CLDA":
117 | args.shared_class_num = 65
118 | args.source_private_class_num = 0
119 | args.target_private_class_num = 0
120 | else:
121 | raise NotImplementedError("Unknown target label type specified")
122 |
123 | elif args.dataset == "VisDA":
124 | args.source_data_dir = "./data/VisDA/train/"
125 | args.target_data_dir = "./data/VisDA/validation/"
126 | args.target_domain_list = ["validataion"]
127 | args.target_domain_dir_list = [args.target_data_dir]
128 |
129 | args.shared_class_num = 6
130 | if args.target_label_type == "PDA":
131 | args.source_private_class_num = 6
132 | args.target_private_class_num = 0
133 |
134 | elif args.target_label_type == "OSDA":
135 | args.source_private_class_num = 0
136 | args.target_private_class_num = 6
137 |
138 | elif args.target_label_type == "OPDA":
139 | args.source_private_class_num = 3
140 | args.target_private_class_num = 3
141 |
142 | elif args.target_label_type == "CLDA":
143 | args.shared_class_num = 12
144 | args.source_private_class_num = 0
145 | args.target_private_class_num = 0
146 |
147 | else:
148 | raise NotImplementedError("Unknown target label type specified", args.target_label_type)
149 |
150 | elif args.dataset == "DomainNet":
151 | domain_list = ["Painting", "Real", "Sketch"]
152 | args.source_data_dir = os.path.join("./data/DomainNet", domain_list[args.s_idx])
153 | args.target_data_dir = os.path.join("./data/DomainNet", domain_list[args.t_idx])
154 | args.target_domain_list = [domain_list[idx] for idx in range(3) if idx != args.s_idx]
155 | args.target_domain_dir_list = [os.path.join("./data/DomainNet", item) for item in args.target_domain_list]
156 | args.embed_feat_dim = 512 # considering that DomainNet involves more than 256 categories.
157 |
158 | args.shared_class_num = 150
159 | if args.target_label_type == "OPDA":
160 | args.source_private_class_num = 50
161 | args.target_private_class_num = 145
162 | else:
163 | raise NotImplementedError("Unknown target label type specified")
164 |
165 | args.source_class_num = args.shared_class_num + args.source_private_class_num
166 | args.target_class_num = args.shared_class_num + args.target_private_class_num
167 | args.class_num = args.source_class_num
168 |
169 | args.source_class_list = [i for i in range(args.source_class_num)]
170 | args.target_class_list = [i for i in range(args.shared_class_num)]
171 | if args.target_private_class_num > 0:
172 | args.target_class_list.append(args.source_class_num)
173 |
174 | return args
175 |
--------------------------------------------------------------------------------
/data/Office/Dslr/image_unida_list.txt:
--------------------------------------------------------------------------------
1 | back_pack/frame_0001.jpg 0
2 | back_pack/frame_0002.jpg 0
3 | back_pack/frame_0003.jpg 0
4 | back_pack/frame_0004.jpg 0
5 | back_pack/frame_0005.jpg 0
6 | back_pack/frame_0006.jpg 0
7 | back_pack/frame_0007.jpg 0
8 | back_pack/frame_0008.jpg 0
9 | back_pack/frame_0009.jpg 0
10 | back_pack/frame_0010.jpg 0
11 | back_pack/frame_0011.jpg 0
12 | back_pack/frame_0012.jpg 0
13 | bike/frame_0001.jpg 1
14 | bike/frame_0002.jpg 1
15 | bike/frame_0003.jpg 1
16 | bike/frame_0004.jpg 1
17 | bike/frame_0005.jpg 1
18 | bike/frame_0006.jpg 1
19 | bike/frame_0007.jpg 1
20 | bike/frame_0008.jpg 1
21 | bike/frame_0009.jpg 1
22 | bike/frame_0010.jpg 1
23 | bike/frame_0011.jpg 1
24 | bike/frame_0012.jpg 1
25 | bike/frame_0013.jpg 1
26 | bike/frame_0014.jpg 1
27 | bike/frame_0015.jpg 1
28 | bike/frame_0016.jpg 1
29 | bike/frame_0017.jpg 1
30 | bike/frame_0018.jpg 1
31 | bike/frame_0019.jpg 1
32 | bike/frame_0020.jpg 1
33 | bike/frame_0021.jpg 1
34 | calculator/frame_0001.jpg 2
35 | calculator/frame_0002.jpg 2
36 | calculator/frame_0003.jpg 2
37 | calculator/frame_0004.jpg 2
38 | calculator/frame_0005.jpg 2
39 | calculator/frame_0006.jpg 2
40 | calculator/frame_0007.jpg 2
41 | calculator/frame_0008.jpg 2
42 | calculator/frame_0009.jpg 2
43 | calculator/frame_0010.jpg 2
44 | calculator/frame_0011.jpg 2
45 | calculator/frame_0012.jpg 2
46 | headphones/frame_0001.jpg 3
47 | headphones/frame_0002.jpg 3
48 | headphones/frame_0003.jpg 3
49 | headphones/frame_0004.jpg 3
50 | headphones/frame_0005.jpg 3
51 | headphones/frame_0006.jpg 3
52 | headphones/frame_0007.jpg 3
53 | headphones/frame_0008.jpg 3
54 | headphones/frame_0009.jpg 3
55 | headphones/frame_0010.jpg 3
56 | headphones/frame_0011.jpg 3
57 | headphones/frame_0012.jpg 3
58 | headphones/frame_0013.jpg 3
59 | keyboard/frame_0001.jpg 4
60 | keyboard/frame_0002.jpg 4
61 | keyboard/frame_0003.jpg 4
62 | keyboard/frame_0004.jpg 4
63 | keyboard/frame_0005.jpg 4
64 | keyboard/frame_0006.jpg 4
65 | keyboard/frame_0007.jpg 4
66 | keyboard/frame_0008.jpg 4
67 | keyboard/frame_0009.jpg 4
68 | keyboard/frame_0010.jpg 4
69 | laptop_computer/frame_0001.jpg 5
70 | laptop_computer/frame_0002.jpg 5
71 | laptop_computer/frame_0003.jpg 5
72 | laptop_computer/frame_0004.jpg 5
73 | laptop_computer/frame_0005.jpg 5
74 | laptop_computer/frame_0006.jpg 5
75 | laptop_computer/frame_0007.jpg 5
76 | laptop_computer/frame_0008.jpg 5
77 | laptop_computer/frame_0009.jpg 5
78 | laptop_computer/frame_0010.jpg 5
79 | laptop_computer/frame_0011.jpg 5
80 | laptop_computer/frame_0012.jpg 5
81 | laptop_computer/frame_0013.jpg 5
82 | laptop_computer/frame_0014.jpg 5
83 | laptop_computer/frame_0015.jpg 5
84 | laptop_computer/frame_0016.jpg 5
85 | laptop_computer/frame_0017.jpg 5
86 | laptop_computer/frame_0018.jpg 5
87 | laptop_computer/frame_0019.jpg 5
88 | laptop_computer/frame_0020.jpg 5
89 | laptop_computer/frame_0021.jpg 5
90 | laptop_computer/frame_0022.jpg 5
91 | laptop_computer/frame_0023.jpg 5
92 | laptop_computer/frame_0024.jpg 5
93 | monitor/frame_0001.jpg 6
94 | monitor/frame_0002.jpg 6
95 | monitor/frame_0003.jpg 6
96 | monitor/frame_0004.jpg 6
97 | monitor/frame_0005.jpg 6
98 | monitor/frame_0006.jpg 6
99 | monitor/frame_0007.jpg 6
100 | monitor/frame_0008.jpg 6
101 | monitor/frame_0009.jpg 6
102 | monitor/frame_0010.jpg 6
103 | monitor/frame_0011.jpg 6
104 | monitor/frame_0012.jpg 6
105 | monitor/frame_0013.jpg 6
106 | monitor/frame_0014.jpg 6
107 | monitor/frame_0015.jpg 6
108 | monitor/frame_0016.jpg 6
109 | monitor/frame_0017.jpg 6
110 | monitor/frame_0018.jpg 6
111 | monitor/frame_0019.jpg 6
112 | monitor/frame_0020.jpg 6
113 | monitor/frame_0021.jpg 6
114 | monitor/frame_0022.jpg 6
115 | mouse/frame_0001.jpg 7
116 | mouse/frame_0002.jpg 7
117 | mouse/frame_0003.jpg 7
118 | mouse/frame_0004.jpg 7
119 | mouse/frame_0005.jpg 7
120 | mouse/frame_0006.jpg 7
121 | mouse/frame_0007.jpg 7
122 | mouse/frame_0008.jpg 7
123 | mouse/frame_0009.jpg 7
124 | mouse/frame_0010.jpg 7
125 | mouse/frame_0011.jpg 7
126 | mouse/frame_0012.jpg 7
127 | mug/frame_0001.jpg 8
128 | mug/frame_0002.jpg 8
129 | mug/frame_0003.jpg 8
130 | mug/frame_0004.jpg 8
131 | mug/frame_0005.jpg 8
132 | mug/frame_0006.jpg 8
133 | mug/frame_0007.jpg 8
134 | mug/frame_0008.jpg 8
135 | projector/frame_0001.jpg 9
136 | projector/frame_0002.jpg 9
137 | projector/frame_0003.jpg 9
138 | projector/frame_0004.jpg 9
139 | projector/frame_0005.jpg 9
140 | projector/frame_0006.jpg 9
141 | projector/frame_0007.jpg 9
142 | projector/frame_0008.jpg 9
143 | projector/frame_0009.jpg 9
144 | projector/frame_0010.jpg 9
145 | projector/frame_0011.jpg 9
146 | projector/frame_0012.jpg 9
147 | projector/frame_0013.jpg 9
148 | projector/frame_0014.jpg 9
149 | projector/frame_0015.jpg 9
150 | projector/frame_0016.jpg 9
151 | projector/frame_0017.jpg 9
152 | projector/frame_0018.jpg 9
153 | projector/frame_0019.jpg 9
154 | projector/frame_0020.jpg 9
155 | projector/frame_0021.jpg 9
156 | projector/frame_0022.jpg 9
157 | projector/frame_0023.jpg 9
158 | bike_helmet/frame_0001.jpg 10
159 | bike_helmet/frame_0002.jpg 10
160 | bike_helmet/frame_0003.jpg 10
161 | bike_helmet/frame_0004.jpg 10
162 | bike_helmet/frame_0005.jpg 10
163 | bike_helmet/frame_0006.jpg 10
164 | bike_helmet/frame_0007.jpg 10
165 | bike_helmet/frame_0008.jpg 10
166 | bike_helmet/frame_0009.jpg 10
167 | bike_helmet/frame_0010.jpg 10
168 | bike_helmet/frame_0011.jpg 10
169 | bike_helmet/frame_0012.jpg 10
170 | bike_helmet/frame_0013.jpg 10
171 | bike_helmet/frame_0014.jpg 10
172 | bike_helmet/frame_0015.jpg 10
173 | bike_helmet/frame_0016.jpg 10
174 | bike_helmet/frame_0017.jpg 10
175 | bike_helmet/frame_0018.jpg 10
176 | bike_helmet/frame_0019.jpg 10
177 | bike_helmet/frame_0020.jpg 10
178 | bike_helmet/frame_0021.jpg 10
179 | bike_helmet/frame_0022.jpg 10
180 | bike_helmet/frame_0023.jpg 10
181 | bike_helmet/frame_0024.jpg 10
182 | bookcase/frame_0001.jpg 11
183 | bookcase/frame_0002.jpg 11
184 | bookcase/frame_0003.jpg 11
185 | bookcase/frame_0004.jpg 11
186 | bookcase/frame_0005.jpg 11
187 | bookcase/frame_0006.jpg 11
188 | bookcase/frame_0007.jpg 11
189 | bookcase/frame_0008.jpg 11
190 | bookcase/frame_0009.jpg 11
191 | bookcase/frame_0010.jpg 11
192 | bookcase/frame_0011.jpg 11
193 | bookcase/frame_0012.jpg 11
194 | bottle/frame_0001.jpg 12
195 | bottle/frame_0002.jpg 12
196 | bottle/frame_0003.jpg 12
197 | bottle/frame_0004.jpg 12
198 | bottle/frame_0005.jpg 12
199 | bottle/frame_0006.jpg 12
200 | bottle/frame_0007.jpg 12
201 | bottle/frame_0008.jpg 12
202 | bottle/frame_0009.jpg 12
203 | bottle/frame_0010.jpg 12
204 | bottle/frame_0011.jpg 12
205 | bottle/frame_0012.jpg 12
206 | bottle/frame_0013.jpg 12
207 | bottle/frame_0014.jpg 12
208 | bottle/frame_0015.jpg 12
209 | bottle/frame_0016.jpg 12
210 | desk_chair/frame_0001.jpg 13
211 | desk_chair/frame_0002.jpg 13
212 | desk_chair/frame_0003.jpg 13
213 | desk_chair/frame_0004.jpg 13
214 | desk_chair/frame_0005.jpg 13
215 | desk_chair/frame_0006.jpg 13
216 | desk_chair/frame_0007.jpg 13
217 | desk_chair/frame_0008.jpg 13
218 | desk_chair/frame_0009.jpg 13
219 | desk_chair/frame_0010.jpg 13
220 | desk_chair/frame_0011.jpg 13
221 | desk_chair/frame_0012.jpg 13
222 | desk_chair/frame_0013.jpg 13
223 | desk_lamp/frame_0001.jpg 14
224 | desk_lamp/frame_0002.jpg 14
225 | desk_lamp/frame_0003.jpg 14
226 | desk_lamp/frame_0004.jpg 14
227 | desk_lamp/frame_0005.jpg 14
228 | desk_lamp/frame_0006.jpg 14
229 | desk_lamp/frame_0007.jpg 14
230 | desk_lamp/frame_0008.jpg 14
231 | desk_lamp/frame_0009.jpg 14
232 | desk_lamp/frame_0010.jpg 14
233 | desk_lamp/frame_0011.jpg 14
234 | desk_lamp/frame_0012.jpg 14
235 | desk_lamp/frame_0013.jpg 14
236 | desk_lamp/frame_0014.jpg 14
237 | desktop_computer/frame_0001.jpg 15
238 | desktop_computer/frame_0002.jpg 15
239 | desktop_computer/frame_0003.jpg 15
240 | desktop_computer/frame_0004.jpg 15
241 | desktop_computer/frame_0005.jpg 15
242 | desktop_computer/frame_0006.jpg 15
243 | desktop_computer/frame_0007.jpg 15
244 | desktop_computer/frame_0008.jpg 15
245 | desktop_computer/frame_0009.jpg 15
246 | desktop_computer/frame_0010.jpg 15
247 | desktop_computer/frame_0011.jpg 15
248 | desktop_computer/frame_0012.jpg 15
249 | desktop_computer/frame_0013.jpg 15
250 | desktop_computer/frame_0014.jpg 15
251 | desktop_computer/frame_0015.jpg 15
252 | file_cabinet/frame_0001.jpg 16
253 | file_cabinet/frame_0002.jpg 16
254 | file_cabinet/frame_0003.jpg 16
255 | file_cabinet/frame_0004.jpg 16
256 | file_cabinet/frame_0005.jpg 16
257 | file_cabinet/frame_0006.jpg 16
258 | file_cabinet/frame_0007.jpg 16
259 | file_cabinet/frame_0008.jpg 16
260 | file_cabinet/frame_0009.jpg 16
261 | file_cabinet/frame_0010.jpg 16
262 | file_cabinet/frame_0011.jpg 16
263 | file_cabinet/frame_0012.jpg 16
264 | file_cabinet/frame_0013.jpg 16
265 | file_cabinet/frame_0014.jpg 16
266 | file_cabinet/frame_0015.jpg 16
267 | letter_tray/frame_0001.jpg 17
268 | letter_tray/frame_0002.jpg 17
269 | letter_tray/frame_0003.jpg 17
270 | letter_tray/frame_0004.jpg 17
271 | letter_tray/frame_0005.jpg 17
272 | letter_tray/frame_0006.jpg 17
273 | letter_tray/frame_0007.jpg 17
274 | letter_tray/frame_0008.jpg 17
275 | letter_tray/frame_0009.jpg 17
276 | letter_tray/frame_0010.jpg 17
277 | letter_tray/frame_0011.jpg 17
278 | letter_tray/frame_0012.jpg 17
279 | letter_tray/frame_0013.jpg 17
280 | letter_tray/frame_0014.jpg 17
281 | letter_tray/frame_0015.jpg 17
282 | letter_tray/frame_0016.jpg 17
283 | mobile_phone/frame_0001.jpg 18
284 | mobile_phone/frame_0002.jpg 18
285 | mobile_phone/frame_0003.jpg 18
286 | mobile_phone/frame_0004.jpg 18
287 | mobile_phone/frame_0005.jpg 18
288 | mobile_phone/frame_0006.jpg 18
289 | mobile_phone/frame_0007.jpg 18
290 | mobile_phone/frame_0008.jpg 18
291 | mobile_phone/frame_0009.jpg 18
292 | mobile_phone/frame_0010.jpg 18
293 | mobile_phone/frame_0011.jpg 18
294 | mobile_phone/frame_0012.jpg 18
295 | mobile_phone/frame_0013.jpg 18
296 | mobile_phone/frame_0014.jpg 18
297 | mobile_phone/frame_0015.jpg 18
298 | mobile_phone/frame_0016.jpg 18
299 | mobile_phone/frame_0017.jpg 18
300 | mobile_phone/frame_0018.jpg 18
301 | mobile_phone/frame_0019.jpg 18
302 | mobile_phone/frame_0020.jpg 18
303 | mobile_phone/frame_0021.jpg 18
304 | mobile_phone/frame_0022.jpg 18
305 | mobile_phone/frame_0023.jpg 18
306 | mobile_phone/frame_0024.jpg 18
307 | mobile_phone/frame_0025.jpg 18
308 | mobile_phone/frame_0026.jpg 18
309 | mobile_phone/frame_0027.jpg 18
310 | mobile_phone/frame_0028.jpg 18
311 | mobile_phone/frame_0029.jpg 18
312 | mobile_phone/frame_0030.jpg 18
313 | mobile_phone/frame_0031.jpg 18
314 | paper_notebook/frame_0001.jpg 19
315 | paper_notebook/frame_0002.jpg 19
316 | paper_notebook/frame_0003.jpg 19
317 | paper_notebook/frame_0004.jpg 19
318 | paper_notebook/frame_0005.jpg 19
319 | paper_notebook/frame_0006.jpg 19
320 | paper_notebook/frame_0007.jpg 19
321 | paper_notebook/frame_0008.jpg 19
322 | paper_notebook/frame_0009.jpg 19
323 | paper_notebook/frame_0010.jpg 19
324 | pen/frame_0001.jpg 20
325 | pen/frame_0002.jpg 20
326 | pen/frame_0003.jpg 20
327 | pen/frame_0004.jpg 20
328 | pen/frame_0005.jpg 20
329 | pen/frame_0006.jpg 20
330 | pen/frame_0007.jpg 20
331 | pen/frame_0008.jpg 20
332 | pen/frame_0009.jpg 20
333 | pen/frame_0010.jpg 20
334 | phone/frame_0001.jpg 21
335 | phone/frame_0002.jpg 21
336 | phone/frame_0003.jpg 21
337 | phone/frame_0004.jpg 21
338 | phone/frame_0005.jpg 21
339 | phone/frame_0006.jpg 21
340 | phone/frame_0007.jpg 21
341 | phone/frame_0008.jpg 21
342 | phone/frame_0009.jpg 21
343 | phone/frame_0010.jpg 21
344 | phone/frame_0011.jpg 21
345 | phone/frame_0012.jpg 21
346 | phone/frame_0013.jpg 21
347 | printer/frame_0001.jpg 22
348 | printer/frame_0002.jpg 22
349 | printer/frame_0003.jpg 22
350 | printer/frame_0004.jpg 22
351 | printer/frame_0005.jpg 22
352 | printer/frame_0006.jpg 22
353 | printer/frame_0007.jpg 22
354 | printer/frame_0008.jpg 22
355 | printer/frame_0009.jpg 22
356 | printer/frame_0010.jpg 22
357 | printer/frame_0011.jpg 22
358 | printer/frame_0012.jpg 22
359 | printer/frame_0013.jpg 22
360 | printer/frame_0014.jpg 22
361 | printer/frame_0015.jpg 22
362 | punchers/frame_0001.jpg 23
363 | punchers/frame_0002.jpg 23
364 | punchers/frame_0003.jpg 23
365 | punchers/frame_0004.jpg 23
366 | punchers/frame_0005.jpg 23
367 | punchers/frame_0006.jpg 23
368 | punchers/frame_0007.jpg 23
369 | punchers/frame_0008.jpg 23
370 | punchers/frame_0009.jpg 23
371 | punchers/frame_0010.jpg 23
372 | punchers/frame_0011.jpg 23
373 | punchers/frame_0012.jpg 23
374 | punchers/frame_0013.jpg 23
375 | punchers/frame_0014.jpg 23
376 | punchers/frame_0015.jpg 23
377 | punchers/frame_0016.jpg 23
378 | punchers/frame_0017.jpg 23
379 | punchers/frame_0018.jpg 23
380 | ring_binder/frame_0001.jpg 24
381 | ring_binder/frame_0002.jpg 24
382 | ring_binder/frame_0003.jpg 24
383 | ring_binder/frame_0004.jpg 24
384 | ring_binder/frame_0005.jpg 24
385 | ring_binder/frame_0006.jpg 24
386 | ring_binder/frame_0007.jpg 24
387 | ring_binder/frame_0008.jpg 24
388 | ring_binder/frame_0009.jpg 24
389 | ring_binder/frame_0010.jpg 24
390 | ruler/frame_0001.jpg 25
391 | ruler/frame_0002.jpg 25
392 | ruler/frame_0003.jpg 25
393 | ruler/frame_0004.jpg 25
394 | ruler/frame_0005.jpg 25
395 | ruler/frame_0006.jpg 25
396 | ruler/frame_0007.jpg 25
397 | scissors/frame_0001.jpg 26
398 | scissors/frame_0002.jpg 26
399 | scissors/frame_0003.jpg 26
400 | scissors/frame_0004.jpg 26
401 | scissors/frame_0005.jpg 26
402 | scissors/frame_0006.jpg 26
403 | scissors/frame_0007.jpg 26
404 | scissors/frame_0008.jpg 26
405 | scissors/frame_0009.jpg 26
406 | scissors/frame_0010.jpg 26
407 | scissors/frame_0011.jpg 26
408 | scissors/frame_0012.jpg 26
409 | scissors/frame_0013.jpg 26
410 | scissors/frame_0014.jpg 26
411 | scissors/frame_0015.jpg 26
412 | scissors/frame_0016.jpg 26
413 | scissors/frame_0017.jpg 26
414 | scissors/frame_0018.jpg 26
415 | speaker/frame_0001.jpg 27
416 | speaker/frame_0002.jpg 27
417 | speaker/frame_0003.jpg 27
418 | speaker/frame_0004.jpg 27
419 | speaker/frame_0005.jpg 27
420 | speaker/frame_0006.jpg 27
421 | speaker/frame_0007.jpg 27
422 | speaker/frame_0008.jpg 27
423 | speaker/frame_0009.jpg 27
424 | speaker/frame_0010.jpg 27
425 | speaker/frame_0011.jpg 27
426 | speaker/frame_0012.jpg 27
427 | speaker/frame_0013.jpg 27
428 | speaker/frame_0014.jpg 27
429 | speaker/frame_0015.jpg 27
430 | speaker/frame_0016.jpg 27
431 | speaker/frame_0017.jpg 27
432 | speaker/frame_0018.jpg 27
433 | speaker/frame_0019.jpg 27
434 | speaker/frame_0020.jpg 27
435 | speaker/frame_0021.jpg 27
436 | speaker/frame_0022.jpg 27
437 | speaker/frame_0023.jpg 27
438 | speaker/frame_0024.jpg 27
439 | speaker/frame_0025.jpg 27
440 | speaker/frame_0026.jpg 27
441 | stapler/frame_0001.jpg 28
442 | stapler/frame_0002.jpg 28
443 | stapler/frame_0003.jpg 28
444 | stapler/frame_0004.jpg 28
445 | stapler/frame_0005.jpg 28
446 | stapler/frame_0006.jpg 28
447 | stapler/frame_0007.jpg 28
448 | stapler/frame_0008.jpg 28
449 | stapler/frame_0009.jpg 28
450 | stapler/frame_0010.jpg 28
451 | stapler/frame_0011.jpg 28
452 | stapler/frame_0012.jpg 28
453 | stapler/frame_0013.jpg 28
454 | stapler/frame_0014.jpg 28
455 | stapler/frame_0015.jpg 28
456 | stapler/frame_0016.jpg 28
457 | stapler/frame_0017.jpg 28
458 | stapler/frame_0018.jpg 28
459 | stapler/frame_0019.jpg 28
460 | stapler/frame_0020.jpg 28
461 | stapler/frame_0021.jpg 28
462 | tape_dispenser/frame_0001.jpg 29
463 | tape_dispenser/frame_0002.jpg 29
464 | tape_dispenser/frame_0003.jpg 29
465 | tape_dispenser/frame_0004.jpg 29
466 | tape_dispenser/frame_0005.jpg 29
467 | tape_dispenser/frame_0006.jpg 29
468 | tape_dispenser/frame_0007.jpg 29
469 | tape_dispenser/frame_0008.jpg 29
470 | tape_dispenser/frame_0009.jpg 29
471 | tape_dispenser/frame_0010.jpg 29
472 | tape_dispenser/frame_0011.jpg 29
473 | tape_dispenser/frame_0012.jpg 29
474 | tape_dispenser/frame_0013.jpg 29
475 | tape_dispenser/frame_0014.jpg 29
476 | tape_dispenser/frame_0015.jpg 29
477 | tape_dispenser/frame_0016.jpg 29
478 | tape_dispenser/frame_0017.jpg 29
479 | tape_dispenser/frame_0018.jpg 29
480 | tape_dispenser/frame_0019.jpg 29
481 | tape_dispenser/frame_0020.jpg 29
482 | tape_dispenser/frame_0021.jpg 29
483 | tape_dispenser/frame_0022.jpg 29
484 | trash_can/frame_0001.jpg 30
485 | trash_can/frame_0002.jpg 30
486 | trash_can/frame_0003.jpg 30
487 | trash_can/frame_0004.jpg 30
488 | trash_can/frame_0005.jpg 30
489 | trash_can/frame_0006.jpg 30
490 | trash_can/frame_0007.jpg 30
491 | trash_can/frame_0008.jpg 30
492 | trash_can/frame_0009.jpg 30
493 | trash_can/frame_0010.jpg 30
494 | trash_can/frame_0011.jpg 30
495 | trash_can/frame_0012.jpg 30
496 | trash_can/frame_0013.jpg 30
497 | trash_can/frame_0014.jpg 30
498 | trash_can/frame_0015.jpg 30
--------------------------------------------------------------------------------
/data/Office/Webcam/image_unida_list.txt:
--------------------------------------------------------------------------------
1 | back_pack/frame_0001.jpg 0
2 | back_pack/frame_0002.jpg 0
3 | back_pack/frame_0003.jpg 0
4 | back_pack/frame_0004.jpg 0
5 | back_pack/frame_0005.jpg 0
6 | back_pack/frame_0006.jpg 0
7 | back_pack/frame_0007.jpg 0
8 | back_pack/frame_0008.jpg 0
9 | back_pack/frame_0009.jpg 0
10 | back_pack/frame_0010.jpg 0
11 | back_pack/frame_0011.jpg 0
12 | back_pack/frame_0012.jpg 0
13 | back_pack/frame_0013.jpg 0
14 | back_pack/frame_0014.jpg 0
15 | back_pack/frame_0015.jpg 0
16 | back_pack/frame_0016.jpg 0
17 | back_pack/frame_0017.jpg 0
18 | back_pack/frame_0018.jpg 0
19 | back_pack/frame_0019.jpg 0
20 | back_pack/frame_0020.jpg 0
21 | back_pack/frame_0021.jpg 0
22 | back_pack/frame_0022.jpg 0
23 | back_pack/frame_0023.jpg 0
24 | back_pack/frame_0024.jpg 0
25 | back_pack/frame_0025.jpg 0
26 | back_pack/frame_0026.jpg 0
27 | back_pack/frame_0027.jpg 0
28 | back_pack/frame_0028.jpg 0
29 | back_pack/frame_0029.jpg 0
30 | bike/frame_0001.jpg 1
31 | bike/frame_0002.jpg 1
32 | bike/frame_0003.jpg 1
33 | bike/frame_0004.jpg 1
34 | bike/frame_0005.jpg 1
35 | bike/frame_0006.jpg 1
36 | bike/frame_0007.jpg 1
37 | bike/frame_0008.jpg 1
38 | bike/frame_0009.jpg 1
39 | bike/frame_0010.jpg 1
40 | bike/frame_0011.jpg 1
41 | bike/frame_0012.jpg 1
42 | bike/frame_0013.jpg 1
43 | bike/frame_0014.jpg 1
44 | bike/frame_0015.jpg 1
45 | bike/frame_0016.jpg 1
46 | bike/frame_0017.jpg 1
47 | bike/frame_0018.jpg 1
48 | bike/frame_0019.jpg 1
49 | bike/frame_0020.jpg 1
50 | bike/frame_0021.jpg 1
51 | calculator/frame_0001.jpg 2
52 | calculator/frame_0002.jpg 2
53 | calculator/frame_0003.jpg 2
54 | calculator/frame_0004.jpg 2
55 | calculator/frame_0005.jpg 2
56 | calculator/frame_0006.jpg 2
57 | calculator/frame_0007.jpg 2
58 | calculator/frame_0008.jpg 2
59 | calculator/frame_0009.jpg 2
60 | calculator/frame_0010.jpg 2
61 | calculator/frame_0011.jpg 2
62 | calculator/frame_0012.jpg 2
63 | calculator/frame_0013.jpg 2
64 | calculator/frame_0014.jpg 2
65 | calculator/frame_0015.jpg 2
66 | calculator/frame_0016.jpg 2
67 | calculator/frame_0017.jpg 2
68 | calculator/frame_0018.jpg 2
69 | calculator/frame_0019.jpg 2
70 | calculator/frame_0020.jpg 2
71 | calculator/frame_0021.jpg 2
72 | calculator/frame_0022.jpg 2
73 | calculator/frame_0023.jpg 2
74 | calculator/frame_0024.jpg 2
75 | calculator/frame_0025.jpg 2
76 | calculator/frame_0026.jpg 2
77 | calculator/frame_0027.jpg 2
78 | calculator/frame_0028.jpg 2
79 | calculator/frame_0029.jpg 2
80 | calculator/frame_0030.jpg 2
81 | calculator/frame_0031.jpg 2
82 | headphones/frame_0001.jpg 3
83 | headphones/frame_0002.jpg 3
84 | headphones/frame_0003.jpg 3
85 | headphones/frame_0004.jpg 3
86 | headphones/frame_0005.jpg 3
87 | headphones/frame_0006.jpg 3
88 | headphones/frame_0007.jpg 3
89 | headphones/frame_0008.jpg 3
90 | headphones/frame_0009.jpg 3
91 | headphones/frame_0010.jpg 3
92 | headphones/frame_0011.jpg 3
93 | headphones/frame_0012.jpg 3
94 | headphones/frame_0013.jpg 3
95 | headphones/frame_0014.jpg 3
96 | headphones/frame_0015.jpg 3
97 | headphones/frame_0016.jpg 3
98 | headphones/frame_0017.jpg 3
99 | headphones/frame_0018.jpg 3
100 | headphones/frame_0019.jpg 3
101 | headphones/frame_0020.jpg 3
102 | headphones/frame_0021.jpg 3
103 | headphones/frame_0022.jpg 3
104 | headphones/frame_0023.jpg 3
105 | headphones/frame_0024.jpg 3
106 | headphones/frame_0025.jpg 3
107 | headphones/frame_0026.jpg 3
108 | headphones/frame_0027.jpg 3
109 | keyboard/frame_0001.jpg 4
110 | keyboard/frame_0002.jpg 4
111 | keyboard/frame_0003.jpg 4
112 | keyboard/frame_0004.jpg 4
113 | keyboard/frame_0005.jpg 4
114 | keyboard/frame_0006.jpg 4
115 | keyboard/frame_0007.jpg 4
116 | keyboard/frame_0008.jpg 4
117 | keyboard/frame_0009.jpg 4
118 | keyboard/frame_0010.jpg 4
119 | keyboard/frame_0011.jpg 4
120 | keyboard/frame_0012.jpg 4
121 | keyboard/frame_0013.jpg 4
122 | keyboard/frame_0014.jpg 4
123 | keyboard/frame_0015.jpg 4
124 | keyboard/frame_0016.jpg 4
125 | keyboard/frame_0017.jpg 4
126 | keyboard/frame_0018.jpg 4
127 | keyboard/frame_0019.jpg 4
128 | keyboard/frame_0020.jpg 4
129 | keyboard/frame_0021.jpg 4
130 | keyboard/frame_0022.jpg 4
131 | keyboard/frame_0023.jpg 4
132 | keyboard/frame_0024.jpg 4
133 | keyboard/frame_0025.jpg 4
134 | keyboard/frame_0026.jpg 4
135 | keyboard/frame_0027.jpg 4
136 | laptop_computer/frame_0001.jpg 5
137 | laptop_computer/frame_0002.jpg 5
138 | laptop_computer/frame_0003.jpg 5
139 | laptop_computer/frame_0004.jpg 5
140 | laptop_computer/frame_0005.jpg 5
141 | laptop_computer/frame_0006.jpg 5
142 | laptop_computer/frame_0007.jpg 5
143 | laptop_computer/frame_0008.jpg 5
144 | laptop_computer/frame_0009.jpg 5
145 | laptop_computer/frame_0010.jpg 5
146 | laptop_computer/frame_0011.jpg 5
147 | laptop_computer/frame_0012.jpg 5
148 | laptop_computer/frame_0013.jpg 5
149 | laptop_computer/frame_0014.jpg 5
150 | laptop_computer/frame_0015.jpg 5
151 | laptop_computer/frame_0016.jpg 5
152 | laptop_computer/frame_0017.jpg 5
153 | laptop_computer/frame_0018.jpg 5
154 | laptop_computer/frame_0019.jpg 5
155 | laptop_computer/frame_0020.jpg 5
156 | laptop_computer/frame_0021.jpg 5
157 | laptop_computer/frame_0022.jpg 5
158 | laptop_computer/frame_0023.jpg 5
159 | laptop_computer/frame_0024.jpg 5
160 | laptop_computer/frame_0025.jpg 5
161 | laptop_computer/frame_0026.jpg 5
162 | laptop_computer/frame_0027.jpg 5
163 | laptop_computer/frame_0028.jpg 5
164 | laptop_computer/frame_0029.jpg 5
165 | laptop_computer/frame_0030.jpg 5
166 | monitor/frame_0001.jpg 6
167 | monitor/frame_0002.jpg 6
168 | monitor/frame_0003.jpg 6
169 | monitor/frame_0004.jpg 6
170 | monitor/frame_0005.jpg 6
171 | monitor/frame_0006.jpg 6
172 | monitor/frame_0007.jpg 6
173 | monitor/frame_0008.jpg 6
174 | monitor/frame_0009.jpg 6
175 | monitor/frame_0010.jpg 6
176 | monitor/frame_0011.jpg 6
177 | monitor/frame_0012.jpg 6
178 | monitor/frame_0013.jpg 6
179 | monitor/frame_0014.jpg 6
180 | monitor/frame_0015.jpg 6
181 | monitor/frame_0016.jpg 6
182 | monitor/frame_0017.jpg 6
183 | monitor/frame_0018.jpg 6
184 | monitor/frame_0019.jpg 6
185 | monitor/frame_0020.jpg 6
186 | monitor/frame_0021.jpg 6
187 | monitor/frame_0022.jpg 6
188 | monitor/frame_0023.jpg 6
189 | monitor/frame_0024.jpg 6
190 | monitor/frame_0025.jpg 6
191 | monitor/frame_0026.jpg 6
192 | monitor/frame_0027.jpg 6
193 | monitor/frame_0028.jpg 6
194 | monitor/frame_0029.jpg 6
195 | monitor/frame_0030.jpg 6
196 | monitor/frame_0031.jpg 6
197 | monitor/frame_0032.jpg 6
198 | monitor/frame_0033.jpg 6
199 | monitor/frame_0034.jpg 6
200 | monitor/frame_0035.jpg 6
201 | monitor/frame_0036.jpg 6
202 | monitor/frame_0037.jpg 6
203 | monitor/frame_0038.jpg 6
204 | monitor/frame_0039.jpg 6
205 | monitor/frame_0040.jpg 6
206 | monitor/frame_0041.jpg 6
207 | monitor/frame_0042.jpg 6
208 | monitor/frame_0043.jpg 6
209 | mouse/frame_0001.jpg 7
210 | mouse/frame_0002.jpg 7
211 | mouse/frame_0003.jpg 7
212 | mouse/frame_0004.jpg 7
213 | mouse/frame_0005.jpg 7
214 | mouse/frame_0006.jpg 7
215 | mouse/frame_0007.jpg 7
216 | mouse/frame_0008.jpg 7
217 | mouse/frame_0009.jpg 7
218 | mouse/frame_0010.jpg 7
219 | mouse/frame_0011.jpg 7
220 | mouse/frame_0012.jpg 7
221 | mouse/frame_0013.jpg 7
222 | mouse/frame_0014.jpg 7
223 | mouse/frame_0015.jpg 7
224 | mouse/frame_0016.jpg 7
225 | mouse/frame_0017.jpg 7
226 | mouse/frame_0018.jpg 7
227 | mouse/frame_0019.jpg 7
228 | mouse/frame_0020.jpg 7
229 | mouse/frame_0021.jpg 7
230 | mouse/frame_0022.jpg 7
231 | mouse/frame_0023.jpg 7
232 | mouse/frame_0024.jpg 7
233 | mouse/frame_0025.jpg 7
234 | mouse/frame_0026.jpg 7
235 | mouse/frame_0027.jpg 7
236 | mouse/frame_0028.jpg 7
237 | mouse/frame_0029.jpg 7
238 | mouse/frame_0030.jpg 7
239 | mug/frame_0001.jpg 8
240 | mug/frame_0002.jpg 8
241 | mug/frame_0003.jpg 8
242 | mug/frame_0004.jpg 8
243 | mug/frame_0005.jpg 8
244 | mug/frame_0006.jpg 8
245 | mug/frame_0007.jpg 8
246 | mug/frame_0008.jpg 8
247 | mug/frame_0009.jpg 8
248 | mug/frame_0010.jpg 8
249 | mug/frame_0011.jpg 8
250 | mug/frame_0012.jpg 8
251 | mug/frame_0013.jpg 8
252 | mug/frame_0014.jpg 8
253 | mug/frame_0015.jpg 8
254 | mug/frame_0016.jpg 8
255 | mug/frame_0017.jpg 8
256 | mug/frame_0018.jpg 8
257 | mug/frame_0019.jpg 8
258 | mug/frame_0020.jpg 8
259 | mug/frame_0021.jpg 8
260 | mug/frame_0022.jpg 8
261 | mug/frame_0023.jpg 8
262 | mug/frame_0024.jpg 8
263 | mug/frame_0025.jpg 8
264 | mug/frame_0026.jpg 8
265 | mug/frame_0027.jpg 8
266 | projector/frame_0001.jpg 9
267 | projector/frame_0002.jpg 9
268 | projector/frame_0003.jpg 9
269 | projector/frame_0004.jpg 9
270 | projector/frame_0005.jpg 9
271 | projector/frame_0006.jpg 9
272 | projector/frame_0007.jpg 9
273 | projector/frame_0008.jpg 9
274 | projector/frame_0009.jpg 9
275 | projector/frame_0010.jpg 9
276 | projector/frame_0011.jpg 9
277 | projector/frame_0012.jpg 9
278 | projector/frame_0013.jpg 9
279 | projector/frame_0014.jpg 9
280 | projector/frame_0015.jpg 9
281 | projector/frame_0016.jpg 9
282 | projector/frame_0017.jpg 9
283 | projector/frame_0018.jpg 9
284 | projector/frame_0019.jpg 9
285 | projector/frame_0020.jpg 9
286 | projector/frame_0021.jpg 9
287 | projector/frame_0022.jpg 9
288 | projector/frame_0023.jpg 9
289 | projector/frame_0024.jpg 9
290 | projector/frame_0025.jpg 9
291 | projector/frame_0026.jpg 9
292 | projector/frame_0027.jpg 9
293 | projector/frame_0028.jpg 9
294 | projector/frame_0029.jpg 9
295 | projector/frame_0030.jpg 9
296 | bike_helmet/frame_0001.jpg 10
297 | bike_helmet/frame_0002.jpg 10
298 | bike_helmet/frame_0003.jpg 10
299 | bike_helmet/frame_0004.jpg 10
300 | bike_helmet/frame_0005.jpg 10
301 | bike_helmet/frame_0006.jpg 10
302 | bike_helmet/frame_0007.jpg 10
303 | bike_helmet/frame_0008.jpg 10
304 | bike_helmet/frame_0009.jpg 10
305 | bike_helmet/frame_0010.jpg 10
306 | bike_helmet/frame_0011.jpg 10
307 | bike_helmet/frame_0012.jpg 10
308 | bike_helmet/frame_0013.jpg 10
309 | bike_helmet/frame_0014.jpg 10
310 | bike_helmet/frame_0015.jpg 10
311 | bike_helmet/frame_0016.jpg 10
312 | bike_helmet/frame_0017.jpg 10
313 | bike_helmet/frame_0018.jpg 10
314 | bike_helmet/frame_0019.jpg 10
315 | bike_helmet/frame_0020.jpg 10
316 | bike_helmet/frame_0021.jpg 10
317 | bike_helmet/frame_0022.jpg 10
318 | bike_helmet/frame_0023.jpg 10
319 | bike_helmet/frame_0024.jpg 10
320 | bike_helmet/frame_0025.jpg 10
321 | bike_helmet/frame_0026.jpg 10
322 | bike_helmet/frame_0027.jpg 10
323 | bike_helmet/frame_0028.jpg 10
324 | bookcase/frame_0001.jpg 11
325 | bookcase/frame_0002.jpg 11
326 | bookcase/frame_0003.jpg 11
327 | bookcase/frame_0004.jpg 11
328 | bookcase/frame_0005.jpg 11
329 | bookcase/frame_0006.jpg 11
330 | bookcase/frame_0007.jpg 11
331 | bookcase/frame_0008.jpg 11
332 | bookcase/frame_0009.jpg 11
333 | bookcase/frame_0010.jpg 11
334 | bookcase/frame_0011.jpg 11
335 | bookcase/frame_0012.jpg 11
336 | bottle/frame_0001.jpg 12
337 | bottle/frame_0002.jpg 12
338 | bottle/frame_0003.jpg 12
339 | bottle/frame_0004.jpg 12
340 | bottle/frame_0005.jpg 12
341 | bottle/frame_0006.jpg 12
342 | bottle/frame_0007.jpg 12
343 | bottle/frame_0008.jpg 12
344 | bottle/frame_0009.jpg 12
345 | bottle/frame_0010.jpg 12
346 | bottle/frame_0011.jpg 12
347 | bottle/frame_0012.jpg 12
348 | bottle/frame_0013.jpg 12
349 | bottle/frame_0014.jpg 12
350 | bottle/frame_0015.jpg 12
351 | bottle/frame_0016.jpg 12
352 | desk_chair/frame_0001.jpg 13
353 | desk_chair/frame_0002.jpg 13
354 | desk_chair/frame_0003.jpg 13
355 | desk_chair/frame_0004.jpg 13
356 | desk_chair/frame_0005.jpg 13
357 | desk_chair/frame_0006.jpg 13
358 | desk_chair/frame_0007.jpg 13
359 | desk_chair/frame_0008.jpg 13
360 | desk_chair/frame_0009.jpg 13
361 | desk_chair/frame_0010.jpg 13
362 | desk_chair/frame_0011.jpg 13
363 | desk_chair/frame_0012.jpg 13
364 | desk_chair/frame_0013.jpg 13
365 | desk_chair/frame_0014.jpg 13
366 | desk_chair/frame_0015.jpg 13
367 | desk_chair/frame_0016.jpg 13
368 | desk_chair/frame_0017.jpg 13
369 | desk_chair/frame_0018.jpg 13
370 | desk_chair/frame_0019.jpg 13
371 | desk_chair/frame_0020.jpg 13
372 | desk_chair/frame_0021.jpg 13
373 | desk_chair/frame_0022.jpg 13
374 | desk_chair/frame_0023.jpg 13
375 | desk_chair/frame_0024.jpg 13
376 | desk_chair/frame_0025.jpg 13
377 | desk_chair/frame_0026.jpg 13
378 | desk_chair/frame_0027.jpg 13
379 | desk_chair/frame_0028.jpg 13
380 | desk_chair/frame_0029.jpg 13
381 | desk_chair/frame_0030.jpg 13
382 | desk_chair/frame_0031.jpg 13
383 | desk_chair/frame_0032.jpg 13
384 | desk_chair/frame_0033.jpg 13
385 | desk_chair/frame_0034.jpg 13
386 | desk_chair/frame_0035.jpg 13
387 | desk_chair/frame_0036.jpg 13
388 | desk_chair/frame_0037.jpg 13
389 | desk_chair/frame_0038.jpg 13
390 | desk_chair/frame_0039.jpg 13
391 | desk_chair/frame_0040.jpg 13
392 | desk_lamp/frame_0001.jpg 14
393 | desk_lamp/frame_0002.jpg 14
394 | desk_lamp/frame_0003.jpg 14
395 | desk_lamp/frame_0004.jpg 14
396 | desk_lamp/frame_0005.jpg 14
397 | desk_lamp/frame_0006.jpg 14
398 | desk_lamp/frame_0007.jpg 14
399 | desk_lamp/frame_0008.jpg 14
400 | desk_lamp/frame_0009.jpg 14
401 | desk_lamp/frame_0010.jpg 14
402 | desk_lamp/frame_0011.jpg 14
403 | desk_lamp/frame_0012.jpg 14
404 | desk_lamp/frame_0013.jpg 14
405 | desk_lamp/frame_0014.jpg 14
406 | desk_lamp/frame_0015.jpg 14
407 | desk_lamp/frame_0016.jpg 14
408 | desk_lamp/frame_0017.jpg 14
409 | desk_lamp/frame_0018.jpg 14
410 | desktop_computer/frame_0001.jpg 15
411 | desktop_computer/frame_0002.jpg 15
412 | desktop_computer/frame_0003.jpg 15
413 | desktop_computer/frame_0004.jpg 15
414 | desktop_computer/frame_0005.jpg 15
415 | desktop_computer/frame_0006.jpg 15
416 | desktop_computer/frame_0007.jpg 15
417 | desktop_computer/frame_0008.jpg 15
418 | desktop_computer/frame_0009.jpg 15
419 | desktop_computer/frame_0010.jpg 15
420 | desktop_computer/frame_0011.jpg 15
421 | desktop_computer/frame_0012.jpg 15
422 | desktop_computer/frame_0013.jpg 15
423 | desktop_computer/frame_0014.jpg 15
424 | desktop_computer/frame_0015.jpg 15
425 | desktop_computer/frame_0016.jpg 15
426 | desktop_computer/frame_0017.jpg 15
427 | desktop_computer/frame_0018.jpg 15
428 | desktop_computer/frame_0019.jpg 15
429 | desktop_computer/frame_0020.jpg 15
430 | desktop_computer/frame_0021.jpg 15
431 | file_cabinet/frame_0001.jpg 16
432 | file_cabinet/frame_0002.jpg 16
433 | file_cabinet/frame_0003.jpg 16
434 | file_cabinet/frame_0004.jpg 16
435 | file_cabinet/frame_0005.jpg 16
436 | file_cabinet/frame_0006.jpg 16
437 | file_cabinet/frame_0007.jpg 16
438 | file_cabinet/frame_0008.jpg 16
439 | file_cabinet/frame_0009.jpg 16
440 | file_cabinet/frame_0010.jpg 16
441 | file_cabinet/frame_0011.jpg 16
442 | file_cabinet/frame_0012.jpg 16
443 | file_cabinet/frame_0013.jpg 16
444 | file_cabinet/frame_0014.jpg 16
445 | file_cabinet/frame_0015.jpg 16
446 | file_cabinet/frame_0016.jpg 16
447 | file_cabinet/frame_0017.jpg 16
448 | file_cabinet/frame_0018.jpg 16
449 | file_cabinet/frame_0019.jpg 16
450 | letter_tray/frame_0001.jpg 17
451 | letter_tray/frame_0002.jpg 17
452 | letter_tray/frame_0003.jpg 17
453 | letter_tray/frame_0004.jpg 17
454 | letter_tray/frame_0005.jpg 17
455 | letter_tray/frame_0006.jpg 17
456 | letter_tray/frame_0007.jpg 17
457 | letter_tray/frame_0008.jpg 17
458 | letter_tray/frame_0009.jpg 17
459 | letter_tray/frame_0010.jpg 17
460 | letter_tray/frame_0011.jpg 17
461 | letter_tray/frame_0012.jpg 17
462 | letter_tray/frame_0013.jpg 17
463 | letter_tray/frame_0014.jpg 17
464 | letter_tray/frame_0015.jpg 17
465 | letter_tray/frame_0016.jpg 17
466 | letter_tray/frame_0017.jpg 17
467 | letter_tray/frame_0018.jpg 17
468 | letter_tray/frame_0019.jpg 17
469 | mobile_phone/frame_0001.jpg 18
470 | mobile_phone/frame_0002.jpg 18
471 | mobile_phone/frame_0003.jpg 18
472 | mobile_phone/frame_0004.jpg 18
473 | mobile_phone/frame_0005.jpg 18
474 | mobile_phone/frame_0006.jpg 18
475 | mobile_phone/frame_0007.jpg 18
476 | mobile_phone/frame_0008.jpg 18
477 | mobile_phone/frame_0009.jpg 18
478 | mobile_phone/frame_0010.jpg 18
479 | mobile_phone/frame_0011.jpg 18
480 | mobile_phone/frame_0012.jpg 18
481 | mobile_phone/frame_0013.jpg 18
482 | mobile_phone/frame_0014.jpg 18
483 | mobile_phone/frame_0015.jpg 18
484 | mobile_phone/frame_0016.jpg 18
485 | mobile_phone/frame_0017.jpg 18
486 | mobile_phone/frame_0018.jpg 18
487 | mobile_phone/frame_0019.jpg 18
488 | mobile_phone/frame_0020.jpg 18
489 | mobile_phone/frame_0021.jpg 18
490 | mobile_phone/frame_0022.jpg 18
491 | mobile_phone/frame_0023.jpg 18
492 | mobile_phone/frame_0024.jpg 18
493 | mobile_phone/frame_0025.jpg 18
494 | mobile_phone/frame_0026.jpg 18
495 | mobile_phone/frame_0027.jpg 18
496 | mobile_phone/frame_0028.jpg 18
497 | mobile_phone/frame_0029.jpg 18
498 | mobile_phone/frame_0030.jpg 18
499 | paper_notebook/frame_0001.jpg 19
500 | paper_notebook/frame_0002.jpg 19
501 | paper_notebook/frame_0003.jpg 19
502 | paper_notebook/frame_0004.jpg 19
503 | paper_notebook/frame_0005.jpg 19
504 | paper_notebook/frame_0006.jpg 19
505 | paper_notebook/frame_0007.jpg 19
506 | paper_notebook/frame_0008.jpg 19
507 | paper_notebook/frame_0009.jpg 19
508 | paper_notebook/frame_0010.jpg 19
509 | paper_notebook/frame_0011.jpg 19
510 | paper_notebook/frame_0012.jpg 19
511 | paper_notebook/frame_0013.jpg 19
512 | paper_notebook/frame_0014.jpg 19
513 | paper_notebook/frame_0015.jpg 19
514 | paper_notebook/frame_0016.jpg 19
515 | paper_notebook/frame_0017.jpg 19
516 | paper_notebook/frame_0018.jpg 19
517 | paper_notebook/frame_0019.jpg 19
518 | paper_notebook/frame_0020.jpg 19
519 | paper_notebook/frame_0021.jpg 19
520 | paper_notebook/frame_0022.jpg 19
521 | paper_notebook/frame_0023.jpg 19
522 | paper_notebook/frame_0024.jpg 19
523 | paper_notebook/frame_0025.jpg 19
524 | paper_notebook/frame_0026.jpg 19
525 | paper_notebook/frame_0027.jpg 19
526 | paper_notebook/frame_0028.jpg 19
527 | pen/frame_0001.jpg 20
528 | pen/frame_0002.jpg 20
529 | pen/frame_0003.jpg 20
530 | pen/frame_0004.jpg 20
531 | pen/frame_0005.jpg 20
532 | pen/frame_0006.jpg 20
533 | pen/frame_0007.jpg 20
534 | pen/frame_0008.jpg 20
535 | pen/frame_0009.jpg 20
536 | pen/frame_0010.jpg 20
537 | pen/frame_0011.jpg 20
538 | pen/frame_0012.jpg 20
539 | pen/frame_0013.jpg 20
540 | pen/frame_0014.jpg 20
541 | pen/frame_0015.jpg 20
542 | pen/frame_0016.jpg 20
543 | pen/frame_0017.jpg 20
544 | pen/frame_0018.jpg 20
545 | pen/frame_0019.jpg 20
546 | pen/frame_0020.jpg 20
547 | pen/frame_0021.jpg 20
548 | pen/frame_0022.jpg 20
549 | pen/frame_0023.jpg 20
550 | pen/frame_0024.jpg 20
551 | pen/frame_0025.jpg 20
552 | pen/frame_0026.jpg 20
553 | pen/frame_0027.jpg 20
554 | pen/frame_0028.jpg 20
555 | pen/frame_0029.jpg 20
556 | pen/frame_0030.jpg 20
557 | pen/frame_0031.jpg 20
558 | pen/frame_0032.jpg 20
559 | phone/frame_0001.jpg 21
560 | phone/frame_0002.jpg 21
561 | phone/frame_0003.jpg 21
562 | phone/frame_0004.jpg 21
563 | phone/frame_0005.jpg 21
564 | phone/frame_0006.jpg 21
565 | phone/frame_0007.jpg 21
566 | phone/frame_0008.jpg 21
567 | phone/frame_0009.jpg 21
568 | phone/frame_0010.jpg 21
569 | phone/frame_0011.jpg 21
570 | phone/frame_0012.jpg 21
571 | phone/frame_0013.jpg 21
572 | phone/frame_0014.jpg 21
573 | phone/frame_0015.jpg 21
574 | phone/frame_0016.jpg 21
575 | printer/frame_0001.jpg 22
576 | printer/frame_0002.jpg 22
577 | printer/frame_0003.jpg 22
578 | printer/frame_0004.jpg 22
579 | printer/frame_0005.jpg 22
580 | printer/frame_0006.jpg 22
581 | printer/frame_0007.jpg 22
582 | printer/frame_0008.jpg 22
583 | printer/frame_0009.jpg 22
584 | printer/frame_0010.jpg 22
585 | printer/frame_0011.jpg 22
586 | printer/frame_0012.jpg 22
587 | printer/frame_0013.jpg 22
588 | printer/frame_0014.jpg 22
589 | printer/frame_0015.jpg 22
590 | printer/frame_0016.jpg 22
591 | printer/frame_0017.jpg 22
592 | printer/frame_0018.jpg 22
593 | printer/frame_0019.jpg 22
594 | printer/frame_0020.jpg 22
595 | punchers/frame_0001.jpg 23
596 | punchers/frame_0002.jpg 23
597 | punchers/frame_0003.jpg 23
598 | punchers/frame_0004.jpg 23
599 | punchers/frame_0005.jpg 23
600 | punchers/frame_0006.jpg 23
601 | punchers/frame_0007.jpg 23
602 | punchers/frame_0008.jpg 23
603 | punchers/frame_0009.jpg 23
604 | punchers/frame_0010.jpg 23
605 | punchers/frame_0011.jpg 23
606 | punchers/frame_0012.jpg 23
607 | punchers/frame_0013.jpg 23
608 | punchers/frame_0014.jpg 23
609 | punchers/frame_0015.jpg 23
610 | punchers/frame_0016.jpg 23
611 | punchers/frame_0017.jpg 23
612 | punchers/frame_0018.jpg 23
613 | punchers/frame_0019.jpg 23
614 | punchers/frame_0020.jpg 23
615 | punchers/frame_0021.jpg 23
616 | punchers/frame_0022.jpg 23
617 | punchers/frame_0023.jpg 23
618 | punchers/frame_0024.jpg 23
619 | punchers/frame_0025.jpg 23
620 | punchers/frame_0026.jpg 23
621 | punchers/frame_0027.jpg 23
622 | ring_binder/frame_0001.jpg 24
623 | ring_binder/frame_0002.jpg 24
624 | ring_binder/frame_0003.jpg 24
625 | ring_binder/frame_0004.jpg 24
626 | ring_binder/frame_0005.jpg 24
627 | ring_binder/frame_0006.jpg 24
628 | ring_binder/frame_0007.jpg 24
629 | ring_binder/frame_0008.jpg 24
630 | ring_binder/frame_0009.jpg 24
631 | ring_binder/frame_0010.jpg 24
632 | ring_binder/frame_0011.jpg 24
633 | ring_binder/frame_0012.jpg 24
634 | ring_binder/frame_0013.jpg 24
635 | ring_binder/frame_0014.jpg 24
636 | ring_binder/frame_0015.jpg 24
637 | ring_binder/frame_0016.jpg 24
638 | ring_binder/frame_0017.jpg 24
639 | ring_binder/frame_0018.jpg 24
640 | ring_binder/frame_0019.jpg 24
641 | ring_binder/frame_0020.jpg 24
642 | ring_binder/frame_0021.jpg 24
643 | ring_binder/frame_0022.jpg 24
644 | ring_binder/frame_0023.jpg 24
645 | ring_binder/frame_0024.jpg 24
646 | ring_binder/frame_0025.jpg 24
647 | ring_binder/frame_0026.jpg 24
648 | ring_binder/frame_0027.jpg 24
649 | ring_binder/frame_0028.jpg 24
650 | ring_binder/frame_0029.jpg 24
651 | ring_binder/frame_0030.jpg 24
652 | ring_binder/frame_0031.jpg 24
653 | ring_binder/frame_0032.jpg 24
654 | ring_binder/frame_0033.jpg 24
655 | ring_binder/frame_0034.jpg 24
656 | ring_binder/frame_0035.jpg 24
657 | ring_binder/frame_0036.jpg 24
658 | ring_binder/frame_0037.jpg 24
659 | ring_binder/frame_0038.jpg 24
660 | ring_binder/frame_0039.jpg 24
661 | ring_binder/frame_0040.jpg 24
662 | ruler/frame_0001.jpg 25
663 | ruler/frame_0002.jpg 25
664 | ruler/frame_0003.jpg 25
665 | ruler/frame_0004.jpg 25
666 | ruler/frame_0005.jpg 25
667 | ruler/frame_0006.jpg 25
668 | ruler/frame_0007.jpg 25
669 | ruler/frame_0008.jpg 25
670 | ruler/frame_0009.jpg 25
671 | ruler/frame_0010.jpg 25
672 | ruler/frame_0011.jpg 25
673 | scissors/frame_0001.jpg 26
674 | scissors/frame_0002.jpg 26
675 | scissors/frame_0003.jpg 26
676 | scissors/frame_0004.jpg 26
677 | scissors/frame_0005.jpg 26
678 | scissors/frame_0006.jpg 26
679 | scissors/frame_0007.jpg 26
680 | scissors/frame_0008.jpg 26
681 | scissors/frame_0009.jpg 26
682 | scissors/frame_0010.jpg 26
683 | scissors/frame_0011.jpg 26
684 | scissors/frame_0012.jpg 26
685 | scissors/frame_0013.jpg 26
686 | scissors/frame_0014.jpg 26
687 | scissors/frame_0015.jpg 26
688 | scissors/frame_0016.jpg 26
689 | scissors/frame_0017.jpg 26
690 | scissors/frame_0018.jpg 26
691 | scissors/frame_0019.jpg 26
692 | scissors/frame_0020.jpg 26
693 | scissors/frame_0021.jpg 26
694 | scissors/frame_0022.jpg 26
695 | scissors/frame_0023.jpg 26
696 | scissors/frame_0024.jpg 26
697 | scissors/frame_0025.jpg 26
698 | speaker/frame_0001.jpg 27
699 | speaker/frame_0002.jpg 27
700 | speaker/frame_0003.jpg 27
701 | speaker/frame_0004.jpg 27
702 | speaker/frame_0005.jpg 27
703 | speaker/frame_0006.jpg 27
704 | speaker/frame_0007.jpg 27
705 | speaker/frame_0008.jpg 27
706 | speaker/frame_0009.jpg 27
707 | speaker/frame_0010.jpg 27
708 | speaker/frame_0011.jpg 27
709 | speaker/frame_0012.jpg 27
710 | speaker/frame_0013.jpg 27
711 | speaker/frame_0014.jpg 27
712 | speaker/frame_0015.jpg 27
713 | speaker/frame_0016.jpg 27
714 | speaker/frame_0017.jpg 27
715 | speaker/frame_0018.jpg 27
716 | speaker/frame_0019.jpg 27
717 | speaker/frame_0020.jpg 27
718 | speaker/frame_0021.jpg 27
719 | speaker/frame_0022.jpg 27
720 | speaker/frame_0023.jpg 27
721 | speaker/frame_0024.jpg 27
722 | speaker/frame_0025.jpg 27
723 | speaker/frame_0026.jpg 27
724 | speaker/frame_0027.jpg 27
725 | speaker/frame_0028.jpg 27
726 | speaker/frame_0029.jpg 27
727 | speaker/frame_0030.jpg 27
728 | stapler/frame_0001.jpg 28
729 | stapler/frame_0002.jpg 28
730 | stapler/frame_0003.jpg 28
731 | stapler/frame_0004.jpg 28
732 | stapler/frame_0005.jpg 28
733 | stapler/frame_0006.jpg 28
734 | stapler/frame_0007.jpg 28
735 | stapler/frame_0008.jpg 28
736 | stapler/frame_0009.jpg 28
737 | stapler/frame_0010.jpg 28
738 | stapler/frame_0011.jpg 28
739 | stapler/frame_0012.jpg 28
740 | stapler/frame_0013.jpg 28
741 | stapler/frame_0014.jpg 28
742 | stapler/frame_0015.jpg 28
743 | stapler/frame_0016.jpg 28
744 | stapler/frame_0017.jpg 28
745 | stapler/frame_0018.jpg 28
746 | stapler/frame_0019.jpg 28
747 | stapler/frame_0020.jpg 28
748 | stapler/frame_0021.jpg 28
749 | stapler/frame_0022.jpg 28
750 | stapler/frame_0023.jpg 28
751 | stapler/frame_0024.jpg 28
752 | tape_dispenser/frame_0001.jpg 29
753 | tape_dispenser/frame_0002.jpg 29
754 | tape_dispenser/frame_0003.jpg 29
755 | tape_dispenser/frame_0004.jpg 29
756 | tape_dispenser/frame_0005.jpg 29
757 | tape_dispenser/frame_0006.jpg 29
758 | tape_dispenser/frame_0007.jpg 29
759 | tape_dispenser/frame_0008.jpg 29
760 | tape_dispenser/frame_0009.jpg 29
761 | tape_dispenser/frame_0010.jpg 29
762 | tape_dispenser/frame_0011.jpg 29
763 | tape_dispenser/frame_0012.jpg 29
764 | tape_dispenser/frame_0013.jpg 29
765 | tape_dispenser/frame_0014.jpg 29
766 | tape_dispenser/frame_0015.jpg 29
767 | tape_dispenser/frame_0016.jpg 29
768 | tape_dispenser/frame_0017.jpg 29
769 | tape_dispenser/frame_0018.jpg 29
770 | tape_dispenser/frame_0019.jpg 29
771 | tape_dispenser/frame_0020.jpg 29
772 | tape_dispenser/frame_0021.jpg 29
773 | tape_dispenser/frame_0022.jpg 29
774 | tape_dispenser/frame_0023.jpg 29
775 | trash_can/frame_0001.jpg 30
776 | trash_can/frame_0002.jpg 30
777 | trash_can/frame_0003.jpg 30
778 | trash_can/frame_0004.jpg 30
779 | trash_can/frame_0005.jpg 30
780 | trash_can/frame_0006.jpg 30
781 | trash_can/frame_0007.jpg 30
782 | trash_can/frame_0008.jpg 30
783 | trash_can/frame_0009.jpg 30
784 | trash_can/frame_0010.jpg 30
785 | trash_can/frame_0011.jpg 30
786 | trash_can/frame_0012.jpg 30
787 | trash_can/frame_0013.jpg 30
788 | trash_can/frame_0014.jpg 30
789 | trash_can/frame_0015.jpg 30
790 | trash_can/frame_0016.jpg 30
791 | trash_can/frame_0017.jpg 30
792 | trash_can/frame_0018.jpg 30
793 | trash_can/frame_0019.jpg 30
794 | trash_can/frame_0020.jpg 30
795 | trash_can/frame_0021.jpg 30
--------------------------------------------------------------------------------
/data/OfficeHome/Art/image_unida_list.txt:
--------------------------------------------------------------------------------
1 | Alarm_Clock/00001.jpg 0
2 | Alarm_Clock/00002.jpg 0
3 | Alarm_Clock/00003.jpg 0
4 | Alarm_Clock/00004.jpg 0
5 | Alarm_Clock/00005.jpg 0
6 | Alarm_Clock/00006.jpg 0
7 | Alarm_Clock/00007.jpg 0
8 | Alarm_Clock/00008.jpg 0
9 | Alarm_Clock/00009.jpg 0
10 | Alarm_Clock/00010.jpg 0
11 | Alarm_Clock/00011.jpg 0
12 | Alarm_Clock/00012.jpg 0
13 | Alarm_Clock/00013.jpg 0
14 | Alarm_Clock/00014.jpg 0
15 | Alarm_Clock/00015.jpg 0
16 | Alarm_Clock/00016.jpg 0
17 | Alarm_Clock/00017.jpg 0
18 | Alarm_Clock/00018.jpg 0
19 | Alarm_Clock/00019.jpg 0
20 | Alarm_Clock/00020.jpg 0
21 | Alarm_Clock/00021.jpg 0
22 | Alarm_Clock/00022.jpg 0
23 | Alarm_Clock/00023.jpg 0
24 | Alarm_Clock/00024.jpg 0
25 | Alarm_Clock/00025.jpg 0
26 | Alarm_Clock/00026.jpg 0
27 | Alarm_Clock/00027.jpg 0
28 | Alarm_Clock/00028.jpg 0
29 | Alarm_Clock/00029.jpg 0
30 | Alarm_Clock/00030.jpg 0
31 | Alarm_Clock/00031.jpg 0
32 | Alarm_Clock/00032.jpg 0
33 | Alarm_Clock/00033.jpg 0
34 | Alarm_Clock/00034.jpg 0
35 | Alarm_Clock/00035.jpg 0
36 | Alarm_Clock/00036.jpg 0
37 | Alarm_Clock/00037.jpg 0
38 | Alarm_Clock/00038.jpg 0
39 | Alarm_Clock/00039.jpg 0
40 | Alarm_Clock/00040.jpg 0
41 | Alarm_Clock/00041.jpg 0
42 | Alarm_Clock/00042.jpg 0
43 | Alarm_Clock/00043.jpg 0
44 | Alarm_Clock/00044.jpg 0
45 | Alarm_Clock/00045.jpg 0
46 | Alarm_Clock/00046.jpg 0
47 | Alarm_Clock/00047.jpg 0
48 | Alarm_Clock/00048.jpg 0
49 | Alarm_Clock/00049.jpg 0
50 | Alarm_Clock/00050.jpg 0
51 | Alarm_Clock/00051.jpg 0
52 | Alarm_Clock/00052.jpg 0
53 | Alarm_Clock/00053.jpg 0
54 | Alarm_Clock/00054.jpg 0
55 | Alarm_Clock/00055.jpg 0
56 | Alarm_Clock/00056.jpg 0
57 | Alarm_Clock/00057.jpg 0
58 | Alarm_Clock/00058.jpg 0
59 | Alarm_Clock/00059.jpg 0
60 | Alarm_Clock/00060.jpg 0
61 | Alarm_Clock/00061.jpg 0
62 | Alarm_Clock/00062.jpg 0
63 | Alarm_Clock/00063.jpg 0
64 | Alarm_Clock/00064.jpg 0
65 | Alarm_Clock/00065.jpg 0
66 | Alarm_Clock/00066.jpg 0
67 | Alarm_Clock/00067.jpg 0
68 | Alarm_Clock/00068.jpg 0
69 | Alarm_Clock/00069.jpg 0
70 | Alarm_Clock/00070.jpg 0
71 | Alarm_Clock/00071.jpg 0
72 | Alarm_Clock/00072.jpg 0
73 | Alarm_Clock/00073.jpg 0
74 | Alarm_Clock/00074.jpg 0
75 | Backpack/00001.jpg 1
76 | Backpack/00002.jpg 1
77 | Backpack/00003.jpg 1
78 | Backpack/00004.jpg 1
79 | Backpack/00005.jpg 1
80 | Backpack/00006.jpg 1
81 | Backpack/00007.jpg 1
82 | Backpack/00008.jpg 1
83 | Backpack/00009.jpg 1
84 | Backpack/00010.jpg 1
85 | Backpack/00011.jpg 1
86 | Backpack/00012.jpg 1
87 | Backpack/00013.jpg 1
88 | Backpack/00014.jpg 1
89 | Backpack/00015.jpg 1
90 | Backpack/00016.jpg 1
91 | Backpack/00017.jpg 1
92 | Backpack/00018.jpg 1
93 | Backpack/00019.jpg 1
94 | Backpack/00020.jpg 1
95 | Backpack/00021.jpg 1
96 | Backpack/00022.jpg 1
97 | Backpack/00023.jpg 1
98 | Backpack/00024.jpg 1
99 | Backpack/00025.jpg 1
100 | Backpack/00026.jpg 1
101 | Backpack/00027.jpg 1
102 | Backpack/00028.jpg 1
103 | Backpack/00029.jpg 1
104 | Backpack/00030.jpg 1
105 | Backpack/00031.jpg 1
106 | Backpack/00032.jpg 1
107 | Backpack/00033.jpg 1
108 | Backpack/00034.jpg 1
109 | Backpack/00035.jpg 1
110 | Backpack/00036.jpg 1
111 | Backpack/00037.jpg 1
112 | Backpack/00038.jpg 1
113 | Backpack/00039.jpg 1
114 | Backpack/00040.jpg 1
115 | Backpack/00041.jpg 1
116 | Batteries/00001.jpg 2
117 | Batteries/00002.jpg 2
118 | Batteries/00003.jpg 2
119 | Batteries/00004.jpg 2
120 | Batteries/00005.jpg 2
121 | Batteries/00006.jpg 2
122 | Batteries/00007.jpg 2
123 | Batteries/00008.jpg 2
124 | Batteries/00009.jpg 2
125 | Batteries/00010.jpg 2
126 | Batteries/00011.jpg 2
127 | Batteries/00012.jpg 2
128 | Batteries/00013.jpg 2
129 | Batteries/00014.jpg 2
130 | Batteries/00015.jpg 2
131 | Batteries/00016.jpg 2
132 | Batteries/00017.jpg 2
133 | Batteries/00018.jpg 2
134 | Batteries/00019.jpg 2
135 | Batteries/00020.jpg 2
136 | Batteries/00021.jpg 2
137 | Batteries/00022.jpg 2
138 | Batteries/00023.jpg 2
139 | Batteries/00024.jpg 2
140 | Batteries/00025.jpg 2
141 | Batteries/00026.jpg 2
142 | Batteries/00027.jpg 2
143 | Bed/00001.jpg 3
144 | Bed/00002.jpg 3
145 | Bed/00003.jpg 3
146 | Bed/00004.jpg 3
147 | Bed/00005.jpg 3
148 | Bed/00006.jpg 3
149 | Bed/00007.jpg 3
150 | Bed/00008.jpg 3
151 | Bed/00009.jpg 3
152 | Bed/00010.jpg 3
153 | Bed/00011.jpg 3
154 | Bed/00012.jpg 3
155 | Bed/00013.jpg 3
156 | Bed/00014.jpg 3
157 | Bed/00015.jpg 3
158 | Bed/00016.jpg 3
159 | Bed/00017.jpg 3
160 | Bed/00018.jpg 3
161 | Bed/00019.jpg 3
162 | Bed/00020.jpg 3
163 | Bed/00021.jpg 3
164 | Bed/00022.jpg 3
165 | Bed/00023.jpg 3
166 | Bed/00024.jpg 3
167 | Bed/00025.jpg 3
168 | Bed/00026.jpg 3
169 | Bed/00027.jpg 3
170 | Bed/00028.jpg 3
171 | Bed/00029.jpg 3
172 | Bed/00030.jpg 3
173 | Bed/00031.jpg 3
174 | Bed/00032.jpg 3
175 | Bed/00033.jpg 3
176 | Bed/00034.jpg 3
177 | Bed/00035.jpg 3
178 | Bed/00036.jpg 3
179 | Bed/00037.jpg 3
180 | Bed/00038.jpg 3
181 | Bed/00039.jpg 3
182 | Bed/00040.jpg 3
183 | Bike/00001.jpg 4
184 | Bike/00002.jpg 4
185 | Bike/00003.jpg 4
186 | Bike/00004.jpg 4
187 | Bike/00005.jpg 4
188 | Bike/00006.jpg 4
189 | Bike/00007.jpg 4
190 | Bike/00008.jpg 4
191 | Bike/00009.jpg 4
192 | Bike/00010.jpg 4
193 | Bike/00011.jpg 4
194 | Bike/00012.jpg 4
195 | Bike/00013.jpg 4
196 | Bike/00014.jpg 4
197 | Bike/00015.jpg 4
198 | Bike/00016.jpg 4
199 | Bike/00017.jpg 4
200 | Bike/00018.jpg 4
201 | Bike/00019.jpg 4
202 | Bike/00020.jpg 4
203 | Bike/00021.jpg 4
204 | Bike/00022.jpg 4
205 | Bike/00023.jpg 4
206 | Bike/00024.jpg 4
207 | Bike/00025.jpg 4
208 | Bike/00026.jpg 4
209 | Bike/00027.jpg 4
210 | Bike/00028.jpg 4
211 | Bike/00029.jpg 4
212 | Bike/00030.jpg 4
213 | Bike/00031.jpg 4
214 | Bike/00032.jpg 4
215 | Bike/00033.jpg 4
216 | Bike/00034.jpg 4
217 | Bike/00035.jpg 4
218 | Bike/00036.jpg 4
219 | Bike/00037.jpg 4
220 | Bike/00038.jpg 4
221 | Bike/00039.jpg 4
222 | Bike/00040.jpg 4
223 | Bike/00041.jpg 4
224 | Bike/00042.jpg 4
225 | Bike/00043.jpg 4
226 | Bike/00044.jpg 4
227 | Bike/00045.jpg 4
228 | Bike/00046.jpg 4
229 | Bike/00047.jpg 4
230 | Bike/00048.jpg 4
231 | Bike/00049.jpg 4
232 | Bike/00050.jpg 4
233 | Bike/00051.jpg 4
234 | Bike/00052.jpg 4
235 | Bike/00053.jpg 4
236 | Bike/00054.jpg 4
237 | Bike/00055.jpg 4
238 | Bike/00056.jpg 4
239 | Bike/00057.jpg 4
240 | Bike/00058.jpg 4
241 | Bike/00059.jpg 4
242 | Bike/00060.jpg 4
243 | Bike/00061.jpg 4
244 | Bike/00062.jpg 4
245 | Bike/00063.jpg 4
246 | Bike/00064.jpg 4
247 | Bike/00065.jpg 4
248 | Bike/00066.jpg 4
249 | Bike/00067.jpg 4
250 | Bike/00068.jpg 4
251 | Bike/00069.jpg 4
252 | Bike/00070.jpg 4
253 | Bike/00071.jpg 4
254 | Bike/00072.jpg 4
255 | Bike/00073.jpg 4
256 | Bike/00074.jpg 4
257 | Bike/00075.jpg 4
258 | Bottle/00001.jpg 5
259 | Bottle/00002.jpg 5
260 | Bottle/00003.jpg 5
261 | Bottle/00004.jpg 5
262 | Bottle/00005.jpg 5
263 | Bottle/00006.jpg 5
264 | Bottle/00007.jpg 5
265 | Bottle/00008.jpg 5
266 | Bottle/00009.jpg 5
267 | Bottle/00010.jpg 5
268 | Bottle/00011.jpg 5
269 | Bottle/00012.jpg 5
270 | Bottle/00013.jpg 5
271 | Bottle/00014.jpg 5
272 | Bottle/00015.jpg 5
273 | Bottle/00016.jpg 5
274 | Bottle/00017.jpg 5
275 | Bottle/00018.jpg 5
276 | Bottle/00019.jpg 5
277 | Bottle/00020.jpg 5
278 | Bottle/00021.jpg 5
279 | Bottle/00022.jpg 5
280 | Bottle/00023.jpg 5
281 | Bottle/00024.jpg 5
282 | Bottle/00025.jpg 5
283 | Bottle/00026.jpg 5
284 | Bottle/00027.jpg 5
285 | Bottle/00028.jpg 5
286 | Bottle/00029.jpg 5
287 | Bottle/00030.jpg 5
288 | Bottle/00031.jpg 5
289 | Bottle/00032.jpg 5
290 | Bottle/00033.jpg 5
291 | Bottle/00034.jpg 5
292 | Bottle/00035.jpg 5
293 | Bottle/00036.jpg 5
294 | Bottle/00037.jpg 5
295 | Bottle/00038.jpg 5
296 | Bottle/00039.jpg 5
297 | Bottle/00040.jpg 5
298 | Bottle/00041.jpg 5
299 | Bottle/00042.jpg 5
300 | Bottle/00043.jpg 5
301 | Bottle/00044.jpg 5
302 | Bottle/00045.jpg 5
303 | Bottle/00046.jpg 5
304 | Bottle/00047.jpg 5
305 | Bottle/00048.jpg 5
306 | Bottle/00049.jpg 5
307 | Bottle/00050.jpg 5
308 | Bottle/00051.jpg 5
309 | Bottle/00052.jpg 5
310 | Bottle/00053.jpg 5
311 | Bottle/00054.jpg 5
312 | Bottle/00055.jpg 5
313 | Bottle/00056.jpg 5
314 | Bottle/00057.jpg 5
315 | Bottle/00058.jpg 5
316 | Bottle/00059.jpg 5
317 | Bottle/00060.jpg 5
318 | Bottle/00061.jpg 5
319 | Bottle/00062.jpg 5
320 | Bottle/00063.jpg 5
321 | Bottle/00064.jpg 5
322 | Bottle/00065.jpg 5
323 | Bottle/00066.jpg 5
324 | Bottle/00067.jpg 5
325 | Bottle/00068.jpg 5
326 | Bottle/00069.jpg 5
327 | Bottle/00070.jpg 5
328 | Bottle/00071.jpg 5
329 | Bottle/00072.jpg 5
330 | Bottle/00073.jpg 5
331 | Bottle/00074.jpg 5
332 | Bottle/00075.jpg 5
333 | Bottle/00076.jpg 5
334 | Bottle/00077.jpg 5
335 | Bottle/00078.jpg 5
336 | Bottle/00079.jpg 5
337 | Bottle/00080.jpg 5
338 | Bottle/00081.jpg 5
339 | Bottle/00082.jpg 5
340 | Bottle/00083.jpg 5
341 | Bottle/00084.jpg 5
342 | Bottle/00085.jpg 5
343 | Bottle/00086.jpg 5
344 | Bottle/00087.jpg 5
345 | Bottle/00088.jpg 5
346 | Bottle/00089.jpg 5
347 | Bottle/00090.jpg 5
348 | Bottle/00091.jpg 5
349 | Bottle/00092.jpg 5
350 | Bottle/00093.jpg 5
351 | Bottle/00094.jpg 5
352 | Bottle/00095.jpg 5
353 | Bottle/00096.jpg 5
354 | Bottle/00097.jpg 5
355 | Bottle/00098.jpg 5
356 | Bottle/00099.jpg 5
357 | Bucket/00001.jpg 6
358 | Bucket/00002.jpg 6
359 | Bucket/00003.jpg 6
360 | Bucket/00004.jpg 6
361 | Bucket/00005.jpg 6
362 | Bucket/00006.jpg 6
363 | Bucket/00007.jpg 6
364 | Bucket/00008.jpg 6
365 | Bucket/00009.jpg 6
366 | Bucket/00010.jpg 6
367 | Bucket/00011.jpg 6
368 | Bucket/00012.jpg 6
369 | Bucket/00013.jpg 6
370 | Bucket/00014.jpg 6
371 | Bucket/00015.jpg 6
372 | Bucket/00016.jpg 6
373 | Bucket/00017.jpg 6
374 | Bucket/00018.jpg 6
375 | Bucket/00019.jpg 6
376 | Bucket/00020.jpg 6
377 | Bucket/00021.jpg 6
378 | Bucket/00022.jpg 6
379 | Bucket/00023.jpg 6
380 | Bucket/00024.jpg 6
381 | Bucket/00025.jpg 6
382 | Bucket/00026.jpg 6
383 | Bucket/00027.jpg 6
384 | Bucket/00028.jpg 6
385 | Bucket/00029.jpg 6
386 | Bucket/00030.jpg 6
387 | Bucket/00031.jpg 6
388 | Bucket/00032.jpg 6
389 | Bucket/00033.jpg 6
390 | Bucket/00034.jpg 6
391 | Bucket/00035.jpg 6
392 | Bucket/00036.jpg 6
393 | Bucket/00037.jpg 6
394 | Bucket/00038.jpg 6
395 | Bucket/00039.jpg 6
396 | Bucket/00040.jpg 6
397 | Calculator/00001.jpg 7
398 | Calculator/00002.jpg 7
399 | Calculator/00003.jpg 7
400 | Calculator/00004.jpg 7
401 | Calculator/00005.jpg 7
402 | Calculator/00006.jpg 7
403 | Calculator/00007.jpg 7
404 | Calculator/00008.jpg 7
405 | Calculator/00009.jpg 7
406 | Calculator/00010.jpg 7
407 | Calculator/00011.jpg 7
408 | Calculator/00012.jpg 7
409 | Calculator/00013.jpg 7
410 | Calculator/00014.jpg 7
411 | Calculator/00015.jpg 7
412 | Calculator/00016.jpg 7
413 | Calculator/00017.jpg 7
414 | Calculator/00018.jpg 7
415 | Calculator/00019.jpg 7
416 | Calculator/00020.jpg 7
417 | Calculator/00021.jpg 7
418 | Calculator/00022.jpg 7
419 | Calculator/00023.jpg 7
420 | Calculator/00024.jpg 7
421 | Calculator/00025.jpg 7
422 | Calculator/00026.jpg 7
423 | Calculator/00027.jpg 7
424 | Calculator/00028.jpg 7
425 | Calculator/00029.jpg 7
426 | Calculator/00030.jpg 7
427 | Calculator/00031.jpg 7
428 | Calculator/00032.jpg 7
429 | Calculator/00033.jpg 7
430 | Calendar/00001.jpg 8
431 | Calendar/00002.jpg 8
432 | Calendar/00003.jpg 8
433 | Calendar/00004.jpg 8
434 | Calendar/00005.jpg 8
435 | Calendar/00006.jpg 8
436 | Calendar/00007.jpg 8
437 | Calendar/00008.jpg 8
438 | Calendar/00009.jpg 8
439 | Calendar/00010.jpg 8
440 | Calendar/00011.jpg 8
441 | Calendar/00012.jpg 8
442 | Calendar/00013.jpg 8
443 | Calendar/00014.jpg 8
444 | Calendar/00015.jpg 8
445 | Calendar/00016.jpg 8
446 | Calendar/00017.jpg 8
447 | Calendar/00018.jpg 8
448 | Calendar/00019.jpg 8
449 | Calendar/00020.jpg 8
450 | Candles/00001.jpg 9
451 | Candles/00002.jpg 9
452 | Candles/00003.jpg 9
453 | Candles/00004.jpg 9
454 | Candles/00005.jpg 9
455 | Candles/00006.jpg 9
456 | Candles/00007.jpg 9
457 | Candles/00008.jpg 9
458 | Candles/00009.jpg 9
459 | Candles/00010.jpg 9
460 | Candles/00011.jpg 9
461 | Candles/00012.jpg 9
462 | Candles/00013.jpg 9
463 | Candles/00014.jpg 9
464 | Candles/00015.jpg 9
465 | Candles/00016.jpg 9
466 | Candles/00017.jpg 9
467 | Candles/00018.jpg 9
468 | Candles/00019.jpg 9
469 | Candles/00020.jpg 9
470 | Candles/00021.jpg 9
471 | Candles/00022.jpg 9
472 | Candles/00023.jpg 9
473 | Candles/00024.jpg 9
474 | Candles/00025.jpg 9
475 | Candles/00026.jpg 9
476 | Candles/00027.jpg 9
477 | Candles/00028.jpg 9
478 | Candles/00029.jpg 9
479 | Candles/00030.jpg 9
480 | Candles/00031.jpg 9
481 | Candles/00032.jpg 9
482 | Candles/00033.jpg 9
483 | Candles/00034.jpg 9
484 | Candles/00035.jpg 9
485 | Candles/00036.jpg 9
486 | Candles/00037.jpg 9
487 | Candles/00038.jpg 9
488 | Candles/00039.jpg 9
489 | Candles/00040.jpg 9
490 | Candles/00041.jpg 9
491 | Candles/00042.jpg 9
492 | Candles/00043.jpg 9
493 | Candles/00044.jpg 9
494 | Candles/00045.jpg 9
495 | Candles/00046.jpg 9
496 | Candles/00047.jpg 9
497 | Candles/00048.jpg 9
498 | Candles/00049.jpg 9
499 | Candles/00050.jpg 9
500 | Candles/00051.jpg 9
501 | Candles/00052.jpg 9
502 | Candles/00053.jpg 9
503 | Candles/00054.jpg 9
504 | Candles/00055.jpg 9
505 | Candles/00056.jpg 9
506 | Candles/00057.jpg 9
507 | Candles/00058.jpg 9
508 | Candles/00059.jpg 9
509 | Candles/00060.jpg 9
510 | Candles/00061.jpg 9
511 | Candles/00062.jpg 9
512 | Candles/00063.jpg 9
513 | Candles/00064.jpg 9
514 | Candles/00065.jpg 9
515 | Candles/00066.jpg 9
516 | Candles/00067.jpg 9
517 | Candles/00068.jpg 9
518 | Candles/00069.jpg 9
519 | Candles/00070.jpg 9
520 | Candles/00071.jpg 9
521 | Candles/00072.jpg 9
522 | Candles/00073.jpg 9
523 | Candles/00074.jpg 9
524 | Candles/00075.jpg 9
525 | Candles/00076.jpg 9
526 | Chair/00001.jpg 10
527 | Chair/00002.jpg 10
528 | Chair/00003.jpg 10
529 | Chair/00004.jpg 10
530 | Chair/00005.jpg 10
531 | Chair/00006.jpg 10
532 | Chair/00007.jpg 10
533 | Chair/00008.jpg 10
534 | Chair/00009.jpg 10
535 | Chair/00010.jpg 10
536 | Chair/00011.jpg 10
537 | Chair/00012.jpg 10
538 | Chair/00013.jpg 10
539 | Chair/00014.jpg 10
540 | Chair/00015.jpg 10
541 | Chair/00016.jpg 10
542 | Chair/00017.jpg 10
543 | Chair/00018.jpg 10
544 | Chair/00019.jpg 10
545 | Chair/00020.jpg 10
546 | Chair/00021.jpg 10
547 | Chair/00022.jpg 10
548 | Chair/00023.jpg 10
549 | Chair/00024.jpg 10
550 | Chair/00025.jpg 10
551 | Chair/00026.jpg 10
552 | Chair/00027.jpg 10
553 | Chair/00028.jpg 10
554 | Chair/00029.jpg 10
555 | Chair/00030.jpg 10
556 | Chair/00031.jpg 10
557 | Chair/00032.jpg 10
558 | Chair/00033.jpg 10
559 | Chair/00034.jpg 10
560 | Chair/00035.jpg 10
561 | Chair/00036.jpg 10
562 | Chair/00037.jpg 10
563 | Chair/00038.jpg 10
564 | Chair/00039.jpg 10
565 | Chair/00040.jpg 10
566 | Chair/00041.jpg 10
567 | Chair/00042.jpg 10
568 | Chair/00043.jpg 10
569 | Chair/00044.jpg 10
570 | Chair/00045.jpg 10
571 | Chair/00046.jpg 10
572 | Chair/00047.jpg 10
573 | Chair/00048.jpg 10
574 | Chair/00049.jpg 10
575 | Chair/00050.jpg 10
576 | Chair/00051.jpg 10
577 | Chair/00052.jpg 10
578 | Chair/00053.jpg 10
579 | Chair/00054.jpg 10
580 | Chair/00055.jpg 10
581 | Chair/00056.jpg 10
582 | Chair/00057.jpg 10
583 | Chair/00058.jpg 10
584 | Chair/00059.jpg 10
585 | Chair/00060.jpg 10
586 | Chair/00061.jpg 10
587 | Chair/00062.jpg 10
588 | Chair/00063.jpg 10
589 | Chair/00064.jpg 10
590 | Chair/00065.jpg 10
591 | Chair/00066.jpg 10
592 | Chair/00067.jpg 10
593 | Chair/00068.jpg 10
594 | Chair/00069.jpg 10
595 | Clipboards/00001.jpg 11
596 | Clipboards/00002.jpg 11
597 | Clipboards/00003.jpg 11
598 | Clipboards/00004.jpg 11
599 | Clipboards/00005.jpg 11
600 | Clipboards/00006.jpg 11
601 | Clipboards/00007.jpg 11
602 | Clipboards/00008.jpg 11
603 | Clipboards/00009.jpg 11
604 | Clipboards/00010.jpg 11
605 | Clipboards/00011.jpg 11
606 | Clipboards/00012.jpg 11
607 | Clipboards/00013.jpg 11
608 | Clipboards/00014.jpg 11
609 | Clipboards/00015.jpg 11
610 | Clipboards/00016.jpg 11
611 | Clipboards/00017.jpg 11
612 | Clipboards/00018.jpg 11
613 | Clipboards/00019.jpg 11
614 | Clipboards/00020.jpg 11
615 | Clipboards/00021.jpg 11
616 | Clipboards/00022.jpg 11
617 | Clipboards/00023.jpg 11
618 | Clipboards/00024.jpg 11
619 | Clipboards/00025.jpg 11
620 | Computer/00001.jpg 12
621 | Computer/00002.jpg 12
622 | Computer/00003.jpg 12
623 | Computer/00004.jpg 12
624 | Computer/00005.jpg 12
625 | Computer/00006.jpg 12
626 | Computer/00007.jpg 12
627 | Computer/00008.jpg 12
628 | Computer/00009.jpg 12
629 | Computer/00010.jpg 12
630 | Computer/00011.jpg 12
631 | Computer/00012.jpg 12
632 | Computer/00013.jpg 12
633 | Computer/00014.jpg 12
634 | Computer/00015.jpg 12
635 | Computer/00016.jpg 12
636 | Computer/00017.jpg 12
637 | Computer/00018.jpg 12
638 | Computer/00019.jpg 12
639 | Computer/00020.jpg 12
640 | Computer/00021.jpg 12
641 | Computer/00022.jpg 12
642 | Computer/00023.jpg 12
643 | Computer/00024.jpg 12
644 | Computer/00025.jpg 12
645 | Computer/00026.jpg 12
646 | Computer/00027.jpg 12
647 | Computer/00028.jpg 12
648 | Computer/00029.jpg 12
649 | Computer/00030.jpg 12
650 | Computer/00031.jpg 12
651 | Computer/00032.jpg 12
652 | Computer/00033.jpg 12
653 | Computer/00034.jpg 12
654 | Computer/00035.jpg 12
655 | Computer/00036.jpg 12
656 | Computer/00037.jpg 12
657 | Computer/00038.jpg 12
658 | Computer/00039.jpg 12
659 | Computer/00040.jpg 12
660 | Computer/00041.jpg 12
661 | Computer/00042.jpg 12
662 | Computer/00043.jpg 12
663 | Computer/00044.jpg 12
664 | Couch/00001.jpg 13
665 | Couch/00002.jpg 13
666 | Couch/00003.jpg 13
667 | Couch/00004.jpg 13
668 | Couch/00005.jpg 13
669 | Couch/00006.jpg 13
670 | Couch/00007.jpg 13
671 | Couch/00008.jpg 13
672 | Couch/00009.jpg 13
673 | Couch/00010.jpg 13
674 | Couch/00011.jpg 13
675 | Couch/00012.jpg 13
676 | Couch/00013.jpg 13
677 | Couch/00014.jpg 13
678 | Couch/00015.jpg 13
679 | Couch/00016.jpg 13
680 | Couch/00017.jpg 13
681 | Couch/00018.jpg 13
682 | Couch/00019.jpg 13
683 | Couch/00020.jpg 13
684 | Couch/00021.jpg 13
685 | Couch/00022.jpg 13
686 | Couch/00023.jpg 13
687 | Couch/00024.jpg 13
688 | Couch/00025.jpg 13
689 | Couch/00026.jpg 13
690 | Couch/00027.jpg 13
691 | Couch/00028.jpg 13
692 | Couch/00029.jpg 13
693 | Couch/00030.jpg 13
694 | Couch/00031.jpg 13
695 | Couch/00032.jpg 13
696 | Couch/00033.jpg 13
697 | Couch/00034.jpg 13
698 | Couch/00035.jpg 13
699 | Couch/00036.jpg 13
700 | Couch/00037.jpg 13
701 | Couch/00038.jpg 13
702 | Couch/00039.jpg 13
703 | Couch/00040.jpg 13
704 | Curtains/00001.jpg 14
705 | Curtains/00002.jpg 14
706 | Curtains/00003.jpg 14
707 | Curtains/00004.jpg 14
708 | Curtains/00005.jpg 14
709 | Curtains/00006.jpg 14
710 | Curtains/00007.jpg 14
711 | Curtains/00008.jpg 14
712 | Curtains/00009.jpg 14
713 | Curtains/00010.jpg 14
714 | Curtains/00011.jpg 14
715 | Curtains/00012.jpg 14
716 | Curtains/00013.jpg 14
717 | Curtains/00014.jpg 14
718 | Curtains/00015.jpg 14
719 | Curtains/00016.jpg 14
720 | Curtains/00017.jpg 14
721 | Curtains/00018.jpg 14
722 | Curtains/00019.jpg 14
723 | Curtains/00020.jpg 14
724 | Curtains/00021.jpg 14
725 | Curtains/00022.jpg 14
726 | Curtains/00023.jpg 14
727 | Curtains/00024.jpg 14
728 | Curtains/00025.jpg 14
729 | Curtains/00026.jpg 14
730 | Curtains/00027.jpg 14
731 | Curtains/00028.jpg 14
732 | Curtains/00029.jpg 14
733 | Curtains/00030.jpg 14
734 | Curtains/00031.jpg 14
735 | Curtains/00032.jpg 14
736 | Curtains/00033.jpg 14
737 | Curtains/00034.jpg 14
738 | Curtains/00035.jpg 14
739 | Curtains/00036.jpg 14
740 | Curtains/00037.jpg 14
741 | Curtains/00038.jpg 14
742 | Curtains/00039.jpg 14
743 | Curtains/00040.jpg 14
744 | Desk_Lamp/00001.jpg 15
745 | Desk_Lamp/00002.jpg 15
746 | Desk_Lamp/00003.jpg 15
747 | Desk_Lamp/00004.jpg 15
748 | Desk_Lamp/00005.jpg 15
749 | Desk_Lamp/00006.jpg 15
750 | Desk_Lamp/00007.jpg 15
751 | Desk_Lamp/00008.jpg 15
752 | Desk_Lamp/00009.jpg 15
753 | Desk_Lamp/00010.jpg 15
754 | Desk_Lamp/00011.jpg 15
755 | Desk_Lamp/00012.jpg 15
756 | Desk_Lamp/00013.jpg 15
757 | Desk_Lamp/00014.jpg 15
758 | Desk_Lamp/00015.jpg 15
759 | Desk_Lamp/00016.jpg 15
760 | Desk_Lamp/00017.jpg 15
761 | Desk_Lamp/00018.jpg 15
762 | Desk_Lamp/00019.jpg 15
763 | Desk_Lamp/00020.jpg 15
764 | Desk_Lamp/00021.jpg 15
765 | Desk_Lamp/00022.jpg 15
766 | Desk_Lamp/00023.jpg 15
767 | Drill/00001.jpg 16
768 | Drill/00002.jpg 16
769 | Drill/00003.jpg 16
770 | Drill/00004.jpg 16
771 | Drill/00005.jpg 16
772 | Drill/00006.jpg 16
773 | Drill/00007.jpg 16
774 | Drill/00008.jpg 16
775 | Drill/00009.jpg 16
776 | Drill/00010.jpg 16
777 | Drill/00011.jpg 16
778 | Drill/00012.jpg 16
779 | Drill/00013.jpg 16
780 | Drill/00014.jpg 16
781 | Drill/00015.jpg 16
782 | Eraser/00001.jpg 17
783 | Eraser/00002.jpg 17
784 | Eraser/00003.jpg 17
785 | Eraser/00004.jpg 17
786 | Eraser/00005.jpg 17
787 | Eraser/00006.jpg 17
788 | Eraser/00007.jpg 17
789 | Eraser/00008.jpg 17
790 | Eraser/00009.jpg 17
791 | Eraser/00010.jpg 17
792 | Eraser/00011.jpg 17
793 | Eraser/00012.jpg 17
794 | Eraser/00013.jpg 17
795 | Eraser/00014.jpg 17
796 | Eraser/00015.jpg 17
797 | Eraser/00016.jpg 17
798 | Eraser/00017.jpg 17
799 | Eraser/00018.jpg 17
800 | Exit_Sign/00001.jpg 18
801 | Exit_Sign/00002.jpg 18
802 | Exit_Sign/00003.jpg 18
803 | Exit_Sign/00004.jpg 18
804 | Exit_Sign/00005.jpg 18
805 | Exit_Sign/00006.jpg 18
806 | Exit_Sign/00007.jpg 18
807 | Exit_Sign/00008.jpg 18
808 | Exit_Sign/00009.jpg 18
809 | Exit_Sign/00010.jpg 18
810 | Exit_Sign/00011.jpg 18
811 | Exit_Sign/00012.jpg 18
812 | Exit_Sign/00013.jpg 18
813 | Exit_Sign/00014.jpg 18
814 | Exit_Sign/00015.jpg 18
815 | Exit_Sign/00016.jpg 18
816 | Exit_Sign/00017.jpg 18
817 | Exit_Sign/00018.jpg 18
818 | Exit_Sign/00019.jpg 18
819 | Exit_Sign/00020.jpg 18
820 | Exit_Sign/00021.jpg 18
821 | Fan/00001.jpg 19
822 | Fan/00002.jpg 19
823 | Fan/00003.jpg 19
824 | Fan/00004.jpg 19
825 | Fan/00005.jpg 19
826 | Fan/00006.jpg 19
827 | Fan/00007.jpg 19
828 | Fan/00008.jpg 19
829 | Fan/00009.jpg 19
830 | Fan/00010.jpg 19
831 | Fan/00011.jpg 19
832 | Fan/00012.jpg 19
833 | Fan/00013.jpg 19
834 | Fan/00014.jpg 19
835 | Fan/00015.jpg 19
836 | Fan/00016.jpg 19
837 | Fan/00017.jpg 19
838 | Fan/00018.jpg 19
839 | Fan/00019.jpg 19
840 | Fan/00020.jpg 19
841 | Fan/00021.jpg 19
842 | Fan/00022.jpg 19
843 | Fan/00023.jpg 19
844 | Fan/00024.jpg 19
845 | Fan/00025.jpg 19
846 | Fan/00026.jpg 19
847 | Fan/00027.jpg 19
848 | Fan/00028.jpg 19
849 | Fan/00029.jpg 19
850 | Fan/00030.jpg 19
851 | Fan/00031.jpg 19
852 | Fan/00032.jpg 19
853 | Fan/00033.jpg 19
854 | Fan/00034.jpg 19
855 | Fan/00035.jpg 19
856 | Fan/00036.jpg 19
857 | Fan/00037.jpg 19
858 | Fan/00038.jpg 19
859 | Fan/00039.jpg 19
860 | Fan/00040.jpg 19
861 | Fan/00041.jpg 19
862 | Fan/00042.jpg 19
863 | Fan/00043.jpg 19
864 | Fan/00044.jpg 19
865 | Fan/00045.jpg 19
866 | File_Cabinet/00001.jpg 20
867 | File_Cabinet/00002.jpg 20
868 | File_Cabinet/00003.jpg 20
869 | File_Cabinet/00004.jpg 20
870 | File_Cabinet/00005.jpg 20
871 | File_Cabinet/00006.jpg 20
872 | File_Cabinet/00007.jpg 20
873 | File_Cabinet/00008.jpg 20
874 | File_Cabinet/00009.jpg 20
875 | File_Cabinet/00010.jpg 20
876 | File_Cabinet/00011.jpg 20
877 | File_Cabinet/00012.jpg 20
878 | File_Cabinet/00013.jpg 20
879 | File_Cabinet/00014.jpg 20
880 | File_Cabinet/00015.jpg 20
881 | File_Cabinet/00016.jpg 20
882 | File_Cabinet/00017.jpg 20
883 | File_Cabinet/00018.jpg 20
884 | File_Cabinet/00019.jpg 20
885 | File_Cabinet/00020.jpg 20
886 | File_Cabinet/00021.jpg 20
887 | File_Cabinet/00022.jpg 20
888 | Flipflops/00001.jpg 21
889 | Flipflops/00002.jpg 21
890 | Flipflops/00003.jpg 21
891 | Flipflops/00004.jpg 21
892 | Flipflops/00005.jpg 21
893 | Flipflops/00006.jpg 21
894 | Flipflops/00007.jpg 21
895 | Flipflops/00008.jpg 21
896 | Flipflops/00009.jpg 21
897 | Flipflops/00010.jpg 21
898 | Flipflops/00011.jpg 21
899 | Flipflops/00012.jpg 21
900 | Flipflops/00013.jpg 21
901 | Flipflops/00014.jpg 21
902 | Flipflops/00015.jpg 21
903 | Flipflops/00016.jpg 21
904 | Flipflops/00017.jpg 21
905 | Flipflops/00018.jpg 21
906 | Flipflops/00019.jpg 21
907 | Flipflops/00020.jpg 21
908 | Flipflops/00021.jpg 21
909 | Flipflops/00022.jpg 21
910 | Flipflops/00023.jpg 21
911 | Flipflops/00024.jpg 21
912 | Flipflops/00025.jpg 21
913 | Flipflops/00026.jpg 21
914 | Flipflops/00027.jpg 21
915 | Flipflops/00028.jpg 21
916 | Flipflops/00029.jpg 21
917 | Flipflops/00030.jpg 21
918 | Flipflops/00031.jpg 21
919 | Flipflops/00032.jpg 21
920 | Flipflops/00033.jpg 21
921 | Flipflops/00034.jpg 21
922 | Flipflops/00035.jpg 21
923 | Flipflops/00036.jpg 21
924 | Flipflops/00037.jpg 21
925 | Flipflops/00038.jpg 21
926 | Flipflops/00039.jpg 21
927 | Flipflops/00040.jpg 21
928 | Flipflops/00041.jpg 21
929 | Flipflops/00042.jpg 21
930 | Flipflops/00043.jpg 21
931 | Flipflops/00044.jpg 21
932 | Flipflops/00045.jpg 21
933 | Flipflops/00046.jpg 21
934 | Flowers/00001.jpg 22
935 | Flowers/00002.jpg 22
936 | Flowers/00003.jpg 22
937 | Flowers/00004.jpg 22
938 | Flowers/00005.jpg 22
939 | Flowers/00006.jpg 22
940 | Flowers/00007.jpg 22
941 | Flowers/00008.jpg 22
942 | Flowers/00009.jpg 22
943 | Flowers/00010.jpg 22
944 | Flowers/00011.jpg 22
945 | Flowers/00012.jpg 22
946 | Flowers/00013.jpg 22
947 | Flowers/00014.jpg 22
948 | Flowers/00015.jpg 22
949 | Flowers/00016.jpg 22
950 | Flowers/00017.jpg 22
951 | Flowers/00018.jpg 22
952 | Flowers/00019.jpg 22
953 | Flowers/00020.jpg 22
954 | Flowers/00021.jpg 22
955 | Flowers/00022.jpg 22
956 | Flowers/00023.jpg 22
957 | Flowers/00024.jpg 22
958 | Flowers/00025.jpg 22
959 | Flowers/00026.jpg 22
960 | Flowers/00027.jpg 22
961 | Flowers/00028.jpg 22
962 | Flowers/00029.jpg 22
963 | Flowers/00030.jpg 22
964 | Flowers/00031.jpg 22
965 | Flowers/00032.jpg 22
966 | Flowers/00033.jpg 22
967 | Flowers/00034.jpg 22
968 | Flowers/00035.jpg 22
969 | Flowers/00036.jpg 22
970 | Flowers/00037.jpg 22
971 | Flowers/00038.jpg 22
972 | Flowers/00039.jpg 22
973 | Flowers/00040.jpg 22
974 | Flowers/00041.jpg 22
975 | Flowers/00042.jpg 22
976 | Flowers/00043.jpg 22
977 | Flowers/00044.jpg 22
978 | Flowers/00045.jpg 22
979 | Flowers/00046.jpg 22
980 | Flowers/00047.jpg 22
981 | Flowers/00048.jpg 22
982 | Flowers/00049.jpg 22
983 | Flowers/00050.jpg 22
984 | Flowers/00051.jpg 22
985 | Flowers/00052.jpg 22
986 | Flowers/00053.jpg 22
987 | Flowers/00054.jpg 22
988 | Flowers/00055.jpg 22
989 | Flowers/00056.jpg 22
990 | Flowers/00057.jpg 22
991 | Flowers/00058.jpg 22
992 | Flowers/00059.jpg 22
993 | Flowers/00060.jpg 22
994 | Flowers/00061.jpg 22
995 | Flowers/00062.jpg 22
996 | Flowers/00063.jpg 22
997 | Flowers/00064.jpg 22
998 | Flowers/00065.jpg 22
999 | Flowers/00066.jpg 22
1000 | Flowers/00067.jpg 22
1001 | Flowers/00068.jpg 22
1002 | Flowers/00069.jpg 22
1003 | Flowers/00070.jpg 22
1004 | Flowers/00071.jpg 22
1005 | Flowers/00072.jpg 22
1006 | Flowers/00073.jpg 22
1007 | Flowers/00074.jpg 22
1008 | Flowers/00075.jpg 22
1009 | Flowers/00076.jpg 22
1010 | Flowers/00077.jpg 22
1011 | Flowers/00078.jpg 22
1012 | Flowers/00079.jpg 22
1013 | Flowers/00080.jpg 22
1014 | Flowers/00081.jpg 22
1015 | Flowers/00082.jpg 22
1016 | Flowers/00083.jpg 22
1017 | Flowers/00084.jpg 22
1018 | Flowers/00085.jpg 22
1019 | Flowers/00086.jpg 22
1020 | Flowers/00087.jpg 22
1021 | Flowers/00088.jpg 22
1022 | Flowers/00089.jpg 22
1023 | Flowers/00090.jpg 22
1024 | Folder/00001.jpg 23
1025 | Folder/00002.jpg 23
1026 | Folder/00003.jpg 23
1027 | Folder/00004.jpg 23
1028 | Folder/00005.jpg 23
1029 | Folder/00006.jpg 23
1030 | Folder/00007.jpg 23
1031 | Folder/00008.jpg 23
1032 | Folder/00009.jpg 23
1033 | Folder/00010.jpg 23
1034 | Folder/00011.jpg 23
1035 | Folder/00012.jpg 23
1036 | Folder/00013.jpg 23
1037 | Folder/00014.jpg 23
1038 | Folder/00015.jpg 23
1039 | Folder/00016.jpg 23
1040 | Folder/00017.jpg 23
1041 | Folder/00018.jpg 23
1042 | Folder/00019.jpg 23
1043 | Folder/00020.jpg 23
1044 | Fork/00001.jpg 24
1045 | Fork/00002.jpg 24
1046 | Fork/00003.jpg 24
1047 | Fork/00004.jpg 24
1048 | Fork/00005.jpg 24
1049 | Fork/00006.jpg 24
1050 | Fork/00007.jpg 24
1051 | Fork/00008.jpg 24
1052 | Fork/00009.jpg 24
1053 | Fork/00010.jpg 24
1054 | Fork/00011.jpg 24
1055 | Fork/00012.jpg 24
1056 | Fork/00013.jpg 24
1057 | Fork/00014.jpg 24
1058 | Fork/00015.jpg 24
1059 | Fork/00016.jpg 24
1060 | Fork/00017.jpg 24
1061 | Fork/00018.jpg 24
1062 | Fork/00019.jpg 24
1063 | Fork/00020.jpg 24
1064 | Fork/00021.jpg 24
1065 | Fork/00022.jpg 24
1066 | Fork/00023.jpg 24
1067 | Fork/00024.jpg 24
1068 | Fork/00025.jpg 24
1069 | Fork/00026.jpg 24
1070 | Fork/00027.jpg 24
1071 | Fork/00028.jpg 24
1072 | Fork/00029.jpg 24
1073 | Fork/00030.jpg 24
1074 | Fork/00031.jpg 24
1075 | Fork/00032.jpg 24
1076 | Fork/00033.jpg 24
1077 | Fork/00034.jpg 24
1078 | Fork/00035.jpg 24
1079 | Fork/00036.jpg 24
1080 | Fork/00037.jpg 24
1081 | Fork/00038.jpg 24
1082 | Fork/00039.jpg 24
1083 | Fork/00040.jpg 24
1084 | Fork/00041.jpg 24
1085 | Fork/00042.jpg 24
1086 | Fork/00043.jpg 24
1087 | Fork/00044.jpg 24
1088 | Fork/00045.jpg 24
1089 | Fork/00046.jpg 24
1090 | Glasses/00001.jpg 25
1091 | Glasses/00002.jpg 25
1092 | Glasses/00003.jpg 25
1093 | Glasses/00004.jpg 25
1094 | Glasses/00005.jpg 25
1095 | Glasses/00006.jpg 25
1096 | Glasses/00007.jpg 25
1097 | Glasses/00008.jpg 25
1098 | Glasses/00009.jpg 25
1099 | Glasses/00010.jpg 25
1100 | Glasses/00011.jpg 25
1101 | Glasses/00012.jpg 25
1102 | Glasses/00013.jpg 25
1103 | Glasses/00014.jpg 25
1104 | Glasses/00015.jpg 25
1105 | Glasses/00016.jpg 25
1106 | Glasses/00017.jpg 25
1107 | Glasses/00018.jpg 25
1108 | Glasses/00019.jpg 25
1109 | Glasses/00020.jpg 25
1110 | Glasses/00021.jpg 25
1111 | Glasses/00022.jpg 25
1112 | Glasses/00023.jpg 25
1113 | Glasses/00024.jpg 25
1114 | Glasses/00025.jpg 25
1115 | Glasses/00026.jpg 25
1116 | Glasses/00027.jpg 25
1117 | Glasses/00028.jpg 25
1118 | Glasses/00029.jpg 25
1119 | Glasses/00030.jpg 25
1120 | Glasses/00031.jpg 25
1121 | Glasses/00032.jpg 25
1122 | Glasses/00033.jpg 25
1123 | Glasses/00034.jpg 25
1124 | Glasses/00035.jpg 25
1125 | Glasses/00036.jpg 25
1126 | Glasses/00037.jpg 25
1127 | Glasses/00038.jpg 25
1128 | Glasses/00039.jpg 25
1129 | Glasses/00040.jpg 25
1130 | Hammer/00001.jpg 26
1131 | Hammer/00002.jpg 26
1132 | Hammer/00003.jpg 26
1133 | Hammer/00004.jpg 26
1134 | Hammer/00005.jpg 26
1135 | Hammer/00006.jpg 26
1136 | Hammer/00007.jpg 26
1137 | Hammer/00008.jpg 26
1138 | Hammer/00009.jpg 26
1139 | Hammer/00010.jpg 26
1140 | Hammer/00011.jpg 26
1141 | Hammer/00012.jpg 26
1142 | Hammer/00013.jpg 26
1143 | Hammer/00014.jpg 26
1144 | Hammer/00015.jpg 26
1145 | Hammer/00016.jpg 26
1146 | Hammer/00017.jpg 26
1147 | Hammer/00018.jpg 26
1148 | Hammer/00019.jpg 26
1149 | Hammer/00020.jpg 26
1150 | Hammer/00021.jpg 26
1151 | Hammer/00022.jpg 26
1152 | Hammer/00023.jpg 26
1153 | Hammer/00024.jpg 26
1154 | Hammer/00025.jpg 26
1155 | Hammer/00026.jpg 26
1156 | Hammer/00027.jpg 26
1157 | Hammer/00028.jpg 26
1158 | Hammer/00029.jpg 26
1159 | Hammer/00030.jpg 26
1160 | Hammer/00031.jpg 26
1161 | Hammer/00032.jpg 26
1162 | Hammer/00033.jpg 26
1163 | Hammer/00034.jpg 26
1164 | Hammer/00035.jpg 26
1165 | Hammer/00036.jpg 26
1166 | Hammer/00037.jpg 26
1167 | Hammer/00038.jpg 26
1168 | Hammer/00039.jpg 26
1169 | Hammer/00040.jpg 26
1170 | Helmet/00001.jpg 27
1171 | Helmet/00002.jpg 27
1172 | Helmet/00003.jpg 27
1173 | Helmet/00004.jpg 27
1174 | Helmet/00005.jpg 27
1175 | Helmet/00006.jpg 27
1176 | Helmet/00007.jpg 27
1177 | Helmet/00008.jpg 27
1178 | Helmet/00009.jpg 27
1179 | Helmet/00010.jpg 27
1180 | Helmet/00011.jpg 27
1181 | Helmet/00012.jpg 27
1182 | Helmet/00013.jpg 27
1183 | Helmet/00014.jpg 27
1184 | Helmet/00015.jpg 27
1185 | Helmet/00016.jpg 27
1186 | Helmet/00017.jpg 27
1187 | Helmet/00018.jpg 27
1188 | Helmet/00019.jpg 27
1189 | Helmet/00020.jpg 27
1190 | Helmet/00021.jpg 27
1191 | Helmet/00022.jpg 27
1192 | Helmet/00023.jpg 27
1193 | Helmet/00024.jpg 27
1194 | Helmet/00025.jpg 27
1195 | Helmet/00026.jpg 27
1196 | Helmet/00027.jpg 27
1197 | Helmet/00028.jpg 27
1198 | Helmet/00029.jpg 27
1199 | Helmet/00030.jpg 27
1200 | Helmet/00031.jpg 27
1201 | Helmet/00032.jpg 27
1202 | Helmet/00033.jpg 27
1203 | Helmet/00034.jpg 27
1204 | Helmet/00035.jpg 27
1205 | Helmet/00036.jpg 27
1206 | Helmet/00037.jpg 27
1207 | Helmet/00038.jpg 27
1208 | Helmet/00039.jpg 27
1209 | Helmet/00040.jpg 27
1210 | Helmet/00041.jpg 27
1211 | Helmet/00042.jpg 27
1212 | Helmet/00043.jpg 27
1213 | Helmet/00044.jpg 27
1214 | Helmet/00045.jpg 27
1215 | Helmet/00046.jpg 27
1216 | Helmet/00047.jpg 27
1217 | Helmet/00048.jpg 27
1218 | Helmet/00049.jpg 27
1219 | Helmet/00050.jpg 27
1220 | Helmet/00051.jpg 27
1221 | Helmet/00052.jpg 27
1222 | Helmet/00053.jpg 27
1223 | Helmet/00054.jpg 27
1224 | Helmet/00055.jpg 27
1225 | Helmet/00056.jpg 27
1226 | Helmet/00057.jpg 27
1227 | Helmet/00058.jpg 27
1228 | Helmet/00059.jpg 27
1229 | Helmet/00060.jpg 27
1230 | Helmet/00061.jpg 27
1231 | Helmet/00062.jpg 27
1232 | Helmet/00063.jpg 27
1233 | Helmet/00064.jpg 27
1234 | Helmet/00065.jpg 27
1235 | Helmet/00066.jpg 27
1236 | Helmet/00067.jpg 27
1237 | Helmet/00068.jpg 27
1238 | Helmet/00069.jpg 27
1239 | Helmet/00070.jpg 27
1240 | Helmet/00071.jpg 27
1241 | Helmet/00072.jpg 27
1242 | Helmet/00073.jpg 27
1243 | Helmet/00074.jpg 27
1244 | Helmet/00075.jpg 27
1245 | Helmet/00076.jpg 27
1246 | Helmet/00077.jpg 27
1247 | Helmet/00078.jpg 27
1248 | Helmet/00079.jpg 27
1249 | Kettle/00001.jpg 28
1250 | Kettle/00002.jpg 28
1251 | Kettle/00003.jpg 28
1252 | Kettle/00004.jpg 28
1253 | Kettle/00005.jpg 28
1254 | Kettle/00006.jpg 28
1255 | Kettle/00007.jpg 28
1256 | Kettle/00008.jpg 28
1257 | Kettle/00009.jpg 28
1258 | Kettle/00010.jpg 28
1259 | Kettle/00011.jpg 28
1260 | Kettle/00012.jpg 28
1261 | Kettle/00013.jpg 28
1262 | Kettle/00014.jpg 28
1263 | Kettle/00015.jpg 28
1264 | Kettle/00016.jpg 28
1265 | Kettle/00017.jpg 28
1266 | Kettle/00018.jpg 28
1267 | Kettle/00019.jpg 28
1268 | Kettle/00020.jpg 28
1269 | Kettle/00021.jpg 28
1270 | Kettle/00022.jpg 28
1271 | Kettle/00023.jpg 28
1272 | Kettle/00024.jpg 28
1273 | Kettle/00025.jpg 28
1274 | Kettle/00026.jpg 28
1275 | Kettle/00027.jpg 28
1276 | Kettle/00028.jpg 28
1277 | Kettle/00029.jpg 28
1278 | Kettle/00030.jpg 28
1279 | Kettle/00031.jpg 28
1280 | Kettle/00032.jpg 28
1281 | Kettle/00033.jpg 28
1282 | Kettle/00034.jpg 28
1283 | Kettle/00035.jpg 28
1284 | Kettle/00036.jpg 28
1285 | Kettle/00037.jpg 28
1286 | Kettle/00038.jpg 28
1287 | Kettle/00039.jpg 28
1288 | Kettle/00040.jpg 28
1289 | Kettle/00041.jpg 28
1290 | Kettle/00042.jpg 28
1291 | Kettle/00043.jpg 28
1292 | Kettle/00044.jpg 28
1293 | Kettle/00045.jpg 28
1294 | Kettle/00046.jpg 28
1295 | Keyboard/00001.jpg 29
1296 | Keyboard/00002.jpg 29
1297 | Keyboard/00003.jpg 29
1298 | Keyboard/00004.jpg 29
1299 | Keyboard/00005.jpg 29
1300 | Keyboard/00006.jpg 29
1301 | Keyboard/00007.jpg 29
1302 | Keyboard/00008.jpg 29
1303 | Keyboard/00009.jpg 29
1304 | Keyboard/00010.jpg 29
1305 | Keyboard/00011.jpg 29
1306 | Keyboard/00012.jpg 29
1307 | Keyboard/00013.jpg 29
1308 | Keyboard/00014.jpg 29
1309 | Keyboard/00015.jpg 29
1310 | Keyboard/00016.jpg 29
1311 | Keyboard/00017.jpg 29
1312 | Keyboard/00018.jpg 29
1313 | Knives/00001.jpg 30
1314 | Knives/00002.jpg 30
1315 | Knives/00003.jpg 30
1316 | Knives/00004.jpg 30
1317 | Knives/00005.jpg 30
1318 | Knives/00006.jpg 30
1319 | Knives/00007.jpg 30
1320 | Knives/00008.jpg 30
1321 | Knives/00009.jpg 30
1322 | Knives/00010.jpg 30
1323 | Knives/00011.jpg 30
1324 | Knives/00012.jpg 30
1325 | Knives/00013.jpg 30
1326 | Knives/00014.jpg 30
1327 | Knives/00015.jpg 30
1328 | Knives/00016.jpg 30
1329 | Knives/00017.jpg 30
1330 | Knives/00018.jpg 30
1331 | Knives/00019.jpg 30
1332 | Knives/00020.jpg 30
1333 | Knives/00021.jpg 30
1334 | Knives/00022.jpg 30
1335 | Knives/00023.jpg 30
1336 | Knives/00024.jpg 30
1337 | Knives/00025.jpg 30
1338 | Knives/00026.jpg 30
1339 | Knives/00027.jpg 30
1340 | Knives/00028.jpg 30
1341 | Knives/00029.jpg 30
1342 | Knives/00030.jpg 30
1343 | Knives/00031.jpg 30
1344 | Knives/00032.jpg 30
1345 | Knives/00033.jpg 30
1346 | Knives/00034.jpg 30
1347 | Knives/00035.jpg 30
1348 | Knives/00036.jpg 30
1349 | Knives/00037.jpg 30
1350 | Knives/00038.jpg 30
1351 | Knives/00039.jpg 30
1352 | Knives/00040.jpg 30
1353 | Knives/00041.jpg 30
1354 | Knives/00042.jpg 30
1355 | Knives/00043.jpg 30
1356 | Knives/00044.jpg 30
1357 | Knives/00045.jpg 30
1358 | Knives/00046.jpg 30
1359 | Knives/00047.jpg 30
1360 | Knives/00048.jpg 30
1361 | Knives/00049.jpg 30
1362 | Knives/00050.jpg 30
1363 | Knives/00051.jpg 30
1364 | Knives/00052.jpg 30
1365 | Knives/00053.jpg 30
1366 | Knives/00054.jpg 30
1367 | Knives/00055.jpg 30
1368 | Knives/00056.jpg 30
1369 | Knives/00057.jpg 30
1370 | Knives/00058.jpg 30
1371 | Knives/00059.jpg 30
1372 | Knives/00060.jpg 30
1373 | Knives/00061.jpg 30
1374 | Knives/00062.jpg 30
1375 | Knives/00063.jpg 30
1376 | Knives/00064.jpg 30
1377 | Knives/00065.jpg 30
1378 | Knives/00066.jpg 30
1379 | Knives/00067.jpg 30
1380 | Knives/00068.jpg 30
1381 | Knives/00069.jpg 30
1382 | Knives/00070.jpg 30
1383 | Knives/00071.jpg 30
1384 | Knives/00072.jpg 30
1385 | Lamp_Shade/00001.jpg 31
1386 | Lamp_Shade/00002.jpg 31
1387 | Lamp_Shade/00003.jpg 31
1388 | Lamp_Shade/00004.jpg 31
1389 | Lamp_Shade/00005.jpg 31
1390 | Lamp_Shade/00006.jpg 31
1391 | Lamp_Shade/00007.jpg 31
1392 | Lamp_Shade/00008.jpg 31
1393 | Lamp_Shade/00009.jpg 31
1394 | Lamp_Shade/00010.jpg 31
1395 | Lamp_Shade/00011.jpg 31
1396 | Lamp_Shade/00012.jpg 31
1397 | Lamp_Shade/00013.jpg 31
1398 | Lamp_Shade/00014.jpg 31
1399 | Lamp_Shade/00015.jpg 31
1400 | Lamp_Shade/00016.jpg 31
1401 | Lamp_Shade/00017.jpg 31
1402 | Lamp_Shade/00018.jpg 31
1403 | Lamp_Shade/00019.jpg 31
1404 | Lamp_Shade/00020.jpg 31
1405 | Lamp_Shade/00021.jpg 31
1406 | Lamp_Shade/00022.jpg 31
1407 | Lamp_Shade/00023.jpg 31
1408 | Lamp_Shade/00024.jpg 31
1409 | Lamp_Shade/00025.jpg 31
1410 | Lamp_Shade/00026.jpg 31
1411 | Lamp_Shade/00027.jpg 31
1412 | Lamp_Shade/00028.jpg 31
1413 | Lamp_Shade/00029.jpg 31
1414 | Lamp_Shade/00030.jpg 31
1415 | Lamp_Shade/00031.jpg 31
1416 | Lamp_Shade/00032.jpg 31
1417 | Lamp_Shade/00033.jpg 31
1418 | Lamp_Shade/00034.jpg 31
1419 | Lamp_Shade/00035.jpg 31
1420 | Lamp_Shade/00036.jpg 31
1421 | Lamp_Shade/00037.jpg 31
1422 | Lamp_Shade/00038.jpg 31
1423 | Lamp_Shade/00039.jpg 31
1424 | Lamp_Shade/00040.jpg 31
1425 | Lamp_Shade/00041.jpg 31
1426 | Lamp_Shade/00042.jpg 31
1427 | Lamp_Shade/00043.jpg 31
1428 | Lamp_Shade/00044.jpg 31
1429 | Lamp_Shade/00045.jpg 31
1430 | Lamp_Shade/00046.jpg 31
1431 | Lamp_Shade/00047.jpg 31
1432 | Lamp_Shade/00048.jpg 31
1433 | Lamp_Shade/00049.jpg 31
1434 | Laptop/00001.jpg 32
1435 | Laptop/00002.jpg 32
1436 | Laptop/00003.jpg 32
1437 | Laptop/00004.jpg 32
1438 | Laptop/00005.jpg 32
1439 | Laptop/00006.jpg 32
1440 | Laptop/00007.jpg 32
1441 | Laptop/00008.jpg 32
1442 | Laptop/00009.jpg 32
1443 | Laptop/00010.jpg 32
1444 | Laptop/00011.jpg 32
1445 | Laptop/00012.jpg 32
1446 | Laptop/00013.jpg 32
1447 | Laptop/00014.jpg 32
1448 | Laptop/00015.jpg 32
1449 | Laptop/00016.jpg 32
1450 | Laptop/00017.jpg 32
1451 | Laptop/00018.jpg 32
1452 | Laptop/00019.jpg 32
1453 | Laptop/00020.jpg 32
1454 | Laptop/00021.jpg 32
1455 | Laptop/00022.jpg 32
1456 | Laptop/00023.jpg 32
1457 | Laptop/00024.jpg 32
1458 | Laptop/00025.jpg 32
1459 | Laptop/00026.jpg 32
1460 | Laptop/00027.jpg 32
1461 | Laptop/00028.jpg 32
1462 | Laptop/00029.jpg 32
1463 | Laptop/00030.jpg 32
1464 | Laptop/00031.jpg 32
1465 | Laptop/00032.jpg 32
1466 | Laptop/00033.jpg 32
1467 | Laptop/00034.jpg 32
1468 | Laptop/00035.jpg 32
1469 | Laptop/00036.jpg 32
1470 | Laptop/00037.jpg 32
1471 | Laptop/00038.jpg 32
1472 | Laptop/00039.jpg 32
1473 | Laptop/00040.jpg 32
1474 | Laptop/00041.jpg 32
1475 | Laptop/00042.jpg 32
1476 | Laptop/00043.jpg 32
1477 | Laptop/00044.jpg 32
1478 | Laptop/00045.jpg 32
1479 | Laptop/00046.jpg 32
1480 | Laptop/00047.jpg 32
1481 | Laptop/00048.jpg 32
1482 | Laptop/00049.jpg 32
1483 | Laptop/00050.jpg 32
1484 | Laptop/00051.jpg 32
1485 | Marker/00001.jpg 33
1486 | Marker/00002.jpg 33
1487 | Marker/00003.jpg 33
1488 | Marker/00004.jpg 33
1489 | Marker/00005.jpg 33
1490 | Marker/00006.jpg 33
1491 | Marker/00007.jpg 33
1492 | Marker/00008.jpg 33
1493 | Marker/00009.jpg 33
1494 | Marker/00010.jpg 33
1495 | Marker/00011.jpg 33
1496 | Marker/00012.jpg 33
1497 | Marker/00013.jpg 33
1498 | Marker/00014.jpg 33
1499 | Marker/00015.jpg 33
1500 | Marker/00016.jpg 33
1501 | Marker/00017.jpg 33
1502 | Marker/00018.jpg 33
1503 | Marker/00019.jpg 33
1504 | Marker/00020.jpg 33
1505 | Monitor/00001.jpg 34
1506 | Monitor/00002.jpg 34
1507 | Monitor/00003.jpg 34
1508 | Monitor/00004.jpg 34
1509 | Monitor/00005.jpg 34
1510 | Monitor/00006.jpg 34
1511 | Monitor/00007.jpg 34
1512 | Monitor/00008.jpg 34
1513 | Monitor/00009.jpg 34
1514 | Monitor/00010.jpg 34
1515 | Monitor/00011.jpg 34
1516 | Monitor/00012.jpg 34
1517 | Monitor/00013.jpg 34
1518 | Monitor/00014.jpg 34
1519 | Monitor/00015.jpg 34
1520 | Monitor/00016.jpg 34
1521 | Monitor/00017.jpg 34
1522 | Monitor/00018.jpg 34
1523 | Monitor/00019.jpg 34
1524 | Monitor/00020.jpg 34
1525 | Monitor/00021.jpg 34
1526 | Monitor/00022.jpg 34
1527 | Monitor/00023.jpg 34
1528 | Monitor/00024.jpg 34
1529 | Monitor/00025.jpg 34
1530 | Monitor/00026.jpg 34
1531 | Monitor/00027.jpg 34
1532 | Monitor/00028.jpg 34
1533 | Monitor/00029.jpg 34
1534 | Monitor/00030.jpg 34
1535 | Monitor/00031.jpg 34
1536 | Monitor/00032.jpg 34
1537 | Monitor/00033.jpg 34
1538 | Monitor/00034.jpg 34
1539 | Monitor/00035.jpg 34
1540 | Monitor/00036.jpg 34
1541 | Monitor/00037.jpg 34
1542 | Monitor/00038.jpg 34
1543 | Monitor/00039.jpg 34
1544 | Monitor/00040.jpg 34
1545 | Monitor/00041.jpg 34
1546 | Monitor/00042.jpg 34
1547 | Mop/00001.jpg 35
1548 | Mop/00002.jpg 35
1549 | Mop/00003.jpg 35
1550 | Mop/00004.jpg 35
1551 | Mop/00005.jpg 35
1552 | Mop/00006.jpg 35
1553 | Mop/00007.jpg 35
1554 | Mop/00008.jpg 35
1555 | Mop/00009.jpg 35
1556 | Mop/00010.jpg 35
1557 | Mop/00011.jpg 35
1558 | Mop/00012.jpg 35
1559 | Mop/00013.jpg 35
1560 | Mop/00014.jpg 35
1561 | Mop/00015.jpg 35
1562 | Mop/00016.jpg 35
1563 | Mop/00017.jpg 35
1564 | Mop/00018.jpg 35
1565 | Mop/00019.jpg 35
1566 | Mop/00020.jpg 35
1567 | Mop/00021.jpg 35
1568 | Mop/00022.jpg 35
1569 | Mop/00023.jpg 35
1570 | Mop/00024.jpg 35
1571 | Mop/00025.jpg 35
1572 | Mop/00026.jpg 35
1573 | Mop/00027.jpg 35
1574 | Mop/00028.jpg 35
1575 | Mop/00029.jpg 35
1576 | Mop/00030.jpg 35
1577 | Mop/00031.jpg 35
1578 | Mop/00032.jpg 35
1579 | Mouse/00001.jpg 36
1580 | Mouse/00002.jpg 36
1581 | Mouse/00003.jpg 36
1582 | Mouse/00004.jpg 36
1583 | Mouse/00005.jpg 36
1584 | Mouse/00006.jpg 36
1585 | Mouse/00007.jpg 36
1586 | Mouse/00008.jpg 36
1587 | Mouse/00009.jpg 36
1588 | Mouse/00010.jpg 36
1589 | Mouse/00011.jpg 36
1590 | Mouse/00012.jpg 36
1591 | Mouse/00013.jpg 36
1592 | Mouse/00014.jpg 36
1593 | Mouse/00015.jpg 36
1594 | Mouse/00016.jpg 36
1595 | Mouse/00017.jpg 36
1596 | Mouse/00018.jpg 36
1597 | Mug/00001.jpg 37
1598 | Mug/00002.jpg 37
1599 | Mug/00003.jpg 37
1600 | Mug/00004.jpg 37
1601 | Mug/00005.jpg 37
1602 | Mug/00006.jpg 37
1603 | Mug/00007.jpg 37
1604 | Mug/00008.jpg 37
1605 | Mug/00009.jpg 37
1606 | Mug/00010.jpg 37
1607 | Mug/00011.jpg 37
1608 | Mug/00012.jpg 37
1609 | Mug/00013.jpg 37
1610 | Mug/00014.jpg 37
1611 | Mug/00015.jpg 37
1612 | Mug/00016.jpg 37
1613 | Mug/00017.jpg 37
1614 | Mug/00018.jpg 37
1615 | Mug/00019.jpg 37
1616 | Mug/00020.jpg 37
1617 | Mug/00021.jpg 37
1618 | Mug/00022.jpg 37
1619 | Mug/00023.jpg 37
1620 | Mug/00024.jpg 37
1621 | Mug/00025.jpg 37
1622 | Mug/00026.jpg 37
1623 | Mug/00027.jpg 37
1624 | Mug/00028.jpg 37
1625 | Mug/00029.jpg 37
1626 | Mug/00030.jpg 37
1627 | Mug/00031.jpg 37
1628 | Mug/00032.jpg 37
1629 | Mug/00033.jpg 37
1630 | Mug/00034.jpg 37
1631 | Mug/00035.jpg 37
1632 | Mug/00036.jpg 37
1633 | Mug/00037.jpg 37
1634 | Mug/00038.jpg 37
1635 | Mug/00039.jpg 37
1636 | Mug/00040.jpg 37
1637 | Mug/00041.jpg 37
1638 | Mug/00042.jpg 37
1639 | Mug/00043.jpg 37
1640 | Mug/00044.jpg 37
1641 | Mug/00045.jpg 37
1642 | Mug/00046.jpg 37
1643 | Mug/00047.jpg 37
1644 | Mug/00048.jpg 37
1645 | Mug/00049.jpg 37
1646 | Notebook/00001.jpg 38
1647 | Notebook/00002.jpg 38
1648 | Notebook/00003.jpg 38
1649 | Notebook/00004.jpg 38
1650 | Notebook/00005.jpg 38
1651 | Notebook/00006.jpg 38
1652 | Notebook/00007.jpg 38
1653 | Notebook/00008.jpg 38
1654 | Notebook/00009.jpg 38
1655 | Notebook/00010.jpg 38
1656 | Notebook/00011.jpg 38
1657 | Notebook/00012.jpg 38
1658 | Notebook/00013.jpg 38
1659 | Notebook/00014.jpg 38
1660 | Notebook/00015.jpg 38
1661 | Notebook/00016.jpg 38
1662 | Notebook/00017.jpg 38
1663 | Notebook/00018.jpg 38
1664 | Notebook/00019.jpg 38
1665 | Notebook/00020.jpg 38
1666 | Notebook/00021.jpg 38
1667 | Oven/00001.jpg 39
1668 | Oven/00002.jpg 39
1669 | Oven/00003.jpg 39
1670 | Oven/00004.jpg 39
1671 | Oven/00005.jpg 39
1672 | Oven/00006.jpg 39
1673 | Oven/00007.jpg 39
1674 | Oven/00008.jpg 39
1675 | Oven/00009.jpg 39
1676 | Oven/00010.jpg 39
1677 | Oven/00011.jpg 39
1678 | Oven/00012.jpg 39
1679 | Oven/00013.jpg 39
1680 | Oven/00014.jpg 39
1681 | Oven/00015.jpg 39
1682 | Oven/00016.jpg 39
1683 | Oven/00017.jpg 39
1684 | Oven/00018.jpg 39
1685 | Oven/00019.jpg 39
1686 | Oven/00020.jpg 39
1687 | Pan/00001.jpg 40
1688 | Pan/00002.jpg 40
1689 | Pan/00003.jpg 40
1690 | Pan/00004.jpg 40
1691 | Pan/00005.jpg 40
1692 | Pan/00006.jpg 40
1693 | Pan/00007.jpg 40
1694 | Pan/00008.jpg 40
1695 | Pan/00009.jpg 40
1696 | Pan/00010.jpg 40
1697 | Pan/00011.jpg 40
1698 | Pan/00012.jpg 40
1699 | Pan/00013.jpg 40
1700 | Pan/00014.jpg 40
1701 | Pan/00015.jpg 40
1702 | Pan/00016.jpg 40
1703 | Pan/00017.jpg 40
1704 | Pan/00018.jpg 40
1705 | Pan/00019.jpg 40
1706 | Paper_Clip/00001.jpg 41
1707 | Paper_Clip/00002.jpg 41
1708 | Paper_Clip/00003.jpg 41
1709 | Paper_Clip/00004.jpg 41
1710 | Paper_Clip/00005.jpg 41
1711 | Paper_Clip/00006.jpg 41
1712 | Paper_Clip/00007.jpg 41
1713 | Paper_Clip/00008.jpg 41
1714 | Paper_Clip/00009.jpg 41
1715 | Paper_Clip/00010.jpg 41
1716 | Paper_Clip/00011.jpg 41
1717 | Paper_Clip/00012.jpg 41
1718 | Paper_Clip/00013.jpg 41
1719 | Paper_Clip/00014.jpg 41
1720 | Paper_Clip/00015.jpg 41
1721 | Paper_Clip/00016.jpg 41
1722 | Paper_Clip/00017.jpg 41
1723 | Paper_Clip/00018.jpg 41
1724 | Paper_Clip/00019.jpg 41
1725 | Pen/00001.jpg 42
1726 | Pen/00002.jpg 42
1727 | Pen/00003.jpg 42
1728 | Pen/00004.jpg 42
1729 | Pen/00005.jpg 42
1730 | Pen/00006.jpg 42
1731 | Pen/00007.jpg 42
1732 | Pen/00008.jpg 42
1733 | Pen/00009.jpg 42
1734 | Pen/00010.jpg 42
1735 | Pen/00011.jpg 42
1736 | Pen/00012.jpg 42
1737 | Pen/00013.jpg 42
1738 | Pen/00014.jpg 42
1739 | Pen/00015.jpg 42
1740 | Pen/00016.jpg 42
1741 | Pen/00017.jpg 42
1742 | Pen/00018.jpg 42
1743 | Pen/00019.jpg 42
1744 | Pen/00020.jpg 42
1745 | Pencil/00001.jpg 43
1746 | Pencil/00002.jpg 43
1747 | Pencil/00003.jpg 43
1748 | Pencil/00004.jpg 43
1749 | Pencil/00005.jpg 43
1750 | Pencil/00006.jpg 43
1751 | Pencil/00007.jpg 43
1752 | Pencil/00008.jpg 43
1753 | Pencil/00009.jpg 43
1754 | Pencil/00010.jpg 43
1755 | Pencil/00011.jpg 43
1756 | Pencil/00012.jpg 43
1757 | Pencil/00013.jpg 43
1758 | Pencil/00014.jpg 43
1759 | Pencil/00015.jpg 43
1760 | Pencil/00016.jpg 43
1761 | Pencil/00017.jpg 43
1762 | Pencil/00018.jpg 43
1763 | Pencil/00019.jpg 43
1764 | Pencil/00020.jpg 43
1765 | Pencil/00021.jpg 43
1766 | Pencil/00022.jpg 43
1767 | Pencil/00023.jpg 43
1768 | Pencil/00024.jpg 43
1769 | Pencil/00025.jpg 43
1770 | Pencil/00026.jpg 43
1771 | Postit_Notes/00001.jpg 44
1772 | Postit_Notes/00002.jpg 44
1773 | Postit_Notes/00003.jpg 44
1774 | Postit_Notes/00004.jpg 44
1775 | Postit_Notes/00005.jpg 44
1776 | Postit_Notes/00006.jpg 44
1777 | Postit_Notes/00007.jpg 44
1778 | Postit_Notes/00008.jpg 44
1779 | Postit_Notes/00009.jpg 44
1780 | Postit_Notes/00010.jpg 44
1781 | Postit_Notes/00011.jpg 44
1782 | Postit_Notes/00012.jpg 44
1783 | Postit_Notes/00013.jpg 44
1784 | Postit_Notes/00014.jpg 44
1785 | Postit_Notes/00015.jpg 44
1786 | Postit_Notes/00016.jpg 44
1787 | Postit_Notes/00017.jpg 44
1788 | Postit_Notes/00018.jpg 44
1789 | Postit_Notes/00019.jpg 44
1790 | Printer/00001.jpg 45
1791 | Printer/00002.jpg 45
1792 | Printer/00003.jpg 45
1793 | Printer/00004.jpg 45
1794 | Printer/00005.jpg 45
1795 | Printer/00006.jpg 45
1796 | Printer/00007.jpg 45
1797 | Printer/00008.jpg 45
1798 | Printer/00009.jpg 45
1799 | Printer/00010.jpg 45
1800 | Printer/00011.jpg 45
1801 | Printer/00012.jpg 45
1802 | Printer/00013.jpg 45
1803 | Printer/00014.jpg 45
1804 | Printer/00015.jpg 45
1805 | Printer/00016.jpg 45
1806 | Printer/00017.jpg 45
1807 | Printer/00018.jpg 45
1808 | Push_Pin/00001.jpg 46
1809 | Push_Pin/00002.jpg 46
1810 | Push_Pin/00003.jpg 46
1811 | Push_Pin/00004.jpg 46
1812 | Push_Pin/00005.jpg 46
1813 | Push_Pin/00006.jpg 46
1814 | Push_Pin/00007.jpg 46
1815 | Push_Pin/00008.jpg 46
1816 | Push_Pin/00009.jpg 46
1817 | Push_Pin/00010.jpg 46
1818 | Push_Pin/00011.jpg 46
1819 | Push_Pin/00012.jpg 46
1820 | Push_Pin/00013.jpg 46
1821 | Push_Pin/00014.jpg 46
1822 | Push_Pin/00015.jpg 46
1823 | Push_Pin/00016.jpg 46
1824 | Push_Pin/00017.jpg 46
1825 | Push_Pin/00018.jpg 46
1826 | Push_Pin/00019.jpg 46
1827 | Push_Pin/00020.jpg 46
1828 | Push_Pin/00021.jpg 46
1829 | Push_Pin/00022.jpg 46
1830 | Push_Pin/00023.jpg 46
1831 | Push_Pin/00024.jpg 46
1832 | Radio/00001.jpg 47
1833 | Radio/00002.jpg 47
1834 | Radio/00003.jpg 47
1835 | Radio/00004.jpg 47
1836 | Radio/00005.jpg 47
1837 | Radio/00006.jpg 47
1838 | Radio/00007.jpg 47
1839 | Radio/00008.jpg 47
1840 | Radio/00009.jpg 47
1841 | Radio/00010.jpg 47
1842 | Radio/00011.jpg 47
1843 | Radio/00012.jpg 47
1844 | Radio/00013.jpg 47
1845 | Radio/00014.jpg 47
1846 | Radio/00015.jpg 47
1847 | Radio/00016.jpg 47
1848 | Radio/00017.jpg 47
1849 | Radio/00018.jpg 47
1850 | Radio/00019.jpg 47
1851 | Radio/00020.jpg 47
1852 | Radio/00021.jpg 47
1853 | Radio/00022.jpg 47
1854 | Radio/00023.jpg 47
1855 | Radio/00024.jpg 47
1856 | Radio/00025.jpg 47
1857 | Radio/00026.jpg 47
1858 | Radio/00027.jpg 47
1859 | Radio/00028.jpg 47
1860 | Radio/00029.jpg 47
1861 | Radio/00030.jpg 47
1862 | Radio/00031.jpg 47
1863 | Radio/00032.jpg 47
1864 | Radio/00033.jpg 47
1865 | Radio/00034.jpg 47
1866 | Radio/00035.jpg 47
1867 | Radio/00036.jpg 47
1868 | Radio/00037.jpg 47
1869 | Radio/00038.jpg 47
1870 | Radio/00039.jpg 47
1871 | Radio/00040.jpg 47
1872 | Radio/00041.jpg 47
1873 | Radio/00042.jpg 47
1874 | Radio/00043.jpg 47
1875 | Radio/00044.jpg 47
1876 | Radio/00045.jpg 47
1877 | Radio/00046.jpg 47
1878 | Radio/00047.jpg 47
1879 | Refrigerator/00001.jpg 48
1880 | Refrigerator/00002.jpg 48
1881 | Refrigerator/00003.jpg 48
1882 | Refrigerator/00004.jpg 48
1883 | Refrigerator/00005.jpg 48
1884 | Refrigerator/00006.jpg 48
1885 | Refrigerator/00007.jpg 48
1886 | Refrigerator/00008.jpg 48
1887 | Refrigerator/00009.jpg 48
1888 | Refrigerator/00010.jpg 48
1889 | Refrigerator/00011.jpg 48
1890 | Refrigerator/00012.jpg 48
1891 | Refrigerator/00013.jpg 48
1892 | Refrigerator/00014.jpg 48
1893 | Refrigerator/00015.jpg 48
1894 | Refrigerator/00016.jpg 48
1895 | Refrigerator/00017.jpg 48
1896 | Refrigerator/00018.jpg 48
1897 | Refrigerator/00019.jpg 48
1898 | Refrigerator/00020.jpg 48
1899 | Refrigerator/00021.jpg 48
1900 | Refrigerator/00022.jpg 48
1901 | Refrigerator/00023.jpg 48
1902 | Refrigerator/00024.jpg 48
1903 | Refrigerator/00025.jpg 48
1904 | Refrigerator/00026.jpg 48
1905 | Refrigerator/00027.jpg 48
1906 | Refrigerator/00028.jpg 48
1907 | Refrigerator/00029.jpg 48
1908 | Refrigerator/00030.jpg 48
1909 | Refrigerator/00031.jpg 48
1910 | Refrigerator/00032.jpg 48
1911 | Refrigerator/00033.jpg 48
1912 | Refrigerator/00034.jpg 48
1913 | Refrigerator/00035.jpg 48
1914 | Refrigerator/00036.jpg 48
1915 | Refrigerator/00037.jpg 48
1916 | Refrigerator/00038.jpg 48
1917 | Refrigerator/00039.jpg 48
1918 | Refrigerator/00040.jpg 48
1919 | Refrigerator/00041.jpg 48
1920 | Refrigerator/00042.jpg 48
1921 | Refrigerator/00043.jpg 48
1922 | Refrigerator/00044.jpg 48
1923 | Refrigerator/00045.jpg 48
1924 | Refrigerator/00046.jpg 48
1925 | Refrigerator/00047.jpg 48
1926 | Refrigerator/00048.jpg 48
1927 | Refrigerator/00049.jpg 48
1928 | Ruler/00001.jpg 49
1929 | Ruler/00002.jpg 49
1930 | Ruler/00003.jpg 49
1931 | Ruler/00004.jpg 49
1932 | Ruler/00005.jpg 49
1933 | Ruler/00006.jpg 49
1934 | Ruler/00007.jpg 49
1935 | Ruler/00008.jpg 49
1936 | Ruler/00009.jpg 49
1937 | Ruler/00010.jpg 49
1938 | Ruler/00011.jpg 49
1939 | Ruler/00012.jpg 49
1940 | Ruler/00013.jpg 49
1941 | Ruler/00014.jpg 49
1942 | Ruler/00015.jpg 49
1943 | Scissors/00001.jpg 50
1944 | Scissors/00002.jpg 50
1945 | Scissors/00003.jpg 50
1946 | Scissors/00004.jpg 50
1947 | Scissors/00005.jpg 50
1948 | Scissors/00006.jpg 50
1949 | Scissors/00007.jpg 50
1950 | Scissors/00008.jpg 50
1951 | Scissors/00009.jpg 50
1952 | Scissors/00010.jpg 50
1953 | Scissors/00011.jpg 50
1954 | Scissors/00012.jpg 50
1955 | Scissors/00013.jpg 50
1956 | Scissors/00014.jpg 50
1957 | Scissors/00015.jpg 50
1958 | Scissors/00016.jpg 50
1959 | Scissors/00017.jpg 50
1960 | Scissors/00018.jpg 50
1961 | Scissors/00019.jpg 50
1962 | Scissors/00020.jpg 50
1963 | Screwdriver/00001.jpg 51
1964 | Screwdriver/00002.jpg 51
1965 | Screwdriver/00003.jpg 51
1966 | Screwdriver/00004.jpg 51
1967 | Screwdriver/00005.jpg 51
1968 | Screwdriver/00006.jpg 51
1969 | Screwdriver/00007.jpg 51
1970 | Screwdriver/00008.jpg 51
1971 | Screwdriver/00009.jpg 51
1972 | Screwdriver/00010.jpg 51
1973 | Screwdriver/00011.jpg 51
1974 | Screwdriver/00012.jpg 51
1975 | Screwdriver/00013.jpg 51
1976 | Screwdriver/00014.jpg 51
1977 | Screwdriver/00015.jpg 51
1978 | Screwdriver/00016.jpg 51
1979 | Screwdriver/00017.jpg 51
1980 | Screwdriver/00018.jpg 51
1981 | Screwdriver/00019.jpg 51
1982 | Screwdriver/00020.jpg 51
1983 | Screwdriver/00021.jpg 51
1984 | Screwdriver/00022.jpg 51
1985 | Screwdriver/00023.jpg 51
1986 | Screwdriver/00024.jpg 51
1987 | Screwdriver/00025.jpg 51
1988 | Screwdriver/00026.jpg 51
1989 | Screwdriver/00027.jpg 51
1990 | Screwdriver/00028.jpg 51
1991 | Screwdriver/00029.jpg 51
1992 | Screwdriver/00030.jpg 51
1993 | Shelf/00001.jpg 52
1994 | Shelf/00002.jpg 52
1995 | Shelf/00003.jpg 52
1996 | Shelf/00004.jpg 52
1997 | Shelf/00005.jpg 52
1998 | Shelf/00006.jpg 52
1999 | Shelf/00007.jpg 52
2000 | Shelf/00008.jpg 52
2001 | Shelf/00009.jpg 52
2002 | Shelf/00010.jpg 52
2003 | Shelf/00011.jpg 52
2004 | Shelf/00012.jpg 52
2005 | Shelf/00013.jpg 52
2006 | Shelf/00014.jpg 52
2007 | Shelf/00015.jpg 52
2008 | Shelf/00016.jpg 52
2009 | Shelf/00017.jpg 52
2010 | Shelf/00018.jpg 52
2011 | Shelf/00019.jpg 52
2012 | Shelf/00020.jpg 52
2013 | Shelf/00021.jpg 52
2014 | Shelf/00022.jpg 52
2015 | Shelf/00023.jpg 52
2016 | Shelf/00024.jpg 52
2017 | Shelf/00025.jpg 52
2018 | Shelf/00026.jpg 52
2019 | Shelf/00027.jpg 52
2020 | Shelf/00028.jpg 52
2021 | Shelf/00029.jpg 52
2022 | Shelf/00030.jpg 52
2023 | Shelf/00031.jpg 52
2024 | Shelf/00032.jpg 52
2025 | Shelf/00033.jpg 52
2026 | Shelf/00034.jpg 52
2027 | Shelf/00035.jpg 52
2028 | Shelf/00036.jpg 52
2029 | Shelf/00037.jpg 52
2030 | Shelf/00038.jpg 52
2031 | Shelf/00039.jpg 52
2032 | Shelf/00040.jpg 52
2033 | Shelf/00041.jpg 52
2034 | Shelf/00042.jpg 52
2035 | Sink/00001.jpg 53
2036 | Sink/00002.jpg 53
2037 | Sink/00003.jpg 53
2038 | Sink/00004.jpg 53
2039 | Sink/00005.jpg 53
2040 | Sink/00006.jpg 53
2041 | Sink/00007.jpg 53
2042 | Sink/00008.jpg 53
2043 | Sink/00009.jpg 53
2044 | Sink/00010.jpg 53
2045 | Sink/00011.jpg 53
2046 | Sink/00012.jpg 53
2047 | Sink/00013.jpg 53
2048 | Sink/00014.jpg 53
2049 | Sink/00015.jpg 53
2050 | Sink/00016.jpg 53
2051 | Sink/00017.jpg 53
2052 | Sink/00018.jpg 53
2053 | Sink/00019.jpg 53
2054 | Sink/00020.jpg 53
2055 | Sink/00021.jpg 53
2056 | Sink/00022.jpg 53
2057 | Sink/00023.jpg 53
2058 | Sink/00024.jpg 53
2059 | Sink/00025.jpg 53
2060 | Sink/00026.jpg 53
2061 | Sink/00027.jpg 53
2062 | Sink/00028.jpg 53
2063 | Sink/00029.jpg 53
2064 | Sink/00030.jpg 53
2065 | Sink/00031.jpg 53
2066 | Sink/00032.jpg 53
2067 | Sink/00033.jpg 53
2068 | Sink/00034.jpg 53
2069 | Sink/00035.jpg 53
2070 | Sink/00036.jpg 53
2071 | Sink/00037.jpg 53
2072 | Sink/00038.jpg 53
2073 | Sink/00039.jpg 53
2074 | Sink/00040.jpg 53
2075 | Sink/00041.jpg 53
2076 | Sneakers/00001.jpg 54
2077 | Sneakers/00002.jpg 54
2078 | Sneakers/00003.jpg 54
2079 | Sneakers/00004.jpg 54
2080 | Sneakers/00005.jpg 54
2081 | Sneakers/00006.jpg 54
2082 | Sneakers/00007.jpg 54
2083 | Sneakers/00008.jpg 54
2084 | Sneakers/00009.jpg 54
2085 | Sneakers/00010.jpg 54
2086 | Sneakers/00011.jpg 54
2087 | Sneakers/00012.jpg 54
2088 | Sneakers/00013.jpg 54
2089 | Sneakers/00014.jpg 54
2090 | Sneakers/00015.jpg 54
2091 | Sneakers/00016.jpg 54
2092 | Sneakers/00017.jpg 54
2093 | Sneakers/00018.jpg 54
2094 | Sneakers/00019.jpg 54
2095 | Sneakers/00020.jpg 54
2096 | Sneakers/00021.jpg 54
2097 | Sneakers/00022.jpg 54
2098 | Sneakers/00023.jpg 54
2099 | Sneakers/00024.jpg 54
2100 | Sneakers/00025.jpg 54
2101 | Sneakers/00026.jpg 54
2102 | Sneakers/00027.jpg 54
2103 | Sneakers/00028.jpg 54
2104 | Sneakers/00029.jpg 54
2105 | Sneakers/00030.jpg 54
2106 | Sneakers/00031.jpg 54
2107 | Sneakers/00032.jpg 54
2108 | Sneakers/00033.jpg 54
2109 | Sneakers/00034.jpg 54
2110 | Sneakers/00035.jpg 54
2111 | Sneakers/00036.jpg 54
2112 | Sneakers/00037.jpg 54
2113 | Sneakers/00038.jpg 54
2114 | Sneakers/00039.jpg 54
2115 | Sneakers/00040.jpg 54
2116 | Sneakers/00041.jpg 54
2117 | Sneakers/00042.jpg 54
2118 | Sneakers/00043.jpg 54
2119 | Sneakers/00044.jpg 54
2120 | Sneakers/00045.jpg 54
2121 | Sneakers/00046.jpg 54
2122 | Soda/00001.jpg 55
2123 | Soda/00002.jpg 55
2124 | Soda/00003.jpg 55
2125 | Soda/00004.jpg 55
2126 | Soda/00005.jpg 55
2127 | Soda/00006.jpg 55
2128 | Soda/00007.jpg 55
2129 | Soda/00008.jpg 55
2130 | Soda/00009.jpg 55
2131 | Soda/00010.jpg 55
2132 | Soda/00011.jpg 55
2133 | Soda/00012.jpg 55
2134 | Soda/00013.jpg 55
2135 | Soda/00014.jpg 55
2136 | Soda/00015.jpg 55
2137 | Soda/00016.jpg 55
2138 | Soda/00017.jpg 55
2139 | Soda/00018.jpg 55
2140 | Soda/00019.jpg 55
2141 | Soda/00020.jpg 55
2142 | Soda/00021.jpg 55
2143 | Soda/00022.jpg 55
2144 | Soda/00023.jpg 55
2145 | Soda/00024.jpg 55
2146 | Soda/00025.jpg 55
2147 | Soda/00026.jpg 55
2148 | Soda/00027.jpg 55
2149 | Soda/00028.jpg 55
2150 | Soda/00029.jpg 55
2151 | Soda/00030.jpg 55
2152 | Soda/00031.jpg 55
2153 | Soda/00032.jpg 55
2154 | Soda/00033.jpg 55
2155 | Soda/00034.jpg 55
2156 | Soda/00035.jpg 55
2157 | Soda/00036.jpg 55
2158 | Soda/00037.jpg 55
2159 | Soda/00038.jpg 55
2160 | Soda/00039.jpg 55
2161 | Soda/00040.jpg 55
2162 | Speaker/00001.jpg 56
2163 | Speaker/00002.jpg 56
2164 | Speaker/00003.jpg 56
2165 | Speaker/00004.jpg 56
2166 | Speaker/00005.jpg 56
2167 | Speaker/00006.jpg 56
2168 | Speaker/00007.jpg 56
2169 | Speaker/00008.jpg 56
2170 | Speaker/00009.jpg 56
2171 | Speaker/00010.jpg 56
2172 | Speaker/00011.jpg 56
2173 | Speaker/00012.jpg 56
2174 | Speaker/00013.jpg 56
2175 | Speaker/00014.jpg 56
2176 | Speaker/00015.jpg 56
2177 | Speaker/00016.jpg 56
2178 | Speaker/00017.jpg 56
2179 | Speaker/00018.jpg 56
2180 | Speaker/00019.jpg 56
2181 | Speaker/00020.jpg 56
2182 | Spoon/00001.jpg 57
2183 | Spoon/00002.jpg 57
2184 | Spoon/00003.jpg 57
2185 | Spoon/00004.jpg 57
2186 | Spoon/00005.jpg 57
2187 | Spoon/00006.jpg 57
2188 | Spoon/00007.jpg 57
2189 | Spoon/00008.jpg 57
2190 | Spoon/00009.jpg 57
2191 | Spoon/00010.jpg 57
2192 | Spoon/00011.jpg 57
2193 | Spoon/00012.jpg 57
2194 | Spoon/00013.jpg 57
2195 | Spoon/00014.jpg 57
2196 | Spoon/00015.jpg 57
2197 | Spoon/00016.jpg 57
2198 | Spoon/00017.jpg 57
2199 | Spoon/00018.jpg 57
2200 | Spoon/00019.jpg 57
2201 | Spoon/00020.jpg 57
2202 | Spoon/00021.jpg 57
2203 | Spoon/00022.jpg 57
2204 | Spoon/00023.jpg 57
2205 | Spoon/00024.jpg 57
2206 | Spoon/00025.jpg 57
2207 | Spoon/00026.jpg 57
2208 | Spoon/00027.jpg 57
2209 | Spoon/00028.jpg 57
2210 | Spoon/00029.jpg 57
2211 | Spoon/00030.jpg 57
2212 | Spoon/00031.jpg 57
2213 | Spoon/00032.jpg 57
2214 | Spoon/00033.jpg 57
2215 | Spoon/00034.jpg 57
2216 | Spoon/00035.jpg 57
2217 | Spoon/00036.jpg 57
2218 | Spoon/00037.jpg 57
2219 | Spoon/00038.jpg 57
2220 | Spoon/00039.jpg 57
2221 | Spoon/00040.jpg 57
2222 | Spoon/00041.jpg 57
2223 | Spoon/00042.jpg 57
2224 | Spoon/00043.jpg 57
2225 | Spoon/00044.jpg 57
2226 | Spoon/00045.jpg 57
2227 | Spoon/00046.jpg 57
2228 | TV/00001.jpg 58
2229 | TV/00002.jpg 58
2230 | TV/00003.jpg 58
2231 | TV/00004.jpg 58
2232 | TV/00005.jpg 58
2233 | TV/00006.jpg 58
2234 | TV/00007.jpg 58
2235 | TV/00008.jpg 58
2236 | TV/00009.jpg 58
2237 | TV/00010.jpg 58
2238 | TV/00011.jpg 58
2239 | TV/00012.jpg 58
2240 | TV/00013.jpg 58
2241 | TV/00014.jpg 58
2242 | TV/00015.jpg 58
2243 | TV/00016.jpg 58
2244 | TV/00017.jpg 58
2245 | TV/00018.jpg 58
2246 | TV/00019.jpg 58
2247 | TV/00020.jpg 58
2248 | TV/00021.jpg 58
2249 | TV/00022.jpg 58
2250 | TV/00023.jpg 58
2251 | TV/00024.jpg 58
2252 | TV/00025.jpg 58
2253 | TV/00026.jpg 58
2254 | TV/00027.jpg 58
2255 | TV/00028.jpg 58
2256 | TV/00029.jpg 58
2257 | TV/00030.jpg 58
2258 | TV/00031.jpg 58
2259 | TV/00032.jpg 58
2260 | TV/00033.jpg 58
2261 | TV/00034.jpg 58
2262 | TV/00035.jpg 58
2263 | TV/00036.jpg 58
2264 | TV/00037.jpg 58
2265 | TV/00038.jpg 58
2266 | TV/00039.jpg 58
2267 | TV/00040.jpg 58
2268 | Table/00001.jpg 59
2269 | Table/00002.jpg 59
2270 | Table/00003.jpg 59
2271 | Table/00004.jpg 59
2272 | Table/00005.jpg 59
2273 | Table/00006.jpg 59
2274 | Table/00007.jpg 59
2275 | Table/00008.jpg 59
2276 | Table/00009.jpg 59
2277 | Table/00010.jpg 59
2278 | Table/00011.jpg 59
2279 | Table/00012.jpg 59
2280 | Table/00013.jpg 59
2281 | Table/00014.jpg 59
2282 | Table/00015.jpg 59
2283 | Table/00016.jpg 59
2284 | Telephone/00001.jpg 60
2285 | Telephone/00002.jpg 60
2286 | Telephone/00003.jpg 60
2287 | Telephone/00004.jpg 60
2288 | Telephone/00005.jpg 60
2289 | Telephone/00006.jpg 60
2290 | Telephone/00007.jpg 60
2291 | Telephone/00008.jpg 60
2292 | Telephone/00009.jpg 60
2293 | Telephone/00010.jpg 60
2294 | Telephone/00011.jpg 60
2295 | Telephone/00012.jpg 60
2296 | Telephone/00013.jpg 60
2297 | Telephone/00014.jpg 60
2298 | Telephone/00015.jpg 60
2299 | Telephone/00016.jpg 60
2300 | Telephone/00017.jpg 60
2301 | Telephone/00018.jpg 60
2302 | Telephone/00019.jpg 60
2303 | Telephone/00020.jpg 60
2304 | Telephone/00021.jpg 60
2305 | Telephone/00022.jpg 60
2306 | Telephone/00023.jpg 60
2307 | Telephone/00024.jpg 60
2308 | Telephone/00025.jpg 60
2309 | Telephone/00026.jpg 60
2310 | Telephone/00027.jpg 60
2311 | Telephone/00028.jpg 60
2312 | Telephone/00029.jpg 60
2313 | Telephone/00030.jpg 60
2314 | Telephone/00031.jpg 60
2315 | Telephone/00032.jpg 60
2316 | Telephone/00033.jpg 60
2317 | Telephone/00034.jpg 60
2318 | Telephone/00035.jpg 60
2319 | Telephone/00036.jpg 60
2320 | Telephone/00037.jpg 60
2321 | Telephone/00038.jpg 60
2322 | Telephone/00039.jpg 60
2323 | Telephone/00040.jpg 60
2324 | Telephone/00041.jpg 60
2325 | Telephone/00042.jpg 60
2326 | Telephone/00043.jpg 60
2327 | Telephone/00044.jpg 60
2328 | ToothBrush/00001.jpg 61
2329 | ToothBrush/00002.jpg 61
2330 | ToothBrush/00003.jpg 61
2331 | ToothBrush/00004.jpg 61
2332 | ToothBrush/00005.jpg 61
2333 | ToothBrush/00006.jpg 61
2334 | ToothBrush/00007.jpg 61
2335 | ToothBrush/00008.jpg 61
2336 | ToothBrush/00009.jpg 61
2337 | ToothBrush/00010.jpg 61
2338 | ToothBrush/00011.jpg 61
2339 | ToothBrush/00012.jpg 61
2340 | ToothBrush/00013.jpg 61
2341 | ToothBrush/00014.jpg 61
2342 | ToothBrush/00015.jpg 61
2343 | ToothBrush/00016.jpg 61
2344 | ToothBrush/00017.jpg 61
2345 | ToothBrush/00018.jpg 61
2346 | ToothBrush/00019.jpg 61
2347 | ToothBrush/00020.jpg 61
2348 | ToothBrush/00021.jpg 61
2349 | ToothBrush/00022.jpg 61
2350 | ToothBrush/00023.jpg 61
2351 | ToothBrush/00024.jpg 61
2352 | ToothBrush/00025.jpg 61
2353 | ToothBrush/00026.jpg 61
2354 | ToothBrush/00027.jpg 61
2355 | ToothBrush/00028.jpg 61
2356 | ToothBrush/00029.jpg 61
2357 | ToothBrush/00030.jpg 61
2358 | ToothBrush/00031.jpg 61
2359 | ToothBrush/00032.jpg 61
2360 | ToothBrush/00033.jpg 61
2361 | ToothBrush/00034.jpg 61
2362 | ToothBrush/00035.jpg 61
2363 | ToothBrush/00036.jpg 61
2364 | ToothBrush/00037.jpg 61
2365 | ToothBrush/00038.jpg 61
2366 | ToothBrush/00039.jpg 61
2367 | ToothBrush/00040.jpg 61
2368 | ToothBrush/00041.jpg 61
2369 | ToothBrush/00042.jpg 61
2370 | ToothBrush/00043.jpg 61
2371 | Toys/00001.jpg 62
2372 | Toys/00002.jpg 62
2373 | Toys/00003.jpg 62
2374 | Toys/00004.jpg 62
2375 | Toys/00005.jpg 62
2376 | Toys/00006.jpg 62
2377 | Toys/00007.jpg 62
2378 | Toys/00008.jpg 62
2379 | Toys/00009.jpg 62
2380 | Toys/00010.jpg 62
2381 | Toys/00011.jpg 62
2382 | Toys/00012.jpg 62
2383 | Toys/00013.jpg 62
2384 | Toys/00014.jpg 62
2385 | Toys/00015.jpg 62
2386 | Toys/00016.jpg 62
2387 | Toys/00017.jpg 62
2388 | Toys/00018.jpg 62
2389 | Toys/00019.jpg 62
2390 | Toys/00020.jpg 62
2391 | Trash_Can/00001.jpg 63
2392 | Trash_Can/00002.jpg 63
2393 | Trash_Can/00003.jpg 63
2394 | Trash_Can/00004.jpg 63
2395 | Trash_Can/00005.jpg 63
2396 | Trash_Can/00006.jpg 63
2397 | Trash_Can/00007.jpg 63
2398 | Trash_Can/00008.jpg 63
2399 | Trash_Can/00009.jpg 63
2400 | Trash_Can/00010.jpg 63
2401 | Trash_Can/00011.jpg 63
2402 | Trash_Can/00012.jpg 63
2403 | Trash_Can/00013.jpg 63
2404 | Trash_Can/00014.jpg 63
2405 | Trash_Can/00015.jpg 63
2406 | Trash_Can/00016.jpg 63
2407 | Trash_Can/00017.jpg 63
2408 | Trash_Can/00018.jpg 63
2409 | Trash_Can/00019.jpg 63
2410 | Trash_Can/00020.jpg 63
2411 | Trash_Can/00021.jpg 63
2412 | Webcam/00001.jpg 64
2413 | Webcam/00002.jpg 64
2414 | Webcam/00003.jpg 64
2415 | Webcam/00004.jpg 64
2416 | Webcam/00005.jpg 64
2417 | Webcam/00006.jpg 64
2418 | Webcam/00007.jpg 64
2419 | Webcam/00008.jpg 64
2420 | Webcam/00009.jpg 64
2421 | Webcam/00010.jpg 64
2422 | Webcam/00011.jpg 64
2423 | Webcam/00012.jpg 64
2424 | Webcam/00013.jpg 64
2425 | Webcam/00014.jpg 64
2426 | Webcam/00015.jpg 64
2427 | Webcam/00016.jpg 64
--------------------------------------------------------------------------------
/dataset/dataset.py:
--------------------------------------------------------------------------------
1 | import os
2 | from tqdm import tqdm
3 | from PIL import Image
4 | from torch.utils.data import Dataset
5 | from torchvision import transforms
6 |
7 | def train_transform(resize_size=256, crop_size=224,):
8 |
9 | normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406],
10 | std=[0.229, 0.224, 0.225])
11 |
12 | return transforms.Compose([
13 | transforms.Resize((resize_size, resize_size)),
14 | transforms.RandomCrop(crop_size),
15 | transforms.RandomHorizontalFlip(),
16 | transforms.ToTensor(),
17 | normalize
18 | ])
19 |
20 | def test_transform(resize_size=256, crop_size=224,):
21 | normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406],
22 | std=[0.229, 0.224, 0.225])
23 | return transforms.Compose([
24 | transforms.Resize((resize_size, resize_size)),
25 | transforms.CenterCrop(crop_size),
26 | transforms.ToTensor(),
27 | normalize
28 | ])
29 |
30 | '''
31 | assume classes across domains are the same.
32 | [0 1 ............................................................................ N - 1]
33 | |---- common classes --||---- source private classes --||---- target private classes --|
34 |
35 | |-------------------------------------------------|
36 | | DATASET PARTITION |
37 | |-------------------------------------------------|
38 | |DATASET | class split(com/sou_pri/tar_pri) |
39 | |-------------------------------------------------|
40 | |DATASET | PDA | OSDA | UniDA |
41 | |-------------------------------------------------|
42 | |Office-31 | 10/21/0 | 10/0/11 | 10/10/11 |
43 | |-------------------------------------------------|
44 | |OfficeHome | 25/40/0 | 25/0/40 | 10/5/50 |
45 | |-------------------------------------------------|
46 | |VisDA-C | | 6/0/6 | 6/3/3 |
47 | |-------------------------------------------------|
48 | |DomainNet | | | 150/50/145 |
49 | |-------------------------------------------------|
50 | '''
51 |
52 | class SFUniDADataset(Dataset):
53 |
54 | def __init__(self, args, data_dir, data_list, d_type, preload_flg=True) -> None:
55 | super(SFUniDADataset, self).__init__()
56 |
57 | self.d_type = d_type
58 | self.dataset = args.dataset
59 | self.preload_flg = preload_flg
60 |
61 | self.shared_class_num = args.shared_class_num
62 | self.source_private_class_num = args.source_private_class_num
63 | self.target_private_class_num = args.target_private_class_num
64 |
65 | self.shared_classes = [i for i in range(args.shared_class_num)]
66 | self.source_private_classes = [i + args.shared_class_num for i in range(args.source_private_class_num)]
67 |
68 | if args.dataset == "Office" and args.target_label_type == "OSDA":
69 | self.target_private_classes = [i + args.shared_class_num + args.source_private_class_num + 10 for i in range(args.target_private_class_num)]
70 | else:
71 | self.target_private_classes = [i + args.shared_class_num + args.source_private_class_num for i in range(args.target_private_class_num)]
72 |
73 | self.source_classes = self.shared_classes + self.source_private_classes
74 | self.target_classes = self.shared_classes + self.target_private_classes
75 |
76 | self.data_dir = data_dir
77 | self.data_list = [item.strip().split() for item in data_list]
78 |
79 | # Filtering the data_list
80 | if self.d_type == "source":
81 | # self.data_dir = args.source_data_dir
82 | self.data_list = [item for item in self.data_list if int(item[1]) in self.source_classes]
83 | else:
84 | # self.data_dir = args.target_data_dir
85 | self.data_list = [item for item in self.data_list if int(item[1]) in self.target_classes]
86 |
87 | self.pre_loading()
88 |
89 | self.train_transform = train_transform()
90 | self.test_transform = test_transform()
91 |
92 | def pre_loading(self):
93 | if "Office" in self.dataset and self.preload_flg:
94 | self.resize_trans = transforms.Resize((256, 256))
95 | print("Dataset Pre-Loading Started ....")
96 | self.img_list = [self.resize_trans(Image.open(os.path.join(self.data_dir, item[0])).convert("RGB")) for item in tqdm(self.data_list, ncols=60)]
97 | print("Dataset Pre-Loading Done!")
98 | else:
99 | pass
100 |
101 | def load_img(self, img_idx):
102 | img_f, img_label = self.data_list[img_idx]
103 | if "Office" in self.dataset and self.preload_flg:
104 | img = self.img_list[img_idx]
105 | else:
106 | img = Image.open(os.path.join(self.data_dir, img_f)).convert("RGB")
107 | return img, img_label
108 |
109 | def __len__(self):
110 | return len(self.data_list)
111 |
112 | def __getitem__(self, img_idx):
113 |
114 | img, img_label = self.load_img(img_idx)
115 |
116 | if self.d_type == "source":
117 | img_label = int(img_label)
118 | else:
119 | img_label = int(img_label) if int(img_label) in self.source_classes else len(self.source_classes)
120 |
121 | img_train = self.train_transform(img)
122 | img_test = self.test_transform(img)
123 |
124 | return img_train, img_test, img_label, img_idx
125 |
--------------------------------------------------------------------------------
/environment.yml:
--------------------------------------------------------------------------------
1 | name: con_110
2 | channels:
3 | - pytorch
4 | - defaults
5 | dependencies:
6 | - _libgcc_mutex=0.1=main
7 | - _openmp_mutex=4.5=1_gnu
8 | - backcall=0.2.0=pyhd3eb1b0_0
9 | - blas=1.0=mkl
10 | - bzip2=1.0.8=h7b6447c_0
11 | - ca-certificates=2022.07.19=h06a4308_0
12 | - certifi=2022.9.24=py37h06a4308_0
13 | - cudatoolkit=11.3.1=h2bc3f7f_2
14 | - debugpy=1.5.1=py37h295c915_0
15 | - decorator=5.1.1=pyhd3eb1b0_0
16 | - entrypoints=0.4=py37h06a4308_0
17 | - faiss-gpu=1.7.2=py3.7_h28a55e0_0_cuda11.3
18 | - ffmpeg=4.3=hf484d3e_0
19 | - freetype=2.11.0=h70c0345_0
20 | - giflib=5.2.1=h7b6447c_0
21 | - gmp=6.2.1=h2531618_2
22 | - gnutls=3.6.15=he1e5248_0
23 | - intel-openmp=2021.4.0=h06a4308_3561
24 | - ipykernel=6.15.2=py37h06a4308_0
25 | - ipython=7.31.1=py37h06a4308_1
26 | - jedi=0.18.1=py37h06a4308_1
27 | - joblib=1.1.0=pyhd3eb1b0_0
28 | - jpeg=9d=h7f8727e_0
29 | - jupyter_client=7.1.2=pyhd3eb1b0_0
30 | - jupyter_core=4.11.1=py37h06a4308_0
31 | - lame=3.100=h7b6447c_0
32 | - lcms2=2.12=h3be6417_0
33 | - ld_impl_linux-64=2.35.1=h7274673_9
34 | - libfaiss=1.7.2=hfc2d529_0_cuda11.3
35 | - libffi=3.3=he6710b0_2
36 | - libgcc-ng=9.3.0=h5101ec6_17
37 | - libgfortran-ng=7.5.0=ha8ba4b0_17
38 | - libgfortran4=7.5.0=ha8ba4b0_17
39 | - libgomp=9.3.0=h5101ec6_17
40 | - libiconv=1.15=h63c8f33_5
41 | - libidn2=2.3.2=h7f8727e_0
42 | - libpng=1.6.37=hbc83047_0
43 | - libsodium=1.0.18=h7b6447c_0
44 | - libstdcxx-ng=9.3.0=hd4cf53a_17
45 | - libtasn1=4.16.0=h27cfd23_0
46 | - libtiff=4.2.0=h85742a9_0
47 | - libunistring=0.9.10=h27cfd23_0
48 | - libuv=1.40.0=h7b6447c_0
49 | - libwebp=1.2.2=h55f646e_0
50 | - libwebp-base=1.2.2=h7f8727e_0
51 | - lz4-c=1.9.3=h295c915_1
52 | - matplotlib-inline=0.1.6=py37h06a4308_0
53 | - mkl=2021.4.0=h06a4308_640
54 | - mkl-service=2.4.0=py37h7f8727e_0
55 | - mkl_fft=1.3.1=py37hd3c417c_0
56 | - mkl_random=1.2.2=py37h51133e4_0
57 | - ncurses=6.3=h7f8727e_2
58 | - nest-asyncio=1.5.5=py37h06a4308_0
59 | - nettle=3.7.3=hbbd107a_1
60 | - numpy=1.21.2=py37h20f2e39_0
61 | - numpy-base=1.21.2=py37h79a1101_0
62 | - olefile=0.46=py37_0
63 | - openh264=2.1.1=h4ff587b_0
64 | - openssl=1.1.1q=h7f8727e_0
65 | - packaging=21.3=pyhd3eb1b0_0
66 | - parso=0.8.3=pyhd3eb1b0_0
67 | - pexpect=4.8.0=pyhd3eb1b0_3
68 | - pickleshare=0.7.5=pyhd3eb1b0_1003
69 | - pillow=8.4.0=py37h5aabda8_0
70 | - pip=21.2.2=py37h06a4308_0
71 | - prompt-toolkit=3.0.20=pyhd3eb1b0_0
72 | - ptyprocess=0.7.0=pyhd3eb1b0_2
73 | - py=1.11.0=pyhd3eb1b0_0
74 | - pygments=2.11.2=pyhd3eb1b0_0
75 | - pyparsing=3.0.9=py37h06a4308_0
76 | - python=3.7.11=h12debd9_0
77 | - python-dateutil=2.8.2=pyhd3eb1b0_0
78 | - pytorch=1.10.2=py3.7_cuda11.3_cudnn8.2.0_0
79 | - pytorch-mutex=1.0=cuda
80 | - pyzmq=22.3.0=py37h295c915_2
81 | - readline=8.1.2=h7f8727e_1
82 | - scikit-learn=1.0.2=py37h51133e4_1
83 | - scipy=1.7.3=py37hc147768_0
84 | - setuptools=58.0.4=py37h06a4308_0
85 | - six=1.16.0=pyhd3eb1b0_1
86 | - sqlite=3.37.2=hc218d9a_0
87 | - threadpoolctl=2.2.0=pyh0d69192_0
88 | - tk=8.6.11=h1ccaba5_0
89 | - torchvision=0.11.3=py37_cu113
90 | - tornado=6.1=py37h27cfd23_0
91 | - tqdm=4.62.3=pyhd3eb1b0_1
92 | - traitlets=5.1.1=pyhd3eb1b0_0
93 | - typing_extensions=3.10.0.2=pyh06a4308_0
94 | - wcwidth=0.2.5=pyhd3eb1b0_0
95 | - wheel=0.37.1=pyhd3eb1b0_0
96 | - xz=5.2.5=h7b6447c_0
97 | - zeromq=4.3.4=h2531618_0
98 | - zlib=1.2.11=h7f8727e_4
99 | - zstd=1.4.9=haebb681_0
100 | - pip:
101 | - charset-normalizer==2.0.12
102 | - click==8.0.4
103 | - docker-pycreds==0.4.0
104 | - gitdb==4.0.9
105 | - gitpython==3.1.27
106 | - idna==3.3
107 | - importlib-metadata==4.11.1
108 | - pathtools==0.1.2
109 | - promise==2.3
110 | - protobuf==3.19.4
111 | - psutil==5.9.0
112 | - pyyaml==6.0
113 | - requests==2.27.1
114 | - sentry-sdk==1.5.6
115 | - shortuuid==1.0.8
116 | - smmap==5.0.0
117 | - termcolor==1.1.0
118 | - urllib3==1.26.8
119 | - wandb==0.12.10
120 | - yaspin==2.1.0
121 | - zipp==3.7.0
122 |
--------------------------------------------------------------------------------
/figures/GLC_framework.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ispc-lab/GLC/dd1c3fe376dcc31744010bab9193437c2c1fa90c/figures/GLC_framework.png
--------------------------------------------------------------------------------
/figures/SFUNIDA.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/ispc-lab/GLC/dd1c3fe376dcc31744010bab9193437c2c1fa90c/figures/SFUNIDA.png
--------------------------------------------------------------------------------
/model/SFUniDA.py:
--------------------------------------------------------------------------------
1 | import torch
2 | import numpy as np
3 | import torch.nn as nn
4 | from torchvision import models
5 |
6 | def init_weights(m):
7 | classname = m.__class__.__name__
8 | if classname.find('Conv2d') != -1 or classname.find('ConvTranspose2d') != -1:
9 | nn.init.kaiming_uniform_(m.weight)
10 | nn.init.zeros_(m.bias)
11 | elif classname.find('BatchNorm') != -1:
12 | nn.init.normal_(m.weight, 1.0, 0.02)
13 | nn.init.zeros_(m.bias)
14 | elif classname.find('Linear') != -1:
15 | nn.init.xavier_normal_(m.weight)
16 | nn.init.zeros_(m.bias)
17 |
18 | vgg_dict = {"vgg11":models.vgg11, "vgg13":models.vgg13,
19 | "vgg16":models.vgg16, "vgg19":models.vgg19,
20 | "vgg11bn":models.vgg11_bn, "vgg13bn":models.vgg13_bn,
21 | "vgg16bn":models.vgg16_bn, "vgg19bn":models.vgg19_bn}
22 |
23 | class VGGBase(nn.Module):
24 | def __init__(self, vgg_name):
25 | super(VGGBase, self).__init__()
26 | model_vgg = vgg_dict[vgg_name](pretrained=True)
27 | self.features = model_vgg.features
28 | self.classifier = nn.Sequential()
29 | for i in range(6):
30 | self.classifier.add_module("classifier"+str(i), model_vgg.classifier[i])
31 | # self.in_features = model_vgg.classifier[6].in_features
32 | self.backbone_feat_dim = model_vgg.classifier[6].in_features
33 |
34 | def forward(self, x):
35 | x = self.features(x)
36 | x = x.view(x.size(0), -1)
37 | x = self.classifier(x)
38 | return x
39 |
40 | res_dict = {"resnet18":models.resnet18, "resnet34":models.resnet34,
41 | "resnet50":models.resnet50, "resnet101":models.resnet101,
42 | "resnet152":models.resnet152, "resnext50":models.resnext50_32x4d,
43 | "resnext101":models.resnext101_32x8d}
44 |
45 | class ResBase(nn.Module):
46 | def __init__(self, res_name):
47 | super(ResBase, self).__init__()
48 | model_resnet = res_dict[res_name](pretrained=True)
49 | self.conv1 = model_resnet.conv1
50 | self.bn1 = model_resnet.bn1
51 | self.relu = model_resnet.relu
52 | self.maxpool = model_resnet.maxpool
53 | self.layer1 = model_resnet.layer1
54 | self.layer2 = model_resnet.layer2
55 | self.layer3 = model_resnet.layer3
56 | self.layer4 = model_resnet.layer4
57 | self.avgpool = model_resnet.avgpool
58 | self.backbone_feat_dim = model_resnet.fc.in_features
59 |
60 | def forward(self, x):
61 | x = self.conv1(x)
62 | x = self.bn1(x)
63 | x = self.relu(x)
64 | x = self.maxpool(x)
65 | x = self.layer1(x)
66 | x = self.layer2(x)
67 | x = self.layer3(x)
68 | x = self.layer4(x)
69 | x = self.avgpool(x)
70 | x = x.view(x.size(0), -1)
71 | return x
72 |
73 | class Embedding(nn.Module):
74 |
75 | def __init__(self, feature_dim, embed_dim=256, type="ori"):
76 |
77 | super(Embedding, self).__init__()
78 | self.bn = nn.BatchNorm1d(embed_dim, affine=True)
79 | self.relu = nn.ReLU(inplace=True)
80 | self.dropout = nn.Dropout(p=0.5)
81 | self.bottleneck = nn.Linear(feature_dim, embed_dim)
82 | self.bottleneck.apply(init_weights)
83 | self.type = type
84 |
85 | def forward(self, x):
86 | x = self.bottleneck(x)
87 | if self.type == "bn":
88 | x = self.bn(x)
89 | return x
90 |
91 | class Classifier(nn.Module):
92 | def __init__(self, embed_dim, class_num, type="linear"):
93 | super(Classifier, self).__init__()
94 |
95 | self.type = type
96 | if type == 'wn':
97 | self.fc = nn.utils.weight_norm(nn.Linear(embed_dim, class_num), name="weight")
98 | self.fc.apply(init_weights)
99 | else:
100 | self.fc = nn.Linear(embed_dim, class_num)
101 | self.fc.apply(init_weights)
102 |
103 | def forward(self, x):
104 | x = self.fc(x)
105 | return x
106 |
107 |
108 | class SFUniDA(nn.Module):
109 | def __init__(self, args):
110 |
111 | super(SFUniDA, self).__init__()
112 | self.backbone_arch = args.backbone_arch # resnet50
113 | self.embed_feat_dim = args.embed_feat_dim # 256
114 | self.class_num = args.class_num # shared_class_num + source_private_class_num
115 |
116 | if "resnet" in self.backbone_arch:
117 | self.backbone_layer = ResBase(self.backbone_arch)
118 | elif "vgg" in self.backbone_arch:
119 | self.backbone_layer = VGGBase(self.backbone_arch)
120 | else:
121 | raise ValueError("Unknown Feature Backbone ARCH of {}".format(self.backbone_arch))
122 |
123 | self.backbone_feat_dim = self.backbone_layer.backbone_feat_dim
124 |
125 | self.feat_embed_layer = Embedding(self.backbone_feat_dim, self.embed_feat_dim, type="bn")
126 |
127 | self.class_layer = Classifier(self.embed_feat_dim, class_num=self.class_num, type="wn")
128 |
129 | def get_embed_feat(self, input_imgs):
130 | # input_imgs [B, 3, H, W]
131 | backbone_feat = self.backbone_layer(input_imgs)
132 | embed_feat = self.feat_embed_layer(backbone_feat)
133 | return embed_feat
134 |
135 | def forward(self, input_imgs, apply_softmax=True):
136 | # input_imgs [B, 3, H, W]
137 | backbone_feat = self.backbone_layer(input_imgs)
138 |
139 | embed_feat = self.feat_embed_layer(backbone_feat)
140 |
141 | cls_out = self.class_layer(embed_feat)
142 |
143 | if apply_softmax:
144 | cls_out = torch.softmax(cls_out, dim=1)
145 |
146 | return embed_feat, cls_out
--------------------------------------------------------------------------------
/scripts/train_source_OPDA.sh:
--------------------------------------------------------------------------------
1 | gpuid=${1:-0}
2 | random_seed=${2:-2021}
3 |
4 | # export CUDA_VISIBLE_DEVICES=$gpuid
5 |
6 | echo "OPDA SOURCE TRAIN ON OFFICE"
7 | python train_source.py --dataset Office --s_idx 0 --target_label_type OPDA --epochs 50 --lr 0.01
8 | python train_source.py --dataset Office --s_idx 1 --target_label_type OPDA --epochs 50 --lr 0.01
9 | python train_source.py --dataset Office --s_idx 2 --target_label_type OPDA --epochs 50 --lr 0.01
10 |
11 | echo "OPDA SOURCE TRAIN ON OFFICEHOME"
12 | python train_source.py --dataset OfficeHome --s_idx 0 --target_label_type OPDA --epochs 50 --lr 0.01
13 | python train_source.py --dataset OfficeHome --s_idx 1 --target_label_type OPDA --epochs 50 --lr 0.01
14 | python train_source.py --dataset OfficeHome --s_idx 2 --target_label_type OPDA --epochs 50 --lr 0.01
15 | python train_source.py --dataset OfficeHome --s_idx 3 --target_label_type OPDA --epochs 50 --lr 0.01
16 |
17 | echo "OPDA SOURCE TRAIN ON VisDA"
18 | python train_source.py --backbone_arch resnet50 --dataset VisDA --s_idx 0 --target_label_type OPDA --epochs 10 --lr 0.001
19 |
20 | echo "OPDA SOURCE TRAIN ON DomainNet"
21 | python train_source.py --dataset DomainNet --s_idx 0 --target_label_type OPDA --epochs 50 --lr 0.01
22 | python train_source.py --dataset DomainNet --s_idx 1 --target_label_type OPDA --epochs 50 --lr 0.01
23 | python train_source.py --dataset DomainNet --s_idx 2 --target_label_type OPDA --epochs 50 --lr 0.01
24 |
--------------------------------------------------------------------------------
/scripts/train_source_OSDA.sh:
--------------------------------------------------------------------------------
1 | gpuid=${1:-0}
2 | random_seed=${2:-2021}
3 |
4 | # export CUDA_VISIBLE_DEVICES=$gpuid
5 |
6 | echo "OSDA SOURCE TRAIN ON OFFICE"
7 | python train_source.py --dataset Office --s_idx 0 --target_label_type OSDA --epochs 50 --lr 0.01
8 | python train_source.py --dataset Office --s_idx 1 --target_label_type OSDA --epochs 50 --lr 0.01
9 | python train_source.py --dataset Office --s_idx 2 --target_label_type OSDA --epochs 50 --lr 0.01
10 |
11 | echo "OSDA SOURCE TRAIN ON OFFICEHOME"
12 | python train_source.py --dataset OfficeHome --s_idx 0 --target_label_type OSDA --epochs 50 --lr 0.01
13 | python train_source.py --dataset OfficeHome --s_idx 1 --target_label_type OSDA --epochs 50 --lr 0.01
14 | python train_source.py --dataset OfficeHome --s_idx 2 --target_label_type OSDA --epochs 50 --lr 0.01
15 | python train_source.py --dataset OfficeHome --s_idx 3 --target_label_type OSDA --epochs 50 --lr 0.01
16 |
17 | echo "OSDA SOURCE TRAIN ON VisDA"
18 | python train_source.py --backbone_arch resnet50 --dataset VisDA --s_idx 0 --target_label_type OSDA --epochs 10 --lr 0.001
--------------------------------------------------------------------------------
/scripts/train_source_PDA.sh:
--------------------------------------------------------------------------------
1 | gpuid=${1:-0}
2 | random_seed=${2:-2021}
3 |
4 | # export CUDA_VISIBLE_DEVICES=$gpuid
5 |
6 | echo "PDA SOURCE TRAIN ON OFFICE"
7 | python train_source.py --dataset Office --s_idx 0 --target_label_type PDA --epochs 50 --lr 0.01
8 | python train_source.py --dataset Office --s_idx 1 --target_label_type PDA --epochs 50 --lr 0.01
9 | python train_source.py --dataset Office --s_idx 2 --target_label_type PDA --epochs 50 --lr 0.01
10 |
11 | echo "PDA SOURCE TRAIN ON OFFICEHOME"
12 | python train_source.py --dataset OfficeHome --s_idx 0 --target_label_type PDA --epochs 50 --lr 0.01
13 | python train_source.py --dataset OfficeHome --s_idx 1 --target_label_type PDA --epochs 50 --lr 0.01
14 | python train_source.py --dataset OfficeHome --s_idx 2 --target_label_type PDA --epochs 50 --lr 0.01
15 | python train_source.py --dataset OfficeHome --s_idx 3 --target_label_type PDA --epochs 50 --lr 0.01
16 |
17 | echo "PDA SOURCE TRAIN ON VisDA"
18 | python train_source.py --backbone_arch resnet50 --dataset VisDA --s_idx 0 --target_label_type PDA --epochs 10 --lr 0.001
--------------------------------------------------------------------------------
/scripts/train_target_OPDA.sh:
--------------------------------------------------------------------------------
1 | gpuid=${1:-0}
2 | random_seed=${2:-2021}
3 |
4 | # export CUDA_VISIBLE_DEVICES=$gpuid
5 |
6 | lam_psd=0.30
7 | echo "OPDA Adaptation ON VisDA"
8 | python train_target.py --backbone_arch resnet50 --lr 0.0001 --dataset VisDA --lam_psd $lam_psd --target_label_type OPDA --epochs 20
9 |
10 | echo "OPDA Adaptation ON Office"
11 | python train_target.py --dataset Office --s_idx 0 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
12 | python train_target.py --dataset Office --s_idx 0 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
13 | python train_target.py --dataset Office --s_idx 1 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
14 | python train_target.py --dataset Office --s_idx 1 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
15 | python train_target.py --dataset Office --s_idx 2 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
16 | python train_target.py --dataset Office --s_idx 2 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
17 |
18 | lam_psd=1.50
19 | echo "OPDA Adaptation ON Office-Home"
20 | python train_target.py --dataset OfficeHome --s_idx 0 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
21 | python train_target.py --dataset OfficeHome --s_idx 0 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
22 | python train_target.py --dataset OfficeHome --s_idx 0 --t_idx 3 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
23 | python train_target.py --dataset OfficeHome --s_idx 1 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
24 | python train_target.py --dataset OfficeHome --s_idx 1 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
25 | python train_target.py --dataset OfficeHome --s_idx 1 --t_idx 3 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
26 | python train_target.py --dataset OfficeHome --s_idx 2 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
27 | python train_target.py --dataset OfficeHome --s_idx 2 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
28 | python train_target.py --dataset OfficeHome --s_idx 2 --t_idx 3 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
29 | python train_target.py --dataset OfficeHome --s_idx 3 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
30 | python train_target.py --dataset OfficeHome --s_idx 3 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
31 | python train_target.py --dataset OfficeHome --s_idx 3 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type OPDA
32 |
33 | echo "OPDA Adaptation ON DomainNet"
34 | python train_target.py --dataset DomainNet --s_idx 0 --t_idx 1 --lr 0.0001 --lam_psd $lam_psd --target_label_type OPDA --epochs 10
35 | python train_target.py --dataset DomainNet --s_idx 0 --t_idx 2 --lr 0.0001 --lam_psd $lam_psd --target_label_type OPDA --epochs 10
36 | python train_target.py --dataset DomainNet --s_idx 1 --t_idx 0 --lr 0.0001 --lam_psd $lam_psd --target_label_type OPDA --epochs 10
37 | python train_target.py --dataset DomainNet --s_idx 1 --t_idx 2 --lr 0.0001 --lam_psd $lam_psd --target_label_type OPDA --epochs 10
38 | python train_target.py --dataset DomainNet --s_idx 2 --t_idx 0 --lr 0.0001 --lam_psd $lam_psd --target_label_type OPDA --epochs 10
39 | python train_target.py --dataset DomainNet --s_idx 2 --t_idx 1 --lr 0.0001 --lam_psd $lam_psd --target_label_type OPDA --epochs 10
40 |
--------------------------------------------------------------------------------
/scripts/train_target_OSDA.sh:
--------------------------------------------------------------------------------
1 | gpuid=${1:-0}
2 | random_seed=${2:-2021}
3 |
4 | # export CUDA_VISIBLE_DEVICES=$gpuid
5 |
6 | lam_psd=0.30
7 | echo "OSDA Adaptation ON VisDA"
8 | python train_target.py --backbone_arch resnet50 --lr 0.0001 --dataset VisDA --lam_psd $lam_psd --target_label_type OSDA --epochs 30
9 |
10 | echo "OSDA Adaptation ON Office"
11 | python train_target.py --dataset Office --s_idx 0 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
12 | python train_target.py --dataset Office --s_idx 0 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
13 | python train_target.py --dataset Office --s_idx 1 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
14 | python train_target.py --dataset Office --s_idx 1 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
15 | python train_target.py --dataset Office --s_idx 2 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
16 | python train_target.py --dataset Office --s_idx 2 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
17 |
18 | lam_psd=1.50
19 | echo "OSDA Adaptation ON Office-Home"
20 | python train_target.py --dataset OfficeHome --s_idx 0 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
21 | python train_target.py --dataset OfficeHome --s_idx 0 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
22 | python train_target.py --dataset OfficeHome --s_idx 0 --t_idx 3 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
23 | python train_target.py --dataset OfficeHome --s_idx 1 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
24 | python train_target.py --dataset OfficeHome --s_idx 1 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
25 | python train_target.py --dataset OfficeHome --s_idx 1 --t_idx 3 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
26 | python train_target.py --dataset OfficeHome --s_idx 2 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
27 | python train_target.py --dataset OfficeHome --s_idx 2 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
28 | python train_target.py --dataset OfficeHome --s_idx 2 --t_idx 3 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
29 | python train_target.py --dataset OfficeHome --s_idx 3 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
30 | python train_target.py --dataset OfficeHome --s_idx 3 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
31 | python train_target.py --dataset OfficeHome --s_idx 3 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type OSDA
--------------------------------------------------------------------------------
/scripts/train_target_PDA.sh:
--------------------------------------------------------------------------------
1 | gpuid=${1:-0}
2 | random_seed=${2:-2021}
3 |
4 | # export CUDA_VISIBLE_DEVICES=$gpuid
5 |
6 | lam_psd=0.30
7 | echo "PDA Adaptation ON VisDA"
8 | python train_target.py --backbone_arch resnet50 --lr 0.0001 --dataset VisDA --lam_psd $lam_psd --target_label_type PDA --epochs 20
9 |
10 | echo "PDA Adaptation ON Office"
11 | python train_target.py --dataset Office --s_idx 0 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
12 | python train_target.py --dataset Office --s_idx 0 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
13 | python train_target.py --dataset Office --s_idx 1 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
14 | python train_target.py --dataset Office --s_idx 1 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
15 | python train_target.py --dataset Office --s_idx 2 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
16 | python train_target.py --dataset Office --s_idx 2 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
17 |
18 | lam_psd=1.50
19 | echo "PDA Adaptation ON Office-Home"
20 | python train_target.py --dataset OfficeHome --s_idx 0 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
21 | python train_target.py --dataset OfficeHome --s_idx 0 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
22 | python train_target.py --dataset OfficeHome --s_idx 0 --t_idx 3 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
23 | python train_target.py --dataset OfficeHome --s_idx 1 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
24 | python train_target.py --dataset OfficeHome --s_idx 1 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
25 | python train_target.py --dataset OfficeHome --s_idx 1 --t_idx 3 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
26 | python train_target.py --dataset OfficeHome --s_idx 2 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
27 | python train_target.py --dataset OfficeHome --s_idx 2 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
28 | python train_target.py --dataset OfficeHome --s_idx 2 --t_idx 3 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
29 | python train_target.py --dataset OfficeHome --s_idx 3 --t_idx 0 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
30 | python train_target.py --dataset OfficeHome --s_idx 3 --t_idx 1 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
31 | python train_target.py --dataset OfficeHome --s_idx 3 --t_idx 2 --lr 0.001 --lam_psd $lam_psd --target_label_type PDA
--------------------------------------------------------------------------------
/train_source.py:
--------------------------------------------------------------------------------
1 | import os
2 | import shutil
3 | import torch
4 | import numpy as np
5 | from tqdm import tqdm
6 | from model.SFUniDA import SFUniDA
7 | from dataset.dataset import SFUniDADataset
8 | from torch.utils.data.dataloader import DataLoader
9 |
10 | from config.model_config import build_args
11 | from utils.net_utils import set_logger, set_random_seed
12 | from utils.net_utils import compute_h_score, CrossEntropyLabelSmooth
13 |
14 | def op_copy(optimizer):
15 | for param_group in optimizer.param_groups:
16 | param_group['lr0'] = param_group['lr']
17 | return optimizer
18 |
19 | def lr_scheduler(optimizer, iter_num, max_iter, gamma=10, power=0.75):
20 | decay = (1 + gamma * iter_num / max_iter) ** (-power)
21 | for param_group in optimizer.param_groups:
22 | param_group['lr'] = param_group['lr0'] * decay
23 | param_group['weight_decay'] = 1e-3
24 | param_group['momentum'] = 0.9
25 | param_group['nesterov'] = True
26 | return optimizer
27 |
28 | def train(args, model, dataloader, criterion, optimizer, epoch_idx=0.0):
29 | model.train()
30 | loss_stack = []
31 |
32 | iter_idx = epoch_idx * len(dataloader)
33 | iter_max = args.epochs * len(dataloader)
34 |
35 | for imgs_train, _, imgs_label, _ in tqdm(dataloader, ncols=60):
36 |
37 | iter_idx += 1
38 | imgs_train = imgs_train.cuda()
39 | imgs_label = imgs_label.cuda()
40 |
41 | _, pred_cls = model(imgs_train, apply_softmax=True)
42 | imgs_onehot_label = torch.zeros_like(pred_cls).scatter(1, imgs_label.unsqueeze(1), 1)
43 |
44 | loss = criterion(pred_cls, imgs_onehot_label)
45 |
46 | lr_scheduler(optimizer, iter_idx, iter_max)
47 | optimizer.zero_grad()
48 | loss.backward()
49 | optimizer.step()
50 |
51 | loss_stack.append(loss.cpu().item())
52 |
53 | train_loss = np.mean(loss_stack)
54 |
55 | return train_loss
56 |
57 | @torch.no_grad()
58 | def test(args, model, dataloader, src_flg=True):
59 | model.eval()
60 | gt_label_stack = []
61 | pred_cls_stack = []
62 |
63 | if src_flg:
64 | class_list = args.source_class_list
65 | open_flg = False
66 | else:
67 | class_list = args.target_class_list
68 | open_flg = args.target_private_class_num > 0
69 |
70 | for _, imgs_test, imgs_label, _ in tqdm(dataloader, ncols=60):
71 |
72 | imgs_test = imgs_test.cuda()
73 | _, pred_cls = model(imgs_test, apply_softmax=True)
74 | gt_label_stack.append(imgs_label)
75 | pred_cls_stack.append(pred_cls.cpu())
76 |
77 | gt_label_all = torch.cat(gt_label_stack, dim=0) #[N]
78 | pred_cls_all = torch.cat(pred_cls_stack, dim=0) #[N, C]
79 |
80 | h_score, known_acc,\
81 | unknown_acc, per_cls_acc = compute_h_score(args, class_list, gt_label_all, pred_cls_all, open_flg, open_thresh=0.50)
82 |
83 | return h_score, known_acc, unknown_acc, per_cls_acc
84 |
85 | def main(args):
86 |
87 | os.environ['CUDA_VISIBLE_DEVICES'] = args.gpu
88 | this_dir = os.path.join(os.path.dirname(__file__), ".")
89 |
90 | model = SFUniDA(args)
91 | if args.checkpoint is not None and os.path.isfile(args.checkpoint):
92 | save_dir = os.path.dirname(args.checkpoint)
93 | checkpoint = torch.load(args.checkpoint, map_location=torch.device("cpu"))
94 | model.load_state_dict(checkpoint["model_state_dict"])
95 | else:
96 | save_dir = os.path.join(this_dir, "checkpoints_glc", args.dataset, "source_{}".format(args.s_idx),
97 | "source_{}_{}".format(args.source_train_type, args.target_label_type))
98 |
99 | if not os.path.isdir(save_dir):
100 | os.makedirs(save_dir)
101 |
102 | model.cuda()
103 | args.save_dir = save_dir
104 | logger = set_logger(args, log_name="log_source_training.txt")
105 |
106 | params_group = []
107 | for k, v in model.backbone_layer.named_parameters():
108 | params_group += [{"params":v, 'lr':args.lr*0.1}]
109 | for k, v in model.feat_embed_layer.named_parameters():
110 | params_group += [{"params":v, 'lr':args.lr}]
111 | for k, v in model.class_layer.named_parameters():
112 | params_group += [{"params":v, 'lr':args.lr}]
113 |
114 | optimizer = torch.optim.SGD(params_group)
115 | optimizer = op_copy(optimizer)
116 |
117 | source_data_list = open(os.path.join(args.source_data_dir, "image_unida_list.txt"), "r").readlines()
118 | source_dataset = SFUniDADataset(args, args.source_data_dir, source_data_list, d_type="source", preload_flg=True)
119 | source_dataloader = DataLoader(source_dataset, batch_size=args.batch_size, shuffle=True,
120 | num_workers=args.num_workers, drop_last=True)
121 |
122 | target_dataloader_list = []
123 | for idx in range(len(args.target_domain_dir_list)):
124 | target_data_dir = args.target_domain_dir_list[idx]
125 | target_data_list = open(os.path.join(target_data_dir, "image_unida_list.txt"), "r").readlines()
126 | target_dataset = SFUniDADataset(args, target_data_dir, target_data_list, d_type="target", preload_flg=False)
127 | target_dataloader_list.append(DataLoader(target_dataset, batch_size=args.batch_size, shuffle=False,
128 | num_workers=args.num_workers, drop_last=False))
129 |
130 | if args.source_train_type == "smooth":
131 | criterion = CrossEntropyLabelSmooth(num_classes=args.class_num, epsilon=0.1, reduction=True)
132 | elif args.source_train_type == "vanilla":
133 | criterion = CrossEntropyLabelSmooth(num_classes=args.class_num, epsilon=0.0, reduction=True)
134 | else:
135 | raise ValueError("Unknown source_train_type:", args.source_train_type)
136 |
137 | notation_str = "\n=================================================\n"
138 | notation_str += " START TRAINING ON THE SOURCE:{} == {} \n".format(args.s_idx, args.target_label_type)
139 | notation_str += "================================================="
140 |
141 | logger.info(notation_str)
142 |
143 | for epoch_idx in tqdm(range(args.epochs), ncols=60):
144 |
145 | train_loss = train(args, model, source_dataloader, criterion, optimizer, epoch_idx)
146 | logger.info("Epoch:{}/{} train_loss:{:.3f}".format(epoch_idx, args.epochs, train_loss))
147 |
148 | if epoch_idx % 1 == 0:
149 | # EVALUATE ON SOURCE
150 | source_h_score, source_known_acc, source_unknown_acc, src_per_cls_acc = test(args, model, source_dataloader, src_flg=True)
151 | logger.info("EVALUATE ON SOURCE: H-Score:{:.3f}, KnownAcc:{:.3f}, UnknownAcc:{:.3f}".\
152 | format(source_h_score, source_known_acc, source_unknown_acc))
153 | if args.dataset == "VisDA":
154 | logger.info("VISDA PER_CLS_ACC:")
155 | logger.info(src_per_cls_acc)
156 |
157 | checkpoint_file = "latest_source_checkpoint.pth"
158 | torch.save({
159 | "epoch":epoch_idx,
160 | "model_state_dict":model.state_dict()}, os.path.join(save_dir, checkpoint_file))
161 |
162 | for idx_i, item in enumerate(args.target_domain_list):
163 | notation_str = "\n=================================================\n"
164 | notation_str += " EVALUATE ON THE TARGET:{} \n".format(item)
165 | notation_str += "================================================="
166 | logger.info(notation_str)
167 |
168 | hscore, knownacc, unknownacc, _ = test(args, model, target_dataloader_list[idx_i], src_flg=False)
169 | logger.info("H-Score:{:.3f}, KnownAcc:{:.3f}, UnknownACC:{:.3f}".format(hscore, knownacc, unknownacc))
170 |
171 | if __name__ == "__main__":
172 | args = build_args()
173 | set_random_seed(args.seed)
174 | main(args)
--------------------------------------------------------------------------------
/train_target.py:
--------------------------------------------------------------------------------
1 | import os
2 | import faiss
3 | import torch
4 | import shutil
5 | import numpy as np
6 |
7 | from tqdm import tqdm
8 | from model.SFUniDA import SFUniDA
9 | from dataset.dataset import SFUniDADataset
10 | from torch.utils.data.dataloader import DataLoader
11 |
12 | from config.model_config import build_args
13 | from utils.net_utils import set_logger, set_random_seed
14 | from utils.net_utils import compute_h_score, Entropy
15 |
16 | from sklearn.cluster import KMeans
17 | from sklearn.metrics import silhouette_score
18 | from sklearn.manifold import TSNE
19 |
20 | def op_copy(optimizer):
21 | for param_group in optimizer.param_groups:
22 | param_group['lr0'] = param_group['lr']
23 | return optimizer
24 |
25 | def lr_scheduler(optimizer, iter_num, max_iter, gamma=10, power=0.75):
26 | decay = (1 + gamma * iter_num / max_iter) ** (-power)
27 | for param_group in optimizer.param_groups:
28 | param_group['lr'] = param_group['lr0'] * decay
29 | param_group['weight_decay'] = 1e-3
30 | param_group['momentum'] = 0.9
31 | param_group['nesterov'] = True
32 | return optimizer
33 |
34 | best_score = 0.0
35 | best_coeff = 1.0
36 |
37 | @torch.no_grad()
38 | def obtain_global_pseudo_labels(args, model, dataloader, epoch_idx=0.0):
39 | model.eval()
40 |
41 | pred_cls_bank = []
42 | gt_label_bank = []
43 | embed_feat_bank = []
44 | class_list = args.target_class_list
45 |
46 | args.logger.info("Generating one-vs-all global clustering pseudo labels...")
47 |
48 | for _, imgs_test, imgs_label, _ in tqdm(dataloader, ncols=60):
49 |
50 | imgs_test = imgs_test.cuda()
51 | embed_feat, pred_cls = model(imgs_test, apply_softmax=True)
52 | pred_cls_bank.append(pred_cls)
53 | embed_feat_bank.append(embed_feat)
54 | gt_label_bank.append(imgs_label.cuda())
55 |
56 | pred_cls_bank = torch.cat(pred_cls_bank, dim=0) #[N, C]
57 | gt_label_bank = torch.cat(gt_label_bank, dim=0) #[N]
58 | embed_feat_bank = torch.cat(embed_feat_bank, dim=0) #[N, D]
59 | embed_feat_bank = embed_feat_bank / torch.norm(embed_feat_bank, p=2, dim=1, keepdim=True)
60 |
61 | global best_score
62 | global best_coeff
63 | # At the first epoch, we need to determine the number of categories in target domain, i.e., the C_t in our paper.
64 | # Here, we utilize the Silhouette metric to realize this goal.
65 | if epoch_idx == 0.0:
66 | embed_feat_bank_cpu = embed_feat_bank.cpu().numpy()
67 |
68 | if args.dataset == "VisDA" or args.dataset == "DomainNet":
69 | # np.random.seed(2021)
70 | data_size = embed_feat_bank_cpu.shape[0]
71 | sample_idxs = np.random.choice(data_size, data_size//3, replace=False)
72 | embed_feat_bank_cpu = embed_feat_bank_cpu[sample_idxs, :]
73 |
74 | embed_feat_bank_cpu = TSNE(n_components=2, init="pca", random_state=0).fit_transform(embed_feat_bank_cpu)
75 | coeff_list = [0.25, 0.50, 1, 2, 3]
76 |
77 | for coeff in coeff_list:
78 | KK = max(int(args.class_num * coeff), 2)
79 | kmeans = KMeans(n_clusters=KK, random_state=0).fit(embed_feat_bank_cpu)
80 | cluster_labels = kmeans.labels_
81 | sil_score = silhouette_score(embed_feat_bank_cpu, cluster_labels)
82 |
83 | if sil_score > best_score:
84 | best_score = sil_score
85 | best_coeff = coeff
86 |
87 | KK = int(args.class_num * best_coeff)
88 |
89 | data_num = pred_cls_bank.shape[0]
90 | pos_topk_num = int(data_num / args.class_num / best_coeff)
91 | sorted_pred_cls, sorted_pred_cls_idxs = torch.sort(pred_cls_bank, dim=0, descending=True)
92 | pos_topk_idxs = sorted_pred_cls_idxs[:pos_topk_num, :].t() #[C, pos_topk_num]
93 | neg_topk_idxs = sorted_pred_cls_idxs[pos_topk_num:, :].t() #[C, neg_topk_num]
94 |
95 | pos_topk_idxs = pos_topk_idxs.unsqueeze(2).expand([-1, -1, args.embed_feat_dim]) #[C, pos_topk_num, D]
96 | neg_topk_idxs = neg_topk_idxs.unsqueeze(2).expand([-1, -1, args.embed_feat_dim]) #[C, neg_topk_num, D]
97 |
98 | embed_feat_bank_expand = embed_feat_bank.unsqueeze(0).expand([args.class_num, -1, -1]) #[C, N, D]
99 | pos_feat_sample = torch.gather(embed_feat_bank_expand, 1, pos_topk_idxs)
100 |
101 | pos_cls_prior = torch.mean(sorted_pred_cls[:(pos_topk_num), :], dim=0, keepdim=True).t() * (1.0 - args.rho) + args.rho
102 |
103 | args.logger.info("POS_CLS_PRIOR:\t" + "\t".join(["{:.3f}".format(item) for item in pos_cls_prior.cpu().squeeze().numpy()]))
104 |
105 | pos_feat_proto = torch.mean(pos_feat_sample, dim=1, keepdim=True) #[C, 1, D]
106 | pos_feat_proto = pos_feat_proto / torch.norm(pos_feat_proto, p=2, dim=-1, keepdim=True)
107 |
108 | faiss_kmeans = faiss.Kmeans(args.embed_feat_dim, KK, niter=100, verbose=False, min_points_per_centroid=1, gpu=False)
109 |
110 | feat_proto_pos_simi = torch.zeros((data_num, args.class_num)).cuda() #[N, C]
111 | feat_proto_max_simi = torch.zeros((data_num, args.class_num)).cuda() #[N, C]
112 | feat_proto_max_idxs = torch.zeros((data_num, args.class_num)).cuda() #[N, C]
113 |
114 | # One-vs-all class pseudo-labeling
115 | for cls_idx in range(args.class_num):
116 | neg_feat_cls_sample_np = torch.gather(embed_feat_bank, 0, neg_topk_idxs[cls_idx, :]).cpu().numpy()
117 | faiss_kmeans.train(neg_feat_cls_sample_np)
118 | cls_neg_feat_proto = torch.from_numpy(faiss_kmeans.centroids).cuda()
119 | cls_neg_feat_proto = cls_neg_feat_proto / torch.norm(cls_neg_feat_proto, p=2, dim=-1, keepdim=True)#[K, D]
120 | cls_pos_feat_proto = pos_feat_proto[cls_idx, :] #[1, D]
121 |
122 | cls_pos_feat_proto_simi = torch.einsum("nd, kd -> nk", embed_feat_bank, cls_pos_feat_proto) #[N, 1]
123 | cls_neg_feat_proto_simi = torch.einsum("nd, kd -> nk", embed_feat_bank, cls_neg_feat_proto) #[N, K]
124 | cls_pos_feat_proto_simi = cls_pos_feat_proto_simi * pos_cls_prior[cls_idx] #[N, 1]
125 |
126 | cls_feat_proto_simi = torch.cat([cls_pos_feat_proto_simi, cls_neg_feat_proto_simi], dim=1) #[N, 1+K]
127 |
128 | feat_proto_pos_simi[:, cls_idx] = cls_feat_proto_simi[:, 0]
129 | maxsimi, maxidxs = torch.max(cls_feat_proto_simi, dim=-1)
130 | feat_proto_max_simi[:, cls_idx] = maxsimi
131 | feat_proto_max_idxs[:, cls_idx] = maxidxs
132 |
133 | # we use this psd_label_prior_simi to control the hard pseudo label either one-hot or unifrom distribution.
134 | psd_label_prior_simi = torch.einsum("nd, cd -> nc", embed_feat_bank, pos_feat_proto.squeeze(1))
135 | psd_label_prior_idxs = torch.max(psd_label_prior_simi, dim=-1, keepdim=True)[1] #[N] ~ (0, class_num-1)
136 | psd_label_prior = torch.zeros_like(psd_label_prior_simi).scatter(1, psd_label_prior_idxs, 1.0) # one_hot prior #[N, C]
137 |
138 | hard_psd_label_bank = feat_proto_max_idxs # [N, C] ~ (0, K)
139 | hard_psd_label_bank = (hard_psd_label_bank == 0).float()
140 | hard_psd_label_bank = hard_psd_label_bank * psd_label_prior #[N, C]
141 |
142 | hard_label = torch.argmax(hard_psd_label_bank, dim=-1) #[N]
143 | hard_label_unk = torch.sum(hard_psd_label_bank, dim=-1)
144 | hard_label_unk = (hard_label_unk == 0)
145 | hard_label[hard_label_unk] = args.class_num
146 |
147 | hard_psd_label_bank[hard_label_unk, :] += 1.0
148 | hard_psd_label_bank = hard_psd_label_bank / (torch.sum(hard_psd_label_bank, dim=-1, keepdim=True) + 1e-4)
149 |
150 | hard_psd_label_bank = hard_psd_label_bank.cuda()
151 |
152 | per_class_num = np.zeros((len(class_list)))
153 | pre_class_num = np.zeros_like(per_class_num)
154 | per_class_correct = np.zeros_like(per_class_num)
155 | for i, label in enumerate(class_list):
156 | label_idx = torch.where(gt_label_bank == label)[0]
157 | correct_idx = torch.where(hard_label[label_idx] == label)[0]
158 | pre_class_num[i] = float(len(torch.where(hard_label == label)[0]))
159 | per_class_num[i] = float(len(label_idx))
160 | per_class_correct[i] = float(len(correct_idx))
161 | per_class_acc = per_class_correct / (per_class_num + 1e-5)
162 |
163 | args.logger.info("PSD AVG ACC:\t" + "{:.3f}".format(np.mean(per_class_acc)))
164 | args.logger.info("PSD PER ACC:\t" + "\t".join(["{:.3f}".format(item) for item in per_class_acc]))
165 | args.logger.info("PER CLS NUM:\t" + "\t".join(["{:.0f}".format(item) for item in per_class_num]))
166 | args.logger.info("PRE CLS NUM:\t" + "\t".join(["{:.0f}".format(item) for item in pre_class_num]))
167 | args.logger.info("PRE ACC NUM:\t" + "\t".join(["{:.0f}".format(item) for item in per_class_correct]))
168 |
169 | return hard_psd_label_bank, pred_cls_bank, embed_feat_bank
170 |
171 |
172 | def train(args, model, train_dataloader, test_dataloader, optimizer, epoch_idx=0.0):
173 |
174 | model.eval()
175 | hard_psd_label_bank, pred_cls_bank, embed_feat_bank = obtain_global_pseudo_labels(args, model, test_dataloader,epoch_idx)
176 | model.train()
177 |
178 | local_KNN = args.local_K
179 | all_pred_loss_stack = []
180 | psd_pred_loss_stack = []
181 | knn_pred_loss_stack = []
182 |
183 | iter_idx = epoch_idx * len(train_dataloader)
184 | iter_max = args.epochs * len(train_dataloader)
185 |
186 | for imgs_train, _, _, imgs_idx in tqdm(train_dataloader, ncols=60):
187 |
188 | iter_idx += 1
189 | imgs_idx = imgs_idx.cuda()
190 | imgs_train = imgs_train.cuda()
191 |
192 | hard_psd_label = hard_psd_label_bank[imgs_idx] #[B, C]
193 |
194 | embed_feat, pred_cls = model(imgs_train, apply_softmax=True)
195 |
196 | psd_pred_loss = torch.sum(-hard_psd_label * torch.log(pred_cls + 1e-5), dim=-1).mean()
197 |
198 | with torch.no_grad():
199 | embed_feat = embed_feat / torch.norm(embed_feat, p=2, dim=-1, keepdim=True)
200 | feat_dist = torch.einsum("bd, nd -> bn", embed_feat, embed_feat_bank) #[B, N]
201 | nn_feat_idx = torch.topk(feat_dist, k=local_KNN+1, dim=-1, largest=True)[-1] #[B, local_KNN+1]
202 | nn_feat_idx = nn_feat_idx[:, 1:] #[B, local_KNN]
203 | nn_pred_cls = torch.mean(pred_cls_bank[nn_feat_idx], dim=1) #[B, C]
204 | # update the pred_cls and embed_feat bank
205 | pred_cls_bank[imgs_idx] = pred_cls
206 | embed_feat_bank[imgs_idx] = embed_feat
207 |
208 | knn_pred_loss = torch.sum(-nn_pred_cls * torch.log(pred_cls + 1e-5), dim=-1).mean()
209 |
210 | loss = args.lam_psd * psd_pred_loss + args.lam_knn * knn_pred_loss
211 |
212 | lr_scheduler(optimizer, iter_idx, iter_max)
213 | optimizer.zero_grad()
214 | loss.backward()
215 | optimizer.step()
216 |
217 | all_pred_loss_stack.append(loss.cpu().item())
218 | psd_pred_loss_stack.append(psd_pred_loss.cpu().item())
219 | knn_pred_loss_stack.append(knn_pred_loss.cpu().item())
220 |
221 | train_loss_dict = {}
222 | train_loss_dict["all_pred_loss"] = np.mean(all_pred_loss_stack)
223 | train_loss_dict["psd_pred_loss"] = np.mean(psd_pred_loss_stack)
224 | train_loss_dict["knn_pred_loss"] = np.mean(knn_pred_loss_stack)
225 |
226 | return train_loss_dict
227 |
228 | @torch.no_grad()
229 | def test(args, model, dataloader, src_flg=False):
230 |
231 | model.eval()
232 | gt_label_stack = []
233 | pred_cls_stack = []
234 |
235 | if src_flg:
236 | class_list = args.source_class_list
237 | open_flg = False
238 | else:
239 | class_list = args.target_class_list
240 | open_flg = args.target_private_class_num > 0
241 |
242 | for _, imgs_test, imgs_label, _ in tqdm(dataloader, ncols=60):
243 |
244 | imgs_test = imgs_test.cuda()
245 | _, pred_cls = model(imgs_test, apply_softmax=True)
246 | gt_label_stack.append(imgs_label)
247 | pred_cls_stack.append(pred_cls.cpu())
248 |
249 | gt_label_all = torch.cat(gt_label_stack, dim=0) #[N]
250 | pred_cls_all = torch.cat(pred_cls_stack, dim=0) #[N, C]
251 |
252 | h_score, known_acc, unknown_acc, _ = compute_h_score(args, class_list, gt_label_all, pred_cls_all, open_flg, open_thresh=args.w_0)
253 | return h_score, known_acc, unknown_acc
254 |
255 | def main(args):
256 |
257 | os.environ["CUDA_VISIBLE_DEVICES"] = args.gpu
258 | this_dir = os.path.join(os.path.dirname(__file__), ".")
259 |
260 | model = SFUniDA(args)
261 |
262 | if args.checkpoint is not None and os.path.isfile(args.checkpoint):
263 | checkpoint = torch.load(args.checkpoint, map_location=torch.device("cpu"))
264 | model.load_state_dict(checkpoint["model_state_dict"])
265 | else:
266 | print(args.checkpoint)
267 | raise ValueError("YOU MUST SET THE APPROPORATE SOURCE CHECKPOINT FOR TARGET MODEL ADPTATION!!!")
268 |
269 | model = model.cuda()
270 | save_dir = os.path.join(this_dir, "checkpoints_glc", args.dataset, "s_{}_t_{}".format(args.s_idx, args.t_idx),
271 | args.target_label_type, args.note)
272 |
273 | if not os.path.isdir(save_dir):
274 | os.makedirs(save_dir)
275 | args.save_dir = save_dir
276 | args.logger = set_logger(args, log_name="log_target_training.txt")
277 |
278 | param_group = []
279 | for k, v in model.backbone_layer.named_parameters():
280 | param_group += [{'params': v, 'lr': args.lr*0.1}]
281 |
282 | for k, v in model.feat_embed_layer.named_parameters():
283 | param_group += [{'params': v, 'lr': args.lr}]
284 |
285 | for k, v in model.class_layer.named_parameters():
286 | v.requires_grad = False
287 |
288 | optimizer = torch.optim.SGD(param_group)
289 | optimizer = op_copy(optimizer)
290 |
291 | target_data_list = open(os.path.join(args.target_data_dir, "image_unida_list.txt"), "r").readlines()
292 | target_dataset = SFUniDADataset(args, args.target_data_dir, target_data_list, d_type="target", preload_flg=True)
293 |
294 | target_train_dataloader = DataLoader(target_dataset, batch_size=args.batch_size, shuffle=True,
295 | num_workers=args.num_workers, drop_last=True)
296 | target_test_dataloader = DataLoader(target_dataset, batch_size=args.batch_size*2, shuffle=False,
297 | num_workers=args.num_workers, drop_last=False)
298 |
299 | notation_str = "\n=======================================================\n"
300 | notation_str += " START TRAINING ON THE TARGET:{} BASED ON SOURCE:{} \n".format(args.t_idx, args.s_idx)
301 | notation_str += "======================================================="
302 |
303 | args.logger.info(notation_str)
304 | best_h_score = 0.0
305 | best_known_acc = 0.0
306 | best_unknown_acc = 0.0
307 | best_epoch_idx = 0
308 | for epoch_idx in tqdm(range(args.epochs), ncols=60):
309 | # Train on target
310 | loss_dict =train(args, model, target_train_dataloader, target_test_dataloader, optimizer, epoch_idx)
311 | args.logger.info("Epoch: {}/{}, train_all_loss:{:.3f},\n\
312 | train_psd_loss:{:.3f}, train_knn_loss:{:.3f},".format(epoch_idx+1, args.epochs,
313 | loss_dict["all_pred_loss"], loss_dict["psd_pred_loss"], loss_dict["knn_pred_loss"]))
314 |
315 | # Evaluate on target
316 | hscore, knownacc, unknownacc = test(args, model, target_test_dataloader, src_flg=False)
317 | args.logger.info("Current: H-Score:{:.3f}, KnownAcc:{:.3f}, UnknownAcc:{:.3f}".format(hscore, knownacc, unknownacc))
318 |
319 | if args.target_label_type == 'PDA' or args.target_label_type == 'CLDA':
320 | if knownacc >= best_known_acc:
321 | best_h_score = hscore
322 | best_known_acc = knownacc
323 | best_unknown_acc = unknownacc
324 | best_epoch_idx = epoch_idx
325 |
326 | # checkpoint_file = "{}_SFDA_best_target_checkpoint.pth".format(args.dataset)
327 | # torch.save({
328 | # "epoch":epoch_idx,
329 | # "model_state_dict":model.state_dict()}, os.path.join(save_dir, checkpoint_file))
330 | else:
331 | if hscore >= best_h_score:
332 | best_h_score = hscore
333 | best_known_acc = knownacc
334 | best_unknown_acc = unknownacc
335 | best_epoch_idx = epoch_idx
336 |
337 | # checkpoint_file = "{}_SFDA_best_target_checkpoint.pth".format(args.dataset)
338 | # torch.save({
339 | # "epoch":epoch_idx,
340 | # "model_state_dict":model.state_dict()}, os.path.join(save_dir, checkpoint_file))
341 |
342 | args.logger.info("Best : H-Score:{:.3f}, KnownAcc:{:.3f}, UnknownAcc:{:.3f}".format(best_h_score, best_known_acc, best_unknown_acc))
343 |
344 | if __name__ == "__main__":
345 | args = build_args()
346 | set_random_seed(args.seed)
347 |
348 | # SET THE CHECKPOINT
349 | args.checkpoint = os.path.join("checkpoints_glc", args.dataset, "source_{}".format(args.s_idx),\
350 | "source_{}_{}".format(args.source_train_type, args.target_label_type),
351 | "latest_source_checkpoint.pth")
352 | main(args)
353 |
--------------------------------------------------------------------------------
/utils/net_utils.py:
--------------------------------------------------------------------------------
1 | import os
2 | import sys
3 | import logging
4 |
5 | import torch
6 | import random
7 | import numpy as np
8 | import torch.nn as nn
9 |
10 | def set_random_seed(seed=0):
11 | random.seed(seed)
12 | np.random.seed(seed)
13 | torch.manual_seed(seed)
14 | if torch.cuda.is_available():
15 | torch.cuda.manual_seed(seed)
16 | torch.cuda.manual_seed_all(seed)
17 |
18 | torch.backends.cudnn.benchmark = False
19 | torch.backends.cudnn.deterministic = True
20 |
21 | def Entropy(input_):
22 | bs = input_.size(0)
23 | epsilon = 1e-5
24 | entropy = -input_ * torch.log(input_ + epsilon)
25 | entropy = torch.sum(entropy, dim=1)
26 | return entropy
27 |
28 | def log_args(args):
29 | s = "\n==========================================\n"
30 |
31 | s += ("python" + " ".join(sys.argv) + "\n")
32 |
33 | for arg, content in args.__dict__.items():
34 | s += "{}:{}\n".format(arg, content)
35 |
36 | s += "==========================================\n"
37 |
38 | return s
39 |
40 | def set_logger(args, log_name="train_log.txt"):
41 |
42 | # creating logger.
43 | logger = logging.getLogger(__name__)
44 | logger.setLevel(logging.DEBUG)
45 |
46 | # file logger handler
47 | if args.test:
48 | # Append the test results on existing logging file.
49 | file_handler = logging.FileHandler(os.path.join(args.save_dir, log_name), mode="a")
50 | file_format = logging.Formatter("%(message)s")
51 | file_handler.setLevel(logging.DEBUG)
52 | file_handler.setFormatter(file_format)
53 | else:
54 | # Init the logging file.
55 | file_handler = logging.FileHandler(os.path.join(args.save_dir, log_name), mode="w")
56 |
57 | file_format = logging.Formatter("%(asctime)s [%(levelname)s] - %(message)s")
58 | file_handler.setLevel(logging.DEBUG)
59 | file_handler.setFormatter(file_format)
60 |
61 | # terminal logger handler
62 | terminal_handler = logging.StreamHandler()
63 | terminal_format = logging.Formatter("%(asctime)s [%(levelname)s] - %(message)s")
64 | terminal_handler.setLevel(logging.INFO)
65 | terminal_handler.setFormatter(terminal_format)
66 |
67 | logger.addHandler(file_handler)
68 | logger.addHandler(terminal_handler)
69 | if not args.test:
70 | logger.debug(log_args(args))
71 |
72 | return logger
73 |
74 | def compute_h_score(args, class_list, gt_label_all, pred_cls_all, open_flag=True, open_thresh=0.5, pred_unc_all=None):
75 |
76 | # class_list:
77 | # :source [0, 1, ..., N_share - 1, ..., N_share + N_src_private - 1]
78 | # :target [0, 1, ..., N_share - 1, N_share + N_src_private + N_tar_private -1]
79 | # gt_label_all [N]
80 | # pred_cls_all [N, C]
81 | # open_flag True/False
82 | # pred_unc_all [N], if exists. [0~1.0]
83 |
84 | per_class_num = np.zeros((len(class_list)))
85 | per_class_correct = np.zeros_like(per_class_num)
86 | pred_label_all = torch.max(pred_cls_all, dim=1)[1] #[N]
87 |
88 | if open_flag:
89 | cls_num = pred_cls_all.shape[1]
90 |
91 | if pred_unc_all is None:
92 | # If there is not pred_unc_all tensor,
93 | # We normalize the Shannon entropy to [0, 1] to denote the uncertainty.
94 | pred_unc_all = Entropy(pred_cls_all)/np.log(cls_num)# [N]
95 |
96 | unc_idx = torch.where(pred_unc_all > open_thresh)[0]
97 | pred_label_all[unc_idx] = cls_num # set these pred results to unknown
98 |
99 | for i, label in enumerate(class_list):
100 | label_idx = torch.where(gt_label_all == label)[0]
101 | correct_idx = torch.where(pred_label_all[label_idx] == label)[0]
102 | per_class_num[i] = float(len(label_idx))
103 | per_class_correct[i] = float(len(correct_idx))
104 |
105 | per_class_acc = per_class_correct / (per_class_num + 1e-5)
106 |
107 | if open_flag:
108 | known_acc = per_class_acc[:-1].mean()
109 | unknown_acc = per_class_acc[-1]
110 | h_score = 2 * known_acc * unknown_acc / (known_acc + unknown_acc + 1e-5)
111 | else:
112 | known_acc = per_class_correct.sum() / (per_class_num.sum() + 1e-5)
113 | unknown_acc = 0.0
114 | h_score = 0.0
115 |
116 | return h_score, known_acc, unknown_acc, per_class_acc
117 |
118 | class CrossEntropyLabelSmooth(nn.Module):
119 |
120 | """Cross entropy loss with label smoothing regularizer.
121 | Reference:
122 | Szegedy et al. Rethinking the Inception Architecture for Computer Vision. CVPR 2016.
123 | Equation: y = (1 - epsilon) * y + epsilon / K.
124 | Args:
125 | num_classes (int): number of classes.
126 | epsilon (float): weight.
127 | """
128 |
129 | def __init__(self, num_classes, epsilon=0.1, reduction=True):
130 | super(CrossEntropyLabelSmooth, self).__init__()
131 | self.num_classes = num_classes
132 | self.epsilon = epsilon
133 | self.reduction = reduction
134 | self.logsoftmax = nn.LogSoftmax(dim=-1)
135 |
136 | def forward(self, inputs, targets, applied_softmax=True):
137 | """
138 | Args:
139 | inputs: prediction matrix (after softmax) with shape (batch_size, num_classes)
140 | targets: ground truth labels with shape (batch_size, num_classes).
141 | """
142 | if applied_softmax:
143 | log_probs = torch.log(inputs)
144 | else:
145 | log_probs = self.logsoftmax(inputs)
146 |
147 | if inputs.shape != targets.shape:
148 | # this means that the target data shape is (B,)
149 | targets = torch.zeros_like(inputs).scatter(1, targets.unsqueeze(1), 1)
150 |
151 | targets = (1 - self.epsilon) * targets + self.epsilon / self.num_classes
152 | loss = (- targets * log_probs).sum(dim=1)
153 |
154 | if self.reduction:
155 | return loss.mean()
156 | else:
157 | return loss
--------------------------------------------------------------------------------