├── LICENSE
├── README.md
├── VERSION.md
├── arm
└── flower_classification
│ ├── Makefile
│ ├── build
│ └── main.o
│ ├── image
│ ├── 00000001.jpg
│ ├── 00000002.jpg
│ ├── 00000003.jpg
│ ├── 00000004.jpg
│ ├── 00000005.jpg
│ ├── 00000006.jpg
│ ├── 00000007.jpg
│ ├── 00000008.jpg
│ ├── 00000009.jpg
│ ├── 00000010.jpg
│ ├── 00000770.jpg
│ ├── 00000771.jpg
│ ├── 00000772.jpg
│ ├── 00000773.jpg
│ ├── 00000774.jpg
│ ├── 00000775.jpg
│ ├── 00000776.jpg
│ ├── 00000777.jpg
│ ├── 00000778.jpg
│ ├── 00000779.jpg
│ ├── 00001822.jpg
│ ├── 00001823.jpg
│ ├── 00001824.jpg
│ ├── 00001825.jpg
│ ├── 00001826.jpg
│ ├── 00001827.jpg
│ ├── 00001828.jpg
│ ├── 00001829.jpg
│ ├── 00001830.jpg
│ ├── 00001831.jpg
│ ├── 00002606.jpg
│ ├── 00002607.jpg
│ ├── 00002608.jpg
│ ├── 00002609.jpg
│ ├── 00002610.jpg
│ ├── 00002611.jpg
│ ├── 00002612.jpg
│ ├── 00002613.jpg
│ ├── 00002614.jpg
│ ├── 00002615.jpg
│ ├── 00003340.jpg
│ ├── 00003341.jpg
│ ├── 00003342.jpg
│ ├── 00003343.jpg
│ ├── 00003344.jpg
│ ├── 00003345.jpg
│ ├── 00003346.jpg
│ ├── 00003347.jpg
│ ├── 00003348.jpg
│ ├── 00003349.jpg
│ └── word_list.txt
│ ├── model
│ └── dpu_flower_classification_0.elf
│ └── src
│ └── main.cc
├── pic_for_readme
├── classification_flower.PNG
└── directory.PNG
└── x86
├── decent_q.sh
├── dnnc.sh
├── evaluate_frozen_model.py
├── evaluate_quantized_graph.py
├── evaluate_trained_model.py
├── flower_classification_input_fn.py
├── flowers
└── .gitignore
├── freeze_model.py
├── load_data.py
└── train_data.py
/LICENSE:
--------------------------------------------------------------------------------
1 | MIT License
2 |
3 | Copyright (c) 2019 gewuek
4 |
5 | Permission is hereby granted, free of charge, to any person obtaining a copy
6 | of this software and associated documentation files (the "Software"), to deal
7 | in the Software without restriction, including without limitation the rights
8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 | copies of the Software, and to permit persons to whom the Software is
10 | furnished to do so, subject to the following conditions:
11 |
12 | The above copyright notice and this permission notice shall be included in all
13 | copies or substantial portions of the Software.
14 |
15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 | SOFTWARE.
22 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # flower_classification_vai_tf_numpy_array
2 | This is a simple example about how to train a ConNet model from labeled dataset with TensorFlow and then use Vitis AI 1.2 tools to deploy the model into ZCU102 board.
3 | To make it easier I just make my model ovefit the dataset. All the training/validation/calibration data are just from the same dataset.
4 | And I just use the numpy array as data input and OpenCV functions to open images during model training. Please find project using tf.data.Dataset here: [flower_classification_vai_tf_dataset](https://github.com/gewuek/flower_classification_vai_tf_dataset).
5 | The dataset is downloaded from: https://www.kaggle.com/alxmamaev/flowers-recognition
6 | And you may find the dataset from Keras tutorial: https://www.kaggle.com/alxmamaev/flowers-recognition
7 | The ARM deploy code is modified from the DNNDK resnet50 example code.
8 |
9 | The whole design is trained and deployed using Ubuntu 18.04 + Vitis AI 1.2 + TensorFlow 1.15 + PetaLinux 2020.1.
10 |
11 |
12 | ***To be notices:*** Although this design uses Xilinx tools to deploy design on Xilinx developboard this is just personal release. No gurantee can be made here. :-) Pease feel free to contact me or post your questions on:
13 | https://forums.xilinx.com/t5/Machine-Learning/bd-p/Deephi
14 |
15 |
16 | ### TensorFlow Training and Vitis AI Quantization Flow
17 | Please install the Vitis AI 1.2 according to https://github.com/Xilinx/Vitis-AI/ before starting the custom model flow.
18 | Make sure you can run Vitis AI DNNDK examples.
19 |
20 | 1. git clone the repository inside the Vitis-AI folder so that when launching the docker you can see the ***flower_classification_vai_tf_numpy_array*** folder inside the docker workspace.
21 | 2. Download kaggle flower dataset from https://www.kaggle.com/alxmamaev/flowers-recognition
22 | 3. unzip the folder and copy the files into ```flower_classification_vai_tf_numpy_array/x86/flowers``` folder. So that the directory would like below:
23 | 
24 | 4. Launch the docker, call ```conda activate vitis-ai-tensorflow``` to set the TensorFlow environment and then navigate into the ```flower_classification_vai_tf_numpy_array/x86/``` folder
25 | 5. Load images and labels into dataset
26 | ```python3 ./load_data.py```
27 | 6. Train data
28 | ```python3 ./train_data.py```
29 | 7. Evaluate the trained model(Opitional)
30 | ```python3 ./evaluate_trained_model.py```
31 | 8. Freeze the model
32 | ```python3 ./freeze_model.py```
33 | 9. Evaluate the frozen model(Opitional)
34 | ```python3 ./evaluate_frozen_model.py```
35 | 10. Quantize the graph, using ```chmod u+x ./decent_q.sh``` if necessary
36 | ```./decent_q.sh```
37 | 11. Evaluate quantized graph (Optional)
38 | ```python3 ./evaluate_quantized_graph.py```
39 | 12. Compile the quantized model into elf using DNNC, use ```chmod u+x ./dnnc.sh``` if necessary
40 | ```./dnnc.sh```
41 | 13. Now you should get the ELF file at ```flower_classification_vai_tf_numpy_array/x86/flower_classification/dpu_flower_classification_0.elf```. Copy the file into the ```flower_classification_vai_tf_numpy_array/arm/flower_classification/model``` for further usage
42 |
43 | ### Test on ZCU102 board
44 | For build and deploy the example on ZCU102 board, there are two additional requirement
45 | 1. DNNDK Libs
46 | a) Refer to [DNNDK example](https://github.com/Xilinx/Vitis-AI/tree/v1.2/mpsoc) to set up the on board DNNDK libs.
47 | b) Refer to [Vitis Ai Library guide](https://github.com/Xilinx/Vitis-AI/tree/v1.2/Vitis-AI-Library) to set up the on board VAi libs.
48 | c) I was told that the Vitis AI Libs are optional if you are using pure DNNDK libs, but don't have time to have a try.
49 |
50 | 2. Compile Environment
51 | a) If you are using [ZCU102 VAI 1.2 release image](https://www.xilinx.com/bin/public/openDownload?filename=xilinx-zcu102-dpu-v2020.1-v1.2.0.img.gz) the compiling environment is preinstalled at the image. So you just need to copy the ***flower_classification_vai_tf_numpy_array/arm/flower_classification*** folder into the ZCU102 board(SD card or DDR) and when the board boot up go to the flower_classification folder and run ***make*** to compile the application.
52 | b) For custom platform please refer to the configuration from [Vitis AI custom platform creation flow](https://github.com/gewuek/vitis_ai_custom_platform_flow). If it is not up-to-date please try with this [VAI 1.2 tmp rep](https://github.com/gewuek/vitis_ai_custom_platform_v1.2_tmp) to configure the rootfs. Both the Vitis AI GUI compilation and on board compilation should work.
53 |
54 | The running result on ZCU102 would like below:
55 | 
56 |
57 | ### Reference
58 |
59 | https://www.youtube.com/watch?v=VwVg9jCtqaU&t=112s
60 |
61 | https://www.kaggle.com/alxmamaev/flowers-recognition
62 |
63 | https://www.youtube.com/watch?v=j-3vuBynnOE
64 |
65 | https://github.com/tensorflow/docs/blob/r1.12/site/en/tutorials/load_data/images.ipynb
66 |
67 | https://www.xilinx.com/support/documentation/sw_manuals/ai_inference/v1_6/ug1327-dnndk-user-guide.pdf
68 |
69 | https://github.com/Xilinx/Edge-AI-Platform-Tutorials/tree/3.1/docs/DPU-Integration
70 |
71 | https://stackoverflow.com/questions/45466020/how-to-export-keras-h5-to-tensorflow-pb
72 |
73 | https://stackoverflow.com/questions/51278213/what-is-the-use-of-a-pb-file-in-tensorflow-and-how-does-it-work/51281809#51281809
74 |
--------------------------------------------------------------------------------
/VERSION.md:
--------------------------------------------------------------------------------
1 | # Version
2 | ***OS:*** Ubuntu 18.04.5 LTS
3 | ***Vitis:*** 2020.1
4 | ***PetaLinux:*** 2020.1
5 | ***Vitis AI:*** 1.2
6 |
7 | ***pip3 list***:
8 | Vitis AI 1.2 docker image
9 | Package Version
10 | --------------------- -------------------
11 | absl-py 0.9.0
12 | argon2-cffi 20.1.0
13 | astor 0.8.0
14 | attrs 19.3.0
15 | backcall 0.2.0
16 | backports.weakref 1.0.post1
17 | bleach 3.1.5
18 | certifi 2020.6.20
19 | cffi 1.14.0
20 | ck 1.15.0
21 | cycler 0.10.0
22 | decorator 4.4.2
23 | defusedxml 0.6.0
24 | dill 0.3.2
25 | entrypoints 0.3
26 | gast 0.2.2
27 | google-pasta 0.2.0
28 | graphviz 0.14
29 | grpcio 1.27.2
30 | h5py 2.8.0
31 | importlib-metadata 1.7.0
32 | iniconfig 0.0.0
33 | ipykernel 5.3.4
34 | ipython 7.16.1
35 | ipython-genutils 0.2.0
36 | ipywidgets 7.5.1
37 | jedi 0.17.0
38 | Jinja2 2.11.2
39 | joblib 0.16.0
40 | jsonschema 3.2.0
41 | jupyter 1.0.0
42 | jupyter-client 6.1.6
43 | jupyter-console 6.1.0
44 | jupyter-core 4.6.3
45 | Keras 2.3.1
46 | Keras-Applications 1.0.8
47 | Keras-Preprocessing 1.1.0
48 | kiwisolver 1.2.0
49 | Markdown 3.1.1
50 | MarkupSafe 1.1.1
51 | marshmallow 3.7.1
52 | matplotlib 3.2.2
53 | mistune 0.8.4
54 | mkl-fft 1.1.0
55 | mkl-random 1.1.1
56 | mkl-service 2.3.0
57 | more-itertools 8.4.0
58 | nbconvert 5.6.1
59 | nbformat 5.0.7
60 | notebook 6.1.1
61 | numpy 1.16.4
62 | olefile 0.46
63 | opencv-contrib-python 4.4.0.40
64 | opt-einsum 3.1.0
65 | orderedset 2.0.3
66 | packaging 20.4
67 | pandas 1.1.0
68 | pandocfilters 1.4.2
69 | parso 0.8.0
70 | pexpect 4.8.0
71 | pickleshare 0.7.5
72 | Pillow 7.2.0
73 | pip 20.2.2
74 | pluggy 0.13.1
75 | progressbar2 3.37.1
76 | prometheus-client 0.8.0
77 | prompt-toolkit 3.0.5
78 | protobuf 3.11.2
79 | ptyprocess 0.6.0
80 | py 1.9.0
81 | pybind11 2.5.0
82 | pycparser 2.20
83 | pydot 1.4.1
84 | Pygments 2.6.1
85 | pyparsing 2.4.7
86 | pyrsistent 0.16.0
87 | pytest 6.0.1
88 | pytest-runner 5.2
89 | python-dateutil 2.8.1
90 | python-utils 2.3.0
91 | pytz 2020.1
92 | PyYAML 5.3.1
93 | pyzmq 19.0.1
94 | qtconsole 4.7.5
95 | QtPy 1.9.0
96 | scikit-learn 0.23.1
97 | scipy 1.5.0
98 | Send2Trash 1.5.0
99 | setuptools 49.2.1.post20200807
100 | six 1.15.0
101 | tensorboard 1.15.0
102 | tensorflow 1.15.0
103 | tensorflow-estimator 1.15.1
104 | tensorflow-gpu 1.15.2
105 | termcolor 1.1.0
106 | terminado 0.8.3
107 | testpath 0.4.4
108 | threadpoolctl 2.1.0
109 | toml 0.10.1
110 | tornado 6.0.4
111 | tqdm 4.48.2
112 | traitlets 4.3.3
113 | wcwidth 0.2.5
114 | webencodings 0.5.1
115 | Werkzeug 0.16.1
116 | wheel 0.34.2
117 | widgetsnbextension 3.5.1
118 | wrapt 1.12.1
119 | xnnc Dev
120 | zipp 3.1.0
121 |
122 |
--------------------------------------------------------------------------------
/arm/flower_classification/Makefile:
--------------------------------------------------------------------------------
1 | ## (c) Copyright 2019 Xilinx, Inc. All rights reserved.
2 | ##
3 | ## This file contains confidential and proprietary information
4 | ## of Xilinx, Inc. and is protected under U.S. and
5 | ## international copyright and other intellectual property
6 | ## laws.
7 | ##
8 | ## DISCLAIMER
9 | ## This disclaimer is not a license and does not grant any
10 | ## rights to the materials distributed herewith. Except as
11 | ## otherwise provided in a valid license issued to you by
12 | ## Xilinx, and to the maximum extent permitted by applicable
13 | ## law: (1) THESE MATERIALS ARE MADE AVAILABLE "AS IS" AND
14 | ## WITH ALL FAULTS, AND XILINX HEREBY DISCLAIMS ALL WARRANTIES
15 | ## AND CONDITIONS, EXPRESS, IMPLIED, OR STATUTORY, INCLUDING
16 | ## BUT NOT LIMITED TO WARRANTIES OF MERCHANTABILITY, NON-
17 | ## INFRINGEMENT, OR FITNESS FOR ANY PARTICULAR PURPOSE; and
18 | ## (2) Xilinx shall not be liable (whether in contract or tort,
19 | ## including negligence, or under any other theory of
20 | ## liability) for any loss or damage of any kind or nature
21 | ## related to, arising under or in connection with these
22 | ## materials, including for any direct, or any indirect,
23 | ## special, incidental, or consequential loss or damage
24 | ## (including loss of data, profits, goodwill, or any type of
25 | ## loss or damage suffered as a result of any action brought
26 | ## by a third party) even if such damage or loss was
27 | ## reasonably foreseeable or Xilinx had been advised of the
28 | ## possibility of the same.
29 | ##
30 | ## CRITICAL APPLICATIONS
31 | ## Xilinx products are not designed or intended to be fail-
32 | ## safe, or for use in any application requiring fail-safe
33 | ## performance, such as life-support or safety devices or
34 | ## systems, Class III medical devices, nuclear facilities,
35 | ## applications related to the deployment of airbags, or any
36 | ## other applications that could lead to death, personal
37 | ## injury, or severe property or environmental damage
38 | ## (individually and collectively, "Critical
39 | ## Applications"). Customer assumes the sole risk and
40 | ## liability of any use of Xilinx products in Critical
41 | ## Applications, subject only to applicable laws and
42 | ## regulations governing limitations on product liability.
43 | ##
44 | ## THIS COPYRIGHT NOTICE AND DISCLAIMER MUST BE RETAINED AS
45 | ## PART OF THIS FILE AT ALL TIMES.
46 |
47 | PROJECT = flower_classification
48 |
49 | CXX := g++
50 | OBJ := main.o
51 |
52 | # linking libraries of OpenCV
53 | LDFLAGS = $(shell pkg-config --libs opencv)
54 |
55 | # linking libraries of DNNDK
56 | LDFLAGS += -lhineon -ln2cube
57 |
58 | CUR_DIR = $(shell pwd)
59 | SRC = $(CUR_DIR)/src
60 | BUILD = $(CUR_DIR)/build
61 | VPATH = $(SRC)
62 | MODEL = $(CUR_DIR)/model/dpu_flower_classification_0.elf
63 |
64 | ARCH = $(shell uname -m | sed -e s/arm.*/armv71/ -e s/aarch64.*/aarch64/)
65 | CFLAGS := -O2 -Wall -Wpointer-arith -std=c++11 -ffast-math
66 | ifeq ($(ARCH),armv71)
67 | CFLAGS += -mcpu=cortex-a9 -mfloat-abi=hard -mfpu=neon
68 | endif
69 | ifeq ($(ARCH),aarch64)
70 | CFLAGS += -mcpu=cortex-a53
71 | endif
72 |
73 | .PHONY: all clean
74 |
75 | all: $(BUILD) $(PROJECT)
76 |
77 | $(PROJECT) : $(OBJ)
78 | $(CXX) $(CFLAGS) $(addprefix $(BUILD)/, $^) $(MODEL) -o $@ $(LDFLAGS)
79 |
80 | %.o : %.cc
81 | $(CXX) -c $(CFLAGS) $< -o $(BUILD)/$@
82 |
83 | clean:
84 | $(RM) -rf $(BUILD)/*.o $(BUILD)
85 | $(RM) $(PROJECT)
86 |
87 | $(BUILD) :
88 | -mkdir -p $@
89 |
--------------------------------------------------------------------------------
/arm/flower_classification/build/main.o:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/build/main.o
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000001.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000001.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000002.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000002.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000003.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000003.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000004.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000004.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000005.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000005.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000006.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000006.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000007.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000007.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000008.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000008.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000009.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000009.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000010.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000010.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000770.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000770.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000771.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000771.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000772.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000772.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000773.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000773.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000774.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000774.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000775.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000775.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000776.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000776.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000777.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000777.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000778.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000778.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00000779.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00000779.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00001822.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00001822.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00001823.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00001823.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00001824.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00001824.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00001825.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00001825.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00001826.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00001826.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00001827.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00001827.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00001828.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00001828.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00001829.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00001829.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00001830.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00001830.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00001831.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00001831.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00002606.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00002606.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00002607.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00002607.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00002608.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00002608.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00002609.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00002609.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00002610.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00002610.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00002611.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00002611.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00002612.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00002612.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00002613.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00002613.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00002614.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00002614.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00002615.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00002615.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00003340.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00003340.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00003341.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00003341.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00003342.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00003342.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00003343.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00003343.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00003344.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00003344.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00003345.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00003345.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00003346.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00003346.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00003347.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00003347.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00003348.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00003348.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/00003349.jpg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/image/00003349.jpg
--------------------------------------------------------------------------------
/arm/flower_classification/image/word_list.txt:
--------------------------------------------------------------------------------
1 | daisy
2 | dandelion
3 | rose
4 | sunflower
5 | tulip
--------------------------------------------------------------------------------
/arm/flower_classification/model/dpu_flower_classification_0.elf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/arm/flower_classification/model/dpu_flower_classification_0.elf
--------------------------------------------------------------------------------
/arm/flower_classification/src/main.cc:
--------------------------------------------------------------------------------
1 | /*
2 | -- (c) Copyright 2019 Xilinx, Inc. All rights reserved.
3 | --
4 | -- This file contains confidential and proprietary information
5 | -- of Xilinx, Inc. and is protected under U.S. and
6 | -- international copyright and other intellectual property
7 | -- laws.
8 | --
9 | -- DISCLAIMER
10 | -- This disclaimer is not a license and does not grant any
11 | -- rights to the materials distributed herewith. Except as
12 | -- otherwise provided in a valid license issued to you by
13 | -- Xilinx, and to the maximum extent permitted by applicable
14 | -- law: (1) THESE MATERIALS ARE MADE AVAILABLE "AS IS" AND
15 | -- WITH ALL FAULTS, AND XILINX HEREBY DISCLAIMS ALL WARRANTIES
16 | -- AND CONDITIONS, EXPRESS, IMPLIED, OR STATUTORY, INCLUDING
17 | -- BUT NOT LIMITED TO WARRANTIES OF MERCHANTABILITY, NON-
18 | -- INFRINGEMENT, OR FITNESS FOR ANY PARTICULAR PURPOSE; and
19 | -- (2) Xilinx shall not be liable (whether in contract or tort,
20 | -- including negligence, or under any other theory of
21 | -- liability) for any loss or damage of any kind or nature
22 | -- related to, arising under or in connection with these
23 | -- materials, including for any direct, or any indirect,
24 | -- special, incidental, or consequential loss or damage
25 | -- (including loss of data, profits, goodwill, or any type of
26 | -- loss or damage suffered as a result of any action brought
27 | -- by a third party) even if such damage or loss was
28 | -- reasonably foreseeable or Xilinx had been advised of the
29 | -- possibility of the same.
30 | --
31 | -- CRITICAL APPLICATIONS
32 | -- Xilinx products are not designed or intended to be fail-
33 | -- safe, or for use in any application requiring fail-safe
34 | -- performance, such as life-support or safety devices or
35 | -- systems, Class III medical devices, nuclear facilities,
36 | -- applications related to the deployment of airbags, or any
37 | -- other applications that could lead to death, personal
38 | -- injury, or severe property or environmental damage
39 | -- (individually and collectively, "Critical
40 | -- Applications"). Customer assumes the sole risk and
41 | -- liability of any use of Xilinx products in Critical
42 | -- Applications, subject only to applicable laws and
43 | -- regulations governing limitations on product liability.
44 | --
45 | -- THIS COPYRIGHT NOTICE AND DISCLAIMER MUST BE RETAINED AS
46 | -- PART OF THIS FILE AT ALL TIMES.
47 | */
48 |
49 | #include
50 | #include
51 | #include
52 | #include
53 | #include
54 | #include
55 | #include
56 | #include
57 | #include
58 | #include
59 | #include
60 | #include
61 | #include
62 | #include
63 | #include
64 |
65 | /* header file OpenCV for image processing */
66 | #include
67 |
68 | /* header file for DNNDK APIs */
69 | #include
70 |
71 | using namespace std;
72 | using namespace cv;
73 |
74 | /* 7.71 GOP MAdds for ResNet50 */
75 | //#define RESNET50_WORKLOAD (7.71f)
76 | /* DPU Kernel name for ResNet50 */
77 | #define KRENEL_FLOWER "flower_classification_0"
78 | /* Input Node for Kernel ResNet50 */
79 | #define INPUT_NODE "conv2d_Conv2D"
80 | /* Output Node for Kernel ResNet50 */
81 | #define OUTPUT_NODE "dense_1_MatMul"
82 |
83 | const string baseImagePath = "./image/";
84 |
85 | /*
86 | * Software normalization
87 | * normalize_image
88 | */
89 | void normalize_image(const Mat& image, int8_t* data, float scale)
90 | {
91 | for(int i = 0; i < 3; ++i) {
92 | for(int j = 0; j < image.rows; ++j) {
93 | for(int k = 0; k < image.cols; ++k) {
94 | //data[j*image.rows*3+k*3+2-i] = (float(image.at(j,k)[i])/255) * scale;
95 | data[j*image.rows*3+k*3+i] = (float(image.at(j,k)[i])/255) * scale;
96 | //data[j*image.rows*3+k*3+2-i] = 64;
97 | //printf("DATA BeforeTEST = %d\n\r", image.at(j,k)[i]);
98 | //printf("DATA AfterTEST = %d\n\r", data[j*image.rows*3+k*3+2-i]);
99 | }
100 | }
101 | }
102 | }
103 |
104 | /**
105 | * @brief put image names to a vector
106 | *
107 | * @param path - path of the image direcotry
108 | * @param images - the vector of image name
109 | *
110 | * @return none
111 | */
112 | void ListImages(string const &path, vector &images) {
113 | images.clear();
114 | struct dirent *entry;
115 |
116 | /*Check if path is a valid directory path. */
117 | struct stat s;
118 | lstat(path.c_str(), &s);
119 | if (!S_ISDIR(s.st_mode)) {
120 | fprintf(stderr, "Error: %s is not a valid directory!\n", path.c_str());
121 | exit(1);
122 | }
123 |
124 | DIR *dir = opendir(path.c_str());
125 | if (dir == nullptr) {
126 | fprintf(stderr, "Error: Open %s path failed.\n", path.c_str());
127 | exit(1);
128 | }
129 |
130 | while ((entry = readdir(dir)) != nullptr) {
131 | if (entry->d_type == DT_REG || entry->d_type == DT_UNKNOWN) {
132 | string name = entry->d_name;
133 | string ext = name.substr(name.find_last_of(".") + 1);
134 | if ((ext == "JPEG") || (ext == "jpeg") || (ext == "JPG") ||
135 | (ext == "jpg") || (ext == "PNG") || (ext == "png")) {
136 | images.push_back(name);
137 | }
138 | }
139 | }
140 |
141 | closedir(dir);
142 | sort(images.begin(), images.end());
143 | }
144 |
145 | /**
146 | * @brief load kinds from file to a vector
147 | *
148 | * @param path - path of the kinds file
149 | * @param kinds - the vector of kinds string
150 | *
151 | * @return none
152 | */
153 | void LoadWords(string const &path, vector &kinds) {
154 | kinds.clear();
155 | fstream fkinds(path);
156 | if (fkinds.fail()) {
157 | fprintf(stderr, "Error : Open %s failed.\n", path.c_str());
158 | exit(1);
159 | }
160 | string kind;
161 | while (getline(fkinds, kind)) {
162 | kinds.push_back(kind);
163 | }
164 |
165 | fkinds.close();
166 | }
167 |
168 | /**
169 | * @brief calculate softmax
170 | *
171 | * @param data - pointer to input buffer
172 | * @param size - size of input buffer
173 | * @param result - calculation result
174 | *
175 | * @return none
176 | */
177 | void CPUCalcSoftmax(const float *data, size_t size, float *result) {
178 | assert(data && result);
179 | double sum = 0.0f;
180 |
181 | for (size_t i = 0; i < size; i++) {
182 | // printf("data = %f\n\r", data[i]);
183 | result[i] = exp(data[i]);
184 | sum += result[i];
185 | }
186 |
187 | for (size_t i = 0; i < size; i++) {
188 | result[i] /= sum;
189 | }
190 | }
191 |
192 | /**
193 | * @brief Get top k results according to its probability
194 | *
195 | * @param d - pointer to input data
196 | * @param size - size of input data
197 | * @param k - calculation result
198 | * @param vkinds - vector of kinds
199 | *
200 | * @return none
201 | */
202 | void TopK(const float *d, int size, int k, vector &vkinds) {
203 | assert(d && size > 0 && k > 0);
204 | priority_queue> q;
205 |
206 | for (auto i = 0; i < size; ++i) {
207 | q.push(pair(d[i], i));
208 | }
209 |
210 | for (auto i = 0; i < k; ++i) {
211 | pair ki = q.top();
212 | printf("top[%d] prob = %-8f name = %s\n", i, d[ki.second],
213 | vkinds[ki.second].c_str());
214 | q.pop();
215 | }
216 | }
217 |
218 | /**
219 | * @brief Run DPU Task for FlowerClassification
220 | *
221 | * @param taskFlowerClassification - pointer to Flower Classification Task
222 | *
223 | * @return none
224 | */
225 | void runFlowerClassification(DPUTask *taskFlowerClassification) {
226 | assert(taskFlowerClassification);
227 |
228 | /* Mean value for Flower Classification specified in Caffe prototxt */
229 | vector kinds, images;
230 |
231 | /* Load all image names.*/
232 | ListImages(baseImagePath, images);
233 | if (images.size() == 0) {
234 | cerr << "\nError: No images existing under " << baseImagePath << endl;
235 | return;
236 | }
237 |
238 | /* Load all kinds words.*/
239 | LoadWords(baseImagePath + "word_list.txt", kinds);
240 | if (kinds.size() == 0) {
241 | cerr << "\nError: No words exist in file words.txt." << endl;
242 | return;
243 | }
244 |
245 | /* Get channel count of the output Tensor for Flower Classification Task */
246 | int channel = dpuGetOutputTensorChannel(taskFlowerClassification, OUTPUT_NODE);
247 | float *softmax = new float[channel];
248 | float *FCResult = new float[channel];
249 | float scale_input = dpuGetInputTensorScale(taskFlowerClassification, INPUT_NODE);
250 | printf("scale_input = %f;\n\r", scale_input);
251 | DPUTensor *dpu_in = dpuGetInputTensor(taskFlowerClassification, INPUT_NODE);
252 | int8_t *data = dpuGetTensorAddress(dpu_in);
253 |
254 | for (auto &imageName : images) {
255 | cout << "\nLoad image : " << imageName << endl;
256 | /* Load image and Set image into DPU Task for Flower Classification */
257 | Mat image = imread(baseImagePath + imageName);
258 | normalize_image(image, data, scale_input);
259 |
260 | //dpuSetInputImage2(taskFlowerClassification, INPUT_NODE, image);
261 | //dpuSetInputImageWithScale(taskConv, CONV_INPUT_NODE, img, jw_mean, jw_scale);
262 |
263 | /* Launch RetNet50 Task */
264 | cout << "\nRun DPU Task for Flower Classification ..." << endl;
265 | dpuRunTask(taskFlowerClassification);
266 |
267 | /* Get DPU execution time (in us) of DPU Task */
268 | //long long timeProf = dpuGetTaskProfile(taskFlowerClassification);
269 | //cout << " DPU Task Execution time: " << (timeProf * 1.0f) << "us\n";
270 | //float prof = (RESNET50_WORKLOAD / timeProf) * 1000000.0f;
271 | //cout << " DPU Task Performance: " << prof << "GOPS\n";
272 |
273 | /* Get FC result and convert from INT8 to FP32 format */
274 | dpuGetOutputTensorInHWCFP32(taskFlowerClassification, OUTPUT_NODE, FCResult, channel);
275 |
276 | /* Calculate softmax on CPU and display TOP-5 classification results */
277 | CPUCalcSoftmax(FCResult, channel, softmax);
278 | TopK(softmax, channel, 5, kinds);
279 | //break;
280 |
281 | /* Display the impage */
282 | cv::imshow("Classification of Flowers", image);
283 | cv::waitKey(1000);
284 | }
285 |
286 | delete[] softmax;
287 | delete[] FCResult;
288 | }
289 |
290 | /**
291 | * @brief Entry for runing Flower Classification neural network
292 | *
293 | * @note DNNDK APIs prefixed with "dpu" are used to easily program &
294 | * deploy Flower Classification on DPU platform.
295 | *
296 | */
297 | int main(void) {
298 | /* DPU Kernel/Task for running Flower Classification */
299 | DPUKernel *kernelFlowerClassification;
300 | DPUTask *taskFlowerClassification;
301 |
302 | /* Attach to DPU driver and prepare for running */
303 | dpuOpen();
304 |
305 | /* Load DPU Kernel for Flower Classification */
306 | kernelFlowerClassification = dpuLoadKernel(KRENEL_FLOWER);
307 |
308 | /* Create DPU Task for Flower Classification */
309 | taskFlowerClassification = dpuCreateTask(kernelFlowerClassification, 0);
310 |
311 | /* Run Flower Classification Task */
312 | runFlowerClassification(taskFlowerClassification);
313 |
314 | /* Destroy DPU Task & free resources */
315 | dpuDestroyTask(taskFlowerClassification);
316 |
317 | /* Destroy DPU Kernel & free resources */
318 | dpuDestroyKernel(kernelFlowerClassification);
319 |
320 | /* Dettach from DPU driver & free resources */
321 | dpuClose();
322 |
323 | return 0;
324 | }
325 |
--------------------------------------------------------------------------------
/pic_for_readme/classification_flower.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/pic_for_readme/classification_flower.PNG
--------------------------------------------------------------------------------
/pic_for_readme/directory.PNG:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/gewuek/flower_classification_vai_tf_numpy_array/3e7099609ce7059be189ec77c5b42b0ae3fca911/pic_for_readme/directory.PNG
--------------------------------------------------------------------------------
/x86/decent_q.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 |
3 | INPUT_NODE=$(sed '1q;d' ./freeze_input_output_node_name.txt)
4 | OUTPUT_NODE=$(sed '2q;d' ./freeze_input_output_node_name.txt)
5 |
6 | vai_q_tensorflow quantize \
7 | --input_frozen_graph ./frozen_graph.pb \
8 | --input_nodes $INPUT_NODE \
9 | --input_shapes ?,128,128,3 \
10 | --output_nodes $OUTPUT_NODE \
11 | --input_fn flower_classification_input_fn.calib_input \
12 | --method 1 \
13 | --gpu 0 \
14 | --calib_iter 10 \
15 | --output_dir ./quantize_results
16 |
--------------------------------------------------------------------------------
/x86/dnnc.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 |
3 | TARGET=ZCU102
4 | NET_NAME=flower_classification
5 | ARCH=${CONDA_PREFIX}/arch/dpuv2/${TARGET}/${TARGET}.json
6 |
7 | # DNNC command to compile pb file into elf file
8 | vai_c_tensorflow \
9 | --frozen_pb quantize_results/deploy_model.pb \
10 | --arch ${ARCH} \
11 | --output_dir flower_classification \
12 | --net_name ${NET_NAME} \
13 | --options "{'save_kernel':''}"
14 |
15 |
16 |
--------------------------------------------------------------------------------
/x86/evaluate_frozen_model.py:
--------------------------------------------------------------------------------
1 | #! /usr/bin/python3
2 | # coding=utf-8
3 | #####################
4 |
5 | import cv2
6 | import numpy as np
7 | import os
8 | import tensorflow as tf
9 | from tensorflow import keras
10 |
11 | # Must be run if fresh load a new pb file
12 | tf.contrib.resampler
13 |
14 | # Set the class number, input/output node names
15 | CLASS_NUM = 5
16 | # INPUT_NODE = 'conv2d_input_2' # Get from "Freeze the model" flow
17 | # OUTPUT_NODE = 'dense_1_2/Softmax' # Get from "Freeze the model" flow
18 | f = open("./freeze_input_output_node_name.txt", "r+")
19 | curline = f.readline()
20 | INPUT_NODE = curline.strip()
21 | curline = f.readline()
22 | OUTPUT_NODE = curline.strip()
23 | f.close()
24 |
25 | # Load images and Normalization
26 | X = np.load("features.npy") / 255.0
27 | # Load labels and change y label to one hot
28 | y = np.load("label.npy")
29 | y_one_hot = np.squeeze(np.eye(CLASS_NUM)[np.array(y).reshape(-1)])
30 |
31 | # Function to load pb file as a graph
32 | def load_pb(path_to_pb):
33 | with tf.gfile.GFile(path_to_pb, "rb") as f:
34 | graph_def = tf.GraphDef()
35 | graph_def.ParseFromString(f.read())
36 | with tf.Graph().as_default() as graph:
37 | tf.import_graph_def(graph_def, name='')
38 | return graph
39 |
40 | # Load pb file into graph
41 | graph = load_pb('./frozen_graph.pb')
42 |
43 | # Name the input/output node
44 | input_node = graph.get_tensor_by_name(INPUT_NODE + ':0')
45 | output_node = graph.get_tensor_by_name(OUTPUT_NODE + ':0')
46 | print(input_node.shape)
47 | print(output_node.shape)
48 |
49 | # Evalate the graph
50 | with tf.Session(graph=graph) as sess:
51 | predict = sess.run(output_node, feed_dict={input_node: X[:10]})
52 | print(predict)
53 | print(y[:10])
54 | loss = tf.losses.softmax_cross_entropy(y_one_hot[:10], predict)
55 | sess_loss = tf.Session()
56 | loss = sess_loss.run(loss)
57 | print(loss)
58 |
--------------------------------------------------------------------------------
/x86/evaluate_quantized_graph.py:
--------------------------------------------------------------------------------
1 | #! /usr/bin/python3
2 | # coding=utf-8
3 | #####################
4 |
5 | import cv2
6 | import numpy as np
7 | import os
8 | import tensorflow as tf
9 | from tensorflow import keras
10 |
11 | # May meet problem when loading a new pb file
12 | tf.contrib.resampler
13 |
14 | # Set the class number, input/output node names
15 | CLASS_NUM = 5
16 | # INPUT_NODE = 'conv2d_input_2' # Get from "Freeze the model" flow
17 | # OUTPUT_NODE = 'dense_1_2/Softmax' # Get from "Freeze the model" flow
18 | f = open("./freeze_input_output_node_name.txt", "r+")
19 | curline = f.readline()
20 | INPUT_NODE = curline.strip()
21 | curline = f.readline()
22 | OUTPUT_NODE = curline.strip()
23 | f.close()
24 |
25 | # Load images and Normalization
26 | X = np.load("features.npy") / 255.0
27 | # Load labels and change y label to one hot
28 | y = np.load("label.npy")
29 | y_one_hot = np.squeeze(np.eye(CLASS_NUM)[np.array(y).reshape(-1)])
30 |
31 | # Function to load pb file as a graph
32 | def load_pb(path_to_pb):
33 | with tf.gfile.GFile(path_to_pb, "rb") as f:
34 | graph_def = tf.GraphDef()
35 | graph_def.ParseFromString(f.read())
36 | with tf.Graph().as_default() as graph:
37 | tf.import_graph_def(graph_def, name='')
38 | return graph
39 |
40 | # Load quantized pb file
41 | graph = load_pb('./quantize_results/quantize_eval_model.pb')
42 |
43 | # Name the input/output node
44 | input_node = graph.get_tensor_by_name(INPUT_NODE + ':0')
45 | output_node = graph.get_tensor_by_name(OUTPUT_NODE + ':0')
46 | print(input_node.shape)
47 | print(output_node.shape)
48 |
49 | # Evalate the graph
50 | with tf.Session(graph=graph) as sess:
51 | predict = sess.run(output_node, feed_dict={input_node: X[:10]})
52 | print(predict)
53 | print(y[:10])
54 | loss = tf.losses.softmax_cross_entropy(y_one_hot[:10], predict)
55 | sess_loss = tf.Session()
56 | loss = sess_loss.run(loss)
57 | print(loss)
58 |
--------------------------------------------------------------------------------
/x86/evaluate_trained_model.py:
--------------------------------------------------------------------------------
1 | #! /usr/bin/python3
2 | # coding=utf-8
3 | #####################
4 |
5 |
6 | import cv2
7 | import numpy as np
8 | import os
9 | import tensorflow as tf
10 | from tensorflow import keras
11 |
12 | # Load images and labels, do the normalization
13 | X = np.load("features.npy") / 255.0
14 | y = np.load("label.npy")
15 |
16 | # Load the model and check the model summary
17 | loaded_model = keras.models.load_model("./flower_classification_weights.h5")
18 | loaded_model.summary()
19 |
20 | # Get the loss and accurary
21 | loss,acc = loaded_model.evaluate(X, y)
22 | print("Restored model, accuracy: {:5.2f}%".format(100*acc))
--------------------------------------------------------------------------------
/x86/flower_classification_input_fn.py:
--------------------------------------------------------------------------------
1 | #import cv2
2 | import numpy as np
3 | #import os
4 | #import tensorflow as tf
5 | #from tensorflow import keras
6 |
7 |
8 | def calib_input(iter):
9 | X = np.load("features.npy") / 255.0
10 | images = X[:100]
11 | return {"conv2d_input": images}
12 |
13 |
--------------------------------------------------------------------------------
/x86/flowers/.gitignore:
--------------------------------------------------------------------------------
1 | # Ignore everything in this directory
2 | *
3 | # Except this file
4 | !.gitignore
5 |
--------------------------------------------------------------------------------
/x86/freeze_model.py:
--------------------------------------------------------------------------------
1 | #! /usr/bin/python3
2 | # coding=utf-8
3 | #####################
4 |
5 |
6 | import cv2
7 | import numpy as np
8 | import os
9 | import tensorflow as tf
10 | from tensorflow import keras
11 |
12 | # Define the freeze function, outptut freezed graph
13 | def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):
14 | """
15 | Freezes the state of a session into a pruned computation graph.
16 | Creates a new computation graph where variable nodes are replaced by
17 | constants taking their current value in the session. The new graph will be
18 | pruned so subgraphs that are not necessary to compute the requested
19 | outputs are removed.
20 | @param session The TensorFlow session to be frozen.
21 | @param keep_var_names A list of variable names that should not be frozen,
22 | or None to freeze all the variables in the graph.
23 | @param output_names Names of the relevant graph outputs.
24 | @param clear_devices Remove the device directives from the graph for better portability.
25 | @return The frozen graph definition.
26 | """
27 | graph = session.graph
28 | with graph.as_default():
29 | freeze_var_names = list(set(v.op.name for v in tf.global_variables()).difference(keep_var_names or []))
30 | output_names = output_names or []
31 | output_names += [v.op.name for v in tf.global_variables()]
32 | input_graph_def = graph.as_graph_def()
33 | if clear_devices:
34 | for node in input_graph_def.node:
35 | node.device = ""
36 | frozen_graph = tf.graph_util.convert_variables_to_constants(
37 | session, input_graph_def, output_names, freeze_var_names)
38 | return frozen_graph
39 |
40 | # Load trained model, set learning phase to 0
41 | keras.backend.set_learning_phase(0)
42 | loaded_model= keras.models.load_model('./flower_classification_weights.h5')
43 |
44 | # make list of output and input node names
45 | input_names=[out.op.name for out in loaded_model.inputs]
46 | output_names=[out.op.name for out in loaded_model.outputs]
47 | print('input node is{}'.format(input_names))
48 | print('output node is{}'.format(output_names))
49 |
50 | f = open("freeze_input_output_node_name.txt", "w+")
51 | # f.write('{}'.format(input_names[input_names.find("'")+1:input_names.find("'")]) + "\n")
52 | f.write('{}'.format(input_names[0]) + "\n")
53 | f.write('{}'.format(output_names[0]) + "\n")
54 | f.close()
55 |
56 | # Freeze graph
57 | frozen_graph = freeze_session(keras.backend.get_session(), output_names=output_names)
58 | # Store graph to pb(Protocol Buffers) file
59 | tf.train.write_graph(frozen_graph, "./", "frozen_graph.pb", as_text=False)
60 |
--------------------------------------------------------------------------------
/x86/load_data.py:
--------------------------------------------------------------------------------
1 | #! /usr/bin/python3
2 | # coding=utf-8
3 | #####################
4 |
5 |
6 | import numpy as np
7 | #import matplotlib.pyplot as plt
8 | import os
9 | import cv2
10 | import random
11 |
12 | DATADIR = './flowers/'
13 | CATEGORIES = ['daisy', 'dandelion', 'rose', 'sunflower', 'tulip']
14 | IMG_SIZE = 128
15 | training_data = []
16 |
17 | # Function to go through the category folders
18 | def create_training_data():
19 | for category in CATEGORIES:
20 | path = os.path.join(DATADIR, category) # path to cats or dogs dir
21 | label_num = CATEGORIES.index(category)
22 | for img in os.listdir(path):
23 | try:
24 | img_array = cv2.imread(os.path.join(path, img))
25 | resized_array = cv2.resize(img_array, (IMG_SIZE, IMG_SIZE))
26 | training_data.append([resized_array, label_num])
27 | except Exception as e:
28 | print("Exception here") #pass
29 |
30 | create_training_data()
31 |
32 | # Print out the picture numbers
33 | print(len(training_data))
34 |
35 | # Random shuffle the dataset
36 | random.shuffle(training_data)
37 |
38 | # Define the features(images) and labels
39 | X = []
40 | y = []
41 |
42 | for features, label in training_data:
43 | X.append(features)
44 | y.append(label)
45 |
46 | # Check the X dataset if necessary
47 | #from PIL import Image
48 | #img = Image.fromarray(X[0])
49 | #img.show()
50 |
51 | # Store data into NPY files
52 | X = np.array(X).reshape(-1, IMG_SIZE, IMG_SIZE, 3)
53 | np.save("features.npy", X)
54 | np.save("label.npy", y)
55 |
56 |
--------------------------------------------------------------------------------
/x86/train_data.py:
--------------------------------------------------------------------------------
1 | #! /usr/bin/python3
2 | # coding=utf-8
3 | #####################
4 |
5 |
6 | import cv2
7 | import numpy as np
8 | import os
9 | import tensorflow as tf
10 | from tensorflow import keras
11 |
12 | # Load images and labels from npy files
13 | X = np.load("features.npy")
14 | y = np.load("label.npy")
15 |
16 | # Normalization
17 | X = X / 255.0
18 |
19 | # Define the network model
20 | model = keras.Sequential([
21 | keras.layers.Conv2D(32, (3,3), padding="same", activation=tf.nn.relu, input_shape=(128, 128, 3)),
22 | keras.layers.MaxPool2D(pool_size=(2,2)),
23 | keras.layers.Conv2D(64, (3,3), padding="same", activation=tf.nn.relu),
24 | keras.layers.MaxPool2D(pool_size=(2,2)),
25 | keras.layers.Conv2D(128, (3,3), padding="same", activation=tf.nn.relu),
26 | keras.layers.MaxPool2D(pool_size=(2,2)),
27 | keras.layers.Flatten(),
28 | keras.layers.Dense(100, activation=tf.nn.relu),
29 | keras.layers.Dense(5, activation=tf.nn.softmax)
30 | ])
31 |
32 | # Compile the model
33 | model.compile(optimizer='adam',
34 | loss='sparse_categorical_crossentropy',
35 | metrics=['accuracy'])
36 |
37 | # Check the summary, not necessary
38 | model.summary()
39 |
40 | # Training the model
41 | model.fit(X, y, epochs=10)
42 |
43 | # Save the model to H5 file
44 | model.save("flower_classification_weights.h5")
--------------------------------------------------------------------------------