├── .gitignore
├── README.md
├── build_prj.tcl
├── firmware
├── lenet5.cpp
├── lenet5.h
├── parameters.h
└── weights
│ ├── b1.h
│ ├── b3.h
│ ├── b5.h
│ ├── b6.h
│ ├── b7.h
│ ├── w1.h
│ ├── w3.h
│ ├── w5.h
│ ├── w6.h
│ └── w7.h
├── gpu
├── Makefile
├── conv.cu
├── conv.h
├── lenet5.cu
├── lenet5.h
├── lenet5_host.cpp
├── nnet.h
└── parameters.h
├── keras_lenet.py
├── keras_lenet_infer.py
├── lenet5_test.cpp
├── misc
├── cosim.png
├── fpga_latency.png
├── keras_lenet_infer.png
├── plot.py
├── resource_usage.png
├── saved_model.json
└── saved_weights.h5
├── nnet_utils
├── README.md
├── nnet_activation.h
├── nnet_batchnorm.h
├── nnet_common.h
├── nnet_conv.h
├── nnet_conv2d.h
├── nnet_helpers.h
├── nnet_layer.h
└── nnet_pooling.h
└── test_images
├── 0.txt
├── 1.txt
├── 10.txt
├── 100.txt
├── 101.txt
├── 11.txt
├── 12.txt
├── 13.txt
├── 14.txt
├── 15.txt
├── 16.txt
├── 17.txt
├── 18.txt
├── 19.txt
├── 2.txt
├── 20.txt
├── 21.txt
├── 22.txt
├── 23.txt
├── 24.txt
├── 25.txt
├── 26.txt
├── 27.txt
├── 28.txt
├── 29.txt
├── 3.txt
├── 30.txt
├── 31.txt
├── 32.txt
├── 33.txt
├── 34.txt
├── 35.txt
├── 36.txt
├── 37.txt
├── 38.txt
├── 39.txt
├── 4.txt
├── 40.txt
├── 41.txt
├── 42.txt
├── 43.txt
├── 44.txt
├── 45.txt
├── 46.txt
├── 47.txt
├── 48.txt
├── 49.txt
├── 5.txt
├── 50.txt
├── 51.txt
├── 52.txt
├── 53.txt
├── 54.txt
├── 55.txt
├── 56.txt
├── 57.txt
├── 58.txt
├── 59.txt
├── 6.txt
├── 60.txt
├── 61.txt
├── 62.txt
├── 63.txt
├── 64.txt
├── 65.txt
├── 66.txt
├── 67.txt
├── 68.txt
├── 69.txt
├── 7.txt
├── 70.txt
├── 71.txt
├── 72.txt
├── 73.txt
├── 74.txt
├── 75.txt
├── 76.txt
├── 77.txt
├── 78.txt
├── 79.txt
├── 8.txt
├── 80.txt
├── 81.txt
├── 82.txt
├── 83.txt
├── 84.txt
├── 85.txt
├── 86.txt
├── 87.txt
├── 88.txt
├── 89.txt
├── 9.txt
├── 90.txt
├── 91.txt
├── 92.txt
├── 93.txt
├── 94.txt
├── 95.txt
├── 96.txt
├── 97.txt
├── 98.txt
├── 99.txt
└── save_images.py
/.gitignore:
--------------------------------------------------------------------------------
1 | # Prerequisites
2 | *.d
3 |
4 | # Compiled Object files
5 | *.slo
6 | *.lo
7 | *.o
8 | *.obj
9 |
10 | # Precompiled Headers
11 | *.gch
12 | *.pch
13 |
14 | # Compiled Dynamic libraries
15 | *.so
16 | *.dylib
17 | *.dll
18 |
19 | # Fortran module files
20 | *.mod
21 | *.smod
22 |
23 | # Compiled Static libraries
24 | *.lai
25 | *.la
26 | *.a
27 | *.lib
28 |
29 | # Executables
30 | *.exe
31 | *.out
32 | *.app
33 |
34 | .vs*
35 | .vscode/*
36 | *.log
37 |
38 | hls_prj*
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # lenet5-accelerator
2 | This is the final project for [Special Course on Computer Architecture](http://www.am.ics.keio.ac.jp/comparc/), in which FPGA and GPU are used for acclerating a simple CNN LeNet-5. For FPGA and GPU, HLS and Cuda are used respectively.
3 |
4 | Hardware info:
5 | - CPU: Intel(R) Core(TM) i7-2600 CPU @ 3.40GHz
6 | - latency = 1.59ms/inference
7 | - GPU: GeForce GTX 970
8 | - latency = 3.3ms (with a batch size of 100)
9 | - FPGA: Xilinx Kintex Ultrascale
10 | - latency = 0.54ms/inference
11 |
12 | For real-time application or inference of small batch size, FPGA is the fastest.
13 |
14 | ## HLS
15 | ### Use the prepared code to generate HLS project
16 | - run `vivado_hls -f build_prj.tcl` in this repo, and you'll have a synthesizale project.
17 | ### Or prepare the model from scratch...
18 | - Train the model by running `keras_lenet.py`, which generates two things: `saved_model.json` and `saved_weights.h5`
19 | - Clone into [this repo](https://github.com/sherylll/hls4ml), which is a fork of this great project [hls4ml](https://github.com/hls-fpga-machine-learning/hls4ml)
20 | - `cd hls4ml/keras-to-hls`, create a new directory `lenet5-model-files`, and put the generated `saved_model.json` and `saved_weights.h5`
21 | - `python keras-to-hls.py -c keras-config.yml`, which generates the C source files for the HLS project
22 | - Build the HLS project by running `vivado_hls -f build_prj.tcl` (30~40 min)
23 | - There will be a lot of 1's printed out, which mean correct inferences. 0 means incorrect inference.
24 | More information can be found in the [hls4ml](https://github.com/hls-fpga-machine-learning/hls4ml) repo. However the generated project without manual optimization can be not-synthesizable or have very poor performance.
25 | ### Testbench and test data
26 | - testbench: `lenet5_test.cpp`
27 | - test data: `test_images/` (generated by the script `test_images/save_images.py`)
28 | ### Optimization tricks
29 | - wrap inner loops into functions, such as the elementwise multiplication of a weight filter with a block of image. In 2d convolution, the innerest two loops are wrapped into one function, so that the filtering can be reused and pipelined.
30 | - pipeline and partition at the same time:
31 | ```
32 | for (int oh = 0; oh < CONFIG_T::out_height; oh++)
33 | for (int ow = 0; ow < CONFIG_T::out_width; ow++)
34 | #pragma HLS pipeline
35 | for (int ff = 0; ff < CONFIG_T::n_filt; ff++)
36 | acc[oh * CONFIG_T::out_width + ow][ff] = biases[ff]; // partition acc in dim 2 and biases in dim 1
37 | ```
38 | - use temporary variable to store the accumulation result to reduce memory access
39 | - for fully-connected layers (matrix-vector multiplication), pipeline the first-level loop:
40 | ```
41 | // keras (tf backend) uses transposed weight matrices
42 | Product1: for(int ii = 0; ii < CONFIG_T::n_in; ii++) {
43 | #pragma HLS PIPELINE
44 | cache = data[ii];
45 | Product2: for(int jj = 0; jj < CONFIG_T::n_out; jj++) {
46 | int index = ii*CONFIG_T::n_out+jj;
47 | typename CONFIG_T::accum_t mult = cache * weights[index];
48 | acc[jj] += mult;
49 | }
50 | }
51 | ```
52 | - remove unnecessary flatenning and unflatenning, match the dimension of the output of a layer and the input of the next layer.
53 | - don't use local arrays too deep in the loop structure, which can result in large resource consumption
54 |
55 | ### Results
56 | The original accuracy is 98.89% and using 16-bit fixed point (w/o the softmax layer), accuracy becomes to 98.87%. The softmax layer introduces a further tiny drop in the accuracy.
57 |
58 | - xcku095-ffvb2104-1-c
59 | - latency = interval = 54064 cycles * 10 ns/cycle
60 | - resource usage
61 | - BRAM 49%
62 | - DSP 96%
63 | - FF 12%
64 | - LUT 46%
65 | - xcku115-flvb2014-2-e:
66 | - latency: same as above
67 |
68 | 
69 |
70 | - resource usage
71 |
72 | 
73 |
74 | - co-simulation
75 |
76 | 
77 | ## GPU
78 | The code for GPU is located under `gpu`. Under the `gpu` directory, run `make cpu` or `make gpu` to build the CPU or GPU version of the code. The CPU/GPU coding style is very different from HLS, especially the convolution part, as can be learned from this project.
79 | The most expensive part, i.e. the second conv layer is accelerated with GPU, by replacing the 3 outermost loops with 3D grids.
80 |
81 | ## Results
82 | Except the second conv layer, everything else stays in the host.
83 | Since there is a large overhead induced by the memory allocation and kernel launching, the overall latency averaged over 100 inferences is slightly worse than CPU (3.50 vs 1.59ms/inference). However if we compare the kernel only, GPU is abuot 20 times faster than CPU (0.042ms vs 0.9ms). This advantage only manifests itself when there is a large amount of data using the same parameters, which can be prefetched into GPU.
84 |
85 | ### Future work
86 | It is also possible to implement the Cuda kernel as C++ templates, as explained in [this blog post](https://devblogs.nvidia.com/power-cpp11-cuda-7/).
87 |
88 | ## LeNet5 in Keras
89 | For a comparison, the evaluation is done with Keras + TF:
90 |
91 | 
92 |
93 | The reduction in latency flattens out mainly due to limited memory on my laptop, if the CPU/GPU memory is large enough, the framework implementation will show even better scalability.
94 |
95 | Note that the CPU/GPU used are different from above, which are Intel i7-7500 and GeForce 940MX, which is because I have the environment setup on my own laptop.
96 |
--------------------------------------------------------------------------------
/build_prj.tcl:
--------------------------------------------------------------------------------
1 | #################
2 | # HLS4ML
3 | #################
4 | array set opt {
5 | csim 1
6 | synth 1
7 | cosim 1
8 | export 1
9 | }
10 |
11 | foreach arg $::argv {
12 | foreach o [lsort [array names opt]] {
13 | regexp "$o +(\\w+)" $arg unused opt($o)
14 | }
15 | }
16 |
17 | open_project -reset hls_prj
18 | set_top lenet5
19 | add_files firmware/lenet5.cpp -cflags "-I[file normalize ./nnet_utils] -std=c++0x"
20 | add_files -tb lenet5_test.cpp -cflags "-I[file normalize ./nnet_utils] -std=c++0x"
21 | add_files -tb firmware/weights
22 | add_files -tb test_images
23 | #add_files -tb tb_data
24 | open_solution -reset "solution1"
25 | catch {config_array_partition -maximum_size 4096}
26 | #set_part {xcku095-ffvb2104-1-c}
27 | set_part {xcku115-flvb2104-2-e}
28 | create_clock -period 10 -name default
29 |
30 | if {$opt(csim)} {
31 | puts "***** C SIMULATION *****"
32 | csim_design
33 | }
34 |
35 | if {$opt(synth)} {
36 | puts "***** C/RTL SYNTHESIS *****"
37 | csynth_design
38 | if {$opt(cosim)} {
39 | puts "***** C/RTL SIMULATION *****"
40 | cosim_design -trace_level all
41 | }
42 | if {$opt(export)} {
43 | puts "***** EXPORT IP *****"
44 | export_design -format ip_catalog
45 | }
46 | }
47 |
48 | exit
49 |
--------------------------------------------------------------------------------
/firmware/lenet5.cpp:
--------------------------------------------------------------------------------
1 | //
2 | // rfnoc-hls-neuralnet: Vivado HLS code for neural-net building blocks
3 | //
4 | // Copyright (C) 2017 EJ Kreinar
5 | //
6 | // This program is free software: you can redistribute it and/or modify
7 | // it under the terms of the GNU General Public License as published by
8 | // the Free Software Foundation, either version 3 of the License, or
9 | // (at your option) any later version.
10 | //
11 | // This program is distributed in the hope that it will be useful,
12 | // but WITHOUT ANY WARRANTY; without even the implied warranty of
13 | // MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
14 | // GNU General Public License for more details.
15 | //
16 | // You should have received a copy of the GNU General Public License
17 | // along with this program. If not, see .
18 | /////////////////////////////////////////////////////////////////////////////
19 | // modified by Yuxi Sun
20 | // Keras trained accuracy 98.89%
21 | // 16-bit 98.87%
22 |
23 | #include
24 |
25 | #include "parameters.h"
26 | #include "lenet5.h"
27 |
28 | #include "nnet_layer.h"
29 | #include "nnet_conv.h"
30 | #include "nnet_conv2d.h"
31 | #include "nnet_batchnorm.h"
32 | #include "nnet_activation.h"
33 | #include "nnet_pooling.h"
34 |
35 | //hls-fpga-machine-learning insert weights
36 | #include "weights/w1.h"
37 | #include "weights/b1.h"
38 | #include "weights/w3.h"
39 | #include "weights/b3.h"
40 | #include "weights/w5.h"
41 | #include "weights/b5.h"
42 | #include "weights/w6.h"
43 | #include "weights/b6.h"
44 | #include "weights/w7.h"
45 | #include "weights/b7.h"
46 |
47 | void lenet5(
48 | input_t data[IN_HEIGHT_1*IN_WIDTH_1*N_CHAN_1],
49 | result_t res[N_OUTPUTS])
50 | {
51 |
52 | //hls-fpga-machine-learning insert IO
53 | #pragma HLS INTERFACE axis port=data,res
54 |
55 | // ****************************************
56 | // NETWORK INSTANTIATION
57 | // ****************************************
58 |
59 | //hls-fpga-machine-learning insert layers
60 |
61 | layer1_t conv2d_layer1_out[OUT_HEIGHT_1][OUT_WIDTH_1][N_FILT_1];
62 | #pragma HLS array_partition variable=conv2d_layer1_out dim=3
63 |
64 | nnet::conv_2d(data, conv2d_layer1_out, w1, b1);
65 |
66 | layer1_t pool2d_layer2_out[OUT_HEIGHT_2*OUT_WIDTH_2*N_FILT_2];
67 | nnet::pooling2d(conv2d_layer1_out, pool2d_layer2_out);
68 |
69 | layer3_t conv2d_layer3_out[OUT_HEIGHT_3][OUT_WIDTH_3][N_FILT_3];
70 | #pragma HLS array_partition variable=conv2d_layer3_out dim=3
71 |
72 | nnet::conv_2d(pool2d_layer2_out, conv2d_layer3_out, w3, b3);
73 |
74 | layer3_t layer4_out[OUT_HEIGHT_4*OUT_WIDTH_4*N_FILT_4];
75 | nnet::pooling2d(conv2d_layer3_out, layer4_out);
76 |
77 | layer5_t layer5_out[N_LAYER_5];
78 | nnet::compute_layer(layer4_out, layer5_out, w5, b5);
79 |
80 | layer6_t layer6_out[N_LAYER_6];
81 | nnet::compute_layer(layer5_out, layer6_out, w6, b6);
82 |
83 | layer6_t logits7[N_OUTPUTS];
84 |
85 | nnet::compute_layer(layer6_out, logits7, w7, b7);
86 |
87 | nnet::softmax(logits7, res);
88 |
89 | }
90 |
--------------------------------------------------------------------------------
/firmware/lenet5.h:
--------------------------------------------------------------------------------
1 | //
2 | // rfnoc-hls-neuralnet: Vivado HLS code for neural-net building blocks
3 | //
4 | // Copyright (C) 2017 EJ Kreinar
5 | //
6 | // This program is free software: you can redistribute it and/or modify
7 | // it under the terms of the GNU General Public License as published by
8 | // the Free Software Foundation, either version 3 of the License, or
9 | // (at your option) any later version.
10 | //
11 | // This program is distributed in the hope that it will be useful,
12 | // but WITHOUT ANY WARRANTY; without even the implied warranty of
13 | // MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
14 | // GNU General Public License for more details.
15 | //
16 | // You should have received a copy of the GNU General Public License
17 | // along with this program. If not, see .
18 | //
19 |
20 | #ifndef LENET5_H_
21 | #define LENET5_H_
22 |
23 | #include
24 | #include "ap_int.h"
25 | #include "ap_fixed.h"
26 |
27 | #include "parameters.h"
28 |
29 |
30 | // Prototype of top level function for C-synthesis
31 | void lenet5(
32 | input_t data[IN_HEIGHT_1*IN_WIDTH_1*N_CHAN_1],
33 | result_t res[N_OUTPUTS]);
34 |
35 |
36 | #endif
37 |
38 |
--------------------------------------------------------------------------------
/firmware/parameters.h:
--------------------------------------------------------------------------------
1 | #ifndef PARAMETERS_H_
2 | #define PARAMETERS_H_
3 | #include "/home/cad/xilinx/Vivado-2017.2/Vivado_HLS/2017.2/include/gmp.h"
4 | #include
5 | #include "ap_int.h"
6 | #include "ap_fixed.h"
7 | #include "nnet_layer.h"
8 | #include "nnet_conv.h"
9 | #include "nnet_conv2d.h"
10 | #include "nnet_activation.h"
11 | #include "nnet_common.h"
12 | #include "nnet_batchnorm.h"
13 | #include "nnet_pooling.h"
14 |
15 | //hls-fpga-machine-learning insert numbers
16 | typedef ap_fixed<16,8> accum_default_t;
17 | typedef ap_fixed<16,4> weight_default_t;
18 | typedef ap_fixed<16,4> bias_default_t;
19 | typedef ap_fixed<16,6> input_t;
20 | typedef ap_fixed<16,6> result_t;
21 |
22 | //typedef float accum_default_t;
23 | //typedef float weight_default_t;
24 | //typedef float bias_default_t;
25 | //typedef float input_t;
26 | //typedef float result_t;
27 |
28 |
29 | #define IN_HEIGHT_1 28
30 | #define IN_WIDTH_1 28
31 | #define N_CHAN_1 1
32 | #define OUT_HEIGHT_1 24
33 | #define OUT_WIDTH_1 24
34 | #define N_FILT_1 6
35 | #define IN_HEIGHT_2 24
36 | #define IN_WIDTH_2 24
37 | #define OUT_HEIGHT_2 12
38 | #define OUT_WIDTH_2 12
39 | #define POOL_HEIGHT_2 2
40 | #define POOL_WIDTH_2 2
41 | #define N_FILT_2 6
42 | #define N_LAYER_2 864
43 | #define IN_HEIGHT_3 12
44 | #define IN_WIDTH_3 12
45 | #define N_CHAN_3 6
46 | #define OUT_HEIGHT_3 8
47 | #define OUT_WIDTH_3 8
48 | #define N_FILT_3 16
49 | #define IN_HEIGHT_4 8
50 | #define IN_WIDTH_4 8
51 | #define OUT_HEIGHT_4 4
52 | #define OUT_WIDTH_4 4
53 | #define POOL_HEIGHT_4 2
54 | #define POOL_WIDTH_4 2
55 | #define N_FILT_4 16
56 | #define N_LAYER_4 256
57 | #define N_LAYER_5 120
58 | #define N_LAYER_6 84
59 | #define N_OUTPUTS 10
60 |
61 | //hls-fpga-machine-learning insert layer-precision
62 | typedef ap_fixed<16,6> layer1_t;
63 | typedef ap_fixed<16,6> layer2_t;
64 | typedef ap_fixed<16,6> layer3_t;
65 | typedef ap_fixed<16,6> layer4_t;
66 | typedef ap_fixed<16,6> layer5_t;
67 | typedef ap_fixed<16,6> layer6_t;
68 |
69 | //typedef float layer1_t;
70 | //typedef float layer2_t;
71 | //typedef float layer3_t;
72 | //typedef float layer4_t;
73 | //typedef float layer5_t;
74 | //typedef float layer6_t;
75 |
76 | //hls-fpga-machine-learning insert layer-config
77 | struct config1 : nnet::conv2d_config {
78 | static const unsigned pad_top = 0;
79 | static const unsigned pad_bottom = 0;
80 | static const unsigned pad_left = 0;
81 | static const unsigned pad_right = 0;
82 | static const unsigned in_height = IN_HEIGHT_1;
83 | static const unsigned in_width = IN_WIDTH_1;
84 | static const unsigned n_chan = N_CHAN_1;
85 | static const unsigned filt_height = 5;
86 | static const unsigned filt_width = 5;
87 | static const unsigned n_filt = N_FILT_1;
88 | static const unsigned stride_height = 1;
89 | static const unsigned stride_width = 1;
90 | static const unsigned out_height = OUT_HEIGHT_1;
91 | static const unsigned out_width = OUT_WIDTH_1;
92 | static const unsigned reuse_factor = 50;
93 | static const unsigned n_zeros = 0;
94 | static const bool store_weights_in_bram = false;
95 | typedef accum_default_t accum_t;
96 | typedef bias_default_t bias_t;
97 | typedef weight_default_t weight_t;
98 | };
99 | struct relu_config1 : nnet::activ_config {
100 | static const unsigned n_in = OUT_HEIGHT_1*OUT_WIDTH_1*N_FILT_1;
101 | static const unsigned table_size = 1024;
102 | static const unsigned io_type = nnet::io_serial;
103 | };
104 | struct config2 : nnet::pooling2d_config {
105 | static const unsigned in_height = IN_HEIGHT_2;
106 | static const unsigned in_width = IN_WIDTH_2;
107 | static const unsigned n_filt = N_FILT_2;
108 | static const unsigned stride_height = 2;
109 | static const unsigned stride_width = 2;
110 | static const unsigned pool_height = 2;
111 | static const unsigned pool_width = 2;
112 | static const unsigned out_height = OUT_HEIGHT_2;
113 | static const unsigned out_width = OUT_WIDTH_2;
114 | static const unsigned pad_top = 0;
115 | static const unsigned pad_bottom = 0;
116 | static const unsigned pad_left = 0;
117 | static const unsigned pad_right = 0;
118 | // static const nnet::Pool_Op pool_op = nnet::Max;
119 | static const unsigned reuse = 50;
120 | };
121 |
122 | struct config3 : nnet::conv2d_config {
123 | static const unsigned pad_top = 0;
124 | static const unsigned pad_bottom = 0;
125 | static const unsigned pad_left = 0;
126 | static const unsigned pad_right = 0;
127 | static const unsigned in_height = IN_HEIGHT_3;
128 | static const unsigned in_width = IN_WIDTH_3;
129 | static const unsigned n_chan = N_CHAN_3;
130 | static const unsigned filt_height = 5;
131 | static const unsigned filt_width = 5;
132 | static const unsigned n_filt = N_FILT_3;
133 | static const unsigned stride_height = 1;
134 | static const unsigned stride_width = 1;
135 | static const unsigned out_height = OUT_HEIGHT_3;
136 | static const unsigned out_width = OUT_WIDTH_3;
137 | static const unsigned reuse_factor = 50;
138 | static const unsigned n_zeros = 0;
139 | static const bool store_weights_in_bram = false;
140 | typedef accum_default_t accum_t;
141 | typedef bias_default_t bias_t;
142 | typedef weight_default_t weight_t;
143 | };
144 | struct relu_config3 : nnet::activ_config {
145 | static const unsigned n_in = OUT_HEIGHT_3*OUT_WIDTH_3*N_FILT_3;
146 | static const unsigned table_size = 1024;
147 | static const unsigned io_type = nnet::io_serial;
148 | };
149 | struct config4 : nnet::pooling2d_config {
150 | static const unsigned in_height = IN_HEIGHT_4;
151 | static const unsigned in_width = IN_WIDTH_4;
152 | static const unsigned n_filt = N_FILT_4;
153 | static const unsigned stride_height = 2;
154 | static const unsigned stride_width = 2;
155 | static const unsigned pool_height = 2;
156 | static const unsigned pool_width = 2;
157 | static const unsigned out_height = OUT_HEIGHT_4;
158 | static const unsigned out_width = OUT_WIDTH_4;
159 | static const unsigned pad_top = 0;
160 | static const unsigned pad_bottom = 0;
161 | static const unsigned pad_left = 0;
162 | static const unsigned pad_right = 0;
163 | // static const nnet::Pool_Op pool_op = nnet::Max;
164 | static const unsigned reuse = 50;
165 | };
166 |
167 | struct config5 : nnet::layer_config {
168 | static const unsigned n_in = N_LAYER_4;
169 | static const unsigned n_out = N_LAYER_5;
170 | static const unsigned io_type = nnet::io_serial;
171 | static const unsigned reuse_factor = 24;
172 | static const unsigned n_zeros = 0;
173 | static const bool store_weights_in_bram = false;
174 | typedef accum_default_t accum_t;
175 | typedef bias_default_t bias_t;
176 | typedef weight_default_t weight_t;
177 | };
178 | struct relu_config5 : nnet::activ_config {
179 | static const unsigned n_in = N_LAYER_5;
180 | static const unsigned table_size = 1024;
181 | static const unsigned io_type = nnet::io_serial;
182 | };
183 | struct config6 : nnet::layer_config {
184 | static const unsigned n_in = N_LAYER_5;
185 | static const unsigned n_out = N_LAYER_6;
186 | static const unsigned io_type = nnet::io_serial;
187 | static const unsigned reuse_factor = 12;
188 | static const unsigned n_zeros = 0;
189 | static const bool store_weights_in_bram = false;
190 | typedef accum_default_t accum_t;
191 | typedef bias_default_t bias_t;
192 | typedef weight_default_t weight_t;
193 | };
194 | struct relu_config6 : nnet::activ_config {
195 | static const unsigned n_in = N_LAYER_6;
196 | static const unsigned table_size = 1024;
197 | static const unsigned io_type = nnet::io_serial;
198 | };
199 | struct config7 : nnet::layer_config {
200 | static const unsigned n_in = N_LAYER_6;
201 | static const unsigned n_out = N_OUTPUTS;
202 | static const unsigned io_type = nnet::io_serial;
203 | static const unsigned reuse_factor = 2;
204 | static const unsigned n_zeros = 0;
205 | static const bool store_weights_in_bram = false;
206 | typedef accum_default_t accum_t;
207 | typedef bias_default_t bias_t;
208 | typedef weight_default_t weight_t;
209 | };
210 | struct softmax_config7 : nnet::activ_config {
211 | static const unsigned n_in = N_OUTPUTS;
212 | static const unsigned table_size = 2048;
213 | static const unsigned io_type = nnet::io_serial;
214 | };
215 |
216 | #endif
217 |
--------------------------------------------------------------------------------
/firmware/weights/b1.h:
--------------------------------------------------------------------------------
1 | //Numpy array shape (6,)
2 | //Min -0.073600828648
3 | //Max 0.025575041771
4 | //Number of zeros 0
5 |
6 | bias_default_t b1[6] = {0.025575041771, -0.002153477632, -0.028862904757, 0.003258966608, -0.073600828648, -0.040061172098};
7 |
--------------------------------------------------------------------------------
/firmware/weights/b3.h:
--------------------------------------------------------------------------------
1 | //Numpy array shape (16,)
2 | //Min -0.126525104046
3 | //Max 0.138503521681
4 | //Number of zeros 0
5 |
6 | bias_default_t b3[16] = {0.019073441625, -0.025135762990, 0.138503521681, -0.052502352744, -0.108942948282, -0.054933834821, -0.029950363562, -0.001227431814, -0.050029836595, -0.005677441135, 0.016388963908, 0.051268998533, -0.126525104046, 0.051677554846, 0.063460767269, -0.061336446553};
7 |
--------------------------------------------------------------------------------
/firmware/weights/b5.h:
--------------------------------------------------------------------------------
1 | //Numpy array shape (120,)
2 | //Min -0.079266294837
3 | //Max 0.081733293831
4 | //Number of zeros 0
5 |
6 | bias_default_t b5[120] = {0.039675883949, 0.047414675355, -0.031112968922, 0.004870147444, 0.012527243234, 0.014310999773, 0.013606126420, -0.010654700920, -0.006378280465, 0.030494974926, -0.021389497444, -0.006791594438, -0.004903488327, -0.009693176486, -0.000801904185, 0.056749109179, 0.007086376194, 0.005952423904, 0.018398677930, 0.012842475437, 0.006151590496, -0.021435260773, 0.011887852103, 0.007385193370, -0.014455468394, 0.036256261170, 0.025524726138, -0.004292924888, 0.028110796586, -0.010723481886, -0.011720723473, -0.005894732662, 0.035589549690, -0.015315731056, -0.024434080347, 0.031847618520, -0.011871568859, -0.031746041030, -0.079266294837, -0.036366876215, 0.009773124941, -0.033121760935, 0.017605395988, 0.001996837324, -0.013033658266, -0.035522438586, 0.000150386943, -0.014253868721, -0.004027070478, 0.017896959558, 0.015300537460, -0.027950471267, -0.028436947614, 0.014966204762, 0.059657003731, -0.059325706214, -0.031564500183, -0.022113179788, -0.027355778962, -0.028248533607, 0.026831803843, 0.001774800825, 0.008102251217, -0.029552709311, -0.019350159913, -0.012236636132, -0.021983670071, 0.007654092275, 0.027464279905, 0.004667683970, 0.081733293831, 0.042503248900, -0.007881767116, -0.028477057815, 0.065044887364, -0.021476658061, 0.004067112692, -0.008710767142, -0.009215495549, -0.009196995758, -0.036572128534, 0.000836184947, 0.006920416839, -0.045609600842, -0.011299545877, -0.022592708468, 0.038450844586, -0.040808301419, 0.002801277675, 0.040155984461, -0.004923629574, 0.002174060093, -0.000011097349, 0.046220581979, 0.000668383786, 0.007120844442, 0.001532270457, -0.002347494941, -0.001802830724, -0.051685601473, 0.016726061702, -0.030072432011, 0.058989658952, -0.029157351702, 0.013907114044, -0.012826438062, 0.071609564126, 0.057116344571, -0.012214791030, 0.004092651419, -0.013728966936, 0.022518446669, -0.000248747179, 0.027226334438, -0.015682749450, 0.002545378637, 0.029434086755, 0.018800765276, 0.002803800395, 0.011859890074};
7 |
--------------------------------------------------------------------------------
/firmware/weights/b6.h:
--------------------------------------------------------------------------------
1 | //Numpy array shape (84,)
2 | //Min -0.054779332131
3 | //Max 0.070629991591
4 | //Number of zeros 0
5 |
6 | bias_default_t b6[84] = {-0.032966867089, -0.006983916275, -0.038047999144, -0.009882482700, 0.060959815979, -0.007536238991, 0.020086284727, -0.041775535792, -0.009785956703, -0.005947063211, 0.035524152219, -0.041092224419, -0.019280964509, 0.031784139574, -0.048983544111, -0.015788450837, -0.023783938959, -0.012035987340, -0.017229881138, -0.007876897231, 0.069782279432, 0.047852888703, -0.003501100000, -0.016193069518, -0.010035633110, 0.070629991591, 0.021696722135, 0.032718691975, -0.037517290562, 0.008626065217, 0.004888438620, -0.040267899632, 0.012565228157, 0.024001777172, -0.023923447356, 0.050600674003, 0.027741624042, 0.029601853341, 0.038732375950, -0.027232430875, 0.008991451934, -0.027070021257, -0.054779332131, -0.029436258599, 0.017098240554, 0.046566776931, 0.040031053126, -0.028923820704, 0.040764026344, 0.001271030167, 0.030982332304, -0.004330385476, -0.011399333365, 0.024214068428, 0.053599197417, -0.005508277100, -0.000602343120, 0.014781150967, 0.003338275244, -0.006147520151, 0.030861420557, 0.002868968062, 0.022815028206, 0.062813736498, -0.015112570487, -0.012712499127, -0.032406125218, 0.046431940049, 0.000838646258, -0.018054954708, -0.017339974642, 0.035298682749, 0.026389500126, -0.012466176413, 0.038199126720, 0.037746813148, -0.005005435087, -0.004659494385, -0.019553482533, 0.026205413043, 0.043333437294, -0.018712880090, -0.019705660641, -0.007884482853};
7 |
--------------------------------------------------------------------------------
/firmware/weights/b7.h:
--------------------------------------------------------------------------------
1 | //Numpy array shape (10,)
2 | //Min -0.055797688663
3 | //Max 0.048208173364
4 | //Number of zeros 0
5 |
6 | bias_default_t b7[10] = {-0.018838690594, -0.055797688663, 0.001113062957, 0.024918084964, -0.007798998151, 0.016095835716, -0.029257413000, -0.029023772106, 0.048208173364, 0.002171966480};
7 |
--------------------------------------------------------------------------------
/firmware/weights/w1.h:
--------------------------------------------------------------------------------
1 | //Numpy array shape (5, 5, 1, 6)
2 | //Min -0.556057095528
3 | //Max 0.515402376652
4 | //Number of zeros 0
5 |
6 | static weight_default_t w1[150] = {-0.337356805801, -0.166020527482, -0.188039079309, 0.221865758300, 0.118748374283, 0.205742001534, -0.341677039862, 0.113768704236, -0.255915433168, 0.379037320614, 0.399055749178, -0.032215941697, -0.161198616028, 0.085498139262, -0.293242841959, -0.072428748012, 0.034931562841, 0.194200292230, -0.428016811609, 0.173208594322, 0.209636434913, -0.013107879087, -0.220337077975, 0.130188524723, -0.258916825056, -0.141600295901, 0.047698359936, 0.145651534200, -0.067987322807, 0.212970897555, -0.272383660078, 0.144383355975, 0.164759427309, 0.297457188368, 0.359256207943, 0.179455235600, -0.178463071585, 0.010230756365, 0.083344623446, 0.299191415310, 0.231550797820, 0.121012106538, 0.003026356455, 0.295901000500, 0.168225944042, 0.123389333487, -0.043575864285, 0.133341312408, 0.290596395731, -0.323337644339, 0.309673756361, -0.041217245162, -0.150600522757, 0.268381357193, 0.274468153715, -0.556057095528, 0.210920631886, 0.145993307233, -0.370939105749, 0.198138505220, 0.346928298473, 0.114450491965, 0.190081506968, -0.105560749769, 0.189418435097, -0.299595803022, 0.398651927710, 0.076693527400, 0.349568158388, -0.062701240182, 0.318182200193, -0.344217628241, 0.250854939222, 0.268899887800, -0.007958161645, -0.053904056549, 0.002088906709, -0.210342153907, 0.349439471960, -0.133583903313, 0.403515696526, 0.023825678974, -0.188490077853, 0.370289236307, 0.443564862013, -0.285099953413, 0.054614417255, -0.366709500551, -0.058639660478, 0.284780204296, 0.515402376652, 0.203695401549, -0.238887205720, -0.104580439627, 0.063162237406, -0.398537576199, 0.422673583031, 0.176001831889, 0.064747899771, -0.101257503033, 0.329994797707, -0.133417576551, 0.407677322626, 0.198134168983, 0.088186025620, 0.081647217274, -0.106153331697, 0.113117016852, 0.014117936604, 0.431491523981, 0.198787570000, -0.116621375084, -0.143898248672, 0.143751874566, -0.120333328843, 0.215597599745, 0.325393885374, -0.194255679846, -0.112421609461, 0.332679033279, 0.311206549406, 0.011099584401, 0.035101063550, -0.026785384864, 0.104625143111, -0.140990629792, 0.167671322823, -0.021893363446, 0.195220470428, -0.230315566063, 0.156861975789, -0.062336876988, 0.104853853583, 0.153059750795, 0.102880015969, 0.023699967191, 0.103424213827, -0.195286318660, -0.121365040541, 0.346287786961, 0.268168479204, 0.210108593106, -0.067693904042, 0.137459874153, -0.278216630220, 0.244398459792, 0.380732327700, 0.121165528893, -0.032080478966, -0.022464865819};
7 |
--------------------------------------------------------------------------------
/firmware/weights/w7.h:
--------------------------------------------------------------------------------
1 | //Numpy array shape (84, 10)
2 | //Min -0.445905119181
3 | //Max 0.383387386799
4 | //Number of zeros 0
5 |
6 | weight_default_t w7[840] = {0.158005610108, -0.043376982212, 0.106894016266, -0.249968707561, 0.296450734138, 0.151257947087, 0.034446492791, 0.163941845298, 0.130415678024, -0.129540920258, -0.134187310934, 0.070226028562, -0.380243420601, 0.000828472548, -0.240292623639, -0.162254884839, -0.318907499313, 0.074252367020, -0.202085882425, 0.201116263866, -0.341637492180, 0.255519777536, 0.271833598614, -0.002569842385, -0.142709329724, -0.393035233021, -0.096032135189, 0.112649045885, -0.228785306215, 0.049684077501, 0.269195169210, 0.268016368151, 0.161419898272, 0.052899282426, -0.254042267799, -0.248251482844, 0.027554528788, -0.019254451618, -0.199356436729, 0.085713505745, -0.410697966814, -0.122219592333, 0.017889268696, 0.123167231679, 0.114054910839, -0.180167570710, -0.121954932809, -0.364478617907, 0.305438578129, 0.208923056722, -0.103394448757, 0.247982323170, -0.249631151557, -0.182495117188, 0.245056390762, 0.057899601758, 0.135269001126, -0.162857502699, -0.010265144520, -0.226797193289, -0.166845038533, -0.128351867199, 0.230591610074, 0.291824877262, 0.148195669055, 0.144319713116, 0.108214668930, 0.296887457371, 0.047924801707, -0.154055327177, -0.071609556675, 0.099036298692, 0.112075723708, -0.262341707945, 0.131417915225, -0.006981641054, 0.141048863530, 0.358584761620, -0.183069974184, -0.099435947835, -0.215709537268, 0.158688947558, 0.026317330077, 0.048952221870, 0.337412536144, 0.050247833133, -0.102598793805, 0.145618751645, -0.141868233681, 0.301115721464, -0.260697335005, 0.286581814289, -0.104362778366, -0.118936590850, -0.133404329419, 0.215725213289, 0.108833484352, 0.194671913981, 0.059085410088, -0.363948047161, 0.260638147593, -0.318627595901, 0.061759497970, 0.137524962425, -0.073511809111, 0.014380044304, 0.209948122501, -0.087677299976, -0.001158903004, 0.059979405254, 0.085524603724, 0.151237607002, 0.182627916336, -0.172186538577, 0.224599972367, -0.016450982541, -0.005781541113, 0.031966693699, -0.076405473053, 0.207429498434, -0.064441233873, 0.145792469382, -0.100332334638, 0.185925602913, 0.315058708191, 0.154848232865, -0.252343416214, -0.044687792659, -0.073273263872, -0.063575178385, -0.318646013737, 0.035995956510, -0.115238443017, -0.030483899638, 0.093785822392, 0.295889168978, -0.288887679577, -0.130339384079, 0.044930744916, 0.184055224061, 0.080510988832, -0.137271746993, 0.061044402421, -0.196326240897, 0.193863436580, 0.001045982586, 0.230225756764, 0.101309031248, -0.246748000383, 0.009048961103, -0.080060176551, 0.194932967424, 0.095723286271, 0.073188789189, -0.226636469364, -0.239742860198, -0.247538164258, -0.221084207296, -0.077816098928, 0.046228244901, -0.022668950260, 0.290773987770, -0.133565083146, -0.074981085956, -0.112711749971, -0.171477600932, 0.193956851959, 0.160796061158, -0.032885845751, -0.129469186068, -0.065076895058, 0.046860903502, 0.227259755135, -0.013438279741, -0.223281547427, 0.152131438255, -0.004285010975, 0.314181268215, -0.224093437195, 0.009349434637, 0.019527383149, 0.190265953541, -0.202562376857, -0.209192559123, -0.064274422824, 0.182464137673, 0.202432140708, -0.330512583256, -0.266898751259, 0.225405246019, 0.264710426331, 0.137148588896, -0.061155617237, -0.017475735396, -0.179177612066, 0.118388324976, 0.242468342185, -0.259593635798, -0.213437095284, -0.240455135703, 0.000052735035, -0.212073028088, -0.261629581451, 0.245380371809, -0.349220395088, 0.366210103035, 0.123602554202, -0.001075451495, 0.110198594630, -0.058287613094, -0.199615150690, -0.183991581202, 0.223627552390, 0.171774715185, 0.159030035138, 0.300830423832, -0.191813975573, -0.265622347593, 0.061548296362, -0.102706335485, 0.089507900178, 0.183504924178, -0.136830046773, -0.110319599509, 0.022936495021, -0.253026902676, 0.137884676456, 0.010794902220, 0.169555619359, -0.348827362061, -0.244527891278, -0.245820179582, 0.173554226756, 0.033750545233, 0.078170515597, -0.233431816101, -0.279433727264, -0.112630710006, -0.177222266793, -0.339410156012, -0.190735071898, -0.145809650421, 0.053238347173, -0.140944704413, -0.078355178237, -0.027454897761, 0.070022962987, -0.192877262831, 0.069557800889, 0.035686235875, -0.060768496245, -0.180283844471, -0.192280635238, 0.164245396852, 0.212596893311, 0.023750033230, -0.303838223219, -0.246320441365, 0.212766736746, 0.200781017542, 0.024458182976, -0.080818712711, 0.111811608076, 0.213724359870, 0.119685322046, -0.169394373894, 0.294687896967, -0.231704428792, 0.221804305911, -0.238489240408, 0.045679114759, -0.025214683264, 0.147322371602, 0.177135929465, -0.188021838665, -0.049298811704, -0.025713238865, -0.128057926893, -0.093972332776, -0.436797559261, 0.220227971673, 0.046384599060, 0.260471940041, -0.153533041477, -0.182784914970, -0.153854265809, 0.150007352233, 0.161255031824, -0.277274698019, -0.120791301131, 0.056680072099, 0.257442831993, 0.212988600135, -0.072707355022, -0.060521796346, 0.014859840274, -0.110486790538, 0.102671124041, 0.187223419547, -0.093237452209, -0.388978749514, -0.128342494369, 0.019626913592, 0.050754208118, -0.195395052433, -0.028055172414, -0.097035504878, -0.014613196254, -0.173595055938, -0.050540667027, -0.171120584011, 0.032859876752, -0.099307060242, -0.250188291073, 0.051582530141, 0.059866689146, 0.071477413177, 0.277393430471, -0.329656243324, 0.178486794233, 0.024647815153, -0.235502496362, -0.212460905313, 0.138294443488, 0.033274888992, 0.011759853922, 0.147524192929, -0.325585424900, -0.073094502091, -0.265140205622, 0.075583636761, -0.307868957520, 0.162386417389, 0.272691220045, 0.218081191182, -0.227166384459, 0.059947069734, -0.116349518299, 0.071129485965, -0.030066791922, 0.161338701844, 0.197275057435, 0.111550465226, -0.121453173459, 0.115724831820, -0.038669165224, -0.203299120069, -0.445905119181, -0.110864706337, 0.002372163581, -0.069025769830, -0.054567210376, -0.048503924161, -0.164584815502, -0.021305995062, -0.012959050946, 0.034474845976, -0.192813023925, 0.087766118348, 0.242882579565, -0.042394418269, 0.168772786856, -0.219003096223, 0.169869884849, -0.002946683671, 0.307337999344, -0.346390962601, 0.156434014440, 0.140110328794, -0.023036377504, 0.035302270204, 0.112601108849, 0.177803516388, 0.314446657896, -0.319686174393, -0.220984265208, -0.338261961937, -0.034914594144, 0.153967320919, -0.055721294135, 0.030506292358, -0.260777711868, 0.044716630131, -0.200132906437, 0.192225247622, -0.321678161621, 0.213373616338, -0.038198407739, 0.257468968630, -0.237947806716, -0.045104809105, 0.195259764791, -0.092188380659, -0.403736293316, 0.123156256974, 0.121016740799, 0.163569733500, -0.025999309495, -0.061096452177, 0.097517631948, 0.291860729456, -0.399122506380, 0.089154683053, 0.061081398278, 0.225489929318, -0.242583587766, -0.140340417624, 0.038864091039, 0.033871170133, 0.146254912019, 0.269019067287, -0.096504352987, -0.032481841743, -0.297918349504, -0.223599553108, -0.114010393620, -0.098116107285, 0.211236208677, 0.044439602643, 0.197085425258, 0.376601040363, 0.165402606130, 0.042070850730, -0.029734658077, -0.298287004232, -0.119295492768, 0.035733819008, 0.272028714418, -0.132256671786, -0.082522429526, 0.109776720405, 0.272146224976, -0.185996308923, -0.397890925407, 0.197470590472, -0.065758593380, 0.096350677311, -0.108620285988, -0.155874118209, 0.168332636356, -0.347300887108, -0.071722187102, -0.218186587095, -0.257285952568, 0.054029218853, -0.013523399830, 0.097584098577, -0.273228794336, 0.030551925302, -0.028827151284, -0.183592677116, -0.044914804399, -0.266385436058, 0.192293971777, -0.002694291063, -0.296956807375, -0.061620336026, -0.163921192288, 0.102376550436, 0.155157446861, -0.024387074634, -0.267213970423, -0.257187217474, 0.311536341906, -0.116234950721, 0.225877016783, -0.070594310760, 0.082555517554, -0.048209968954, 0.045186694711, 0.350552618504, -0.012207509018, -0.146033570170, -0.186824470758, -0.153115957975, 0.152044594288, 0.032624639571, 0.207444339991, -0.052152544260, -0.181798428297, -0.140989899635, -0.296825557947, 0.157774552703, -0.000830598932, -0.269895493984, -0.227217078209, -0.003318316769, 0.013978756033, 0.159334227443, 0.062820971012, -0.233940780163, 0.215134114027, 0.248396009207, -0.288663536310, 0.250986844301, -0.163136988878, -0.263857871294, -0.130175307393, 0.165610581636, -0.192558839917, -0.219191819429, 0.023926926777, -0.166473537683, 0.271100789309, -0.278952091932, -0.197180613875, -0.368530273438, 0.240831926465, 0.106002546847, -0.044784355909, -0.152890205383, -0.177970007062, 0.240476667881, 0.128601387143, -0.032415930182, -0.103920817375, 0.086150206625, -0.274742007256, -0.151481047273, -0.167055368423, -0.248485505581, 0.109575651586, -0.162529706955, 0.230849057436, -0.079145438969, -0.011703374796, 0.211511075497, 0.119734928012, -0.079603835940, -0.332313269377, 0.032247547060, -0.012120667845, -0.362646162510, -0.137939035892, 0.150869429111, 0.210852041841, 0.119124419987, 0.234067142010, -0.044997043908, 0.021575605497, -0.175984650850, -0.343068420887, -0.225205719471, -0.155486732721, -0.172722712159, -0.067401103675, -0.278219848871, 0.112901538610, 0.233279675245, 0.236374720931, 0.244900166988, 0.274418205023, -0.064050070941, -0.190515026450, 0.067824035883, -0.157201156020, -0.168657094240, 0.022386441007, -0.163128599524, 0.036755729467, 0.136954069138, 0.019107209519, -0.331635564566, -0.026404134929, 0.061567984521, 0.061398770660, -0.115359261632, -0.275036066771, -0.152811706066, -0.074625514448, -0.217804074287, -0.121398515999, 0.256813645363, 0.068942978978, -0.150695994496, -0.211128503084, 0.188574567437, -0.360933154821, -0.222834318876, 0.267977237701, 0.149768203497, -0.350394636393, -0.357828915119, -0.277044624090, -0.025714412332, 0.109299816191, 0.025583855808, -0.314507573843, -0.108975008130, 0.028974866495, -0.198088318110, 0.160630270839, -0.075293757021, -0.259277343750, -0.249163061380, 0.226878166199, 0.299971997738, -0.175861015916, 0.017034573480, -0.039896417409, 0.042056702077, 0.152374371886, -0.009771457873, -0.342800319195, -0.101639628410, 0.117194980383, -0.255077362061, 0.226063251495, 0.194345831871, -0.049160912633, -0.183724343777, 0.067707933486, -0.045591268688, 0.181437164545, 0.220919907093, -0.375059545040, -0.093646302819, 0.112584285438, 0.157490432262, -0.148578420281, -0.077959142625, 0.311475157738, -0.361154288054, 0.073752440512, -0.011407220736, -0.349642068148, 0.144253209233, -0.004464268684, 0.245798960328, -0.165745005012, -0.321158319712, -0.160049259663, -0.178620994091, -0.244308292866, -0.084604308009, 0.203368589282, 0.289690196514, -0.018505565822, 0.260394454002, 0.000337988196, -0.179760605097, 0.003792365314, -0.150734946132, 0.199853584170, 0.064363755286, -0.104567810893, -0.254022210836, 0.159618481994, 0.229181602597, -0.023608073592, 0.037437938154, -0.000157138347, 0.169553428888, -0.097415901721, -0.098412707448, -0.031324110925, -0.207799628377, -0.124172091484, 0.035410530865, 0.066098168492, -0.150404527783, -0.282023400068, -0.288219571114, 0.165787026286, 0.235664814711, -0.153448000550, -0.245849326253, 0.078770317137, -0.276756942272, 0.116182968020, 0.283113926649, -0.261170923710, -0.168590009212, -0.407440453768, 0.162056982517, -0.166565790772, 0.025232007727, -0.186004951596, 0.160855710506, 0.210922256112, -0.226862773299, 0.076268889010, -0.319795846939, 0.205050691962, 0.137376964092, 0.165163949132, -0.105034716427, -0.217951491475, -0.291276454926, -0.175804361701, 0.073313929141, -0.196216717362, -0.157686382532, -0.249372571707, -0.125801086426, -0.186810821295, -0.173892840743, -0.270474523306, 0.184177100658, 0.268230944872, -0.140015184879, 0.205908656120, -0.204647570848, 0.239352226257, 0.221009418368, 0.093239627779, 0.158652424812, -0.095584459603, 0.165544316173, -0.002468364546, -0.352052301168, -0.084287457168, -0.006289057899, 0.063150301576, -0.170709446073, 0.161365360022, -0.333020806313, -0.204615935683, 0.030905889347, 0.204769492149, -0.183419749141, 0.127580538392, 0.121558472514, 0.234977483749, 0.214635968208, -0.131959348917, -0.331795871258, -0.065733976662, 0.085864633322, 0.006946905982, 0.088918641210, 0.022682929412, -0.000869035022, 0.229488953948, 0.086863830686, 0.097655527294, -0.285977900028, -0.287755906582, -0.370591938496, 0.383387386799, -0.294436186552, -0.055748067796, -0.394963234663, -0.195783078671, 0.000755572459, 0.136454373598, -0.087302826345, 0.069270886481, 0.163180887699, 0.076977014542, 0.020523773506, 0.097405582666, 0.006537368521, 0.029427420348, -0.282494753599, 0.228390917182, 0.233547747135, -0.305596798658, 0.162043794990, 0.011607275344, 0.113826155663, -0.098404228687, -0.150766149163, 0.154346674681, -0.334413796663, -0.119058541954, 0.143397673965, 0.185208886862, -0.131660014391, -0.017511233687, 0.077866204083, 0.007963258773, 0.247726365924, -0.099277429283, 0.027487894520, 0.043860789388, -0.164540559053, -0.333265841007, -0.233560323715, -0.263900548220, 0.076998822391, 0.081198081374, -0.233389273286, 0.060631241649, 0.123736232519, 0.002071761992, -0.270743876696, 0.127908140421, -0.007172602229, -0.082190617919, -0.071081317961, -0.048988349736, 0.251201987267, -0.240965500474, -0.207804515958, 0.062345087528, 0.037245746702, -0.141822785139, 0.243198126554, -0.355336606503, 0.110465757549, -0.401544302702, 0.137455731630, -0.124284781516, -0.162930145860, 0.072034522891, -0.207930624485, -0.012695144862, -0.239940658212, 0.194995835423, -0.196436733007, 0.074165649712, 0.049043230712, 0.057562600821, -0.228023678064, 0.324214309454, -0.217967867851, 0.134682476521, -0.148161962628, -0.324226737022, 0.288212627172, 0.243486091495, 0.259924024343, -0.222500756383, -0.266496092081, 0.095479860902, 0.205138936639, -0.232806637883, -0.209738358855, -0.220745816827, 0.124704904854, -0.247577100992, 0.186943992972, -0.031060704961, -0.107571028173, -0.147245094180, -0.380679398775, 0.281640470028, -0.114630438387, 0.150456905365};
7 |
--------------------------------------------------------------------------------
/gpu/Makefile:
--------------------------------------------------------------------------------
1 | cpu: lenet5_host.cpp lenet5.cu conv.cu lenet5.h conv.h
2 | nvcc -o lenet5_cpu lenet5_host.cpp lenet5.cu conv.cu -DUSE_CPU
3 | gpu: lenet5_host.cpp lenet5.cu conv.cu lenet5.h conv.h
4 | nvcc -o lenet5_gpu lenet5_host.cpp lenet5.cu conv.cu
5 |
--------------------------------------------------------------------------------
/gpu/conv.cu:
--------------------------------------------------------------------------------
1 |
2 | #include "conv.h"
3 | #include
4 |
5 | __global__ void conv_2d_2(
6 | float data[IN_HEIGHT_3 * IN_WIDTH_3 * N_CHAN_3],
7 | float res[OUT_HEIGHT_3 * OUT_WIDTH_3 * N_FILT_3],
8 | float weights[FILT_HEIGHT * FILT_WIDTH * N_CHAN_3 * N_FILT_3],
9 | float biases[N_FILT_3])
10 | {
11 | int oh= blockIdx.y * blockDim.y + threadIdx.y;
12 | int ow= blockIdx.x * blockDim.x + threadIdx.x;
13 | if (oh>=IN_HEIGHT_3-FILT_HEIGHT+1 || ow>=IN_WIDTH_3-FILT_WIDTH+1)
14 | return;
15 | int ff = blockIdx.z * blockDim.z + threadIdx.z;
16 | if (ff >= N_FILT_3)
17 | return;
18 | int offset = (oh * OUT_WIDTH_3 + ow)*N_FILT_3;
19 | float temp = biases[ff];
20 | for (int cc = 0; cc < N_CHAN_3; cc++)
21 | {
22 | for (int fh = 0; fh < FILT_HEIGHT; fh++)
23 | {
24 | for (int fw = 0; fw < FILT_WIDTH; fw++)
25 | {
26 | int index_weight = fh * FILT_WIDTH * N_CHAN_3 * N_FILT_3 + fw * N_CHAN_3 * N_FILT_3 + cc * N_FILT_3 + ff;
27 | // assuming there is no padding
28 | temp += data[((oh + fh) * IN_WIDTH_3 + (ow + fw)) * N_CHAN_3 + cc] * weights[index_weight];
29 |
30 | } //end mult loop
31 | } //end channel loop
32 | } //end filter width loop
33 | res[offset + ff] = (temp > 0)?temp:0;
34 | } //end conv2d
35 |
--------------------------------------------------------------------------------
/gpu/conv.h:
--------------------------------------------------------------------------------
1 | #include "parameters.h"
2 | #include "nnet.h"
3 |
4 | __global__ void conv_2d_2(
5 | float data[IN_HEIGHT_3 * IN_WIDTH_3 * N_CHAN_3],
6 | float res[OUT_HEIGHT_3 * OUT_WIDTH_3*N_FILT_3],
7 | float weights[FILT_HEIGHT * FILT_WIDTH * N_CHAN_3 * N_FILT_3],
8 | float biases[N_FILT_3]);
9 |
--------------------------------------------------------------------------------
/gpu/lenet5.cu:
--------------------------------------------------------------------------------
1 |
2 | // modified by Yuxi Sun
3 | // Keras trained accuracy 98.89%
4 |
5 | #include "parameters.h"
6 | #include "lenet5.h"
7 | #include "conv.h"
8 | #include "stdio.h"
9 |
10 | //hls-fpga-machine-learning insert weights
11 | #include "../firmware/weights/w1.h"
12 | #include "../firmware/weights/b1.h"
13 | #include "../firmware/weights/w3.h"
14 | #include "../firmware/weights/b3.h"
15 | #include "../firmware/weights/w5.h"
16 | #include "../firmware/weights/b5.h"
17 | #include "../firmware/weights/w6.h"
18 | #include "../firmware/weights/b6.h"
19 | #include "../firmware/weights/w7.h"
20 | #include "../firmware/weights/b7.h"
21 |
22 | #ifndef USE_CPU
23 | static bool initialized = 0;
24 | static float *d_pool2d_layer2_out;
25 | static float *d_conv2d_layer3_out;
26 | static float *w3_copy, *b3_copy;
27 | static dim3 block(4, 1, 1);
28 | static dim3 grid (OUT_HEIGHT_3, OUT_HEIGHT_3, N_FILT_3);
29 | #endif
30 |
31 | void lenet5(input_t data[IN_HEIGHT_1*IN_WIDTH_1*N_CHAN_1],
32 | result_t res[N_OUTPUTS], bool cleanup)
33 | {
34 |
35 | float conv2d_layer1_out[OUT_HEIGHT_1*OUT_WIDTH_1*N_FILT_1];
36 | nnet::conv_2d(data, conv2d_layer1_out, w1, b1);
37 |
38 | float pool2d_layer2_out[OUT_HEIGHT_2*OUT_WIDTH_2*N_FILT_2];
39 | nnet::pooling2d(conv2d_layer1_out, pool2d_layer2_out);
40 | #ifdef USE_CPU
41 | // start timer
42 | clock_t begin_time = clock();
43 |
44 | float conv2d_layer3_out[OUT_HEIGHT_3 * OUT_WIDTH_3 * N_FILT_3];
45 | nnet::conv_2d(pool2d_layer2_out, conv2d_layer3_out, w3, b3);
46 |
47 | // end timer
48 | clock_t end_time = clock();
49 | printf("CPU kernel time: %f ms\n", double(end_time - begin_time) / CLOCKS_PER_SEC * 1000);
50 | #else
51 | // prepare memory
52 | if (!initialized)
53 | {
54 | cudaMalloc(&d_pool2d_layer2_out, sizeof(float)*OUT_HEIGHT_2*OUT_WIDTH_2*N_FILT_2);
55 | cudaMalloc(&d_conv2d_layer3_out, sizeof(float)*OUT_HEIGHT_3 * OUT_WIDTH_3 * N_FILT_3);
56 | cudaMalloc(&w3_copy, 2400 * sizeof(float));
57 | cudaMalloc(&b3_copy, 16 * sizeof(float));
58 | cudaMemcpy(w3_copy, w3, sizeof(float)*2400, cudaMemcpyHostToDevice);
59 | cudaMemcpy(b3_copy, b3, sizeof(float)*16, cudaMemcpyHostToDevice);
60 | initialized = 1;
61 | }
62 |
63 | cudaMemcpy(d_pool2d_layer2_out, pool2d_layer2_out, sizeof(float)*OUT_HEIGHT_2*OUT_WIDTH_2*N_FILT_2, cudaMemcpyHostToDevice);
64 |
65 | // start timer
66 | cudaEvent_t start, stop;
67 | float elapsedTime;
68 | cudaEventCreate(&start);
69 | cudaEventRecord(start,0);
70 |
71 | // gpu kernel
72 | conv_2d_2<<>>(d_pool2d_layer2_out, d_conv2d_layer3_out, w3_copy, b3_copy);
73 |
74 | // end timer
75 | cudaEventCreate(&stop);
76 | cudaEventRecord(stop,0);
77 | cudaEventSynchronize(stop);
78 | cudaEventElapsedTime(&elapsedTime, start,stop);
79 | printf("GPU kernel time: %f ms\n" ,elapsedTime);
80 |
81 | // device to host
82 | float conv2d_layer3_out[OUT_HEIGHT_3 * OUT_WIDTH_3 * N_FILT_3];
83 | cudaMemcpy(conv2d_layer3_out, d_conv2d_layer3_out, sizeof(float)*OUT_HEIGHT_3 * OUT_WIDTH_3 * N_FILT_3, cudaMemcpyDeviceToHost);
84 | #endif
85 |
86 | float layer4_out[OUT_HEIGHT_4*OUT_WIDTH_4*N_FILT_4];
87 | nnet::pooling2d(conv2d_layer3_out, layer4_out);
88 |
89 | float layer5_out[N_LAYER_5];
90 | nnet::compute_layer(layer4_out, layer5_out, w5, b5);
91 |
92 | float layer6_out[N_LAYER_6];
93 | nnet::compute_layer(layer5_out, layer6_out, w6, b6);
94 |
95 | // float logits7[N_OUTPUTS];
96 | nnet::compute_layer(layer6_out, res, w7, b7);
97 |
98 | // todo change to the non-table version of softmax
99 | // nnet::softmax(logits7, res);
100 | #ifndef USE_CPU
101 | if (cleanup)
102 | {
103 | cudaFree(d_pool2d_layer2_out);
104 | cudaFree(conv2d_layer1_out);
105 | cudaFree(w3_copy);
106 | cudaFree(b3_copy);
107 | }
108 | #endif
109 | }
110 |
--------------------------------------------------------------------------------
/gpu/lenet5.h:
--------------------------------------------------------------------------------
1 |
2 | #ifndef LENET5_H_
3 | #define LENET5_H_
4 |
5 | #include "parameters.h"
6 | #include "nnet.h"
7 |
8 | void lenet5(
9 | input_t data[IN_HEIGHT_1*IN_WIDTH_1*N_CHAN_1],
10 | result_t res[N_OUTPUTS], bool cleanup = false);
11 |
12 |
13 | #endif
14 |
15 |
--------------------------------------------------------------------------------
/gpu/lenet5_host.cpp:
--------------------------------------------------------------------------------
1 |
2 | #include
3 | #include
4 | #include
5 | #include
6 | #include
7 | #include
8 | #include
9 | #include
10 | #include "parameters.h"
11 | #include "lenet5.h"
12 |
13 | #define IMAGE_WIDTH 28
14 | #define TEST_SIZE 100 // full test set has 10000 samples
15 |
16 | int max_likelihood(result_t y[N_OUTPUTS])
17 | {
18 | int i_likely = 0;
19 | result_t y_max = 0;
20 | for (int i = 0; i < N_OUTPUTS; i++)
21 | {
22 | if (y[i] > y_max)
23 | {
24 | y_max = y[i];
25 | i_likely = i;
26 | }
27 | }
28 | return i_likely;
29 | }
30 |
31 | int read_to_array(char *path, input_t x_test[IMAGE_WIDTH*IMAGE_WIDTH*1], int *y_test)
32 | {
33 | std::ifstream inFile;
34 | inFile.open(path);
35 | if (!inFile)
36 | return -1;
37 | if (inFile.get() == '#')
38 | inFile >> *y_test;
39 | for (int i = 0; i < IMAGE_WIDTH; i++)
40 | {
41 | for (int j = 0; j < IMAGE_WIDTH; j++)
42 | {
43 | inFile >> x_test[i*IMAGE_WIDTH+j+0];
44 | }
45 | }
46 | inFile.close();
47 | return 0;
48 | }
49 |
50 | int main(int argc, char **argv)
51 | {
52 | input_t data_str[IN_HEIGHT_1*IN_WIDTH_1*N_CHAN_1];
53 | result_t probs[N_OUTPUTS] = {0};
54 | int y_test, counter = 0;
55 |
56 | char x_str[10] = "";
57 | char path_cstr[30];
58 | clock_t begin_time, end_time;
59 | double total_time = 0;
60 |
61 | for (int im=0; im < TEST_SIZE; im ++){
62 | sprintf(x_str, "%d.txt", im);
63 | std::string image_path = "../test_images/";
64 | image_path += std::string(x_str);
65 | strcpy(path_cstr, image_path.c_str());
66 | if (read_to_array(path_cstr, data_str, &y_test) == 0){
67 | unsigned short size_in, size_out;
68 | begin_time = clock();
69 | if (im == TEST_SIZE - 1)
70 | lenet5(data_str, probs, true); // cleanup
71 | else
72 | lenet5(data_str, probs);
73 | end_time = clock();
74 | total_time += double(end_time - begin_time) / CLOCKS_PER_SEC;
75 | int y_pred = max_likelihood(probs);
76 | if (y_pred == y_test)
77 | counter++;
78 | }
79 | else
80 | std::cout << "failed to read file" << std::endl;
81 | }
82 |
83 | std::cout << "(partial) accuracy: " << counter/(float)TEST_SIZE << std::endl;
84 | std::cout << "average inference time including memory access: " << total_time/TEST_SIZE * 1000 << "ms" << std::endl;
85 | return 0;
86 | }
87 |
--------------------------------------------------------------------------------
/gpu/nnet.h:
--------------------------------------------------------------------------------
1 | //
2 | // rfnoc-hls-neuralnet: Vivado HLS code for neural-net building blocks
3 | //
4 | // Copyright (C) 2017 EJ Kreinar
5 | //
6 | // This program is free software: you can redistribute it and/or modify
7 | // it under the terms of the GNU General Public License as published by
8 | // the Free Software Foundation, either version 3 of the License, or
9 | // (at your option) any later version.
10 | //
11 | // This program is distributed in the hope that it will be useful,
12 | // but WITHOUT ANY WARRANTY; without even the implied warranty of
13 | // MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
14 | // GNU General Public License for more details.
15 | //
16 | // You should have received a copy of the GNU General Public License
17 | // along with this program. If not, see .
18 | //
19 |
20 | #ifndef NNET_H_
21 | #define NNET_H_
22 |
23 | #include
24 |
25 | namespace nnet
26 | {
27 |
28 | struct conv2d_config
29 | {
30 | // Internal data type definitions
31 | typedef float bias_t;
32 | typedef float weight_t;
33 | typedef float accum_t;
34 | };
35 |
36 | template
37 | void conv_2d(
38 | float data[CONFIG_T::in_height * CONFIG_T::in_width * CONFIG_T::n_chan],
39 | float res[CONFIG_T::out_height * CONFIG_T::out_width * CONFIG_T::n_filt],
40 | float weights[CONFIG_T::filt_height * CONFIG_T::filt_width * CONFIG_T::n_chan * CONFIG_T::n_filt],
41 | float biases[CONFIG_T::n_filt])
42 | {
43 | for (int oh = 0; oh < CONFIG_T::out_height; oh++)
44 | {
45 | for (int ow = 0; ow < CONFIG_T::out_width; ow++)
46 | {
47 | for (int ff = 0; ff < CONFIG_T::n_filt; ff++)
48 | {
49 | float temp = 0;
50 | for (int cc = 0; cc < CONFIG_T::n_chan; cc++)
51 | {
52 | for (int fh = 0; fh < CONFIG_T::filt_height; fh++)
53 | {
54 | for (int fw = 0; fw < CONFIG_T::filt_width; fw++)
55 | {
56 | int index_weight = fh * CONFIG_T::filt_width * CONFIG_T::n_chan * CONFIG_T::n_filt + fw * CONFIG_T::n_chan * CONFIG_T::n_filt + cc * CONFIG_T::n_filt + ff;
57 | // assuming there is no padding
58 | if ((oh + fh) < CONFIG_T::in_height && (ow + fw) < CONFIG_T::in_width)
59 | temp += data[((oh + fh) * CONFIG_T::in_width + (ow + fw)) * CONFIG_T::n_chan + cc] * weights[index_weight];
60 | } //end mult loop
61 | } //end channel loop
62 | } //end filter width loop
63 | float res_ = temp + biases[ff];
64 | res[(oh * CONFIG_T::out_width + ow) * CONFIG_T::n_filt + ff] = (res_ > 0)?res_:0;
65 |
66 | } //end filter height loop
67 | } //end output width loop
68 | } //end output height loop
69 | } //end conv2d
70 |
71 | //////////// pool2d ////////////////
72 | struct pooling2d_config
73 | {
74 | // IO size
75 | static const unsigned in_height = 10;
76 | static const unsigned in_width = 10;
77 | static const unsigned n_filt = 4;
78 | static const unsigned stride_height = 2;
79 | static const unsigned stride_width = 2;
80 | static const unsigned pool_height = 2;
81 | static const unsigned pool_width = 2;
82 | static const unsigned out_height = (in_height - pool_height) / stride_height + 1;
83 | static const unsigned out_width = (in_width - pool_width) / stride_width + 1;
84 | // Padding
85 | static const unsigned pad_top = 0;
86 | static const unsigned pad_bottom = 0;
87 | static const unsigned pad_left = 0;
88 | static const unsigned pad_right = 0;
89 | };
90 |
91 | template
92 | void pooling2d(float data[CONFIG_T::in_height * CONFIG_T::in_width * CONFIG_T::n_filt],
93 | float res[CONFIG_T::out_height * CONFIG_T::out_width * CONFIG_T::n_filt])
94 | {
95 |
96 | // Add any necessary padding
97 | const unsigned padded_height = CONFIG_T::in_height + CONFIG_T::pad_top + CONFIG_T::pad_bottom;
98 | const unsigned padded_width = CONFIG_T::in_width + CONFIG_T::pad_left + CONFIG_T::pad_right;
99 |
100 | for (int ff = 0; ff < CONFIG_T::n_filt; ff++)
101 | {
102 | float pool[CONFIG_T::pool_height * CONFIG_T::pool_width];
103 | // Loop over input image y in steps of stride
104 | for (int ii = 0; ii < padded_height; ii += CONFIG_T::stride_height)
105 | {
106 | // Loop over input image x in steps of stride
107 | for (int jj = 0; jj < padded_width; jj += CONFIG_T::stride_width)
108 | {
109 | // Loop over pool window y
110 | for (int kk = 0; kk < CONFIG_T::stride_height; kk++)
111 | {
112 | // Loop over pool window x
113 | for (int ll = 0; ll < CONFIG_T::stride_width; ll++)
114 | {
115 | if (ii + kk < CONFIG_T::pad_top || ii + kk >= (padded_height - CONFIG_T::pad_bottom) || jj + ll < CONFIG_T::pad_left || jj + ll >= (padded_width - CONFIG_T::pad_right))
116 | {
117 | // Add padding
118 | pool[kk * CONFIG_T::stride_width + ll] = 0;
119 | }
120 | else
121 | {
122 | pool[kk * CONFIG_T::stride_width + ll] = data[((ii + kk)*CONFIG_T::in_width + (jj + ll))*CONFIG_T::n_filt + ff];
123 | }
124 | }
125 | }
126 | // do the pooling
127 | float max_pool = pool[0];
128 | for (int i = 1; i < CONFIG_T::pool_height*CONFIG_T::pool_width; i++)
129 | {
130 | max_pool = pool[i] > max_pool ? pool[i] : max_pool;
131 | }
132 | res[(ii / CONFIG_T::stride_height) * CONFIG_T::out_width * CONFIG_T::n_filt + (jj / CONFIG_T::stride_width) * CONFIG_T::n_filt + ff] = max_pool;
133 |
134 | }
135 | }
136 | }
137 | }
138 |
139 | ////////////// fully connected ///////////////////
140 | struct layer_config
141 | {
142 | // Layer Sizes
143 | static const unsigned n_in = 10;
144 | static const unsigned n_out = 10;
145 | };
146 |
147 | template
148 | void compute_layer(
149 | float data[CONFIG_T::n_in],
150 | float res[CONFIG_T::n_out],
151 | float weights[CONFIG_T::n_in * CONFIG_T::n_out],
152 | float biases[CONFIG_T::n_out])
153 | {
154 | float cache;
155 | float acc[CONFIG_T::n_out];
156 |
157 | // Initialize accumulator with input biases
158 | for (int iacc = 0; iacc < CONFIG_T::n_out; iacc++)
159 | {
160 | acc[iacc] = (float)biases[iacc];
161 | }
162 |
163 | // Do the matrix-multiply
164 | for (int ii = 0; ii < CONFIG_T::n_in; ii++)
165 | {
166 | cache = data[ii];
167 | for (int jj = 0; jj < CONFIG_T::n_out; jj++)
168 | {
169 | int index = ii * CONFIG_T::n_out + jj;
170 | float mult = cache * weights[index];
171 | acc[jj] += mult;
172 | }
173 | }
174 | // Cast to "float" type
175 | for (int ires = 0; ires < CONFIG_T::n_out; ires++)
176 | {
177 | res[ires] = acc[ires] > 0 ? acc[ires] : 0;
178 | }
179 | }
180 |
181 | } // namespace nnet
182 |
183 | #endif
184 |
--------------------------------------------------------------------------------
/gpu/parameters.h:
--------------------------------------------------------------------------------
1 | #ifndef PARAMETERS_H_
2 | #define PARAMETERS_H_
3 |
4 | #include "nnet.h"
5 |
6 | typedef float accum_default_t;
7 | typedef float weight_default_t;
8 | typedef float bias_default_t;
9 | typedef float input_t;
10 | typedef float result_t;
11 |
12 | #define IN_HEIGHT_1 28
13 | #define IN_WIDTH_1 28
14 | #define N_CHAN_1 1
15 | #define OUT_HEIGHT_1 24
16 | #define OUT_WIDTH_1 24
17 | #define N_FILT_1 6
18 | #define IN_HEIGHT_2 24
19 | #define IN_WIDTH_2 24
20 | #define OUT_HEIGHT_2 12
21 | #define OUT_WIDTH_2 12
22 | #define POOL_HEIGHT_2 2
23 | #define POOL_WIDTH_2 2
24 | #define N_FILT_2 6
25 | #define N_LAYER_2 864
26 | #define IN_HEIGHT_3 12
27 | #define IN_WIDTH_3 12
28 | #define N_CHAN_3 6
29 | #define OUT_HEIGHT_3 8
30 | #define OUT_WIDTH_3 8
31 | #define N_FILT_3 16
32 | #define IN_HEIGHT_4 8
33 | #define IN_WIDTH_4 8
34 | #define OUT_HEIGHT_4 4
35 | #define OUT_WIDTH_4 4
36 | #define POOL_HEIGHT_4 2
37 | #define POOL_WIDTH_4 2
38 | #define N_FILT_4 16
39 | #define N_LAYER_4 256
40 | #define N_LAYER_5 120
41 | #define N_LAYER_6 84
42 | #define N_OUTPUTS 10
43 |
44 | #define FILT_HEIGHT 5
45 | #define FILT_WIDTH 5
46 |
47 | typedef float layer1_t;
48 | typedef float layer2_t;
49 | typedef float layer3_t;
50 | typedef float layer4_t;
51 | typedef float layer5_t;
52 | typedef float layer6_t;
53 |
54 | //hls-fpga-machine-learning insert layer-config
55 | struct config1 : nnet::conv2d_config {
56 | static const unsigned pad_top = 0;
57 | static const unsigned pad_bottom = 0;
58 | static const unsigned pad_left = 0;
59 | static const unsigned pad_right = 0;
60 | static const unsigned in_height = IN_HEIGHT_1;
61 | static const unsigned in_width = IN_WIDTH_1;
62 | static const unsigned n_chan = N_CHAN_1;
63 | static const unsigned filt_height = 5;
64 | static const unsigned filt_width = 5;
65 | static const unsigned n_filt = N_FILT_1;
66 | static const unsigned stride_height = 1;
67 | static const unsigned stride_width = 1;
68 | static const unsigned out_height = OUT_HEIGHT_1;
69 | static const unsigned out_width = OUT_WIDTH_1;
70 | static const unsigned reuse_factor = 50;
71 | static const unsigned n_zeros = 0;
72 | static const bool store_weights_in_bram = false;
73 | typedef accum_default_t accum_t;
74 | typedef bias_default_t bias_t;
75 | typedef weight_default_t weight_t;
76 | };
77 |
78 | struct config2 : nnet::pooling2d_config {
79 | static const unsigned in_height = IN_HEIGHT_2;
80 | static const unsigned in_width = IN_WIDTH_2;
81 | static const unsigned n_filt = N_FILT_2;
82 | static const unsigned stride_height = 2;
83 | static const unsigned stride_width = 2;
84 | static const unsigned pool_height = 2;
85 | static const unsigned pool_width = 2;
86 | static const unsigned out_height = OUT_HEIGHT_2;
87 | static const unsigned out_width = OUT_WIDTH_2;
88 | static const unsigned pad_top = 0;
89 | static const unsigned pad_bottom = 0;
90 | static const unsigned pad_left = 0;
91 | static const unsigned pad_right = 0;
92 | // static const nnet::Pool_Op pool_op = nnet::Max;
93 | static const unsigned reuse = 50;
94 | };
95 |
96 | struct config3 : nnet::conv2d_config {
97 | static const unsigned pad_top = 0;
98 | static const unsigned pad_bottom = 0;
99 | static const unsigned pad_left = 0;
100 | static const unsigned pad_right = 0;
101 | static const unsigned in_height = IN_HEIGHT_3;
102 | static const unsigned in_width = IN_WIDTH_3;
103 | static const unsigned n_chan = N_CHAN_3;
104 | static const unsigned filt_height = 5;
105 | static const unsigned filt_width = 5;
106 | static const unsigned n_filt = N_FILT_3;
107 | static const unsigned stride_height = 1;
108 | static const unsigned stride_width = 1;
109 | static const unsigned out_height = OUT_HEIGHT_3;
110 | static const unsigned out_width = OUT_WIDTH_3;
111 | static const unsigned reuse_factor = 50;
112 | static const unsigned n_zeros = 0;
113 | static const bool store_weights_in_bram = false;
114 | typedef accum_default_t accum_t;
115 | typedef bias_default_t bias_t;
116 | typedef weight_default_t weight_t;
117 | };
118 |
119 | struct config4 : nnet::pooling2d_config {
120 | static const unsigned in_height = IN_HEIGHT_4;
121 | static const unsigned in_width = IN_WIDTH_4;
122 | static const unsigned n_filt = N_FILT_4;
123 | static const unsigned stride_height = 2;
124 | static const unsigned stride_width = 2;
125 | static const unsigned pool_height = 2;
126 | static const unsigned pool_width = 2;
127 | static const unsigned out_height = OUT_HEIGHT_4;
128 | static const unsigned out_width = OUT_WIDTH_4;
129 | static const unsigned pad_top = 0;
130 | static const unsigned pad_bottom = 0;
131 | static const unsigned pad_left = 0;
132 | static const unsigned pad_right = 0;
133 | static const unsigned reuse = 50;
134 | };
135 |
136 | struct config5 : nnet::layer_config {
137 | static const unsigned n_in = N_LAYER_4;
138 | static const unsigned n_out = N_LAYER_5;
139 | static const unsigned reuse_factor = 24;
140 | static const unsigned n_zeros = 0;
141 | static const bool store_weights_in_bram = false;
142 | typedef accum_default_t accum_t;
143 | typedef bias_default_t bias_t;
144 | typedef weight_default_t weight_t;
145 | };
146 |
147 | struct config6 : nnet::layer_config {
148 | static const unsigned n_in = N_LAYER_5;
149 | static const unsigned n_out = N_LAYER_6;
150 | static const unsigned reuse_factor = 12;
151 | static const unsigned n_zeros = 0;
152 | static const bool store_weights_in_bram = false;
153 | typedef accum_default_t accum_t;
154 | typedef bias_default_t bias_t;
155 | typedef weight_default_t weight_t;
156 | };
157 |
158 | struct config7 : nnet::layer_config {
159 | static const unsigned n_in = N_LAYER_6;
160 | static const unsigned n_out = N_OUTPUTS;
161 | static const unsigned reuse_factor = 2;
162 | static const unsigned n_zeros = 0;
163 | static const bool store_weights_in_bram = false;
164 | typedef accum_default_t accum_t;
165 | typedef bias_default_t bias_t;
166 | typedef weight_default_t weight_t;
167 | };
168 | #endif
169 |
--------------------------------------------------------------------------------
/keras_lenet.py:
--------------------------------------------------------------------------------
1 | import tensorflow as tf
2 | import keras
3 | from keras.datasets import mnist
4 | from keras.models import Model
5 | from keras.layers import Input, Dense, Dropout, Flatten
6 | from keras.layers import Conv2D, MaxPooling2D
7 | saved_model_dir = 'saved_model.json'
8 | saved_weights_dir = 'saved_weights.h5'
9 | batch_size = 128
10 | num_classes = 10
11 | epochs = 10
12 |
13 | # input image dimensions
14 | img_rows, img_cols = 28, 28
15 |
16 | # the data, split between train and test sets
17 | (x_train, y_train), (x_test, y_test) = mnist.load_data()
18 |
19 | x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)
20 | x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)
21 | input_shape = (img_rows, img_cols, 1)
22 |
23 | x_train = x_train.astype('float32')
24 | x_test = x_test.astype('float32')
25 | x_train /= 255
26 | x_test /= 255
27 | print('x_train shape:', x_train.shape)
28 | print(x_train.shape[0], 'train samples')
29 | print(x_test.shape[0], 'test samples')
30 |
31 | # convert class vectors to binary class matrices
32 | y_train = keras.utils.to_categorical(y_train, num_classes)
33 | y_test = keras.utils.to_categorical(y_test, num_classes)
34 |
35 | inputs = Input(shape=input_shape)
36 | layer = Conv2D(filters=6,
37 | kernel_size=5,
38 | strides=1,
39 | activation='relu',
40 | input_shape=input_shape)(inputs)
41 |
42 | # Pooling layer 1
43 | layer = MaxPooling2D(pool_size=2, strides=2)(layer)
44 |
45 | # Layer 2
46 | # Conv Layer 2
47 | layer = Conv2D(filters=16,
48 | kernel_size=5,
49 | strides=1,
50 | activation='relu',
51 | input_shape=(12, 12, 6))(layer)
52 | # Pooling Layer 2
53 | layer = MaxPooling2D(pool_size=2, strides=2)(layer)
54 | # Flatten
55 | layer = Flatten()(layer)
56 | # Layer 3
57 | # Fully connected layer 1
58 | layer = Dense(units=120, activation='relu')(layer)
59 | # Layer 4
60 | # Fully connected layer 2
61 | layer = Dense(units=84, activation='relu')(layer)
62 | # Layer 5
63 | # Output Layer
64 | predictions = Dense(units=10, activation='softmax')(layer)
65 |
66 | model = Model(inputs=inputs, outputs=predictions)
67 |
68 | model.compile(loss=keras.losses.categorical_crossentropy,
69 | optimizer=keras.optimizers.Adadelta(),
70 | metrics=['accuracy'])
71 |
72 | model.fit(x_train, y_train,
73 | batch_size=batch_size,
74 | epochs=epochs,
75 | verbose=1,
76 | validation_data=(x_test, y_test))
77 | score = model.evaluate(x_test, y_test, verbose=0)
78 |
79 | json_string = model.to_json()
80 | with open(saved_model_dir, "w+") as f:
81 | f.write(json_string)
82 | model.save_weights(saved_weights_dir)
83 |
--------------------------------------------------------------------------------
/keras_lenet_infer.py:
--------------------------------------------------------------------------------
1 | '''
2 | CPU (Intel i7-7500 CPU @ 2.0GHz)
3 | N = [10, 100, 1000, 10000]
4 | latency = [0.0078, 0.00047, 0.000219, 0.000199], acceleration flattens out due to limited memory on a mobile cpu
5 | GPU (GeForce 940MX)
6 | latency = [0.2383,0.0128, 0.00132, 0.0002]
7 | '''
8 |
9 | import tensorflow as tf
10 | import keras
11 | from keras.datasets import mnist
12 | from keras.models import Model
13 | from keras.layers import Input, Dense, Dropout, Flatten
14 | from keras.layers import Conv2D, MaxPooling2D
15 | from keras.models import model_from_json
16 | import time
17 | import sys
18 | saved_model_dir = 'misc/saved_model.json'
19 | saved_weights_dir = 'misc/saved_weights.h5'
20 |
21 | if __name__ == "__main__":
22 | with open(saved_model_dir) as f:
23 | json_str = f.read()
24 | model = model_from_json(json_str)
25 | model.load_weights(saved_weights_dir)
26 |
27 | num_classes = 10
28 | # input image dimensions
29 | img_rows, img_cols = 28, 28
30 | # the data, split between train and test sets
31 | (x_train, y_train), (x_test, y_test) = mnist.load_data()
32 | x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)
33 | x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)
34 | input_shape = (img_rows, img_cols, 1)
35 | x_train = x_train.astype('float32')
36 | x_test = x_test.astype('float32')
37 | x_train /= 255
38 | x_test /= 255
39 | print('x_train shape:', x_train.shape)
40 | print(x_train.shape[0], 'train samples')
41 | print(x_test.shape[0], 'test samples')
42 |
43 | # convert class vectors to binary class matrices
44 | y_train = keras.utils.to_categorical(y_train, num_classes)
45 | y_test = keras.utils.to_categorical(y_test, num_classes)
46 |
47 | num_samples = int(sys.argv[1])
48 | start = time.time()
49 | model.predict(x_test[:num_samples])
50 | end = time.time()
51 | print((end-start)/num_samples)
--------------------------------------------------------------------------------
/lenet5_test.cpp:
--------------------------------------------------------------------------------
1 | //
2 | // rfnoc-hls-neuralnet: Vivado HLS code for neural-net building blocks
3 | //
4 | // Copyright (C) 2017 EJ Kreinar
5 | //
6 | // This program is free software: you can redistribute it and/or modify
7 | // it under the terms of the GNU General Public License as published by
8 | // the Free Software Foundation, either version 3 of the License, or
9 | // (at your option) any later version.
10 | //
11 | // This program is distributed in the hope that it will be useful,
12 | // but WITHOUT ANY WARRANTY; without even the implied warranty of
13 | // MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
14 | // GNU General Public License for more details.
15 | //
16 | // You should have received a copy of the GNU General Public License
17 | // along with this program. If not, see .
18 | //
19 | #include
20 | #include
21 | #include
22 | #include
23 | #include
24 | #include
25 | #include
26 | #include "firmware/parameters.h"
27 | #include "firmware/lenet5.h"
28 | #include "nnet_helpers.h"
29 |
30 | #define IMAGE_WIDTH 28
31 | #ifdef C_COSIM
32 | #define TEST_SIZE 10
33 | #else
34 | #define TEST_SIZE 100 // full test set has 10000 samples
35 | #endif
36 |
37 |
38 | int max_likelihood(result_t y[N_OUTPUTS])
39 | {
40 | int i_likely = 0;
41 | result_t y_max = 0;
42 | for (int i = 0; i < N_OUTPUTS; i++)
43 | {
44 | if (y[i] > y_max)
45 | {
46 | y_max = y[i];
47 | i_likely = i;
48 | }
49 | }
50 | return i_likely;
51 | }
52 |
53 | int read_to_array(char *path, input_t x_test[IMAGE_WIDTH*IMAGE_WIDTH*1], int *y_test)
54 | {
55 | std::ifstream inFile;
56 | inFile.open(path);
57 | if (!inFile)
58 | return -1;
59 | if (inFile.get() == '#')
60 | inFile >> *y_test;
61 | // std::cout << *y_test;
62 | for (int i = 0; i < IMAGE_WIDTH; i++)
63 | {
64 | for (int j = 0; j < IMAGE_WIDTH; j++)
65 | {
66 | inFile >> x_test[i*IMAGE_WIDTH+j+0];
67 | }
68 | }
69 | inFile.close();
70 | return 0;
71 | }
72 |
73 | int main(int argc, char **argv)
74 | {
75 |
76 | input_t data_str[IN_HEIGHT_1*IN_WIDTH_1*N_CHAN_1];
77 |
78 | result_t probs[N_OUTPUTS] = {0};
79 | int y_test, counter = 0;
80 |
81 | char x_str[10] = "";
82 | char path_cstr[30];
83 |
84 | for (int im=0; im < TEST_SIZE; im ++){
85 | sprintf(x_str, "%d.txt", im);
86 | std::string image_path = "test_images/";
87 | image_path += std::string(x_str);
88 | strcpy(path_cstr, image_path.c_str());
89 | if (read_to_array(path_cstr, data_str, &y_test) == 0){
90 | unsigned short size_in, size_out;
91 | lenet5(data_str, probs);
92 |
93 | int y_pred = max_likelihood(probs);
94 | std::cout << im << " " << (y_pred == y_test)<< std::endl;
95 | if (y_pred == y_test)
96 | counter++;
97 | }
98 | else
99 | std::cout << "failed to read file" << std::endl;
100 | }
101 | std::cout << counter;
102 |
103 |
104 | return 0;
105 | }
106 |
--------------------------------------------------------------------------------
/misc/cosim.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sherylll/lenet5-accelerator/3cc3021ad060155d1e3f96b18353a15f13f2c88d/misc/cosim.png
--------------------------------------------------------------------------------
/misc/fpga_latency.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sherylll/lenet5-accelerator/3cc3021ad060155d1e3f96b18353a15f13f2c88d/misc/fpga_latency.png
--------------------------------------------------------------------------------
/misc/keras_lenet_infer.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sherylll/lenet5-accelerator/3cc3021ad060155d1e3f96b18353a15f13f2c88d/misc/keras_lenet_infer.png
--------------------------------------------------------------------------------
/misc/plot.py:
--------------------------------------------------------------------------------
1 | import matplotlib.pyplot as plt
2 | saved_graph_dir = 'keras_lenet_infer.png'
3 |
4 | # CPU (Intel i7-7500 CPU @ 2.0GHz)
5 | N = [10, 100, 1000, 10000]
6 | cpu = [0.0078, 0.00047, 0.000219, 0.000199] # acceleration flattens out due to limited memory on a mobile cpu
7 | # GPU (GeForce 940MX)
8 | gpu = [0.2383,0.0128, 0.00132, 0.0002]
9 |
10 | plt.plot(N, cpu, N, gpu)
11 | plt.xscale('log')
12 | plt.legend(['CPU','GPU'])
13 | # plt.show()
14 | plt.savefig(saved_graph_dir)
15 |
--------------------------------------------------------------------------------
/misc/resource_usage.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sherylll/lenet5-accelerator/3cc3021ad060155d1e3f96b18353a15f13f2c88d/misc/resource_usage.png
--------------------------------------------------------------------------------
/misc/saved_model.json:
--------------------------------------------------------------------------------
1 | {"class_name": "Model", "config": {"name": "model_1", "layers": [{"name": "input_1", "class_name": "InputLayer", "config": {"batch_input_shape": [null, 28, 28, 1], "dtype": "float32", "sparse": false, "name": "input_1"}, "inbound_nodes": []}, {"name": "conv2d_1", "class_name": "Conv2D", "config": {"name": "conv2d_1", "trainable": true, "batch_input_shape": [null, 28, 28, 1], "dtype": "float32", "filters": 6, "kernel_size": [5, 5], "strides": [1, 1], "padding": "valid", "data_format": "channels_last", "dilation_rate": [1, 1], "activation": "relu", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": null, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}, "inbound_nodes": [[["input_1", 0, 0, {}]]]}, {"name": "max_pooling2d_1", "class_name": "MaxPooling2D", "config": {"name": "max_pooling2d_1", "trainable": true, "pool_size": [2, 2], "padding": "valid", "strides": [2, 2], "data_format": "channels_last"}, "inbound_nodes": [[["conv2d_1", 0, 0, {}]]]}, {"name": "conv2d_2", "class_name": "Conv2D", "config": {"name": "conv2d_2", "trainable": true, "batch_input_shape": [null, 12, 12, 6], "dtype": "float32", "filters": 16, "kernel_size": [5, 5], "strides": [1, 1], "padding": "valid", "data_format": "channels_last", "dilation_rate": [1, 1], "activation": "relu", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": null, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}, "inbound_nodes": [[["max_pooling2d_1", 0, 0, {}]]]}, {"name": "max_pooling2d_2", "class_name": "MaxPooling2D", "config": {"name": "max_pooling2d_2", "trainable": true, "pool_size": [2, 2], "padding": "valid", "strides": [2, 2], "data_format": "channels_last"}, "inbound_nodes": [[["conv2d_2", 0, 0, {}]]]}, {"name": "flatten_1", "class_name": "Flatten", "config": {"name": "flatten_1", "trainable": true, "data_format": "channels_last"}, "inbound_nodes": [[["max_pooling2d_2", 0, 0, {}]]]}, {"name": "dense_1", "class_name": "Dense", "config": {"name": "dense_1", "trainable": true, "units": 120, "activation": "relu", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": null, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}, "inbound_nodes": [[["flatten_1", 0, 0, {}]]]}, {"name": "dense_2", "class_name": "Dense", "config": {"name": "dense_2", "trainable": true, "units": 84, "activation": "relu", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": null, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}, "inbound_nodes": [[["dense_1", 0, 0, {}]]]}, {"name": "dense_3", "class_name": "Dense", "config": {"name": "dense_3", "trainable": true, "units": 10, "activation": "softmax", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": null, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}, "inbound_nodes": [[["dense_2", 0, 0, {}]]]}], "input_layers": [["input_1", 0, 0]], "output_layers": [["dense_3", 0, 0]]}, "keras_version": "2.2.4", "backend": "tensorflow"}
--------------------------------------------------------------------------------
/misc/saved_weights.h5:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/sherylll/lenet5-accelerator/3cc3021ad060155d1e3f96b18353a15f13f2c88d/misc/saved_weights.h5
--------------------------------------------------------------------------------
/nnet_utils/README.md:
--------------------------------------------------------------------------------
1 | Adapted from https://github.com/Xilinx/RFNoC-HLS-NeuralNet/tree/master/rfnoc/hls/nnet_lib
--------------------------------------------------------------------------------
/nnet_utils/nnet_batchnorm.h:
--------------------------------------------------------------------------------
1 | //
2 | // rfnoc-hls-neuralnet: Vivado HLS code for neural-net building blocks
3 | //
4 | // Copyright (C) 2017 EJ Kreinar
5 | //
6 | // This program is free software: you can redistribute it and/or modify
7 | // it under the terms of the GNU General Public License as published by
8 | // the Free Software Foundation, either version 3 of the License, or
9 | // (at your option) any later version.
10 | //
11 | // This program is distributed in the hope that it will be useful,
12 | // but WITHOUT ANY WARRANTY; without even the implied warranty of
13 | // MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
14 | // GNU General Public License for more details.
15 | //
16 | // You should have received a copy of the GNU General Public License
17 | // along with this program. If not, see .
18 | //
19 |
20 | #ifndef NNET_BATCHNORM_H_
21 | #define NNET_BATCHNORM_H_
22 |
23 | #include "nnet_common.h"
24 | #include "hls_stream.h"
25 | #include
26 |
27 | namespace nnet {
28 |
29 | struct batchnorm_config
30 | {
31 | // Internal data type definitions
32 | typedef float beta_t;
33 | typedef float scale_t;
34 | typedef float mean_t;
35 |
36 | // Layer Sizes
37 | static const unsigned n_in = 10;
38 | static const unsigned n_filt = -1;
39 |
40 | // Resource reuse info
41 | static const unsigned io_type = io_parallel;
42 | static const unsigned reuse_factor = 1;
43 | static const bool store_weights_in_bram = false;
44 | static const unsigned n_zeros = 0;
45 | // partitioning arrays cyclically to go with roll factors?
46 | };
47 |
48 | template
49 | void normalize(
50 | data_T data[CONFIG_T::n_in],
51 | res_T res[CONFIG_T::n_in],
52 | typename CONFIG_T::scale_t scale[CONFIG_T::n_in],
53 | typename CONFIG_T::beta_t beta[CONFIG_T::n_in],
54 | typename CONFIG_T::mean_t mean[CONFIG_T::n_in])
55 | {
56 | data_T cache;
57 |
58 | // Use a function_instantiate in case it helps to explicitly optimize unchanging weights/biases
59 | #pragma HLS function_instantiate variable=scale,beta,mean
60 |
61 | if (CONFIG_T::io_type == io_parallel){
62 | // For parallel inputs:
63 | // - completely partition arrays -- target fabric
64 | // - if we have an unroll factor, limit number of multipliers
65 | #pragma HLS PIPELINE II=CONFIG_T::reuse_factor
66 |
67 | // #pragma HLS ARRAY_PARTITION variable=weights complete // remove this line for now, it breaks compression sometimes
68 | #pragma HLS ARRAY_PARTITION variable=scale complete
69 | #pragma HLS ARRAY_PARTITION variable=beta complete
70 | #pragma HLS ARRAY_PARTITION variable=mean complete
71 |
72 | int multiplier_limit = ceil(float(CONFIG_T::n_in*CONFIG_T::n_in) / float(CONFIG_T::reuse_factor));
73 | #pragma HLS ALLOCATION instances=mul limit=multiplier_limit operation
74 |
75 | } else if (CONFIG_T::io_type == io_serial){
76 | #pragma HLS ARRAY_RESHAPE variable=scale complete dim=1
77 | #pragma HLS ARRAY_RESHAPE variable=beta complete dim=1
78 | #pragma HLS ARRAY_RESHAPE variable=mean complete dim=1
79 | #pragma HLS DATAFLOW
80 | }
81 |
82 | // Calcuate result
83 | Result: for(int ires = 0; ires < CONFIG_T::n_in; ires++){
84 | if (CONFIG_T::io_type == io_serial){
85 | #pragma HLS UNROLL
86 | #pragma HLS PIPELINE
87 | }
88 | if(CONFIG_T::n_filt==-1) res[ires] = (res_T) (data[ires]-mean[ires])*scale[ires]+beta[ires];
89 | else{
90 | int norm_index = ires%CONFIG_T::n_filt;
91 | res[ires] = (res_T) (data[ires]-mean[norm_index])*scale[norm_index]+beta[norm_index];
92 | }
93 | }
94 |
95 | }
96 |
97 | }
98 |
99 | #endif
100 |
--------------------------------------------------------------------------------
/nnet_utils/nnet_common.h:
--------------------------------------------------------------------------------
1 | //
2 | // rfnoc-hls-neuralnet: Vivado HLS code for neural-net building blocks
3 | //
4 | // Copyright (C) 2017 EJ Kreinar
5 | //
6 | // This program is free software: you can redistribute it and/or modify
7 | // it under the terms of the GNU General Public License as published by
8 | // the Free Software Foundation, either version 3 of the License, or
9 | // (at your option) any later version.
10 | //
11 | // This program is distributed in the hope that it will be useful,
12 | // but WITHOUT ANY WARRANTY; without even the implied warranty of
13 | // MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
14 | // GNU General Public License for more details.
15 | //
16 | // You should have received a copy of the GNU General Public License
17 | // along with this program. If not, see .
18 | //
19 |
20 | #ifndef NNET_COMMON_H_
21 | #define NNET_COMMON_H_
22 |
23 | #include "ap_fixed.h"
24 |
25 | namespace nnet {
26 |
27 | // Common type definitions
28 | enum io_type {io_parallel = 0, io_serial};
29 |
30 | // Default data types (??) TODO: Deprecate
31 | typedef ap_fixed<16,4> weight_t_def;
32 | typedef ap_fixed<16,4> bias_t_def;
33 | typedef ap_fixed<32,10> accum_t_def;
34 |
35 | template
36 | void merge(
37 | data_T data1[NIN1],
38 | data_T data2[NIN2],
39 | data_T res[NIN1+NIN2])
40 | {
41 | for(int ii=0; ii.
18 | //
19 |
20 | #ifndef NNET_CONV_H_
21 | #define NNET_CONV_H_
22 |
23 | #include "nnet_common.h"
24 | #include
25 |
26 | namespace nnet {
27 |
28 | struct conv_config
29 | {
30 | // Internal data type definitions
31 | typedef float bias_t;
32 | typedef float weight_t;
33 | typedef float accum_t;
34 |
35 | // Convolutional parameters
36 | static const unsigned pad_left = 4;
37 | static const unsigned pad_right = 5;
38 | static const unsigned y_in = 128;
39 | static const unsigned n_chan = 9;
40 | static const unsigned y_filt = 10;
41 | static const unsigned n_filt = 4;
42 | static const unsigned stride = 1;
43 | static const unsigned y_out = 128;
44 |
45 | static const unsigned reuse_factor = 1;
46 | static const bool store_weights_in_bram = false;
47 | static const unsigned n_zeros = 0; // not used yet
48 | };
49 |
50 |
51 | //Computes multiplier limit
52 | //This function should not be synthesized into firmware
53 | template
54 | int compute_multiplier_limit(
55 | typename CONFIG_T::weight_t weights[CONFIG_T::y_filt * CONFIG_T::n_chan * CONFIG_T::n_filt]
56 | )
57 | {
58 | int n_mult = 0;
59 | for(int ii = 0; ii < CONFIG_T::y_out; ii++) {
60 | for(int ff = 0; ff < CONFIG_T::n_filt; ff++){
61 | for(int cc = 0; cc < CONFIG_T::n_chan; cc++){
62 | for(int jj = 0; jj < CONFIG_T::y_filt; jj++){
63 |
64 | int index_weight = jj*CONFIG_T::n_chan*CONFIG_T::n_filt + cc*CONFIG_T::n_filt + ff;
65 |
66 | if((ii*CONFIG_T::stride+jj) < CONFIG_T::pad_left || (ii*CONFIG_T::stride+jj) >= (CONFIG_T::pad_left + CONFIG_T::y_in)){
67 | //padded -- do nothing
68 | continue;
69 | }
70 | else {
71 | //need to tune this cut?
72 | if( weights[index_weight] > 1e-20 || weights[index_weight] < -1e-20 ){
73 | n_mult++;
74 | }//end if nonzero weight
75 | }//end not padding
76 | }//end loop accross filter
77 | }//end channel loop
78 | }//end filter loop
79 | }//end output loop
80 |
81 | return ceil( float(n_mult) / float(CONFIG_T::reuse_factor) );
82 |
83 | }//end compute_n_mult
84 |
85 |
86 | template
87 | void conv_1d(
88 | data_T data[CONFIG_T::y_in][CONFIG_T::n_chan],
89 | res_T res[CONFIG_T::y_out][CONFIG_T::n_filt],
90 | typename CONFIG_T::weight_t weights[CONFIG_T::y_filt * CONFIG_T::n_chan * CONFIG_T::n_filt],
91 | typename CONFIG_T::bias_t biases[CONFIG_T::n_filt])
92 | {
93 |
94 | typename CONFIG_T::accum_t mult[CONFIG_T::y_out * CONFIG_T::n_filt * CONFIG_T::n_chan * CONFIG_T::y_filt];
95 | typename CONFIG_T::accum_t acc[CONFIG_T::y_out][CONFIG_T::n_filt];
96 |
97 | #pragma HLS ARRAY_PARTITION variable=mult complete dim=0
98 | #pragma HLS ARRAY_PARTITION variable=acc complete dim=0
99 |
100 | // Use a function_instantiate in case it helps to explicitly optimize unchanging weights/biases
101 | #pragma HLS function_instantiate variable=weights,biases
102 |
103 | // Parallel mode
104 | #pragma HLS PIPELINE
105 | #pragma HLS ARRAY_PARTITION variable=biases complete dim=0
106 |
107 | // Limit multipliers to control parallelization
108 | const int multiplier_limit = compute_multiplier_limit(weights);
109 | #pragma HLS ALLOCATION instances=mul limit=multiplier_limit operation
110 |
111 | // Convolve, saving all multiplication results to accumulate later
112 | ConvOut: for(int ii = 0; ii < CONFIG_T::y_out; ii++) {
113 | ConvFilt: for(int ff = 0; ff < CONFIG_T::n_filt; ff++){
114 | ConvChan: for(int cc = 0; cc < CONFIG_T::n_chan; cc++){
115 | ConvMult: for(int jj = 0; jj < CONFIG_T::y_filt; jj++){
116 |
117 | int index_mult = ii*CONFIG_T::n_filt*CONFIG_T::n_chan*CONFIG_T::y_filt + ff*CONFIG_T::n_chan*CONFIG_T::y_filt + cc*CONFIG_T::y_filt + jj;
118 | int index_weight = jj*CONFIG_T::n_chan*CONFIG_T::n_filt + cc*CONFIG_T::n_filt + ff;
119 |
120 | if((ii*CONFIG_T::stride+jj) < CONFIG_T::pad_left || (ii*CONFIG_T::stride+jj) >= (CONFIG_T::pad_left + CONFIG_T::y_in)){
121 | mult[index_mult] = 0;
122 | }
123 | else {
124 | mult[index_mult] = data[ii*CONFIG_T::stride+jj-CONFIG_T::pad_left][cc] * weights[index_weight];
125 | }
126 | }
127 | }//end channel loop
128 | }//end filter loop
129 | }//end output loop
130 |
131 |
132 | // Initialize accumulator with input biases
133 | for(int ii = 0; ii < CONFIG_T::y_out; ii++) {
134 | for(int ff = 0; ff < CONFIG_T::n_filt; ff++) {
135 | acc[ii][ff]=biases[ff];
136 | }
137 | }
138 |
139 |
140 | // Accumulate multiplication result
141 | AccumOut: for(int ii = 0; ii < CONFIG_T::y_out; ii++) {
142 | AccumFilt: for(int ff = 0; ff < CONFIG_T::n_filt; ff++) {
143 | //Do "dot product" sum within filter and sum over channels
144 | AccumChan: for(int cc = 0; cc < CONFIG_T::n_chan; cc++){
145 | AccumDot: for(int jj = 0; jj < CONFIG_T::y_filt; jj++){
146 | int index_mult = ii*CONFIG_T::n_filt*CONFIG_T::n_chan*CONFIG_T::y_filt + ff*CONFIG_T::n_chan*CONFIG_T::y_filt + cc*CONFIG_T::y_filt + jj;
147 | acc[ii][ff] += mult[index_mult];
148 | }//end dot product loop
149 | }//end channel loop
150 | }//end filter loop
151 | }//end output loop
152 |
153 |
154 | // Cast to "res_t" type
155 | for(int ii = 0; ii < CONFIG_T::y_out; ii++) {
156 | for(int ff = 0; ff < CONFIG_T::n_filt; ff++) {
157 | res[ii][ff] = (res_T)(acc[ii][ff]);
158 | }
159 | }
160 | }
161 |
162 |
163 | template
164 | void flatten(
165 | data_T data[NROWS][NCOLS],
166 | data_T res[NROWS*NCOLS])
167 | {
168 |
169 | //Initialize
170 | //for(int i=0; i
183 | void unflatten(
184 | data_T data[NROWS*NCOLS],
185 | data_T res[NROWS][NCOLS])
186 | {
187 | for(int r=0; r.
18 | //
19 |
20 | #ifndef NNET_CONV2D_H_
21 | #define NNET_CONV2D_H_
22 |
23 | #include "nnet_common.h"
24 | #include
25 |
26 | namespace nnet
27 | {
28 |
29 | struct conv2d_config
30 | {
31 | // Internal data type definitions
32 | typedef float bias_t;
33 | typedef float weight_t;
34 | typedef float accum_t;
35 |
36 | // Convolutional parameters
37 | static const unsigned pad_top = 4;
38 | static const unsigned pad_bottom = 5;
39 | static const unsigned pad_left = 4;
40 | static const unsigned pad_right = 5;
41 | static const unsigned in_height = 128;
42 | static const unsigned in_width = 128;
43 | static const unsigned n_chan = 9;
44 | static const unsigned filt_height = 10;
45 | static const unsigned filt_width = 10;
46 | static const unsigned n_filt = 4;
47 | static const unsigned stride_height = 1;
48 | static const unsigned stride_width = 1;
49 | static const unsigned out_height = 128;
50 | static const unsigned out_width = 128;
51 |
52 | static const unsigned reuse_factor = 1;
53 | static const bool store_weights_in_bram = false;
54 | static const unsigned n_zeros = 0; // not used yet
55 | };
56 |
57 | template
58 | void apply_filter(
59 | data_T data_block[CONFIG_T::filt_height * CONFIG_T::filt_width],
60 | res_T mult[CONFIG_T::filt_height * CONFIG_T::filt_width],
61 | typename CONFIG_T::weight_t weights[CONFIG_T::filt_height * CONFIG_T::filt_width * CONFIG_T::n_chan * CONFIG_T::n_filt],
62 | int oh, int ow, int ff, int cc)
63 | {
64 | #pragma inline off
65 | #pragma HLS PIPELINE
66 | int oh_offset = oh * CONFIG_T::stride_height;
67 | int ow_offset = ow * CONFIG_T::stride_width;
68 | #pragma HLS ARRAY_PARTITION variable=mult
69 | #pragma HLS array_partition variable=data_block
70 |
71 | ConvFiltHeight:
72 | for (int fh = 0; fh < CONFIG_T::filt_height; fh++)
73 | {
74 | ConvFiltWidth:
75 | for (int fw = 0; fw < CONFIG_T::filt_width; fw++)
76 | {
77 | int index_weight = fh * CONFIG_T::filt_width * CONFIG_T::n_chan * CONFIG_T::n_filt + fw * CONFIG_T::n_chan * CONFIG_T::n_filt + cc * CONFIG_T::n_filt + ff;
78 |
79 | if ((oh * CONFIG_T::stride_height + fh) < CONFIG_T::pad_top || (oh * CONFIG_T::stride_height + fh) >= (CONFIG_T::pad_top + CONFIG_T::in_height) || (ow * CONFIG_T::stride_width + fw) < CONFIG_T::pad_left || (ow * CONFIG_T::stride_width + fw) >= (CONFIG_T::pad_left + CONFIG_T::in_width))
80 | {
81 | mult[fh * CONFIG_T::filt_width + fw] = 0;
82 | }
83 | else
84 | {
85 | mult[fh * CONFIG_T::filt_width + fw] = data_block[fh * CONFIG_T::filt_width + fw] * weights[index_weight];
86 | }
87 | } //end mult loop
88 | } //end channel loop
89 | }
90 |
91 | template
92 | void conv_2d(
93 | data_T data[CONFIG_T::in_height * CONFIG_T::in_width * CONFIG_T::n_chan],
94 | res_T res[CONFIG_T::out_height][CONFIG_T::out_width][CONFIG_T::n_filt],
95 | typename CONFIG_T::weight_t weights[CONFIG_T::filt_height * CONFIG_T::filt_width * CONFIG_T::n_chan * CONFIG_T::n_filt],
96 | typename CONFIG_T::bias_t biases[CONFIG_T::n_filt])
97 | {
98 | //Convert data to 1D
99 | data_T data_2d[CONFIG_T::in_height * CONFIG_T::in_width][CONFIG_T::n_chan];
100 | #pragma HLS ARRAY_PARTITION variable=data_2d dim = 1
101 | for (int ih = 0; ih < CONFIG_T::in_height; ih++)
102 | {
103 | for (int iw = 0; iw < CONFIG_T::in_width; iw++)
104 | {
105 | #pragma HLS pipeline
106 | for (int cc = 0; cc < CONFIG_T::n_chan; cc++)
107 | {
108 | data_2d[ih * CONFIG_T::in_width + iw][cc] = data[ih * CONFIG_T::in_width * CONFIG_T::n_chan + iw * CONFIG_T::n_chan + cc];
109 | }
110 | }
111 | }
112 |
113 | typename CONFIG_T::accum_t mult[CONFIG_T::out_height * CONFIG_T::out_width][CONFIG_T::n_filt][CONFIG_T::n_chan][CONFIG_T::filt_height * CONFIG_T::filt_width];
114 | #pragma HLS ARRAY_PARTITION variable=mult dim = 2
115 | #pragma HLS ARRAY_PARTITION variable=mult dim = 3
116 | #pragma HLS ARRAY_PARTITION variable=mult dim = 4
117 |
118 | typename CONFIG_T::accum_t acc[CONFIG_T::out_height * CONFIG_T::out_width][CONFIG_T::n_filt];
119 | #pragma HLS ARRAY_PARTITION variable=acc dim = 2
120 | // Convolve, saving all multiplication results to accumulate later
121 | ConvOutHeight:
122 | for (int oh = 0; oh < CONFIG_T::out_height; oh++)
123 | {
124 | ConvOutWidth:
125 | for (int ow = 0; ow < CONFIG_T::out_width; ow++)
126 | {
127 | ConvFilt:
128 | for (int ff = 0; ff < CONFIG_T::n_filt; ff++)
129 | {
130 | //#pragma HLS PIPELINE
131 | ConvChan:
132 | for (int cc = 0; cc < CONFIG_T::n_chan; cc++)
133 | {
134 | data_T data_block[CONFIG_T::filt_height * CONFIG_T::filt_width];
135 | for (int fh = 0; fh < CONFIG_T::filt_height; fh++)
136 | {
137 | #pragma HLS pipeline
138 | for (int fw = 0; fw < CONFIG_T::filt_width; fw++)
139 | {
140 | data_block[fh * CONFIG_T::filt_width + fw] = data_2d[(oh * CONFIG_T::stride_height + fh - CONFIG_T::pad_top) * CONFIG_T::in_width + (ow * CONFIG_T::stride_width + fw - CONFIG_T::pad_left)][cc];
141 | }
142 | }
143 | apply_filter(data_block, mult[oh * CONFIG_T::out_width + ow][ff][cc], weights, oh, ow, ff, cc);
144 | } //end filter width loop
145 | } //end filter height loop
146 | } //end output width loop
147 | } //end output height loop
148 |
149 | // Initialize accumulator with input biases
150 | for (int oh = 0; oh < CONFIG_T::out_height; oh++)
151 | {
152 | for (int ow = 0; ow < CONFIG_T::out_width; ow++)
153 | {
154 | #pragma HLS pipeline
155 | for (int ff = 0; ff < CONFIG_T::n_filt; ff++)
156 | {
157 | acc[oh * CONFIG_T::out_width + ow][ff] = biases[ff];
158 | }
159 | }
160 | }
161 |
162 | // Accumulate multiplication result
163 | AccumOutHeight:
164 | for (int oh = 0; oh < CONFIG_T::out_height; oh++)
165 | {
166 | AccumOutWidth:
167 | for (int ow = 0; ow < CONFIG_T::out_width; ow++)
168 | {
169 | #pragma HLS pipeline
170 | AccumFilt:
171 | for (int ff = 0; ff < CONFIG_T::n_filt; ff++)
172 | {
173 | typename CONFIG_T::accum_t temp = 0;
174 | //Do "dot product" sum within filter and sum over channels
175 | AccumChan:
176 | for (int cc = 0; cc < CONFIG_T::n_chan; cc++)
177 | {
178 | AccumDotHeight:
179 | for (int fh = 0; fh < CONFIG_T::filt_height; fh++)
180 | {
181 | AccumDotWidth:
182 | for (int fw = 0; fw < CONFIG_T::filt_width; fw++)
183 | {
184 | temp += mult[oh * CONFIG_T::out_width + ow][ff][cc][fh * CONFIG_T::filt_width + fw];
185 | } //end dot product filter width loop
186 | } //end dot product filter height loop
187 | } //end n channel loop
188 | acc[oh * CONFIG_T::out_width + ow][ff] = temp;
189 | } //end n filter loop
190 | } //end output width loop
191 | } //end output height loop
192 |
193 | // relu
194 | for (int oh = 0; oh < CONFIG_T::out_height; oh++)
195 | {
196 | for (int ow = 0; ow < CONFIG_T::out_width; ow++)
197 | {
198 | #pragma HLS pipeline
199 | for (int ff = 0; ff < CONFIG_T::n_filt; ff++)
200 | {
201 | int index = oh * CONFIG_T::out_width + ow;
202 | if (acc[index][ff] > 0)
203 | res[oh][ow][ff] = (res_T)acc[index][ff];
204 | else
205 | res[oh][ow][ff] = 0;
206 | }
207 | }
208 | }
209 |
210 | } //end conv2d
211 |
212 | } // namespace nnet
213 |
214 | #endif
215 |
--------------------------------------------------------------------------------
/nnet_utils/nnet_helpers.h:
--------------------------------------------------------------------------------
1 | //
2 | // rfnoc-hls-neuralnet: Vivado HLS code for neural-net building blocks
3 | //
4 | // Copyright (C) 2017 EJ Kreinar
5 | //
6 | // This program is free software: you can redistribute it and/or modify
7 | // it under the terms of the GNU General Public License as published by
8 | // the Free Software Foundation, either version 3 of the License, or
9 | // (at your option) any later version.
10 | //
11 | // This program is distributed in the hope that it will be useful,
12 | // but WITHOUT ANY WARRANTY; without even the implied warranty of
13 | // MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
14 | // GNU General Public License for more details.
15 | //
16 | // You should have received a copy of the GNU General Public License
17 | // along with this program. If not, see .
18 | //
19 |
20 | #ifndef NNET_HELPERS_H
21 | #define NNET_HELPERS_H
22 |
23 | #include
24 | #include
25 | #include
26 | #include "hls_stream.h"
27 |
28 | namespace nnet {
29 |
30 | template
31 | int read_file_1D(const char * filename, dataType data[nrows])
32 | {
33 | FILE *fp;
34 | fp = fopen(filename, "r");
35 | if (fp == 0) {
36 | return -1;
37 | }
38 | // Read data from file
39 | float newval;
40 | for (int ii = 0; ii < nrows; ii++){
41 | if (fscanf(fp, "%f\n", &newval) != 0){
42 | data[ii] = newval;
43 | } else {
44 | return -2;
45 | }
46 | }
47 | fclose(fp);
48 | return 0;
49 | }
50 |
51 | template
52 | int read_file_2D(const char * filename, dataType data[nrows][ncols])
53 | {
54 | FILE *fp;
55 | fp = fopen(filename, "r");
56 | if (fp == 0) {
57 | return -1;
58 | }
59 | // Read data from file
60 | float newval;
61 | for (int ii = 0; ii < nrows; ii++) {
62 | for (int jj = 0; jj < ncols; jj++){
63 | if (fscanf(fp, "%f\n", &newval) != 0){
64 | data[ii][jj] = newval;
65 | } else {
66 | return -2;
67 | }
68 | }
69 | }
70 | fclose(fp);
71 | return 0;
72 | }
73 |
74 | template
75 | void change_type(hls::stream &in, hls::stream &out)
76 | {
77 | in_T datareg;
78 | hls::stream input_trunc;
79 | for (int ii=0; ii
85 | void hls_stream_debug(hls::stream &data, hls::stream &res)
86 | {
87 | data_T datareg;
88 | for (int ii=0; ii.
18 | //
19 |
20 | #ifndef NNET_LAYER_H_
21 | #define NNET_LAYER_H_
22 |
23 | #include "nnet_common.h"
24 | #include "hls_stream.h"
25 | #include
26 |
27 | namespace nnet {
28 |
29 | struct layer_config
30 | {
31 | // Internal data type definitions
32 | typedef float bias_t;
33 | typedef float weight_t;
34 | typedef float accum_t;
35 |
36 | // Layer Sizes
37 | static const unsigned n_in = 10;
38 | static const unsigned n_out = 10;
39 |
40 | // Resource reuse info
41 | static const unsigned io_type = io_parallel;
42 | static const unsigned reuse_factor = 1;
43 | static const bool store_weights_in_bram = false;
44 | static const unsigned n_zeros = 0;
45 | // partitioning arrays cyclically to go with roll factors?
46 | };
47 |
48 | template
49 | void compute_layer(
50 | data_T data[CONFIG_T::n_in],
51 | res_T res[CONFIG_T::n_out],
52 | typename CONFIG_T::weight_t weights[CONFIG_T::n_in*CONFIG_T::n_out],
53 | typename CONFIG_T::bias_t biases[CONFIG_T::n_out])
54 | {
55 | data_T cache;
56 | // typename CONFIG_T::accum_t mult[CONFIG_T::n_in*CONFIG_T::n_out];
57 | typename CONFIG_T::accum_t acc[CONFIG_T::n_out];
58 |
59 | // Use a function_instantiate in case it helps to explicitly optimize unchanging weights/biases
60 | // #pragma HLS function_instantiate variable=weights,biases
61 | int cycle_factor = CONFIG_T::n_out/CONFIG_T::reuse_factor;
62 | #pragma HLS ARRAY_PARTITION variable=weights cyclic factor=cycle_factor
63 | // #pragma HLS ARRAY_PARTITION variable=mult cyclic factor=cycle_factor
64 | #pragma HLS ARRAY_PARTITION variable=acc complete
65 | // #pragma HLS DATAFLOW
66 | // #pragma HLS STREAM variable=mult depth=1
67 | // #pragma HLS STREAM variable=acc depth=1
68 | // if (CONFIG_T::store_weights_in_bram){
69 | // #pragma HLS RESOURCE variable=weights core=ROM_2P_BRAM
70 | // }
71 |
72 | // Initialize accumulator with input biases
73 | ResetAccum: for(int iacc = 0; iacc < CONFIG_T::n_out; iacc++) {
74 | if (CONFIG_T::io_type == io_serial){
75 | #pragma HLS UNROLL
76 | }
77 | acc[iacc] = (typename CONFIG_T::accum_t) biases[iacc];
78 | }
79 |
80 | // Do the matrix-multiply
81 | Product1: for(int ii = 0; ii < CONFIG_T::n_in; ii++) {
82 | if (CONFIG_T::io_type == io_serial){
83 | #pragma HLS PIPELINE
84 | }
85 | cache = data[ii];
86 | Product2: for(int jj = 0; jj < CONFIG_T::n_out; jj++) {
87 | // if (CONFIG_T::io_type == io_serial) {
88 | // int multiplier_limit = ceil(float(CONFIG_T::n_out) / float(CONFIG_T::reuse_factor));
89 | // #pragma HLS ALLOCATION instances=mul limit=multiplier_limit operation
90 | // }
91 | int index = ii*CONFIG_T::n_out+jj;
92 | // mult[index] = cache * weights[index];
93 | typename CONFIG_T::accum_t mult = cache * weights[index];
94 | acc[jj] += mult;
95 | }
96 | }
97 |
98 | // Accumulate multiplication result
99 | // Accum1: for(int ii = 0; ii < CONFIG_T::n_in; ii++) {
100 | // if (CONFIG_T::io_type == io_serial){
101 | // #pragma HLS PIPELINE
102 | // }
103 | // Accum2: for(int jj = 0; jj < CONFIG_T::n_out; jj++) {
104 | // int index = ii*CONFIG_T::n_out+jj;
105 | // acc[jj] += mult[index];
106 | // }
107 | // }
108 |
109 | // Cast to "res_t" type
110 | Result: for(int ires = 0; ires < CONFIG_T::n_out; ires++){
111 | if (CONFIG_T::io_type == io_serial){
112 | #pragma HLS UNROLL
113 | }
114 | if (acc[ires]>0) res[ires] = (res_T) (acc[ires]);
115 | else res[ires] = 0;
116 | }
117 | }
118 |
119 | }
120 |
121 | #endif
122 |
--------------------------------------------------------------------------------
/nnet_utils/nnet_pooling.h:
--------------------------------------------------------------------------------
1 | #ifndef NNET_POOLING_H_
2 | #define NNET_POOLING_H_
3 |
4 | #include
5 | #include "nnet_helpers.h"
6 |
7 | namespace nnet
8 | {
9 |
10 | // Return the maximum value from an array
11 | template
12 | T max(T x[N])
13 | {
14 | #pragma HLS inline off
15 | T y = x[0];
16 | for (int i = 1; i < N; i++)
17 | {
18 | y = x[i] > y ? x[i] : y;
19 | }
20 | return y;
21 | }
22 |
23 | template
24 | ap_int avg(ap_int (&x)[N])
25 | {
26 | // Use a wider accumulator than the input to avoid overflow
27 | ap_int tmp = 0;
28 | for (int i = 0; i < N; i++)
29 | {
30 | tmp += x[i];
31 | }
32 | tmp /= N;
33 | // Now cast back to original type
34 | ap_int y = tmp;
35 | return tmp;
36 | }
37 |
38 | template
39 | ap_fixed avg(ap_fixed (&x)[N])
40 | {
41 | // Use a wider accumulator than the input to avoid overflow
42 | ap_fixed tmp = 0;
43 | for (int i = 0; i < N; i++)
44 | {
45 | tmp += x[i];
46 | }
47 | tmp /= N;
48 | // Now cast back to original type
49 | ap_fixed y = tmp;
50 | return y;
51 | }
52 |
53 | // Return the mean value of an array
54 | template
55 | T avg(T (&x)[N])
56 | {
57 | T y = 0;
58 | for (int i = 0; i < N; i++)
59 | {
60 | y += x[i];
61 | }
62 | y /= N;
63 | return y;
64 | }
65 |
66 | // Enumeration for pooling operation (max, avg, l2norm pooling)
67 | enum Pool_Op
68 | {
69 | Max,
70 | Average
71 | }; // L2Norm };
72 | template
73 | T pool_op(T (&x)[N])
74 | {
75 | switch (op)
76 | {
77 | case Max:
78 | return max(x);
79 | case Average:
80 | return avg(x);
81 | // case L2Norm: return l2norm(x);
82 | }
83 | }
84 |
85 | template
86 | T pad_val()
87 | {
88 | /*---
89 | *- In Tensorflow, pooling ignores the value in the padded cells
90 | *- For Avg pooling, return 0 (the divisior is modified to the
91 | *- area overlapping the unpadded image.
92 | *- For max pooling, return the most negative value for the type.
93 | *- TODO this is not really generic, it assumes fixed point or integer T
94 | ---*/
95 | switch (op)
96 | {
97 | case Max:
98 | {
99 | T x = 0;
100 | x[x.width - 1] = 1;
101 | return x;
102 | break;
103 | }
104 | case Average:
105 | return 0;
106 | }
107 | }
108 |
109 | struct pooling1d_config
110 | {
111 | // IO size
112 | static const unsigned n_in = 10;
113 | static const unsigned pool_size = 2;
114 | static const unsigned n_out = n_in / pool_size;
115 | static const unsigned pad_left = 0;
116 | static const unsigned pad_right = 0;
117 | // Pooling function
118 | static const Pool_Op pool_op = Max;
119 | };
120 |
121 | template
122 | void pooling1d(data_T data[CONFIG_T::n_in], data_T res[CONFIG_T::n_out])
123 | {
124 | for (int ii = 0; ii < CONFIG_T::n_out; ii++)
125 | {
126 | data_T pool[CONFIG_T::pool_size];
127 | for (int jj = 0; jj < CONFIG_T::pool_size; jj++)
128 | {
129 | pool[jj] = data[ii * CONFIG_T::pool_size + jj];
130 | }
131 | res[ii] = pool_op(pool);
132 | }
133 | }
134 |
135 | struct pooling2d_config
136 | {
137 | // IO size
138 | static const unsigned in_height = 10;
139 | static const unsigned in_width = 10;
140 | static const unsigned n_filt = 4;
141 | static const unsigned stride_height = 2;
142 | static const unsigned stride_width = 2;
143 | static const unsigned pool_height = 2;
144 | static const unsigned pool_width = 2;
145 | static const unsigned out_height = (in_height - pool_height) / stride_height + 1;
146 | static const unsigned out_width = (in_width - pool_width) / stride_width + 1;
147 | // Padding
148 | static const unsigned pad_top = 0;
149 | static const unsigned pad_bottom = 0;
150 | static const unsigned pad_left = 0;
151 | static const unsigned pad_right = 0;
152 | // Pooling function
153 | static const Pool_Op pool_op = Max;
154 | // Reuse
155 | static const unsigned reuse = 1;
156 | };
157 |
158 | template
159 | constexpr int pool_op_limit()
160 | {
161 | return (CONFIG_T::out_height * CONFIG_T::out_width) * CONFIG_T::n_filt / CONFIG_T::reuse;
162 | }
163 |
164 | template
165 | void pooling2d(data_T data[CONFIG_T::in_height][CONFIG_T::in_width][CONFIG_T::n_filt],
166 | data_T res[CONFIG_T::out_height * CONFIG_T::out_width * CONFIG_T::n_filt])
167 | {
168 |
169 | // Add any necessary padding
170 | const unsigned padded_height = CONFIG_T::in_height + CONFIG_T::pad_top + CONFIG_T::pad_bottom;
171 | const unsigned padded_width = CONFIG_T::in_width + CONFIG_T::pad_left + CONFIG_T::pad_right;
172 |
173 | for (int ff = 0; ff < CONFIG_T::n_filt; ff++)
174 | {
175 | data_T pool[CONFIG_T::pool_height * CONFIG_T::pool_width];
176 | #pragma HLS array_partition variable = pool complete
177 | // Loop over input image y in steps of stride
178 | for (int ii = 0; ii < padded_height; ii += CONFIG_T::stride_height)
179 | {
180 | // Loop over input image x in steps of stride
181 | for (int jj = 0; jj < padded_width; jj += CONFIG_T::stride_width)
182 | {
183 | #pragma HLS pipeline
184 | // Loop over pool window y
185 | for (int kk = 0; kk < CONFIG_T::stride_height; kk++)
186 | {
187 | // Loop over pool window x
188 | for (int ll = 0; ll < CONFIG_T::stride_width; ll++)
189 | {
190 | if (ii + kk < CONFIG_T::pad_top || ii + kk >= (padded_height - CONFIG_T::pad_bottom) || jj + ll < CONFIG_T::pad_left || jj + ll >= (padded_width - CONFIG_T::pad_right))
191 | {
192 | // Add padding
193 | pool[kk * CONFIG_T::stride_width + ll] = 0;
194 | }
195 | else
196 | {
197 | pool[kk * CONFIG_T::stride_width + ll] = data[ii + kk][jj + ll][ff];
198 | }
199 | }
200 | }
201 | // do the pooling
202 | res[(ii / CONFIG_T::stride_height) * CONFIG_T::out_width * CONFIG_T::n_filt + (jj / CONFIG_T::stride_width) * CONFIG_T::n_filt + ff] =
203 | max(pool);
204 | }
205 | }
206 | }
207 | }
208 |
209 | } // namespace nnet
210 |
211 | #endif
212 |
--------------------------------------------------------------------------------
/test_images/0.txt:
--------------------------------------------------------------------------------
1 | # 7
2 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
3 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
4 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
5 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
6 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
7 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
8 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
9 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.294117748737335205e-01 7.254902124404907227e-01 6.235294342041015625e-01 5.921568870544433594e-01 2.352941185235977173e-01 1.411764770746231079e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
10 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.705882430076599121e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.450980424880981445e-01 7.764706015586853027e-01 7.764706015586853027e-01 7.764706015586853027e-01 7.764706015586853027e-01 7.764706015586853027e-01 7.764706015586853027e-01 7.764706015586853027e-01 7.764706015586853027e-01 6.666666865348815918e-01 2.039215713739395142e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
11 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.627451121807098389e-01 4.470588266849517822e-01 2.823529541492462158e-01 4.470588266849517822e-01 6.392157077789306641e-01 8.901960849761962891e-01 9.960784316062927246e-01 8.823529481887817383e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.803921580314636230e-01 8.980392217636108398e-01 9.960784316062927246e-01 9.960784316062927246e-01 5.490196347236633301e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
12 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.666667014360427856e-02 2.588235437870025635e-01 5.490196123719215393e-02 2.627451121807098389e-01 2.627451121807098389e-01 2.627451121807098389e-01 2.313725501298904419e-01 8.235294371843338013e-02 9.254902005195617676e-01 9.960784316062927246e-01 4.156862795352935791e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
13 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.254902064800262451e-01 9.921568632125854492e-01 8.196078538894653320e-01 7.058823853731155396e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
14 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.627451211214065552e-02 9.137254953384399414e-01 1.000000000000000000e+00 3.254902064800262451e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
15 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.058823823928833008e-01 9.960784316062927246e-01 9.333333373069763184e-01 1.725490242242813110e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
16 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.313725501298904419e-01 9.764705896377563477e-01 9.960784316062927246e-01 2.431372553110122681e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
17 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.215686559677124023e-01 9.960784316062927246e-01 7.333333492279052734e-01 1.960784383118152618e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
18 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.529411926865577698e-02 8.039215803146362305e-01 9.725490212440490723e-01 2.274509817361831665e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
19 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.941176474094390869e-01 9.960784316062927246e-01 7.137255072593688965e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
20 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.941176593303680420e-01 9.843137264251708984e-01 9.411764740943908691e-01 2.235294133424758911e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
21 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.450980693101882935e-02 8.666666746139526367e-01 9.960784316062927246e-01 6.509804129600524902e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
22 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.176470611244440079e-02 7.960784435272216797e-01 9.960784316062927246e-01 8.588235378265380859e-01 1.372549086809158325e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
23 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.490196138620376587e-01 9.960784316062927246e-01 9.960784316062927246e-01 3.019607961177825928e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
24 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.215686276555061340e-01 8.784313797950744629e-01 9.960784316062927246e-01 4.509803950786590576e-01 3.921568859368562698e-03 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
25 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.215686559677124023e-01 9.960784316062927246e-01 9.960784316062927246e-01 2.039215713739395142e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
26 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.392156869173049927e-01 9.490196108818054199e-01 9.960784316062927246e-01 9.960784316062927246e-01 2.039215713739395142e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
27 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.745098054409027100e-01 9.960784316062927246e-01 9.960784316062927246e-01 8.588235378265380859e-01 1.568627506494522095e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
28 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.745098054409027100e-01 9.960784316062927246e-01 8.117647171020507812e-01 7.058823853731155396e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
29 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
30 |
--------------------------------------------------------------------------------
/test_images/1.txt:
--------------------------------------------------------------------------------
1 | # 2
2 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
3 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
4 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
5 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.549019634723663330e-01 4.901960790157318115e-01 6.705882549285888672e-01 1.000000000000000000e+00 1.000000000000000000e+00 5.882353186607360840e-01 3.647058904170989990e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
6 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.627451181411743164e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 8.549019694328308105e-01 1.176470592617988586e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
7 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.627451181411743164e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 8.352941274642944336e-01 5.568627715110778809e-01 6.901960968971252441e-01 9.921568632125854492e-01 9.921568632125854492e-01 4.784313738346099854e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
8 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.039215713739395142e-01 9.803921580314636230e-01 9.921568632125854492e-01 8.235294222831726074e-01 1.254902034997940063e-01 4.705882444977760315e-02 0.000000000000000000e+00 2.352941222488880157e-02 8.078431487083435059e-01 9.921568632125854492e-01 5.490196347236633301e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
9 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.019607961177825928e-01 9.843137264251708984e-01 8.235294222831726074e-01 9.803921729326248169e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.784313738346099854e-01 9.725490212440490723e-01 9.921568632125854492e-01 2.549019753932952881e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
10 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.215686276555061340e-01 7.058823853731155396e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.196078538894653320e-01 9.921568632125854492e-01 9.921568632125854492e-01 2.549019753932952881e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
11 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.588235318660736084e-01 9.686274528503417969e-01 9.921568632125854492e-01 7.764706015586853027e-01 3.921568766236305237e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
12 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.980392277240753174e-01 9.686274528503417969e-01 9.921568632125854492e-01 9.058823585510253906e-01 2.470588237047195435e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
13 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.019608139991760254e-01 9.921568632125854492e-01 9.921568632125854492e-01 5.647059082984924316e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
14 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.901960968971252441e-01 9.647058844566345215e-01 9.921568632125854492e-01 6.235294342041015625e-01 4.705882444977760315e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
15 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 9.803921729326248169e-02 9.176470637321472168e-01 9.921568632125854492e-01 9.137254953384399414e-01 1.372549086809158325e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
16 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.764706015586853027e-01 9.921568632125854492e-01 9.921568632125854492e-01 5.529412031173706055e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
17 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.058823645114898682e-01 9.725490212440490723e-01 9.921568632125854492e-01 7.411764860153198242e-01 4.705882444977760315e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
18 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.450980693101882935e-02 7.843137383460998535e-01 9.921568632125854492e-01 9.921568632125854492e-01 5.529412031173706055e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
19 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.254902243614196777e-01 9.921568632125854492e-01 9.921568632125854492e-01 6.784313917160034180e-01 4.705882444977760315e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
20 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 9.725490212440490723e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.803921729326248169e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
21 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 9.725490212440490723e-01 9.921568632125854492e-01 9.921568632125854492e-01 1.686274558305740356e-01 7.843137532472610474e-02 7.843137532472610474e-02 7.843137532472610474e-02 7.843137532472610474e-02 1.960784383118152618e-02 0.000000000000000000e+00 1.960784383118152618e-02 7.843137532472610474e-02 7.843137532472610474e-02 1.450980454683303833e-01 5.882353186607360840e-01 5.882353186607360840e-01 5.882353186607360840e-01 5.764706134796142578e-01 3.921568766236305237e-02 0.000000000000000000e+00
22 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 9.725490212440490723e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 6.588235497474670410e-01 5.607843399047851562e-01 6.509804129600524902e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 4.823529422283172607e-01 0.000000000000000000e+00
23 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.823529601097106934e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.764705896377563477e-01 9.686274528503417969e-01 9.686274528503417969e-01 6.627451181411743164e-01 4.588235318660736084e-01 4.588235318660736084e-01 2.235294133424758911e-01 0.000000000000000000e+00
24 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.627451002597808838e-01 4.823529422283172607e-01 4.823529422283172607e-01 4.823529422283172607e-01 6.509804129600524902e-01 9.921568632125854492e-01 9.921568632125854492e-01 9.921568632125854492e-01 6.078431606292724609e-01 4.823529422283172607e-01 4.823529422283172607e-01 1.607843190431594849e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
25 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
26 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
27 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
28 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
29 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
30 |
--------------------------------------------------------------------------------
/test_images/10.txt:
--------------------------------------------------------------------------------
1 | # 0
2 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
3 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
4 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
5 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
6 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.392156869173049927e-01 1.176470611244440079e-02 1.647058874368667603e-01 4.627451002597808838e-01 7.568627595901489258e-01 4.627451002597808838e-01 4.627451002597808838e-01 2.392156869173049927e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
7 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.490196123719215393e-02 7.019608020782470703e-01 9.607843160629272461e-01 9.254902005195617676e-01 9.490196108818054199e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.607843160629272461e-01 9.215686321258544922e-01 3.294117748737335205e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
8 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.921568870544433594e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 8.352941274642944336e-01 7.529411911964416504e-01 6.980392336845397949e-01 6.980392336845397949e-01 7.058823704719543457e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.450980424880981445e-01 1.803921610116958618e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
9 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.686274558305740356e-01 9.215686321258544922e-01 9.960784316062927246e-01 8.862745165824890137e-01 2.509804069995880127e-01 1.098039224743843079e-01 4.705882444977760315e-02 0.000000000000000000e+00 0.000000000000000000e+00 7.843137718737125397e-03 5.019608139991760254e-01 9.882352948188781738e-01 1.000000000000000000e+00 6.784313917160034180e-01 6.666667014360427856e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
10 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.196078449487686157e-01 9.960784316062927246e-01 9.921568632125854492e-01 4.196078479290008545e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.254902243614196777e-01 9.803921580314636230e-01 9.960784316062927246e-01 2.941176593303680420e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
11 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.470588237047195435e-01 9.960784316062927246e-01 6.196078658103942871e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.666666746139526367e-01 9.960784316062927246e-01 6.156862974166870117e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
12 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.607843279838562012e-01 9.960784316062927246e-01 4.039215743541717529e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.882353186607360840e-01 9.960784316062927246e-01 8.352941274642944336e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
13 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.333333402872085571e-01 8.627451062202453613e-01 9.372549057006835938e-01 2.274509817361831665e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.294117748737335205e-01 9.960784316062927246e-01 8.352941274642944336e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
14 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.941176474094390869e-01 9.960784316062927246e-01 6.705882549285888672e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.294117748737335205e-01 9.960784316062927246e-01 8.352941274642944336e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
15 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.392156958580017090e-01 9.372549057006835938e-01 2.352941185235977173e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.294117748737335205e-01 9.960784316062927246e-01 8.352941274642944336e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
16 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.392156958580017090e-01 7.803921699523925781e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.294117748737335205e-01 9.960784316062927246e-01 8.352941274642944336e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
17 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.313725605607032776e-02 8.588235378265380859e-01 7.803921699523925781e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.294117748737335205e-01 9.960784316062927246e-01 8.352941274642944336e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
18 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.843137323856353760e-01 9.960784316062927246e-01 7.803921699523925781e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.352941393852233887e-01 9.960784316062927246e-01 8.196078538894653320e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
19 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.843137323856353760e-01 9.960784316062927246e-01 7.803921699523925781e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.000000029802322388e-01 9.333333373069763184e-01 9.960784316062927246e-01 2.941176593303680420e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
20 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.843137323856353760e-01 9.960784316062927246e-01 7.803921699523925781e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.000000029802322388e-01 6.470588445663452148e-01 9.960784316062927246e-01 7.647058963775634766e-01 1.568627543747425079e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
21 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.588235437870025635e-01 9.450980424880981445e-01 7.803921699523925781e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.176470611244440079e-02 6.549019813537597656e-01 9.960784316062927246e-01 8.901960849761962891e-01 2.156862765550613403e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
22 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.392156958580017090e-01 8.352941274642944336e-01 7.843137532472610474e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.803921610116958618e-01 5.960784554481506348e-01 7.921568751335144043e-01 9.960784316062927246e-01 9.960784316062927246e-01 2.470588237047195435e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
23 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.392156958580017090e-01 9.960784316062927246e-01 8.000000119209289551e-01 7.058823704719543457e-01 7.058823704719543457e-01 7.058823704719543457e-01 7.058823704719543457e-01 7.058823704719543457e-01 9.215686321258544922e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.176470637321472168e-01 6.117647290229797363e-01 3.921568766236305237e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
24 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.176470696926116943e-01 8.039215803146362305e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.882352948188781738e-01 9.176470637321472168e-01 4.705882370471954346e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
25 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.019607856869697571e-01 8.235294222831726074e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 9.960784316062927246e-01 6.000000238418579102e-01 4.078431427478790283e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
26 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
27 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
28 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
29 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
30 |
--------------------------------------------------------------------------------
/test_images/11.txt:
--------------------------------------------------------------------------------
1 | # 6
2 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
3 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
4 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
5 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.745098061859607697e-02 8.000000119209289551e-01 9.921568632125854492e-01 6.901960968971252441e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
6 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.745098061859607697e-02 5.882353186607360840e-01 9.882352948188781738e-01 9.882352948188781738e-01 4.901960790157318115e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
7 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.588235318660736084e-01 9.882352948188781738e-01 7.294117808341979980e-01 2.196078449487686157e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
8 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.529412031173706055e-01 9.882352948188781738e-01 4.627451002597808838e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
9 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.039215922355651855e-01 9.686274528503417969e-01 1.960784345865249634e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
10 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.019607856869697571e-01 9.921568632125854492e-01 7.686274647712707520e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
11 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.882353186607360840e-01 9.921568632125854492e-01 7.686274647712707520e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.235294133424758911e-01 3.333333432674407959e-01 3.333333432674407959e-01 1.490196138620376587e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
12 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.823529481887817383e-01 9.921568632125854492e-01 3.764705955982208252e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.921568870544433594e-01 8.862745165824890137e-01 9.529411792755126953e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.333333373069763184e-01 4.901960790157318115e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
13 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.921568766236305237e-02 8.980392217636108398e-01 8.862745165824890137e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.568627543747425079e-02 2.117647081613540649e-01 8.980392217636108398e-01 9.921568632125854492e-01 1.000000000000000000e+00 9.176470637321472168e-01 6.862745285034179688e-01 8.823529481887817383e-01 1.000000000000000000e+00 8.941176533699035645e-01 1.215686276555061340e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
14 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.313725531101226807e-01 9.882352948188781738e-01 5.882353186607360840e-01 0.000000000000000000e+00 0.000000000000000000e+00 1.019607856869697571e-01 5.019608139991760254e-01 9.882352948188781738e-01 9.882352948188781738e-01 8.901960849761962891e-01 5.254902243614196777e-01 1.098039224743843079e-01 0.000000000000000000e+00 0.000000000000000000e+00 6.980392336845397949e-01 9.882352948188781738e-01 2.196078449487686157e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
15 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.235294342041015625e-01 9.882352948188781738e-01 4.431372582912445068e-01 0.000000000000000000e+00 0.000000000000000000e+00 5.882353186607360840e-01 9.921568632125854492e-01 9.882352948188781738e-01 7.294117808341979980e-01 1.686274558305740356e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.529412031173706055e-01 9.882352948188781738e-01 2.196078449487686157e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
16 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.254902124404907227e-01 9.882352948188781738e-01 4.431372582912445068e-01 0.000000000000000000e+00 1.490196138620376587e-01 9.294117689132690430e-01 9.921568632125854492e-01 5.921568870544433594e-01 2.352941222488880157e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.529412031173706055e-01 7.921568751335144043e-01 2.352941222488880157e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
17 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.764706015586853027e-01 9.921568632125854492e-01 4.470588266849517822e-01 0.000000000000000000e+00 5.764706134796142578e-01 9.921568632125854492e-01 6.392157077789306641e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.039215922355651855e-01 7.725490331649780273e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
18 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.725490331649780273e-01 9.882352948188781738e-01 4.431372582912445068e-01 0.000000000000000000e+00 6.745098233222961426e-01 9.882352948188781738e-01 7.372549176216125488e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.019607856869697571e-01 9.921568632125854492e-01 6.705882549285888672e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
19 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.725490331649780273e-01 9.882352948188781738e-01 4.431372582912445068e-01 0.000000000000000000e+00 7.450980693101882935e-02 9.058823585510253906e-01 9.686274528503417969e-01 4.784313738346099854e-01 7.450980693101882935e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.843137383460998535e-01 9.568627476692199707e-01 2.196078449487686157e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
20 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.019607856869697571e-01 8.705882430076599121e-01 9.882352948188781738e-01 4.431372582912445068e-01 0.000000000000000000e+00 0.000000000000000000e+00 9.803921729326248169e-02 7.960784435272216797e-01 9.882352948188781738e-01 7.568627595901489258e-01 5.098039284348487854e-02 0.000000000000000000e+00 2.980392277240753174e-01 7.843137383460998535e-01 9.764705896377563477e-01 4.901960790157318115e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
21 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.254902124404907227e-01 9.921568632125854492e-01 7.019608020782470703e-01 3.921568766236305237e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.980392277240753174e-01 1.372549086809158325e-01 1.137254908680915833e-01 6.039215922355651855e-01 9.921568632125854492e-01 9.568627476692199707e-01 4.901960790157318115e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
22 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.098039224743843079e-01 8.196078538894653320e-01 9.921568632125854492e-01 7.686274647712707520e-01 3.215686380863189697e-01 2.235294133424758911e-01 2.235294133424758911e-01 5.137255191802978516e-01 7.725490331649780273e-01 9.882352948188781738e-01 9.921568632125854492e-01 8.392156958580017090e-01 3.176470696926116943e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
23 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 9.803921729326248169e-02 8.470588326454162598e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.921568632125854492e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 6.117647290229797363e-01 7.450980693101882935e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
24 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.274510174989700317e-02 4.039215743541717529e-01 5.450980663299560547e-01 9.411764740943908691e-01 5.490196347236633301e-01 5.450980663299560547e-01 5.450980663299560547e-01 1.568627506494522095e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
25 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
26 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
27 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
28 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
29 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
30 |
--------------------------------------------------------------------------------
/test_images/12.txt:
--------------------------------------------------------------------------------
1 | # 9
2 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
3 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
4 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
5 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
6 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
7 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
8 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
9 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.921568661928176880e-01 7.058823704719543457e-01 9.921568632125854492e-01 1.000000000000000000e+00 9.921568632125854492e-01 6.627451181411743164e-01 1.411764770746231079e-01 4.313725605607032776e-02 2.980392277240753174e-01 3.529411926865577698e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
10 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.960784383118152618e-02 2.666666805744171143e-01 8.941176533699035645e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.921568632125854492e-01 9.882352948188781738e-01 9.882352948188781738e-01 6.274510025978088379e-01 7.411764860153198242e-01 9.921568632125854492e-01 3.607843220233917236e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
11 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.156862765550613403e-01 9.882352948188781738e-01 9.882352948188781738e-01 8.901960849761962891e-01 3.098039329051971436e-01 2.705882489681243896e-01 2.705882489681243896e-01 3.921568691730499268e-01 3.529411852359771729e-01 9.254902005195617676e-01 9.686274528503417969e-01 2.627451121807098389e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
12 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.686274558305740356e-01 9.137254953384399414e-01 9.882352948188781738e-01 7.254902124404907227e-01 1.960784345865249634e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.019607856869697571e-01 7.960784435272216797e-01 9.882352948188781738e-01 5.294117927551269531e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
13 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.588235497474670410e-01 9.921568632125854492e-01 6.980392336845397949e-01 1.450980454683303833e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.745098173618316650e-01 9.882352948188781738e-01 9.882352948188781738e-01 2.470588237047195435e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
14 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.078431606292724609e-01 9.921568632125854492e-01 9.490196108818054199e-01 1.647058874368667603e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.960784383118152618e-02 7.490196228027343750e-01 9.921568632125854492e-01 7.450980544090270996e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
15 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.117647171020507812e-01 9.882352948188781738e-01 9.019607901573181152e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.960784383118152618e-02 5.333333611488342285e-01 9.882352948188781738e-01 9.882352948188781738e-01 2.509804069995880127e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
16 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.117647171020507812e-01 9.882352948188781738e-01 9.019607901573181152e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.254902034997940063e-01 5.411764979362487793e-01 9.882352948188781738e-01 9.882352948188781738e-01 8.901960849761962891e-01 6.274510174989700317e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
17 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.470588445663452148e-01 9.882352948188781738e-01 9.764705896377563477e-01 8.117647171020507812e-01 8.117647171020507812e-01 8.117647171020507812e-01 8.941176533699035645e-01 9.921568632125854492e-01 9.882352948188781738e-01 9.882352948188781738e-01 6.274510025978088379e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
18 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.529411926865577698e-02 7.019608020782470703e-01 9.921568632125854492e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 2.941176593303680420e-01 6.627451181411743164e-01 9.882352948188781738e-01 2.196078449487686157e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
19 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.509804069995880127e-01 4.549019634723663330e-01 4.549019634723663330e-01 2.901960909366607666e-01 0.000000000000000000e+00 5.843137502670288086e-01 9.921568632125854492e-01 8.431372642517089844e-01 8.235294371843338013e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
20 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 9.921568632125854492e-01 9.882352948188781738e-01 6.352941393852233887e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
21 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.254902034997940063e-01 9.921568632125854492e-01 9.411764740943908691e-01 1.960784345865249634e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
22 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.156862974166870117e-01 9.921568632125854492e-01 6.431372761726379395e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
23 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.686274558305740356e-01 9.411764740943908691e-01 9.921568632125854492e-01 3.607843220233917236e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
24 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.647058904170989990e-01 9.921568632125854492e-01 9.882352948188781738e-01 3.294117748737335205e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
25 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.470588266849517822e-01 9.882352948188781738e-01 8.196078538894653320e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
26 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.117647171020507812e-01 9.882352948188781738e-01 4.549019634723663330e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
27 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.470588445663452148e-01 9.882352948188781738e-01 4.549019634723663330e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
28 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.647058904170989990e-01 7.843137383460998535e-01 2.470588237047195435e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
29 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
30 |
--------------------------------------------------------------------------------
/test_images/13.txt:
--------------------------------------------------------------------------------
1 | # 0
2 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
3 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
4 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
5 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
6 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
7 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.666667014360427856e-02 2.588235437870025635e-01 5.411764979362487793e-01 1.000000000000000000e+00 9.921568632125854492e-01 6.627451181411743164e-01 5.411764979362487793e-01 9.019608050584793091e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
8 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.960784383118152618e-02 4.705882370471954346e-01 8.941176533699035645e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.921568632125854492e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 6.196078658103942871e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
9 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.235294163227081299e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 7.450980544090270996e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
10 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.686274558305740356e-01 9.137254953384399414e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 4.549019634723663330e-01 1.960784383118152618e-02 5.294117927551269531e-01 9.882352948188781738e-01 9.882352948188781738e-01 9.882352948188781738e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
11 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.686274558305740356e-01 6.980392336845397949e-01 9.921568632125854492e-01 9.882352948188781738e-01 8.666666746139526367e-01 1.686274558305740356e-01 7.843137718737125397e-03 0.000000000000000000e+00 1.960784383118152618e-02 2.117647081613540649e-01 9.098039269447326660e-01 9.882352948188781738e-01 8.235294222831726074e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
12 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.647058904170989990e-01 9.921568632125854492e-01 1.000000000000000000e+00 9.764705896377563477e-01 4.509803950786590576e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.333333611488342285e-01 9.843137264251708984e-01 1.000000000000000000e+00 6.039215922355651855e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
13 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.509804129600524902e-01 9.882352948188781738e-01 9.921568632125854492e-01 7.254902124404907227e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.196078538894653320e-01 9.921568632125854492e-01 8.078431487083435059e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
14 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.450980693101882935e-02 8.627451062202453613e-01 9.882352948188781738e-01 9.921568632125854492e-01 3.607843220233917236e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.549019634723663330e-01 9.921568632125854492e-01 8.078431487083435059e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
15 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.745098173618316650e-01 9.882352948188781738e-01 9.882352948188781738e-01 7.529411911964416504e-01 6.666667014360427856e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.549019634723663330e-01 9.921568632125854492e-01 8.745098114013671875e-01 9.803921729326248169e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
16 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.784313738346099854e-01 9.882352948188781738e-01 9.882352948188781738e-01 2.470588237047195435e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.549019634723663330e-01 9.921568632125854492e-01 9.882352948188781738e-01 2.705882489681243896e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
17 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.176470875740051270e-01 9.921568632125854492e-01 9.921568632125854492e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.549019634723663330e-01 1.000000000000000000e+00 9.921568632125854492e-01 2.705882489681243896e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
18 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.215686440467834473e-01 9.882352948188781738e-01 9.882352948188781738e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.549019634723663330e-01 9.921568632125854492e-01 9.882352948188781738e-01 2.705882489681243896e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
19 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.215686440467834473e-01 9.882352948188781738e-01 9.882352948188781738e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 4.549019634723663330e-01 9.921568632125854492e-01 9.411764740943908691e-01 1.960784345865249634e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
20 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 7.215686440467834473e-01 9.882352948188781738e-01 9.882352948188781738e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 8.235294222831726074e-01 9.921568632125854492e-01 4.392156898975372314e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
21 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.882352977991104126e-01 9.098039269447326660e-01 9.882352948188781738e-01 6.196078658103942871e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 9.019607901573181152e-01 9.098039269447326660e-01 3.137255087494850159e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
22 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 3.647058904170989990e-01 9.921568632125854492e-01 9.568627476692199707e-01 1.960784345865249634e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 6.078431606292724609e-01 9.921568632125854492e-01 6.588235497474670410e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
23 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.333333402872085571e-01 6.431372761726379395e-01 9.921568632125854492e-01 4.431372582912445068e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 2.588235437870025635e-01 9.254902005195617676e-01 9.058823585510253906e-01 1.647058874368667603e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
24 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 1.254902034997940063e-01 8.705882430076599121e-01 9.411764740943908691e-01 5.254902243614196777e-01 0.000000000000000000e+00 0.000000000000000000e+00 1.490196138620376587e-01 3.568627536296844482e-01 9.176470637321472168e-01 9.882352948188781738e-01 5.372549295425415039e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
25 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 9.803921729326248169e-02 6.941176652908325195e-01 9.411764740943908691e-01 8.117647171020507812e-01 4.039215743541717529e-01 9.137254953384399414e-01 9.882352948188781738e-01 9.882352948188781738e-01 6.901960968971252441e-01 1.372549086809158325e-01 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
26 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 5.882352963089942932e-02 2.117647081613540649e-01 7.019608020782470703e-01 9.882352948188781738e-01 5.372549295425415039e-01 5.372549295425415039e-01 2.117647081613540649e-01 1.568627543747425079e-02 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
27 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
28 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
29 | 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00 0.000000000000000000e+00
30 |
--------------------------------------------------------------------------------
/test_images/save_images.py:
--------------------------------------------------------------------------------
1 | '''
2 | Saves mnist images to txt files
3 | '''
4 |
5 | import numpy as np
6 | import keras
7 |
8 | from keras.datasets import mnist
9 |
10 | (x_train, y_train), (x_test, y_test) = mnist.load_data()
11 |
12 | x_test = x_test.astype('float32')
13 | x_test /= 255
14 |
15 | # print('shape shape:', x_test.shape)
16 | # print(, 'test samples')
17 | x_test_len = x_test.shape[0]
18 |
19 | for i in range(x_test_len):
20 | with open('test_images/'+str(i)+'.txt', 'w') as outfile:
21 | outfile.write('# '+str(y_test[i])+'\n')
22 | np.savetxt(outfile, x_test[i])
--------------------------------------------------------------------------------