├── .gitmodules ├── LICENSE.md ├── README.md ├── app ├── README.md └── tflite_yolov5_test │ ├── .gitignore │ ├── .idea │ ├── .gitignore │ ├── compiler.xml │ ├── gradle.xml │ ├── jarRepositories.xml │ ├── misc.xml │ ├── runConfigurations.xml │ └── vcs.xml │ ├── app │ ├── .gitignore │ ├── CMakeLists.txt │ ├── build.gradle │ ├── proguard-rules.pro │ └── src │ │ ├── androidTest │ │ └── java │ │ │ └── com │ │ │ └── example │ │ │ └── tflite_yolov5_test │ │ │ └── ExampleInstrumentedTest.java │ │ ├── main │ │ ├── AndroidManifest.xml │ │ ├── cpp │ │ │ ├── nms.h │ │ │ └── postprocess.cpp │ │ ├── java │ │ │ └── com │ │ │ │ └── example │ │ │ │ └── tflite_yolov5_test │ │ │ │ ├── ImageProcess.java │ │ │ │ ├── MainActivity.java │ │ │ │ ├── PathUtils.java │ │ │ │ ├── TfliteRunMode.java │ │ │ │ ├── TfliteRunner.java │ │ │ │ ├── camera │ │ │ │ ├── CameraActivity.java │ │ │ │ ├── CameraConnectionFragment.java │ │ │ │ ├── DetectorActivity.java │ │ │ │ ├── LegacyCameraConnectionFragment.java │ │ │ │ ├── env │ │ │ │ │ ├── BorderedText.java │ │ │ │ │ └── ImageUtils.java │ │ │ │ └── tracker │ │ │ │ │ └── MultiBoxTracker.java │ │ │ │ └── customview │ │ │ │ ├── AutoFitTextureView.java │ │ │ │ └── OverlayView.java │ │ └── res │ │ │ ├── drawable-v24 │ │ │ └── ic_launcher_foreground.xml │ │ │ ├── drawable │ │ │ ├── ic_dashboard_black_24dp.xml │ │ │ ├── ic_home_black_24dp.xml │ │ │ ├── ic_launcher_background.xml │ │ │ └── ic_notifications_black_24dp.xml │ │ │ ├── layout │ │ │ ├── activity_camera.xml │ │ │ ├── activity_main.xml │ │ │ └── tfe_od_camera_connection_fragment_tracking.xml │ │ │ ├── menu │ │ │ └── bottom_nav_menu.xml │ │ │ ├── mipmap-anydpi-v26 │ │ │ ├── ic_launcher.xml │ │ │ └── ic_launcher_round.xml │ │ │ ├── mipmap-hdpi │ │ │ ├── ic_launcher.png │ │ │ └── ic_launcher_round.png │ │ │ ├── mipmap-mdpi │ │ │ ├── ic_launcher.png │ │ │ └── ic_launcher_round.png │ │ │ ├── mipmap-xhdpi │ │ │ ├── ic_launcher.png │ │ │ └── ic_launcher_round.png │ │ │ ├── mipmap-xxhdpi │ │ │ ├── ic_launcher.png │ │ │ └── ic_launcher_round.png │ │ │ ├── mipmap-xxxhdpi │ │ │ ├── ic_launcher.png │ │ │ └── ic_launcher_round.png │ │ │ ├── navigation │ │ │ └── mobile_navigation.xml │ │ │ ├── values-night │ │ │ └── themes.xml │ │ │ └── values │ │ │ ├── colors.xml │ │ │ ├── dimens.xml │ │ │ ├── strings.xml │ │ │ └── themes.xml │ │ └── test │ │ └── java │ │ └── com │ │ └── example │ │ └── tflite_yolov5_test │ │ └── ExampleUnitTest.java │ ├── build.gradle │ ├── gradle.properties │ ├── gradle │ └── wrapper │ │ ├── gradle-wrapper.jar │ │ └── gradle-wrapper.properties │ ├── gradlew │ ├── gradlew.bat │ └── settings.gradle ├── benchmark └── README.md ├── convert_model ├── README.md └── quantize.py ├── docker └── Dockerfile ├── host ├── README.md ├── cococlass.txt ├── detect.py ├── detector_head.py ├── dog.jpg ├── evaluate.py ├── metric.py ├── postprocess.py ├── results │ ├── android_result.json │ └── host_result.json └── runner.py └── tflite_model ├── yolov5s_fp32_320.tflite ├── yolov5s_fp32_640.tflite ├── yolov5s_int8_320.tflite └── yolov5s_int8_640.tflite /.gitmodules: -------------------------------------------------------------------------------- 1 | [submodule "yolov5"] 2 | path = yolov5 3 | url = https://github.com/ultralytics/yolov5 4 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # yolov5s_android:rocket: 2 |
3 | 4 | 5 |
6 | 7 | The implementation of yolov5s on android for the [yolov5s export contest](https://github.com/ultralytics/yolov5/discussions/3213). 8 | Download the latest android apk from [release](https://github.com/lp6m/yolov5s_android/releases) and install your device. 9 | 10 | **UPDATE:rocket: 2022/06/25** Added tutorial on how to integrate models trained with custom data. [Custom Model Intergration Tutorial](https://github.com/lp6m/yolov5s_android/issues/14) 11 | 12 | ## Environment 13 | - Host Ubuntu18.04 14 | - Docker 15 | * Tensorflow 2.4.0 16 | * PyTorch 1.7.0 17 | * OpenVino 2021.3 18 | - Android App 19 | * Android Studio 4.2.1 20 | * minSdkVersion 28 21 | * targetSdkVersion 29 22 | * TfLite 2.4.0 23 | - Android Device 24 | * Xiaomi Mi11 (Storage 128GB/ RAM8GB) 25 | * OS MIUI 12.5.8 26 | 27 | We use docker container for host evaluation and model conversion. 28 | ```sh 29 | git clone --recursive https://github.com/lp6m/yolov5s_android 30 | cd yolov5s_android 31 | docker build ./ -f ./docker/Dockerfile -t yolov5s_android 32 | docker run -it --gpus all -v `pwd`:/workspace yolov5s_android bash 33 | ``` 34 | 35 | ## Files 36 | - `./app` 37 | * Android application. 38 | * To build application by yourself, copy `./tflite_model/*.tflite` to `app/tflite_yolov5_test/app/src/main/assets/`, and build on Android Studio. 39 | * The app can perform inference with various configurations of input size, inference accuracy, and model accuracy. 40 | * For 'Open Directory Mode', save the detected bounding boxes results as a json file in coco format. 41 | * Realtime deteciton from camera image (precision and input size is fixed to int8/320). Achieved FPS is about **15FPS** on Mi11. 42 | * **NOTE** Please select image/directory as an absolute path from 'Device'. The app does not support select image/directory from 'Recent' in some devices. 43 | - `./benchmark` 44 | * Benchmark script and results by [TFLite Model Benchmark Tool with C++ Binary](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/tools/benchmark#profiling-model-operators). 45 | - `./convert_model` 46 | * Model conversion guide and model quantization script. 47 | - `./docker` 48 | * Dockerfile for the evaluation and model conversion environment. 49 | - `./host` 50 | * `detect.py` : Run detection for image with TfLite model on host environment. 51 | * `evaluate.py`: Run evaluation with coco validation dataset and inference results. 52 | - `./tflite_model` 53 | * Converted TfLite Model. 54 | 55 | ## Performance 56 | ### Latency 57 | These results are measured on `Xiaomi Mi11`. 58 | Please refer [`benchmark/README.md`](https://github.com/lp6m/yolov5s_android/tree/master/benchmark) about the detail of benchmark command. 59 | The latency does not contain the pre/post processing time and data transfer time. 60 | #### float32 model 61 | 62 | | delegate | 640x640 [ms] | 320x320 [ms] | 63 | | :-------------------- | -----------: | -----------: | 64 | | None (CPU) | 249 | 61 | 65 | | NNAPI (qti-gpu, fp32) | 156 | 112 | 66 | | NNAPI (qti-gpu, fp16) | 92 | 79 | 67 | 68 | #### int8 model 69 | We tried to accelerate the inference process by using `NNAPI (qti-dsp)` and offload calculation to Hexagon DSP, but it doesn't work for now. Please see [here](https://github.com/lp6m/yolov5s_android/tree/dev/benchmark#nnapi-qti-dsp-not-working) in detail. 70 | 71 | | delegate | 640x640 [ms] | 320x320 [ms] | 72 | | :------------------- | -----------: | -----------: | 73 | | None (CPU) | 95 | 23 | 74 | | NNAPI (qti-default) | Not working | Not working | 75 | | NNAPI (qti-dsp) | Not working | Not working | 76 | 77 | ## Accuracy 78 | Please refer [host/README.md](https://github.com/lp6m/yolov5s_android/tree/master/host#example2) about the evaluation method. 79 | We set `conf_thresh=0.25` and `iou_thresh=0.45` for nms parameter. 80 | | device, model, delegate | 640x640 mAP | 320x320 mAP | 81 | | :-------------------------------- | ----------: | ----------: | 82 | | host GPU (Tflite + PyTorch, fp32) | 27.8 | 26.6 | 83 | | host CPU (Tflite + PyTorch, int8) | 26.6 | 25.5 | 84 | | NNAPI (qti-gpu, fp16) | 28.5 | 26.8 | 85 | | CPU (int8) | 27.2 | 25.8 | 86 | 87 | 88 | ## Model conversion 89 | This project focuses on obtaining a tflite model by **model conversion from PyTorch original implementation, rather than doing own implementation in tflite**. 90 | We convert models in this way: `PyTorch -> ONNX -> OpenVino -> TfLite`. 91 | To convert the model from OpenVino to TfLite, we use [openvino2tensorflow](https://github.com/PINTO0309/openvino2tensorflow). 92 | Please refer [convert_model/README.md](https://github.com/lp6m/yolov5s_android/tree/master/convert_model) about the model conversion. 93 | 94 | -------------------------------------------------------------------------------- /app/README.md: -------------------------------------------------------------------------------- 1 | # Android App 2 | This applications uses [TFLite Android Support](https://www.tensorflow.org/lite/guide/android). 3 | 4 | ### TfliteRunner 5 | [`TfliteRunner.java`](https://github.com/lp6m/yolov5s_android/blob/master/app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/TfliteRunner.java) is the main class for running TfLite model. Apply a delegate according to the specified running mode. 6 | 7 | ### postprocess.cpp 8 | [`postprocess.cpp`](https://github.com/lp6m/yolov5s_android/blob/master/app/tflite_yolov5_test/app/src/main/cpp/postprocess.cpp) corresponds to the [detect layer module](https://github.com/ultralytics/yolov5/blob/master/models/yolo.py#L33) and `non_max_suppression` of original yolov5. This C++ code is called by `TfliteRunner` via JNI Interface. 9 | 10 | ### Realtime inference 11 | For realtime inference from camera image, We copy a lot of codes from [TensorFlow Lite Object Detection Android Demo](https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android) to `src/main/java/com/example/tflite_yolov5_test/customview, camera`. 12 | Realtime inference is not the main topic of the contest, so please forgive me for porting the code in a messy way! 13 | 14 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/.gitignore: -------------------------------------------------------------------------------- 1 | *.iml 2 | .gradle 3 | /local.properties 4 | /.idea/caches 5 | /.idea/libraries 6 | /.idea/modules.xml 7 | /.idea/workspace.xml 8 | /.idea/navEditor.xml 9 | /.idea/assetWizardSettings.xml 10 | .DS_Store 11 | /build 12 | /captures 13 | .externalNativeBuild 14 | .cxx 15 | local.properties 16 | 17 | app/src/main/assets/*.tflite 18 | app/release -------------------------------------------------------------------------------- /app/tflite_yolov5_test/.idea/.gitignore: -------------------------------------------------------------------------------- 1 | # Default ignored files 2 | /shelf/ 3 | /workspace.xml 4 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/.idea/compiler.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/.idea/gradle.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 19 | 20 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/.idea/jarRepositories.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 9 | 10 | 14 | 15 | 19 | 20 | 24 | 25 | 29 | 30 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/.idea/misc.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | 9 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/.idea/runConfigurations.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 9 | 10 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/.idea/vcs.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/.gitignore: -------------------------------------------------------------------------------- 1 | /build -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/CMakeLists.txt: -------------------------------------------------------------------------------- 1 | cmake_minimum_required(VERSION 3.4.1) 2 | 3 | add_library( # Sets the name of the library. 4 | native-lib 5 | 6 | # Sets the library as a shared library. 7 | SHARED 8 | 9 | # Provides a relative path to your source file(s). 10 | src/main/cpp/postprocess.cpp ) 11 | 12 | # Searches for a specified prebuilt library and stores the path as a 13 | # variable. Because CMake includes system libraries in the search path by 14 | # default, you only need to specify the name of the public NDK library 15 | # you want to add. CMake verifies that the library exists before 16 | # completing its build. 17 | 18 | find_library( # Sets the name of the path variable. 19 | log-lib 20 | 21 | # Specifies the name of the NDK library that 22 | # you want CMake to locate. 23 | log ) 24 | 25 | # Specifies libraries CMake should link to your target library. You 26 | # can link multiple libraries, such as libraries you define in this 27 | # build script, prebuilt third-party libraries, or system libraries. 28 | 29 | target_link_libraries( # Specifies the target library. 30 | native-lib 31 | 32 | # Links the target library to the log library 33 | # included in the NDK. 34 | ${log-lib} ) 35 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/build.gradle: -------------------------------------------------------------------------------- 1 | plugins { 2 | id 'com.android.application' 3 | } 4 | 5 | android { 6 | compileSdkVersion 29 7 | buildToolsVersion "30.0.3" 8 | 9 | defaultConfig { 10 | ndk { 11 | abiFilters 'armeabi-v7a', 'arm64-v8a' 12 | } 13 | applicationId "com.example.tflite_yolov5_test" 14 | minSdkVersion 28 15 | targetSdkVersion 29 16 | versionCode 1 17 | versionName "1.0" 18 | 19 | testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner" 20 | externalNativeBuild { 21 | cmake { 22 | cppFlags "-O3" 23 | } 24 | } 25 | } 26 | 27 | buildTypes { 28 | release { 29 | minifyEnabled false 30 | proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro' 31 | } 32 | } 33 | compileOptions { 34 | sourceCompatibility JavaVersion.VERSION_1_8 35 | targetCompatibility JavaVersion.VERSION_1_8 36 | } 37 | externalNativeBuild { 38 | cmake { 39 | path "CMakeLists.txt" 40 | } 41 | } 42 | buildFeatures { 43 | viewBinding true 44 | } 45 | } 46 | 47 | dependencies { 48 | implementation 'org.nanohttpd:nanohttpd:2.3.1' 49 | implementation 'androidx.appcompat:appcompat:1.2.0' 50 | implementation 'com.google.android.material:material:1.2.1' 51 | implementation 'androidx.constraintlayout:constraintlayout:2.0.1' 52 | implementation 'androidx.lifecycle:lifecycle-livedata-ktx:2.2.0' 53 | implementation 'androidx.lifecycle:lifecycle-viewmodel-ktx:2.2.0' 54 | implementation 'androidx.navigation:navigation-fragment:2.3.0' 55 | implementation 'androidx.navigation:navigation-ui:2.3.0' 56 | 57 | implementation 'org.tensorflow:tensorflow-lite:2.4.0' 58 | //implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly-SNAPSHOT' 59 | //implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly' 60 | //implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:0.0.0-nightly' 61 | //implementation 'org.tensorflow:tensorflow-lite-gpu:0.0.0-nightly-SNAPSHOT' 62 | //implementation 'org.tensorflow:tensorflow-lite-support:0.0.0-nightly-SNAPSHOT' 63 | 64 | testImplementation 'junit:junit:4.+' 65 | androidTestImplementation 'androidx.test.ext:junit:1.1.2' 66 | androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0' 67 | } -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/proguard-rules.pro: -------------------------------------------------------------------------------- 1 | # Add project specific ProGuard rules here. 2 | # You can control the set of applied configuration files using the 3 | # proguardFiles setting in build.gradle. 4 | # 5 | # For more details, see 6 | # http://developer.android.com/guide/developing/tools/proguard.html 7 | 8 | # If your project uses WebView with JS, uncomment the following 9 | # and specify the fully qualified class name to the JavaScript interface 10 | # class: 11 | #-keepclassmembers class fqcn.of.javascript.interface.for.webview { 12 | # public *; 13 | #} 14 | 15 | # Uncomment this to preserve the line number information for 16 | # debugging stack traces. 17 | #-keepattributes SourceFile,LineNumberTable 18 | 19 | # If you keep the line number information, uncomment this to 20 | # hide the original source file name. 21 | #-renamesourcefileattribute SourceFile -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/androidTest/java/com/example/tflite_yolov5_test/ExampleInstrumentedTest.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test; 2 | 3 | import android.content.Context; 4 | 5 | import androidx.test.platform.app.InstrumentationRegistry; 6 | import androidx.test.ext.junit.runners.AndroidJUnit4; 7 | 8 | import org.junit.Test; 9 | import org.junit.runner.RunWith; 10 | 11 | import static org.junit.Assert.*; 12 | 13 | /** 14 | * Instrumented test, which will execute on an Android device. 15 | * 16 | * @see Testing documentation 17 | */ 18 | @RunWith(AndroidJUnit4.class) 19 | public class ExampleInstrumentedTest { 20 | @Test 21 | public void useAppContext() { 22 | // Context of the app under test. 23 | Context appContext = InstrumentationRegistry.getInstrumentation().getTargetContext(); 24 | assertEquals("com.example.tflite_yolov5_test", appContext.getPackageName()); 25 | } 26 | } -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/AndroidManifest.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/cpp/nms.h: -------------------------------------------------------------------------------- 1 | //https://github.com/martinkersner/non-maximum-suppression-cpp/blob/master/nms.cpp 2 | #include 3 | #include 4 | #include 5 | 6 | using namespace std; 7 | 8 | struct bbox{ 9 | float x1; 10 | float y1; 11 | float x2; 12 | float y2; 13 | float conf; 14 | int class_idx; 15 | bbox(float x1, float y1, float x2, float y2, float conf, int class_idx) : 16 | x1(x1), y1(y1), x2(x2), y2(y2), conf(conf), class_idx(class_idx){ 17 | } 18 | }; 19 | 20 | #define Point_XMIN 0 21 | #define Point_XMAX 1 22 | #define Point_YMIN 2 23 | #define Point_YMAX 3 24 | 25 | vector GetPointFromRect(const vector &rect, 26 | const int pos) 27 | { 28 | vector points; 29 | 30 | for (const auto & p: rect) { 31 | float point; 32 | if (pos == Point_XMIN) point = p.x1; 33 | else if (pos == Point_XMAX) point = p.x2; 34 | else if (pos == Point_YMIN) point = p.y1; 35 | else if (pos == Point_YMAX) point = p.y2; 36 | 37 | points.push_back(point); 38 | } 39 | 40 | return points; 41 | } 42 | 43 | vector ComputeArea(const vector & x1, 44 | const vector & y1, 45 | const vector & x2, 46 | const vector & y2) 47 | { 48 | vector area; 49 | auto len = x1.size(); 50 | 51 | for (decltype(len) idx = 0; idx < len; ++idx) { 52 | auto tmpArea = (x2[idx] - x1[idx] + 1) * (y2[idx] - y1[idx] + 1); 53 | area.push_back(tmpArea); 54 | } 55 | 56 | return area; 57 | } 58 | 59 | vector argsort_byscore(const vector & v) 60 | { 61 | // initialize original index locations 62 | vector idx(v.size()); 63 | std::iota(idx.begin(), idx.end(), 0); 64 | 65 | // sort indexes based on comparing values in v 66 | sort(idx.begin(), idx.end(), 67 | [&v](int i1, int i2) {return v[i1].conf < v[i2].conf;}); 68 | 69 | return idx; 70 | } 71 | 72 | vector Maximum(const float & num, 73 | const vector & vec) 74 | { 75 | auto maxVec = vec; 76 | auto len = vec.size(); 77 | 78 | for (decltype(len) idx = 0; idx < len; ++idx) 79 | if (vec[idx] < num) 80 | maxVec[idx] = num; 81 | 82 | return maxVec; 83 | } 84 | 85 | vector Minimum(const float & num, 86 | const vector & vec) 87 | { 88 | auto minVec = vec; 89 | auto len = vec.size(); 90 | 91 | for (decltype(len) idx = 0; idx < len; ++idx) 92 | if (vec[idx] > num) 93 | minVec[idx] = num; 94 | 95 | return minVec; 96 | } 97 | 98 | vector CopyByIndexes(const vector & vec, 99 | const vector & idxs) 100 | { 101 | vector resultVec; 102 | 103 | for (const auto & idx : idxs) 104 | resultVec.push_back(vec[idx]); 105 | 106 | return resultVec; 107 | } 108 | 109 | vector RemoveLast(const vector & vec) 110 | { 111 | auto resultVec = vec; 112 | resultVec.erase(resultVec.end()-1); 113 | return resultVec; 114 | } 115 | 116 | vector Subtract(const vector & vec1, 117 | const vector & vec2) 118 | { 119 | vector result; 120 | auto len = vec1.size(); 121 | 122 | for (decltype(len) idx = 0; idx < len; ++idx) 123 | result.push_back(vec1[idx] - vec2[idx] + 1); 124 | 125 | return result; 126 | } 127 | 128 | vector Multiply(const vector & vec1, 129 | const vector & vec2) 130 | { 131 | vector resultVec; 132 | auto len = vec1.size(); 133 | 134 | for (decltype(len) idx = 0; idx < len; ++idx) 135 | resultVec.push_back(vec1[idx] * vec2[idx]); 136 | 137 | return resultVec; 138 | } 139 | 140 | vector Divide(const vector & vec1, 141 | const vector & vec2) 142 | { 143 | vector resultVec; 144 | auto len = vec1.size(); 145 | 146 | for (decltype(len) idx = 0; idx < len; ++idx) 147 | resultVec.push_back(vec1[idx] / vec2[idx]); 148 | 149 | return resultVec; 150 | } 151 | 152 | vector WhereLarger(const vector & vec, 153 | const float & threshold) 154 | { 155 | vector resultVec; 156 | auto len = vec.size(); 157 | 158 | for (decltype(len) idx = 0; idx < len; ++idx) 159 | if (vec[idx] > threshold) 160 | resultVec.push_back(idx); 161 | 162 | return resultVec; 163 | } 164 | 165 | vector RemoveByIndexes(const vector & vec, 166 | const vector & idxs) 167 | { 168 | auto resultVec = vec; 169 | auto offset = 0; 170 | 171 | for (const auto & idx : idxs) { 172 | resultVec.erase(resultVec.begin() + idx + offset); 173 | offset -= 1; 174 | } 175 | 176 | return resultVec; 177 | } 178 | 179 | 180 | template 181 | vector FilterVector(const vector & vec, 182 | const vector & idxs) 183 | { 184 | vector resultVec; 185 | 186 | for (const auto & idx: idxs) 187 | resultVec.push_back(vec[idx]); 188 | 189 | return resultVec; 190 | } 191 | 192 | 193 | vector nms(const vector & candidates, 194 | const float &iou_threshold) 195 | { 196 | if (candidates.empty()) return vector(); 197 | 198 | // grab the coordinates of the bounding boxes 199 | auto x1 = GetPointFromRect(candidates, Point_XMIN); 200 | auto y1 = GetPointFromRect(candidates, Point_YMIN); 201 | auto x2 = GetPointFromRect(candidates, Point_XMAX); 202 | auto y2 = GetPointFromRect(candidates, Point_YMAX); 203 | 204 | // compute the area of the bounding boxes and sort the bounding 205 | // boxes by the bottom-right y-coordinate of the bounding box 206 | auto area = ComputeArea(x1, y1, x2, y2); 207 | auto idxs = argsort_byscore(candidates); 208 | 209 | int last; 210 | int i; 211 | vector pick; 212 | 213 | // keep looping while some indexes still remain in the indexes list 214 | while (idxs.size() > 0) { 215 | // grab the last index in the indexes list and add the 216 | // index value to the list of picked indexes 217 | last = idxs.size() - 1; 218 | i = idxs[last]; 219 | pick.push_back(i); 220 | 221 | // find the largest (x, y) coordinates for the start of 222 | // the bounding box and the smallest (x, y) coordinates 223 | // for the end of the bounding box 224 | auto idxsWoLast = RemoveLast(idxs); 225 | 226 | auto xx1 = Maximum(x1[i], CopyByIndexes(x1, idxsWoLast)); 227 | auto yy1 = Maximum(y1[i], CopyByIndexes(y1, idxsWoLast)); 228 | auto xx2 = Minimum(x2[i], CopyByIndexes(x2, idxsWoLast)); 229 | auto yy2 = Minimum(y2[i], CopyByIndexes(y2, idxsWoLast)); 230 | 231 | // compute the width and height of the bounding box 232 | auto w = Maximum(0, Subtract(xx2, xx1)); 233 | auto h = Maximum(0, Subtract(yy2, yy1)); 234 | 235 | // compute the ratio of overlap 236 | auto overlap = Divide(Multiply(w, h), CopyByIndexes(area, idxsWoLast)); 237 | 238 | // delete all indexes from the index list that have 239 | auto deleteIdxs = WhereLarger(overlap, iou_threshold); 240 | deleteIdxs.push_back(last); 241 | idxs = RemoveByIndexes(idxs, deleteIdxs); 242 | } 243 | 244 | return FilterVector(candidates, pick); 245 | } 246 | 247 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/cpp/postprocess.cpp: -------------------------------------------------------------------------------- 1 | #include 2 | #include 3 | #include 4 | #include 5 | #include "nms.h" 6 | 7 | using namespace std; 8 | 9 | float sigmoid(float f){ 10 | return (float)(1.0f / (1.0f + exp(-f))); 11 | } 12 | 13 | float revsigmoid(float f){ 14 | const float eps = 1e-8; 15 | return -1.0f * (float)log((1.0f / (f + eps)) - 1.0f); 16 | } 17 | 18 | #define CLASS_NUM 80 19 | #define max_wh 4096 20 | void detector( 21 | vector* bbox_candidates, 22 | JNIEnv *env, 23 | jobjectArray input, 24 | const int gridnum, 25 | const int strides, 26 | const int anchorgrid[3][2], 27 | const float conf_thresh){ 28 | float revsigmoid_conf = revsigmoid(conf_thresh); 29 | //Warning: For now, we assume batch_size is always 1. 30 | for(int bi = 0; bi < 1; bi++){ 31 | jobjectArray ptr_d0 = (jobjectArray)env->GetObjectArrayElement(input , bi); 32 | for(int gy = 0; gy < gridnum; gy++){ 33 | jobjectArray ptr_d1 = (jobjectArray)env->GetObjectArrayElement(ptr_d0 ,gy); 34 | for(int gx = 0; gx < gridnum; gx++){ 35 | jobjectArray ptr_d2 = (jobjectArray)env->GetObjectArrayElement(ptr_d1 ,gx); 36 | auto elmptr = env->GetFloatArrayElements((jfloatArray)ptr_d2 , nullptr); 37 | for(int ch = 0; ch < 3; ch++){ 38 | int offset = 85 * ch; 39 | auto elmptr_ch = elmptr + offset; 40 | //don't apply sigmoid to all bbox candidates for efficiency 41 | float obj_conf_unsigmoid = elmptr_ch[4]; 42 | //if (sigmoid(obj_conf_unsigmoid) < conf_thresh) continue; 43 | if (obj_conf_unsigmoid >= revsigmoid_conf) { 44 | //get maximum conf class 45 | float max_class_conf = elmptr_ch[5]; 46 | int max_class_idx = 0; 47 | for(int class_idx = 1; class_idx < CLASS_NUM; class_idx++){ 48 | float class_conf = elmptr_ch[class_idx + 5]; 49 | if (class_conf > max_class_conf){ 50 | max_class_conf = class_conf; 51 | max_class_idx = class_idx; 52 | } 53 | } 54 | // class conf filter 55 | float bbox_conf = sigmoid(max_class_conf) * sigmoid(obj_conf_unsigmoid); 56 | //if (bbox_conf < conf_thresh) continue; 57 | // xywh2xyxy 58 | // batched nms (by adding class * max_wh to coordinates, 59 | // we can get nms result for all classes by just one nms call) 60 | //grid[gridnum][gy][gx][0] = gx 61 | //grid[gridnum][gy][gx][1] = gy 62 | float cx = ((sigmoid(elmptr_ch[0]) * 2.0f) - 0.5f + (float)gx) * (float)strides; 63 | float cy = ((sigmoid(elmptr_ch[1]) * 2.0f) - 0.5f + (float)gy) * (float)strides; 64 | float w = (sigmoid(elmptr_ch[2]) * sigmoid(elmptr_ch[2])) * 4.0f * (float)anchorgrid[ch][0]; 65 | float h = (sigmoid(elmptr_ch[3]) * sigmoid(elmptr_ch[3])) * 4.0f * (float)anchorgrid[ch][1]; 66 | float x1 = cx - w / 2.0f + max_wh * max_class_idx; 67 | float y1 = cy - h / 2.0f + max_wh * max_class_idx; 68 | float x2 = cx + w / 2.0f + max_wh * max_class_idx; 69 | float y2 = cy + h / 2.0f + max_wh * max_class_idx; 70 | bbox box = bbox(x1, y1, x2, y2, bbox_conf, max_class_idx); 71 | bbox_candidates->push_back(box); 72 | } 73 | } 74 | env->ReleaseFloatArrayElements((jfloatArray)ptr_d2, elmptr, 0); 75 | env->DeleteLocalRef(ptr_d2); 76 | } 77 | env->DeleteLocalRef(ptr_d1); 78 | } 79 | env->DeleteLocalRef(ptr_d0); 80 | } 81 | env->DeleteLocalRef(input); 82 | } 83 | 84 | extern "C" jobjectArray Java_com_example_tflite_1yolov5_1test_TfliteRunner_postprocess ( 85 | JNIEnv *env, 86 | jobject /* this */, 87 | jobjectArray input1,//80x80 or 40x40 88 | jobjectArray input2, //40x40 or 20x20, 89 | jobjectArray input3, //20x20 or 10x10 90 | jint input_size, 91 | jfloat conf_thresh, 92 | jfloat iou_thresh){ 93 | //conf 94 | const int anchorgrids[3][3][2] = { 95 | {{10, 13}, {16, 30}, {33, 23}}, //80 96 | {{30, 61}, {62, 45}, {59, 119}}, //40 97 | {{116, 90}, {156, 198}, {373, 326}} //20 98 | }; 99 | const int strides[3] = {8, 16, 32}; 100 | 101 | vector bbox_candidates; //TODO: reserve 102 | //Detector 103 | detector(&bbox_candidates, env, input1, input_size / 8, strides[0], anchorgrids[0], conf_thresh); 104 | detector(&bbox_candidates, env, input2, input_size / 16, strides[1], anchorgrids[1], conf_thresh); 105 | detector(&bbox_candidates, env, input3, input_size / 32, strides[2], anchorgrids[2], conf_thresh); 106 | //non-max-suppression 107 | vector nms_results = nms(bbox_candidates, iou_thresh); 108 | 109 | //return 2-dimension array [detected_box][6(x1, y1, x2, y2, conf, class)] 110 | jobjectArray objArray; 111 | jclass floatArray = env->FindClass("[F"); 112 | if (floatArray == NULL) return NULL; 113 | int size = nms_results.size(); 114 | objArray = env->NewObjectArray(size, floatArray, NULL); 115 | if (objArray == NULL) return NULL; 116 | for(int i = 0; i < nms_results.size(); i++){ 117 | int class_idx = nms_results[i].class_idx; 118 | float x1 = nms_results[i].x1 - class_idx * max_wh; 119 | float y1 = nms_results[i].y1 - class_idx * max_wh; 120 | float x2 = nms_results[i].x2 - class_idx * max_wh; 121 | float y2 = nms_results[i].y2 - class_idx * max_wh; 122 | float conf = nms_results[i].conf; 123 | float boxres[6] = {x1, y1, x2, y2, conf, (float)class_idx}; 124 | jfloatArray iarr = env->NewFloatArray((jsize)6); 125 | if (iarr == NULL) return NULL; 126 | env->SetFloatArrayRegion(iarr, 0, 6, boxres); 127 | env->SetObjectArrayElement(objArray, i, iarr); 128 | env->DeleteLocalRef(iarr); 129 | } 130 | return objArray; 131 | } -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/ImageProcess.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test; 2 | 3 | import android.graphics.Bitmap; 4 | import android.graphics.Canvas; 5 | import android.graphics.Color; 6 | import android.graphics.Paint; 7 | import android.graphics.RectF; 8 | import android.text.TextUtils; 9 | 10 | import com.example.tflite_yolov5_test.camera.env.BorderedText; 11 | 12 | import java.util.List; 13 | 14 | public class ImageProcess { 15 | private static final int[] COLORS = { 16 | Color.BLUE, 17 | Color.RED, 18 | Color.GREEN, 19 | Color.YELLOW, 20 | Color.CYAN, 21 | Color.MAGENTA, 22 | Color.WHITE, 23 | Color.parseColor("#55FF55"), 24 | Color.parseColor("#FFA500"), 25 | Color.parseColor("#FF8888"), 26 | Color.parseColor("#AAAAFF"), 27 | Color.parseColor("#FFFFAA"), 28 | Color.parseColor("#55AAAA"), 29 | Color.parseColor("#AA33AA"), 30 | Color.parseColor("#0D0068") 31 | }; 32 | static public Bitmap drawBboxes(List bboxes, Bitmap bitmap, int inputSize) { 33 | Bitmap mutableBitmap = bitmap.copy(Bitmap.Config.ARGB_8888, true); 34 | bitmap.recycle(); 35 | final Canvas canvas = new Canvas(mutableBitmap); 36 | final Paint paint = new Paint(); 37 | BorderedText borderedText = new BorderedText(25); 38 | paint.setStyle(Paint.Style.STROKE); 39 | paint.setStrokeWidth(3.0f); 40 | for (TfliteRunner.Recognition bbox : bboxes) { 41 | int color_idx = bbox.getClass_idx() % COLORS.length; 42 | paint.setColor(COLORS[color_idx]); 43 | RectF location = bbox.getLocation(); 44 | float left = location.left * bitmap.getWidth() / inputSize; 45 | float right = location.right * bitmap.getWidth() / inputSize; 46 | float top = location.top * bitmap.getHeight() / inputSize; 47 | float bottom = location.bottom * bitmap.getHeight() / inputSize; 48 | RectF drawBoxRect = new RectF(left, top, right, bottom); 49 | canvas.drawRect(drawBoxRect, paint); 50 | String labelString = String.format("%s %.2f", bbox.getTitle(), (100 * bbox.getConfidence())); 51 | borderedText.drawText( 52 | canvas, left - 10, top - 10, labelString, paint); 53 | } 54 | 55 | return mutableBitmap; 56 | } 57 | } 58 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/MainActivity.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test; 2 | 3 | import android.content.Context; 4 | import android.graphics.RectF; 5 | import android.os.Bundle; 6 | 7 | import com.example.tflite_yolov5_test.camera.DetectorActivity; 8 | 9 | import androidx.appcompat.app.AppCompatActivity; 10 | 11 | import android.Manifest; 12 | import android.app.AlertDialog; 13 | import android.content.Intent; 14 | import android.content.pm.PackageManager; 15 | import android.graphics.BitmapFactory; 16 | import android.net.Uri; 17 | import android.os.Handler; 18 | import android.os.HandlerThread; 19 | import android.provider.DocumentsContract; 20 | import android.view.View; 21 | import android.widget.Button; 22 | import android.widget.ImageView; 23 | import android.widget.ProgressBar; 24 | import android.widget.RadioButton; 25 | import android.widget.SeekBar; 26 | import android.widget.TextView; 27 | import android.widget.Toast; 28 | 29 | import java.io.IOException; 30 | 31 | import org.json.JSONArray; 32 | //import org.tensorflow.lite.gpu.GpuDelegate; 33 | 34 | import java.io.*; 35 | import java.util.ArrayList; 36 | import java.util.Arrays; 37 | import java.util.HashMap; 38 | import java.util.List; 39 | import java.util.Map; 40 | 41 | import android.graphics.Bitmap; 42 | 43 | import java.lang.Math; 44 | public class MainActivity extends AppCompatActivity { 45 | 46 | final int REQUEST_OPEN_FILE = 1; 47 | final int REQUEST_OPEN_DIRECTORY = 9999; 48 | //permission 49 | private int inputSize = -1; 50 | private File[] process_files = null; 51 | private final int REQUEST_PERMISSION = 1000; 52 | private final String[] PERMISSIONS = { 53 | Manifest.permission.READ_EXTERNAL_STORAGE, 54 | Manifest.permission.WRITE_EXTERNAL_STORAGE, 55 | }; 56 | //background task 57 | private boolean handler_stop_request; 58 | private Handler handler; 59 | private HandlerThread handlerThread; 60 | 61 | private void checkPermission(){ 62 | if (!isGranted()){ 63 | requestPermissions(PERMISSIONS, REQUEST_PERMISSION); 64 | } 65 | } 66 | @Override 67 | public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) { 68 | super.onRequestPermissionsResult(requestCode, permissions, grantResults); 69 | Toast.makeText(this, "onRequestPermissionResult", Toast.LENGTH_LONG).show(); 70 | if (requestCode == REQUEST_PERMISSION){ 71 | boolean result = isGranted(); 72 | Toast.makeText(getApplicationContext(), result ? "OK" : "NG", Toast.LENGTH_SHORT).show(); 73 | } 74 | } 75 | private boolean isGranted(){ 76 | for (int i = 0; i < PERMISSIONS.length; i++){ 77 | if (checkSelfPermission(PERMISSIONS[i]) != PackageManager.PERMISSION_GRANTED) { 78 | if (shouldShowRequestPermissionRationale(PERMISSIONS[i])) { 79 | Toast.makeText(this, "permission is required", Toast.LENGTH_LONG).show(); 80 | } 81 | return false; 82 | } 83 | } 84 | return true; 85 | } 86 | 87 | @Override 88 | protected void onCreate(Bundle savedInstanceState) { 89 | super.onCreate(savedInstanceState); 90 | 91 | setContentView(R.layout.activity_main); 92 | SeekBar conf_seekBar = (SeekBar)findViewById(R.id.conf_seekBar); 93 | conf_seekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener(){ 94 | @Override 95 | public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { 96 | TextView conf_textView = (TextView)findViewById(R.id.conf_TextView); 97 | conf_textView.setText(String.format("Confidence Threshold: %.2f", (float)progress / 100)); 98 | } 99 | @Override 100 | public void onStopTrackingTouch(SeekBar seekBar) { 101 | } 102 | @Override 103 | public void onStartTrackingTouch(SeekBar seekBar) { 104 | } 105 | }); 106 | conf_seekBar.setMax(100); 107 | conf_seekBar.setProgress(25);//0.25 108 | SeekBar iou_seekBar = (SeekBar)findViewById(R.id.iou_seekBar); 109 | iou_seekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener(){ 110 | @Override 111 | public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { 112 | TextView iou_textView = (TextView)findViewById(R.id.iou_TextView); 113 | iou_textView.setText(String.format("IoU Threshold: %.2f", (float)progress / 100)); 114 | } 115 | @Override 116 | public void onStopTrackingTouch(SeekBar seekBar) { 117 | } 118 | @Override 119 | public void onStartTrackingTouch(SeekBar seekBar) { 120 | } 121 | }); 122 | iou_seekBar.setMax(100); 123 | iou_seekBar.setProgress(45);//0.45 124 | } 125 | public void OnOpenImageButtonClick(View view){ 126 | checkPermission(); 127 | Intent intent = new Intent(Intent.ACTION_GET_CONTENT); 128 | intent.setType("image/*"); 129 | intent.addCategory(Intent.CATEGORY_OPENABLE); 130 | startActivityForResult(Intent.createChooser(intent, "Open an image"), REQUEST_OPEN_FILE); 131 | } 132 | public void OnOpenDirButtonClick(View view){ 133 | checkPermission(); 134 | Intent intent = new Intent(Intent.ACTION_OPEN_DOCUMENT_TREE); 135 | intent.addCategory(Intent.CATEGORY_DEFAULT); 136 | startActivityForResult(Intent.createChooser(intent, "Open directory"), REQUEST_OPEN_DIRECTORY); 137 | } 138 | public void setResultImage(Bitmap bitmap){ 139 | ImageView imageview = (ImageView)findViewById(R.id.resultImageView); 140 | imageview.setImageBitmap(bitmap); 141 | } 142 | ArrayList> bboxesToMap(File file, List bboxes, int orig_h, int orig_w){ 143 | ArrayList> resList = new ArrayList>(); 144 | String basename = file.getName(); 145 | basename = basename.substring(0, basename.lastIndexOf('.')); 146 | Object image_id; 147 | try{ 148 | image_id = Integer.parseInt(basename); 149 | } catch (Exception e){ 150 | image_id = basename; 151 | } 152 | for(TfliteRunner.Recognition bbox : bboxes){ 153 | //clamp and scale to original image size 154 | RectF location = bbox.getLocation(); 155 | float x1 = Math.min(Math.max(0, location.left), this.inputSize) * orig_w / (float)this.inputSize; 156 | float y1 = Math.min(Math.max(0, location.top), this.inputSize) * orig_h / (float)this.inputSize; 157 | float x2 = Math.min(Math.max(0, location.right), this.inputSize) * orig_w / (float)this.inputSize; 158 | float y2 = Math.min(Math.max(0, location.bottom), this.inputSize) * orig_h / (float)this.inputSize; 159 | float x = x1; 160 | float y = y1; 161 | float w = x2 - x1; 162 | float h = y2 - y1; 163 | float conf = bbox.getConfidence(); 164 | int class_idx = TfliteRunner.get_coco91_from_coco80(bbox.getClass_idx()); 165 | HashMap mapbox = new HashMap<>(); 166 | mapbox.put("image_id", image_id); 167 | mapbox.put("bbox", new float[]{x, y, w, h}); 168 | mapbox.put("score", conf); 169 | mapbox.put("category_id", class_idx); 170 | resList.add(mapbox); 171 | } 172 | return resList; 173 | } 174 | private boolean isBackgroundTaskRunning() { 175 | return this.handlerThread != null && this.handlerThread.isAlive(); 176 | } 177 | public void OnInferenceTaskCompleted() { 178 | Button button = (Button) findViewById(R.id.runInferenceButton); 179 | button.setText("Run Inference"); 180 | SeekBar conf_seekBar = (SeekBar) findViewById(R.id.conf_seekBar); 181 | conf_seekBar.setEnabled(true); 182 | SeekBar iou_seekBar = (SeekBar) findViewById(R.id.iou_seekBar); 183 | iou_seekBar.setEnabled(true); 184 | } 185 | public void OnInferenceTaskStart(){ 186 | Button button = (Button)findViewById(R.id.runInferenceButton); 187 | button.setText("Stop Inference"); 188 | SeekBar conf_seekBar = (SeekBar) findViewById(R.id.conf_seekBar); 189 | conf_seekBar.setEnabled(false); 190 | SeekBar iou_seekBar = (SeekBar) findViewById(R.id.iou_seekBar); 191 | iou_seekBar.setEnabled(false); 192 | } 193 | public float getConfThreshFromGUI(){ return ((float)((SeekBar)findViewById(R.id.conf_seekBar)).getProgress()) / 100.0f;} 194 | public float getIoUThreshFromGUI(){ return ((float)((SeekBar)findViewById(R.id.iou_seekBar)).getProgress()) / 100.0f;} 195 | public void OnRunInferenceButtonClick(View view){ 196 | TfliteRunner runner; 197 | TfliteRunMode.Mode runmode = getRunModeFromGUI(); 198 | this.inputSize = getInputSizeFromGUI(); 199 | //validation 200 | if (this.process_files == null || this.process_files.length == 0){ 201 | showErrorDialog("Please select image or directory."); 202 | return; 203 | } 204 | if (runmode == null) { 205 | showErrorDialog("Please select valid configurations."); 206 | return; 207 | } 208 | 209 | //open model 210 | try { 211 | Context context = getApplicationContext(); 212 | runner = new TfliteRunner(context, runmode, this.inputSize, getConfThreshFromGUI(), getIoUThreshFromGUI()); 213 | } catch (Exception e) { 214 | showErrorDialog("Model load failed: " + e.getMessage()); 215 | return; 216 | } 217 | //check background task status 218 | if(isBackgroundTaskRunning()){ 219 | //already inference is running, stop inference 220 | this.handler_stop_request = true; 221 | this.handlerThread.quitSafely(); 222 | try { 223 | handlerThread.join(); 224 | handlerThread = null; 225 | handler = null; 226 | } catch (final InterruptedException e) { 227 | addLog(e.getMessage() + "Exception!"); 228 | } 229 | OnInferenceTaskCompleted(); 230 | return; 231 | } else { 232 | //start inference task 233 | this.handler_stop_request = false; 234 | OnInferenceTaskStart(); 235 | } 236 | 237 | //run inference in background 238 | this.handlerThread = new HandlerThread("inference"); 239 | this.handlerThread.start(); 240 | this.handler = new Handler(this.handlerThread.getLooper()); 241 | ProgressBar pbar = (ProgressBar)findViewById(R.id.progressBar); 242 | File[] process_files = this.process_files; 243 | pbar.setProgress(0); 244 | ArrayList> resList = new ArrayList<>(); 245 | runInBackground( 246 | new Runnable() { 247 | @Override 248 | public void run() { 249 | try { 250 | for(int i = 0; i < process_files.length; i++){ 251 | if (handler_stop_request) break; 252 | File file = process_files[i]; 253 | InputStream is = new FileInputStream(file); 254 | Bitmap bitmap = BitmapFactory.decodeStream(is); 255 | Bitmap resized = TfliteRunner.getResizedImage(bitmap, inputSize); 256 | runner.setInput(resized); 257 | List bboxes = runner.runInference(); 258 | Bitmap resBitmap = ImageProcess.drawBboxes(bboxes, bitmap, getInputSizeFromGUI()); 259 | ArrayList> bboxmaps = bboxesToMap(file, bboxes, bitmap.getHeight(), bitmap.getWidth()); 260 | resList.addAll(bboxmaps); 261 | int ii = i; 262 | runOnUiThread( 263 | new Runnable() { 264 | @Override 265 | public void run () { 266 | pbar.setProgress(Math.min(100, (ii+1) * 100 / process_files.length)); 267 | setResultImage(resBitmap); 268 | } 269 | }); 270 | bitmap.recycle(); 271 | } 272 | } catch (Exception e) { 273 | runOnUiThread( 274 | new Runnable() { 275 | @Override 276 | public void run() { 277 | showErrorDialog("Inference failed : " + e.getMessage()) ; 278 | } 279 | } 280 | ); 281 | } 282 | //completed 283 | runOnUiThread( 284 | new Runnable() { 285 | @Override 286 | public void run() { 287 | handler_stop_request = false; 288 | OnInferenceTaskCompleted(); 289 | //output json if directory mode 290 | if (process_files.length > 1) { 291 | try { 292 | String jsonpath = saveBboxesToJson(resList, process_files[0], "result.json"); 293 | showInfoDialog("result json is saved : " + jsonpath); 294 | } catch (Exception e){ 295 | showErrorDialog("json output failed : " + e.getMessage()); 296 | } 297 | } 298 | addLog(runner.getLastElapsedTimeLog()); 299 | } 300 | } 301 | ); 302 | } 303 | } 304 | ); 305 | 306 | } 307 | String saveBboxesToJson(ArrayList> resList, File file, String output_filename) 308 | throws org.json.JSONException, IOException{ 309 | // HashMap[] resArr = (HashMap[])resList.toArray(); 310 | JSONArray jarr = new JSONArray(resList); 311 | String jstr = jarr.toString(); 312 | 313 | String filepath = file.getParent() + "/" + output_filename; 314 | FileOutputStream fileOutputStream = new FileOutputStream(filepath, false); 315 | fileOutputStream.write(jstr.getBytes()); 316 | return filepath; 317 | } 318 | private void showErrorDialog(String text){ showDialog("Error", text);} 319 | private void showInfoDialog(String text){ showDialog("Info", text);} 320 | private void showDialog(String title, String text){ 321 | new AlertDialog.Builder(this) 322 | .setTitle(title) 323 | .setMessage(text) 324 | .setPositiveButton("OK" , null ) 325 | .create().show(); 326 | } 327 | private void addLog(String logtxt){ 328 | TextView logtext = findViewById(R.id.logTextView); 329 | logtext.setText(logtext.getText() + logtxt + "\n"); 330 | } 331 | private void setOneLineLog(String text){ 332 | TextView onelinetextview = findViewById(R.id.oneLineLabel); 333 | onelinetextview.setText(text); 334 | } 335 | public void OnClearLogButton(View view) { 336 | TextView logtext = findViewById(R.id.logTextView); 337 | logtext.setText(""); 338 | } 339 | void setImageView(Bitmap bitmap){ 340 | ImageView imageview = (ImageView)findViewById(R.id.resultImageView); 341 | imageview.setImageBitmap(bitmap); 342 | } 343 | public void OnOpenCameraButtonClick(View view){ 344 | if (isBackgroundTaskRunning()) { 345 | showErrorDialog("Please stop inference task"); 346 | return; 347 | } 348 | Intent intent = new Intent(MainActivity.this, DetectorActivity.class); 349 | startActivity(intent); 350 | } 351 | @Override 352 | protected void onActivityResult(int requestCode, int resultCode, Intent data) { 353 | super.onActivityResult(requestCode, resultCode, data); 354 | if (requestCode == REQUEST_OPEN_FILE) { 355 | // one image file is selected 356 | if (resultCode == RESULT_OK && data != null) { 357 | Uri uri = data.getData(); 358 | if (uri != null) { 359 | String fullpath = PathUtils.getPath(getApplicationContext(), uri); 360 | this.process_files = new File[]{new File(fullpath)}; 361 | } 362 | } 363 | } else if (requestCode == REQUEST_OPEN_DIRECTORY) { 364 | // image directory is selected 365 | if (resultCode == RESULT_OK && data != null) { 366 | Uri uri = data.getData(); 367 | if (uri != null) { 368 | Uri docUri = DocumentsContract.buildDocumentUriUsingTree(uri, 369 | DocumentsContract.getTreeDocumentId(uri)); 370 | String fullpath = PathUtils.getPath(getApplicationContext(), docUri); 371 | File directory = new File(fullpath); 372 | this.process_files = directory.listFiles(new ImageFilenameFIlter()); 373 | } 374 | } 375 | } 376 | if (this.process_files != null && this.process_files.length > 0){ 377 | setOneLineLog(String.valueOf(this.process_files.length) + " images loaded."); 378 | try{ 379 | InputStream is = new FileInputStream(this.process_files[0]); 380 | Bitmap bitmap = BitmapFactory.decodeStream(is); 381 | setResultImage(bitmap); 382 | } catch(Exception ex){ 383 | setOneLineLog(ex.getMessage()); 384 | } 385 | } 386 | } 387 | class ImageFilenameFIlter implements FilenameFilter { 388 | public boolean accept(File dir, String name) { 389 | if (name.toLowerCase().matches(".*\\.jpg$|.*\\.jpeg$|.*\\.png$|.*\\.bmp$")) { 390 | return true; 391 | } 392 | return false; 393 | } 394 | } 395 | protected synchronized void runInBackground(final Runnable r) { 396 | if (this.handler != null) { 397 | this.handler.post(r); 398 | } 399 | } 400 | private TfliteRunMode.Mode getRunModeFromGUI(){ 401 | boolean model_float = ((RadioButton)findViewById(R.id.radioButton_modelFloat)).isChecked(); 402 | boolean model_int8 = ((RadioButton)findViewById(R.id.radioButton_modelInt)).isChecked(); 403 | boolean precision_fp32 = ((RadioButton)findViewById(R.id.radioButton_runFP32)).isChecked(); 404 | boolean precision_fp16 = ((RadioButton)findViewById(R.id.radioButton_runFP16)).isChecked(); 405 | boolean precision_int8 = ((RadioButton)findViewById(R.id.radioButton_runInt8)).isChecked(); 406 | boolean delegate_none = ((RadioButton)findViewById(R.id.radioButton_delegateNone)).isChecked(); 407 | boolean delegate_nnapi = ((RadioButton)findViewById(R.id.radioButton_delegateNNAPI)).isChecked(); 408 | boolean[] gui_selected = {model_float, model_int8, precision_fp32, precision_fp16, precision_int8, delegate_none, delegate_nnapi}; 409 | final Map candidates = new HashMap(){{ 410 | put(TfliteRunMode.Mode.NONE_FP32, new boolean[]{true, false, true, false, false, true, false}); 411 | put(TfliteRunMode.Mode.NONE_FP16, new boolean[]{true, false, false, true, false, true, false}); 412 | put(TfliteRunMode.Mode.NNAPI_GPU_FP32, new boolean[]{true, false, true, false, false, false, true}); 413 | put(TfliteRunMode.Mode.NNAPI_GPU_FP16, new boolean[]{true, false, false, true, false, false, true}); 414 | put(TfliteRunMode.Mode.NONE_INT8, new boolean[]{false, true, false, false, true, true, false}); 415 | put(TfliteRunMode.Mode.NNAPI_DSP_INT8, new boolean[]{false, true, false, false, true, false, true}); 416 | }}; 417 | for(Map.Entry entry : candidates.entrySet()){ 418 | if (Arrays.equals(gui_selected, entry.getValue())) return entry.getKey(); 419 | } 420 | //not found 421 | return null; 422 | } 423 | public int getInputSizeFromGUI(){ 424 | RadioButton input_640 = findViewById(R.id.radioButton_640); 425 | if (input_640.isChecked()) return 640; 426 | else return 320; 427 | } 428 | //Eliminate infeasible run configurations(model, precision) 429 | public void onModelFloatClick(View view) { 430 | RadioButton precision_int8 = findViewById(R.id.radioButton_runInt8); 431 | if (precision_int8.isChecked()){ 432 | RadioButton precision_fp32 = findViewById(R.id.radioButton_runFP32); 433 | precision_fp32.setChecked(true); 434 | } 435 | } 436 | public void onModelIntClick(View view) { 437 | RadioButton precision_fp32 = findViewById(R.id.radioButton_runFP32); 438 | RadioButton precision_fp16 = findViewById(R.id.radioButton_runFP16); 439 | if (precision_fp32.isChecked() || precision_fp16.isChecked()){ 440 | RadioButton precision_int8 = findViewById(R.id.radioButton_runInt8); 441 | precision_int8.setChecked(true); 442 | } 443 | } 444 | public void onPrecisionFPClick(View view){ 445 | RadioButton model_int = findViewById(R.id.radioButton_modelInt); 446 | if (model_int.isChecked()) { 447 | RadioButton model_fp = findViewById(R.id.radioButton_modelFloat); 448 | model_fp.setChecked(true); 449 | } 450 | } 451 | public void onPrecisionIntClick(View view){ 452 | RadioButton model_fp = findViewById(R.id.radioButton_modelFloat); 453 | if (model_fp.isChecked()) { 454 | RadioButton model_int = findViewById(R.id.radioButton_modelInt); 455 | model_int.setChecked(true); 456 | } 457 | } 458 | } -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/PathUtils.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test; 2 | import android.content.ContentUris; 3 | import android.content.Context; 4 | import android.database.Cursor; 5 | import android.graphics.Bitmap; 6 | import android.graphics.Canvas; 7 | import android.graphics.Color; 8 | import android.graphics.Paint; 9 | import android.graphics.RectF; 10 | import android.net.Uri; 11 | import android.os.Build; 12 | import android.os.Environment; 13 | import android.provider.DocumentsContract; 14 | import android.provider.MediaStore; 15 | 16 | import java.util.List; 17 | 18 | public class PathUtils { 19 | 20 | public static String getPath(final Context context, final Uri uri) { 21 | 22 | // DocumentProvider 23 | if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT && DocumentsContract.isDocumentUri(context, uri)) { 24 | 25 | if (isExternalStorageDocument(uri)) {// ExternalStorageProvider 26 | final String docId = DocumentsContract.getDocumentId(uri); 27 | final String[] split = docId.split(":"); 28 | final String type = split[0]; 29 | String storageDefinition; 30 | 31 | 32 | if("primary".equalsIgnoreCase(type)){ 33 | 34 | return Environment.getExternalStorageDirectory() + "/" + split[1]; 35 | 36 | } else { 37 | 38 | if(Environment.isExternalStorageRemovable()){ 39 | storageDefinition = "EXTERNAL_STORAGE"; 40 | 41 | } else{ 42 | storageDefinition = "SECONDARY_STORAGE"; 43 | } 44 | 45 | return System.getenv(storageDefinition) + "/" + split[1]; 46 | } 47 | 48 | } else if (isDownloadsDocument(uri)) {// DownloadsProvider 49 | 50 | final String id = DocumentsContract.getDocumentId(uri); 51 | final Uri contentUri = ContentUris.withAppendedId( 52 | Uri.parse("content://downloads/public_downloads"), Long.valueOf(id)); 53 | 54 | return getDataColumn(context, contentUri, null, null); 55 | 56 | } else if (isMediaDocument(uri)) {// MediaProvider 57 | final String docId = DocumentsContract.getDocumentId(uri); 58 | final String[] split = docId.split(":"); 59 | final String type = split[0]; 60 | 61 | Uri contentUri = null; 62 | if ("image".equals(type)) { 63 | contentUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI; 64 | } else if ("video".equals(type)) { 65 | contentUri = MediaStore.Video.Media.EXTERNAL_CONTENT_URI; 66 | } else if ("audio".equals(type)) { 67 | contentUri = MediaStore.Audio.Media.EXTERNAL_CONTENT_URI; 68 | } 69 | 70 | final String selection = "_id=?"; 71 | final String[] selectionArgs = new String[]{ 72 | split[1] 73 | }; 74 | 75 | return getDataColumn(context, contentUri, selection, selectionArgs); 76 | } 77 | 78 | } else if ("content".equalsIgnoreCase(uri.getScheme())) {// MediaStore (and general) 79 | 80 | // Return the remote address 81 | if (isGooglePhotosUri(uri)) 82 | return uri.getLastPathSegment(); 83 | 84 | return getDataColumn(context, uri, null, null); 85 | 86 | } else if ("file".equalsIgnoreCase(uri.getScheme())) {// File 87 | return uri.getPath(); 88 | } 89 | 90 | return null; 91 | } 92 | 93 | public static String getDataColumn(Context context, Uri uri, String selection, String[] selectionArgs) { 94 | 95 | Cursor cursor = null; 96 | final String column = "_data"; 97 | final String[] projection = { 98 | column 99 | }; 100 | 101 | try { 102 | cursor = context.getContentResolver().query(uri, projection, selection, selectionArgs, null); 103 | if (cursor != null && cursor.moveToFirst()) { 104 | final int column_index = cursor.getColumnIndexOrThrow(column); 105 | return cursor.getString(column_index); 106 | } 107 | } finally { 108 | if (cursor != null) 109 | cursor.close(); 110 | } 111 | return null; 112 | } 113 | 114 | 115 | public static boolean isExternalStorageDocument(Uri uri) { 116 | return "com.android.externalstorage.documents".equals(uri.getAuthority()); 117 | } 118 | 119 | 120 | public static boolean isDownloadsDocument(Uri uri) { 121 | return "com.android.providers.downloads.documents".equals(uri.getAuthority()); 122 | } 123 | 124 | public static boolean isMediaDocument(Uri uri) { 125 | return "com.android.providers.media.documents".equals(uri.getAuthority()); 126 | } 127 | 128 | public static boolean isGooglePhotosUri(Uri uri) { 129 | return "com.google.android.apps.photos.content".equals(uri.getAuthority()); 130 | } 131 | 132 | } -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/TfliteRunMode.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test; 2 | 3 | public class TfliteRunMode { 4 | public enum Mode{ 5 | NONE_FP32, 6 | NONE_FP16, 7 | NONE_INT8, 8 | NNAPI_GPU_FP32, 9 | NNAPI_GPU_FP16, 10 | NNAPI_DSP_INT8 11 | } 12 | static public boolean isQuantizedMode(Mode mode){ 13 | return mode == Mode.NONE_INT8 || mode == Mode.NNAPI_DSP_INT8; 14 | } 15 | } 16 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/TfliteRunner.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test; 2 | 3 | import android.content.Context; 4 | import android.content.res.AssetFileDescriptor; 5 | import android.content.res.AssetManager; 6 | import android.graphics.Bitmap; 7 | import android.graphics.BitmapFactory; 8 | import android.graphics.RectF; 9 | import android.widget.ImageView; 10 | 11 | import org.tensorflow.lite.Interpreter; 12 | 13 | import org.tensorflow.lite.nnapi.NnApiDelegate; 14 | 15 | import java.io.File; 16 | import java.io.FileInputStream; 17 | import java.io.FileNotFoundException; 18 | import java.io.IOException; 19 | import java.io.InputStream; 20 | import java.nio.ByteBuffer; 21 | import java.nio.ByteOrder; 22 | import java.nio.MappedByteBuffer; 23 | import java.nio.channels.FileChannel; 24 | import java.util.ArrayList; 25 | import java.util.HashMap; 26 | import java.util.List; 27 | import java.util.Map; 28 | import com.example.tflite_yolov5_test.TfliteRunMode.*; 29 | 30 | public class TfliteRunner { 31 | final int numBytesPerChannel_float = 4; 32 | final int numBytesPerChannel_int = 1; 33 | static { 34 | System.loadLibrary("native-lib"); 35 | } 36 | public native float[][] postprocess(float[][][][] out1, float[][][][] out2, float[][][][] out3, int inputSize, float conf_thresh, float iou_thresh); 37 | private Interpreter tfliteInterpreter; 38 | Mode runmode; 39 | int inputSize; 40 | class InferenceRawResult{ 41 | public float[][][][] out1; 42 | public float[][][][] out2; 43 | public float[][][][] out3; 44 | 45 | public InferenceRawResult(int inputSize){ 46 | this.out1 = new float[1][inputSize/8][inputSize/8][3*85]; 47 | this.out2 = new float[1][inputSize/16][inputSize/16][3*85]; 48 | this.out3 = new float[1][inputSize/32][inputSize/32][3*85]; 49 | } 50 | } 51 | Object[] inputArray; 52 | Map outputMap; 53 | InferenceRawResult rawres; 54 | float conf_thresh; 55 | float iou_thresh; 56 | 57 | public TfliteRunner(Context context, Mode runmode, int inputSize, float conf_thresh, float iou_thresh) throws Exception{ 58 | this.runmode = runmode; 59 | this.rawres = new InferenceRawResult(inputSize); 60 | this.inputSize = inputSize; 61 | this.conf_thresh = conf_thresh; 62 | this.iou_thresh = iou_thresh; 63 | loadModel(context, runmode, inputSize, 4); 64 | } 65 | private static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename) 66 | throws IOException { 67 | AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename); 68 | FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor()); 69 | FileChannel fileChannel = inputStream.getChannel(); 70 | long startOffset = fileDescriptor.getStartOffset(); 71 | long declaredLength = fileDescriptor.getDeclaredLength(); 72 | return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength); 73 | } 74 | public void loadModel(Context context, Mode runmode, int inputSize, int num_threads) throws Exception{ 75 | Interpreter.Options options = new Interpreter.Options(); 76 | NnApiDelegate.Options nnapi_options = new NnApiDelegate.Options(); 77 | options.setNumThreads(num_threads); 78 | nnapi_options.setExecutionPreference(1);//sustain-spped 79 | switch (runmode){ 80 | case NONE_FP32: 81 | options.setUseXNNPACK(true); 82 | break; 83 | case NONE_FP16: 84 | //TODO:deprecated? 85 | options.setAllowFp16PrecisionForFp32(true); 86 | break; 87 | case NNAPI_GPU_FP32: 88 | nnapi_options.setAcceleratorName("qti-gpu"); 89 | nnapi_options.setAllowFp16(false); 90 | options.addDelegate(new NnApiDelegate(nnapi_options)); 91 | break; 92 | case NNAPI_GPU_FP16: 93 | nnapi_options.setAcceleratorName("qti-gpu"); 94 | nnapi_options.setAllowFp16(true); 95 | options.addDelegate(new NnApiDelegate(nnapi_options)); 96 | break; 97 | case NONE_INT8: 98 | options.setUseXNNPACK(true); 99 | break; 100 | case NNAPI_DSP_INT8: 101 | nnapi_options.setAcceleratorName("qti-dsp"); 102 | options.addDelegate(new NnApiDelegate(nnapi_options)); 103 | break; 104 | default: 105 | throw new RuntimeException("Unknown runmode!"); 106 | } 107 | boolean quantized_mode = TfliteRunMode.isQuantizedMode(runmode); 108 | String precision_str = quantized_mode ? "int8" : "fp32"; 109 | String modelname = "yolov5s_" + precision_str + "_" + String.valueOf(inputSize) + ".tflite"; 110 | MappedByteBuffer tflite_model_buf = TfliteRunner.loadModelFile(context.getAssets(), modelname); 111 | this.tfliteInterpreter = new Interpreter(tflite_model_buf, options); 112 | } 113 | static public Bitmap getResizedImage(Bitmap bitmap, int inputSize) { 114 | Bitmap resized = Bitmap.createScaledBitmap(bitmap, inputSize, inputSize, true); 115 | return resized; 116 | } 117 | public void setInput(Bitmap resizedbitmap){ 118 | boolean quantized_mode = TfliteRunMode.isQuantizedMode(this.runmode); 119 | int numBytesPerChannel = quantized_mode ? numBytesPerChannel_int : numBytesPerChannel_float; 120 | ByteBuffer imgData = ByteBuffer.allocateDirect(1 * inputSize * inputSize * 3 * numBytesPerChannel); 121 | 122 | int[] intValues = new int[inputSize * inputSize]; 123 | resizedbitmap.getPixels(intValues, 0, resizedbitmap.getWidth(), 0, 0, resizedbitmap.getWidth(), resizedbitmap.getHeight()); 124 | 125 | imgData.order(ByteOrder.nativeOrder()); 126 | imgData.rewind(); 127 | for (int i = 0; i < inputSize; ++i) { 128 | for (int j = 0; j < inputSize; ++j) { 129 | int pixelValue = intValues[i * inputSize + j]; 130 | if (quantized_mode) { 131 | // Quantized model 132 | imgData.put((byte) ((pixelValue >> 16) & 0xFF)); 133 | imgData.put((byte) ((pixelValue >> 8) & 0xFF)); 134 | imgData.put((byte) (pixelValue & 0xFF)); 135 | } else { // Float model 136 | float r = (((pixelValue >> 16) & 0xFF)) / 255.0f; 137 | float g = (((pixelValue >> 8) & 0xFF)) / 255.0f; 138 | float b = ((pixelValue & 0xFF)) / 255.0f; 139 | imgData.putFloat(r); 140 | imgData.putFloat(g); 141 | imgData.putFloat(b); 142 | } 143 | } 144 | } 145 | this.inputArray = new Object[]{imgData}; 146 | this.outputMap = new HashMap<>(); 147 | outputMap.put(0, this.rawres.out1); 148 | outputMap.put(1, this.rawres.out2); 149 | outputMap.put(2, this.rawres.out3); 150 | } 151 | private int inference_elapsed; 152 | private int postprocess_elapsed; 153 | public String getLastElapsedTimeLog() { 154 | return String.format("inference: %dms postprocess: %dms", this.inference_elapsed, this.postprocess_elapsed); 155 | } 156 | public List runInference(){ 157 | List bboxes = new ArrayList<>(); 158 | long start = System.currentTimeMillis(); 159 | this.tfliteInterpreter.runForMultipleInputsOutputs(inputArray, outputMap); 160 | long end = System.currentTimeMillis(); 161 | this.inference_elapsed = (int)(end - start); 162 | 163 | //float[bbox_num][6] 164 | // (x1, y1, x2, y2, conf, class_idx) 165 | float[][] bbox_arrs = postprocess(this.rawres.out1, 166 | this.rawres.out2, 167 | this.rawres.out3, 168 | this.inputSize, 169 | this.conf_thresh, 170 | this.iou_thresh); 171 | long end2 = System.currentTimeMillis(); 172 | this.postprocess_elapsed = (int)(end2 - end); 173 | for(float[] bbox_arr: bbox_arrs){ 174 | bboxes.add(new Recognition(bbox_arr)); 175 | } 176 | return bboxes; 177 | } 178 | static int[] coco80_to_91class_map = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 27, 28, 31, 32, 33, 34, 179 | 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 180 | 64, 65, 67, 70, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 84, 85, 86, 87, 88, 89, 90}; 181 | static public int get_coco91_from_coco80(int idx){ 182 | //assume idx < 80 183 | return coco80_to_91class_map[idx]; 184 | } 185 | public void setConfThresh(float thresh){ this.conf_thresh = thresh;} 186 | public void setIoUThresh(float thresh) {this.iou_thresh = thresh;} 187 | 188 | //port from TfLite Object Detection example 189 | /** An immutable result returned by a Detector describing what was recognized. */ 190 | public class Recognition { 191 | private final String[] coco_class_names = new String[]{"person", "bicycle", "car", "motorbike", "aeroplane", "bus", "train", "truck", "boat", "traffic light", "fire hydrant", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep", "cow", "elephant", "bear", "zebra", "giraffe", "backpack", "umbrella", "handbag", "tie", "suitcase", "frisbee", "skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard", "tennis racket", "bottle", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich", "orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "sofa", "pottedplant", "bed", "diningtable", "toilet", "tvmonitor", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven", "toaster", "sink", "refrigerator", "book", "clock", "vase", "scissors", "teddy bear", "hair drier", "toothbrush"}; 192 | private final Integer class_idx; 193 | /** 194 | * A unique identifier for what has been recognized. Specific to the class, not the instance of 195 | * the object. 196 | */ 197 | //private final String id; 198 | 199 | /** Display name for the recognition. */ 200 | private final String title; 201 | 202 | /** 203 | * A sortable score for how good the recognition is relative to others. Higher should be better. 204 | */ 205 | private final Float confidence; 206 | 207 | /** Optional location within the source image for the location of the recognized object. */ 208 | private RectF location; 209 | 210 | public Recognition( 211 | float[] bbox_array) { 212 | float x1 = bbox_array[0]; 213 | float y1 = bbox_array[1]; 214 | float x2 = bbox_array[2]; 215 | float y2 = bbox_array[3]; 216 | //this.id = (int)bbox_array[5]; 217 | int class_id = (int)bbox_array[5]; 218 | this.class_idx = class_id; 219 | this.title = coco_class_names[class_id]; 220 | this.confidence = bbox_array[4]; 221 | this.location = new RectF(x1, y1, x2, y2); 222 | } 223 | public Integer getClass_idx(){ 224 | return class_idx; 225 | } 226 | /*public String getId() { 227 | return id; 228 | }*/ 229 | 230 | public String getTitle() { 231 | return title; 232 | } 233 | 234 | public Float getConfidence() { 235 | return confidence; 236 | } 237 | 238 | public RectF getLocation() { 239 | return new RectF(location); 240 | } 241 | 242 | public void setLocation(RectF location) { 243 | this.location = location; 244 | } 245 | 246 | @Override 247 | public String toString() { 248 | String resultString = ""; 249 | /*if (id != null) { 250 | resultString += "[" + id + "] "; 251 | }*/ 252 | 253 | if (title != null) { 254 | resultString += title + " "; 255 | } 256 | 257 | if (confidence != null) { 258 | resultString += String.format("(%.1f%%) ", confidence * 100.0f); 259 | } 260 | 261 | if (location != null) { 262 | resultString += location + " "; 263 | } 264 | 265 | return resultString.trim(); 266 | } 267 | } 268 | } 269 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/camera/CameraActivity.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test.camera; 2 | 3 | import androidx.appcompat.app.AppCompatActivity; 4 | import androidx.appcompat.widget.SwitchCompat; 5 | 6 | import android.Manifest; 7 | import android.app.Fragment; 8 | import android.content.Context; 9 | import android.content.pm.PackageManager; 10 | import android.hardware.Camera; 11 | import android.hardware.camera2.CameraAccessException; 12 | import android.hardware.camera2.CameraCharacteristics; 13 | import android.hardware.camera2.CameraManager; 14 | import android.hardware.camera2.params.StreamConfigurationMap; 15 | import android.media.Image; 16 | import android.media.ImageReader; 17 | import android.os.Build; 18 | import android.os.Bundle; 19 | import android.os.Handler; 20 | import android.os.HandlerThread; 21 | import android.os.Trace; 22 | import android.util.Size; 23 | import android.view.Surface; 24 | import android.view.View; 25 | import android.widget.CompoundButton; 26 | import android.widget.ImageView; 27 | import android.widget.LinearLayout; 28 | import android.widget.SeekBar; 29 | import android.widget.TextView; 30 | import android.widget.Toast; 31 | 32 | import com.example.tflite_yolov5_test.R; 33 | import com.google.android.material.bottomsheet.BottomSheetBehavior; 34 | import com.example.tflite_yolov5_test.camera.env.ImageUtils; 35 | import java.nio.ByteBuffer; 36 | 37 | public abstract class CameraActivity extends AppCompatActivity 38 | implements ImageReader.OnImageAvailableListener, 39 | Camera.PreviewCallback, 40 | CompoundButton.OnCheckedChangeListener, 41 | View.OnClickListener { 42 | private static final int PERMISSIONS_REQUEST = 1; 43 | 44 | private static final String PERMISSION_CAMERA = Manifest.permission.CAMERA; 45 | protected int previewWidth = 0; 46 | protected int previewHeight = 0; 47 | private boolean debug = false; 48 | private Handler handler; 49 | private HandlerThread handlerThread; 50 | private boolean useCamera2API; 51 | private boolean isProcessingFrame = false; 52 | private byte[][] yuvBytes = new byte[3][]; 53 | private int[] rgbBytes = null; 54 | private int yRowStride; 55 | private Runnable postInferenceCallback; 56 | private Runnable imageConverter; 57 | 58 | private LinearLayout bottomSheetLayout; 59 | private LinearLayout gestureLayout; 60 | private BottomSheetBehavior sheetBehavior; 61 | 62 | protected TextView frameValueTextView, cropValueTextView, inferenceTimeTextView; 63 | protected ImageView bottomSheetArrowImageView; 64 | private ImageView plusImageView, minusImageView; 65 | private SwitchCompat apiSwitchCompat; 66 | private TextView threadsTextView; 67 | @Override 68 | protected void onCreate(Bundle savedInstanceState) { 69 | super.onCreate(savedInstanceState); 70 | setContentView(R.layout.activity_camera); 71 | if (hasPermission()) { 72 | setFragment(); 73 | } else { 74 | requestPermission(); 75 | } 76 | 77 | } 78 | private String chooseCamera() { 79 | final CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE); 80 | try { 81 | for (final String cameraId : manager.getCameraIdList()) { 82 | final CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId); 83 | 84 | // We don't use a front facing camera in this sample. 85 | final Integer facing = characteristics.get(CameraCharacteristics.LENS_FACING); 86 | if (facing != null && facing == CameraCharacteristics.LENS_FACING_FRONT) { 87 | continue; 88 | } 89 | 90 | final StreamConfigurationMap map = 91 | characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP); 92 | 93 | if (map == null) { 94 | continue; 95 | } 96 | 97 | // Fallback to camera1 API for internal cameras that don't have full support. 98 | // This should help with legacy situations where using the camera2 API causes 99 | // distorted or otherwise broken previews. 100 | useCamera2API = 101 | (facing == CameraCharacteristics.LENS_FACING_EXTERNAL) 102 | || isHardwareLevelSupported( 103 | characteristics, CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_FULL); 104 | return cameraId; 105 | } 106 | } catch (CameraAccessException e) { 107 | } 108 | 109 | return null; 110 | } 111 | protected void fillBytes(final Image.Plane[] planes, final byte[][] yuvBytes) { 112 | // Because of the variable row stride it's not possible to know in 113 | // advance the actual necessary dimensions of the yuv planes. 114 | for (int i = 0; i < planes.length; ++i) { 115 | final ByteBuffer buffer = planes[i].getBuffer(); 116 | if (yuvBytes[i] == null) { 117 | yuvBytes[i] = new byte[buffer.capacity()]; 118 | } 119 | buffer.get(yuvBytes[i]); 120 | } 121 | } 122 | protected int getScreenOrientation() { 123 | switch (getWindowManager().getDefaultDisplay().getRotation()) { 124 | case Surface.ROTATION_270: 125 | return 270; 126 | case Surface.ROTATION_180: 127 | return 180; 128 | case Surface.ROTATION_90: 129 | return 90; 130 | default: 131 | return 0; 132 | } 133 | } 134 | /** Callback for android.hardware.Camera API */ 135 | @Override 136 | public void onPreviewFrame(final byte[] bytes, final Camera camera) { 137 | if (isProcessingFrame) { 138 | return; 139 | } 140 | 141 | try { 142 | // Initialize the storage bitmaps once when the resolution is known. 143 | if (rgbBytes == null) { 144 | Camera.Size previewSize = camera.getParameters().getPreviewSize(); 145 | previewHeight = previewSize.height; 146 | previewWidth = previewSize.width; 147 | rgbBytes = new int[previewWidth * previewHeight]; 148 | onPreviewSizeChosen(new Size(previewSize.width, previewSize.height), 90); 149 | } 150 | } catch (final Exception e) { 151 | return; 152 | } 153 | } 154 | /** Callback for Camera2 API */ 155 | @Override 156 | public void onImageAvailable(final ImageReader reader) { 157 | // We need wait until we have some size from onPreviewSizeChosen 158 | if (previewWidth == 0 || previewHeight == 0) { 159 | return; 160 | } 161 | if (rgbBytes == null) { 162 | rgbBytes = new int[previewWidth * previewHeight]; 163 | } 164 | try { 165 | final Image image = reader.acquireLatestImage(); 166 | 167 | if (image == null) { 168 | return; 169 | } 170 | 171 | if (isProcessingFrame) { 172 | image.close(); 173 | return; 174 | } 175 | isProcessingFrame = true; 176 | Trace.beginSection("imageAvailable"); 177 | final Image.Plane[] planes = image.getPlanes(); 178 | fillBytes(planes, yuvBytes); 179 | yRowStride = planes[0].getRowStride(); 180 | final int uvRowStride = planes[1].getRowStride(); 181 | final int uvPixelStride = planes[1].getPixelStride(); 182 | 183 | imageConverter = 184 | new Runnable() { 185 | @Override 186 | public void run() { 187 | ImageUtils.convertYUV420ToARGB8888( 188 | yuvBytes[0], 189 | yuvBytes[1], 190 | yuvBytes[2], 191 | previewWidth, 192 | previewHeight, 193 | yRowStride, 194 | uvRowStride, 195 | uvPixelStride, 196 | rgbBytes); 197 | } 198 | }; 199 | 200 | postInferenceCallback = 201 | new Runnable() { 202 | @Override 203 | public void run() { 204 | image.close(); 205 | isProcessingFrame = false; 206 | } 207 | }; 208 | 209 | processImage(); 210 | } catch (final Exception e) { 211 | Trace.endSection(); 212 | return; 213 | } 214 | Trace.endSection(); 215 | } 216 | 217 | @Override 218 | public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) { 219 | } 220 | @Override 221 | public void onClick(View v) { 222 | } 223 | @Override 224 | public synchronized void onResume() { 225 | super.onResume(); 226 | 227 | handlerThread = new HandlerThread("inference"); 228 | handlerThread.start(); 229 | handler = new Handler(handlerThread.getLooper()); 230 | } 231 | 232 | @Override 233 | public synchronized void onPause() { 234 | handlerThread.quitSafely(); 235 | try { 236 | handlerThread.join(); 237 | handlerThread = null; 238 | handler = null; 239 | } catch (final InterruptedException e) { 240 | } 241 | 242 | super.onPause(); 243 | } 244 | protected synchronized void runInBackground(final Runnable r) { 245 | if (handler != null) { 246 | handler.post(r); 247 | } 248 | } 249 | protected int[] getRgbBytes() { 250 | imageConverter.run(); 251 | return rgbBytes; 252 | } 253 | protected void readyForNextImage() { 254 | if (postInferenceCallback != null) { 255 | postInferenceCallback.run(); 256 | } 257 | } 258 | private boolean isHardwareLevelSupported( 259 | CameraCharacteristics characteristics, int requiredLevel) { 260 | int deviceLevel = characteristics.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL); 261 | if (deviceLevel == CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY) { 262 | return requiredLevel == deviceLevel; 263 | } 264 | // deviceLevel is not LEGACY, can use numerical sort 265 | return requiredLevel <= deviceLevel; 266 | } 267 | protected void setFragment() { 268 | String cameraId = chooseCamera(); 269 | 270 | Fragment fragment; 271 | if (useCamera2API) { 272 | CameraConnectionFragment camera2Fragment = 273 | CameraConnectionFragment.newInstance( 274 | new CameraConnectionFragment.ConnectionCallback() { 275 | @Override 276 | public void onPreviewSizeChosen(final Size size, final int rotation) { 277 | previewHeight = size.getHeight(); 278 | previewWidth = size.getWidth(); 279 | CameraActivity.this.onPreviewSizeChosen(size, rotation); 280 | } 281 | }, 282 | this, 283 | getLayoutId(), 284 | getDesiredPreviewFrameSize()); 285 | 286 | camera2Fragment.setCamera(cameraId); 287 | fragment = camera2Fragment; 288 | } else { 289 | fragment = 290 | new LegacyCameraConnectionFragment(this, getLayoutId(), getDesiredPreviewFrameSize()); 291 | } 292 | 293 | getFragmentManager().beginTransaction().replace(R.id.container, fragment).commit(); 294 | int a = 0; 295 | a = a + 1; 296 | } 297 | protected abstract void processImage(); 298 | protected abstract void onPreviewSizeChosen(final Size size, final int rotation); 299 | 300 | protected abstract int getLayoutId(); 301 | 302 | protected abstract Size getDesiredPreviewFrameSize(); 303 | 304 | protected abstract void setNumThreads(int numThreads); 305 | 306 | protected abstract void setUseNNAPI(boolean isChecked); 307 | 308 | private static boolean allPermissionsGranted(final int[] grantResults) { 309 | for (int result : grantResults) { 310 | if (result != PackageManager.PERMISSION_GRANTED) { 311 | return false; 312 | } 313 | } 314 | return true; 315 | } 316 | private boolean hasPermission() { 317 | if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) { 318 | return checkSelfPermission(PERMISSION_CAMERA) == PackageManager.PERMISSION_GRANTED; 319 | } else { 320 | return true; 321 | } 322 | } 323 | 324 | private void requestPermission() { 325 | if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) { 326 | if (shouldShowRequestPermissionRationale(PERMISSION_CAMERA)) { 327 | Toast.makeText( 328 | CameraActivity.this, 329 | "Camera permission is required for this demo", 330 | Toast.LENGTH_LONG) 331 | .show(); 332 | } 333 | requestPermissions(new String[] {PERMISSION_CAMERA}, PERMISSIONS_REQUEST); 334 | } 335 | } 336 | @Override 337 | public void onRequestPermissionsResult( 338 | final int requestCode, final String[] permissions, final int[] grantResults) { 339 | super.onRequestPermissionsResult(requestCode, permissions, grantResults); 340 | if (requestCode == PERMISSIONS_REQUEST) { 341 | if (allPermissionsGranted(grantResults)) { 342 | setFragment(); 343 | } else { 344 | requestPermission(); 345 | } 346 | } 347 | } 348 | 349 | } -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/camera/DetectorActivity.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test.camera; 2 | 3 | import android.graphics.Bitmap; 4 | import android.graphics.Canvas; 5 | import android.graphics.Color; 6 | import android.graphics.Matrix; 7 | import android.graphics.Paint; 8 | import android.graphics.RectF; 9 | import android.graphics.Typeface; 10 | import android.media.ImageReader; 11 | import android.os.Bundle; 12 | import android.os.SystemClock; 13 | import android.util.Log; 14 | import android.util.Size; 15 | import android.util.TypedValue; 16 | import android.widget.SeekBar; 17 | import android.widget.TextView; 18 | import android.widget.Toast; 19 | 20 | import com.example.tflite_yolov5_test.R; 21 | import com.example.tflite_yolov5_test.camera.env.BorderedText; 22 | import com.example.tflite_yolov5_test.camera.env.ImageUtils; 23 | import com.example.tflite_yolov5_test.camera.tracker.MultiBoxTracker; 24 | import com.example.tflite_yolov5_test.customview.OverlayView; 25 | import com.example.tflite_yolov5_test.TfliteRunner; 26 | import com.example.tflite_yolov5_test.TfliteRunMode; 27 | import java.io.IOException; 28 | import java.util.ArrayList; 29 | import java.util.List; 30 | 31 | public class DetectorActivity extends CameraActivity implements ImageReader.OnImageAvailableListener { 32 | 33 | 34 | private static final int TF_OD_API_INPUT_SIZE = 320; 35 | private static final boolean TF_OD_API_IS_QUANTIZED = true; 36 | private static final String TF_OD_API_MODEL_FILE = "detect.tflite"; 37 | private static final String TF_OD_API_LABELS_FILE = "labelmap.txt"; 38 | private static final TfliteRunMode.Mode MODE = TfliteRunMode.Mode.NONE_INT8; 39 | // Minimum detection confidence to track a detection. 40 | private static final float MINIMUM_CONFIDENCE_TF_OD_API = 0.5f; 41 | private static final boolean MAINTAIN_ASPECT = false; 42 | private static final Size DESIRED_PREVIEW_SIZE = new Size(640, 480); 43 | private static final boolean SAVE_PREVIEW_BITMAP = false; 44 | private static final float TEXT_SIZE_DIP = 10; 45 | OverlayView trackingOverlay; 46 | private Integer sensorOrientation; 47 | 48 | private TfliteRunner detector; 49 | 50 | private long lastProcessingTimeMs = 0; 51 | private Bitmap rgbFrameBitmap = null; 52 | private Bitmap croppedBitmap = null; 53 | private Bitmap cropCopyBitmap = null; 54 | 55 | private boolean computingDetection = false; 56 | 57 | private long timestamp = 0; 58 | 59 | private Matrix frameToCropTransform; 60 | private Matrix cropToFrameTransform; 61 | 62 | private MultiBoxTracker tracker; 63 | 64 | private BorderedText borderedText; 65 | 66 | protected Size getDesiredPreviewFrameSize() { 67 | return DESIRED_PREVIEW_SIZE; 68 | } 69 | 70 | protected int getLayoutId() { 71 | return R.layout.tfe_od_camera_connection_fragment_tracking; 72 | } 73 | 74 | 75 | public float getConfThreshFromGUI(){ return ((float)((SeekBar)findViewById(R.id.conf_seekBar2)).getProgress()) / 100.0f;} 76 | public float getIoUThreshFromGUI(){ return ((float)((SeekBar)findViewById(R.id.iou_seekBar2)).getProgress()) / 100.0f;} 77 | @Override 78 | protected void onCreate(Bundle savedInstanceState) { 79 | super.onCreate(savedInstanceState); 80 | SeekBar conf_seekBar = (SeekBar)findViewById(R.id.conf_seekBar2); 81 | conf_seekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener(){ 82 | @Override 83 | public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { 84 | TextView conf_textView = (TextView)findViewById(R.id.conf_TextView2); 85 | float thresh = (float)progress / 100.0f; 86 | conf_textView.setText(String.format("Confidence Threshold: %.2f", thresh)); 87 | if (detector != null) detector.setConfThresh(thresh); 88 | } 89 | @Override 90 | public void onStopTrackingTouch(SeekBar seekBar) { 91 | } 92 | @Override 93 | public void onStartTrackingTouch(SeekBar seekBar) { 94 | } 95 | }); 96 | conf_seekBar.setMax(100); 97 | conf_seekBar.setProgress(25);//0.25 98 | SeekBar iou_seekBar = (SeekBar)findViewById(R.id.iou_seekBar2); 99 | iou_seekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener(){ 100 | @Override 101 | public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { 102 | TextView iou_textView = (TextView)findViewById(R.id.iou_TextView2); 103 | float thresh = (float)progress / 100.0f; 104 | iou_textView.setText(String.format("IoU Threshold: %.2f", thresh)); 105 | if (detector != null) detector.setIoUThresh(thresh); 106 | } 107 | @Override 108 | public void onStopTrackingTouch(SeekBar seekBar) { 109 | } 110 | @Override 111 | public void onStartTrackingTouch(SeekBar seekBar) { 112 | } 113 | }); 114 | iou_seekBar.setMax(100); 115 | iou_seekBar.setProgress(45);//0.45 116 | } 117 | 118 | @Override 119 | protected void setUseNNAPI(final boolean isChecked) { 120 | 121 | } 122 | 123 | @Override 124 | protected void setNumThreads(final int numThreads) { 125 | //runInBackground(() -> detector.setNumThreads(numThreads)); 126 | } 127 | @Override 128 | public void onPreviewSizeChosen(final Size size, final int rotation) { 129 | final float textSizePx = 130 | TypedValue.applyDimension( 131 | TypedValue.COMPLEX_UNIT_DIP, TEXT_SIZE_DIP, getResources().getDisplayMetrics()); 132 | borderedText = new BorderedText(textSizePx); 133 | borderedText.setTypeface(Typeface.MONOSPACE); 134 | 135 | tracker = new MultiBoxTracker(this); 136 | 137 | int cropSize = TF_OD_API_INPUT_SIZE; 138 | 139 | try { 140 | detector = new TfliteRunner(this, MODE, TF_OD_API_INPUT_SIZE, 0.25f, 0.45f); 141 | cropSize = TF_OD_API_INPUT_SIZE; 142 | } catch (final Exception e) { 143 | e.printStackTrace(); 144 | Toast toast = 145 | Toast.makeText( 146 | getApplicationContext(), "Detector could not be initialized", Toast.LENGTH_SHORT); 147 | toast.show(); 148 | finish(); 149 | } 150 | 151 | previewWidth = size.getWidth(); 152 | previewHeight = size.getHeight(); 153 | int a = getScreenOrientation(); 154 | sensorOrientation = rotation - getScreenOrientation(); 155 | rgbFrameBitmap = Bitmap.createBitmap(previewWidth, previewHeight, Bitmap.Config.ARGB_8888); 156 | croppedBitmap = Bitmap.createBitmap(cropSize, cropSize, Bitmap.Config.ARGB_8888); 157 | 158 | frameToCropTransform = 159 | ImageUtils.getTransformationMatrix( 160 | previewWidth, previewHeight, 161 | cropSize, cropSize, 162 | sensorOrientation, MAINTAIN_ASPECT); 163 | 164 | cropToFrameTransform = new Matrix(); 165 | frameToCropTransform.invert(cropToFrameTransform); 166 | 167 | trackingOverlay = (OverlayView) findViewById(R.id.tracking_overlay); 168 | trackingOverlay.addCallback( 169 | new OverlayView.DrawCallback() { 170 | @Override 171 | public void drawCallback(final Canvas canvas) { 172 | tracker.draw(canvas); 173 | } 174 | }); 175 | 176 | tracker.setFrameConfiguration(getDesiredPreviewFrameSize(), TF_OD_API_INPUT_SIZE, sensorOrientation); 177 | } 178 | 179 | @Override 180 | protected void processImage() { 181 | trackingOverlay.postInvalidate(); 182 | 183 | // No mutex needed as this method is not reentrant. 184 | if (computingDetection) { 185 | readyForNextImage(); 186 | return; 187 | } 188 | computingDetection = true; 189 | 190 | rgbFrameBitmap.setPixels(getRgbBytes(), 0, previewWidth, 0, 0, previewWidth, previewHeight); 191 | 192 | readyForNextImage(); 193 | 194 | final Canvas canvas = new Canvas(croppedBitmap); 195 | canvas.drawBitmap(rgbFrameBitmap, frameToCropTransform, null); 196 | 197 | runInBackground( 198 | new Runnable() { 199 | @Override 200 | public void run() { 201 | final long nowTime = SystemClock.uptimeMillis(); 202 | float fps = (float)1000 / (float)(nowTime - lastProcessingTimeMs); 203 | lastProcessingTimeMs = nowTime; 204 | 205 | 206 | //ImageUtils.saveBitmap(croppedBitmap); 207 | detector.setInput(croppedBitmap); 208 | final List results = detector.runInference(); 209 | 210 | cropCopyBitmap = Bitmap.createBitmap(croppedBitmap); 211 | final Canvas canvas = new Canvas(cropCopyBitmap); 212 | final Paint paint = new Paint(); 213 | paint.setColor(Color.RED); 214 | paint.setStyle(Paint.Style.STROKE); 215 | paint.setStrokeWidth(2.0f); 216 | 217 | for (final TfliteRunner.Recognition result : results) { 218 | final RectF location = result.getLocation(); 219 | //canvas.drawRect(location, paint); 220 | } 221 | 222 | tracker.trackResults(results); 223 | trackingOverlay.postInvalidate(); 224 | 225 | computingDetection = false; 226 | 227 | runOnUiThread( 228 | new Runnable() { 229 | @Override 230 | public void run() { 231 | TextView fpsTextView = (TextView)findViewById(R.id.textViewFPS); 232 | String fpsText = String.format("FPS: %.2f", fps); 233 | fpsTextView.setText(fpsText); 234 | TextView latencyTextView = (TextView)findViewById(R.id.textViewLatency); 235 | latencyTextView.setText(detector.getLastElapsedTimeLog()); 236 | } 237 | }); 238 | } 239 | }); 240 | } 241 | 242 | } 243 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/camera/LegacyCameraConnectionFragment.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test.camera; 2 | 3 | import android.annotation.SuppressLint; 4 | import android.app.Fragment; 5 | import android.graphics.SurfaceTexture; 6 | import android.hardware.Camera; 7 | import android.os.Bundle; 8 | import android.os.Handler; 9 | import android.os.HandlerThread; 10 | import android.util.Size; 11 | import android.util.SparseIntArray; 12 | import android.view.LayoutInflater; 13 | import android.view.Surface; 14 | import android.view.TextureView; 15 | import android.view.View; 16 | import android.view.ViewGroup; 17 | 18 | import java.io.IOException; 19 | import java.util.List; 20 | 21 | 22 | import com.example.tflite_yolov5_test.R; 23 | import com.example.tflite_yolov5_test.customview.AutoFitTextureView; 24 | import com.example.tflite_yolov5_test.camera.env.ImageUtils; 25 | 26 | public class LegacyCameraConnectionFragment extends Fragment { 27 | /** Conversion from screen rotation to JPEG orientation. */ 28 | private static final SparseIntArray ORIENTATIONS = new SparseIntArray(); 29 | 30 | static { 31 | ORIENTATIONS.append(Surface.ROTATION_0, 90); 32 | ORIENTATIONS.append(Surface.ROTATION_90, 0); 33 | ORIENTATIONS.append(Surface.ROTATION_180, 270); 34 | ORIENTATIONS.append(Surface.ROTATION_270, 180); 35 | } 36 | 37 | private Camera camera; 38 | private Camera.PreviewCallback imageListener; 39 | private Size desiredSize; 40 | /** The layout identifier to inflate for this Fragment. */ 41 | private int layout; 42 | /** An {@link AutoFitTextureView} for camera preview. */ 43 | private AutoFitTextureView textureView; 44 | private SurfaceTexture availableSurfaceTexture = null; 45 | 46 | /** 47 | * {@link TextureView.SurfaceTextureListener} handles several lifecycle events on a {@link 48 | * TextureView}. 49 | */ 50 | private final TextureView.SurfaceTextureListener surfaceTextureListener = 51 | new TextureView.SurfaceTextureListener() { 52 | @Override 53 | public void onSurfaceTextureAvailable( 54 | final SurfaceTexture texture, final int width, final int height) { 55 | availableSurfaceTexture = texture; 56 | startCamera(); 57 | } 58 | 59 | @Override 60 | public void onSurfaceTextureSizeChanged( 61 | final SurfaceTexture texture, final int width, final int height) {} 62 | 63 | @Override 64 | public boolean onSurfaceTextureDestroyed(final SurfaceTexture texture) { 65 | return true; 66 | } 67 | 68 | @Override 69 | public void onSurfaceTextureUpdated(final SurfaceTexture texture) {} 70 | }; 71 | /** An additional thread for running tasks that shouldn't block the UI. */ 72 | private HandlerThread backgroundThread; 73 | 74 | @SuppressLint("ValidFragment") 75 | public LegacyCameraConnectionFragment( 76 | final Camera.PreviewCallback imageListener, final int layout, final Size desiredSize) { 77 | this.imageListener = imageListener; 78 | this.layout = layout; 79 | this.desiredSize = desiredSize; 80 | } 81 | 82 | @Override 83 | public View onCreateView( 84 | final LayoutInflater inflater, final ViewGroup container, final Bundle savedInstanceState) { 85 | return inflater.inflate(layout, container, false); 86 | } 87 | 88 | @Override 89 | public void onViewCreated(final View view, final Bundle savedInstanceState) { 90 | textureView = (AutoFitTextureView) view.findViewById(R.id.texture); 91 | } 92 | 93 | @Override 94 | public void onActivityCreated(final Bundle savedInstanceState) { 95 | super.onActivityCreated(savedInstanceState); 96 | } 97 | 98 | @Override 99 | public void onResume() { 100 | super.onResume(); 101 | startBackgroundThread(); 102 | // When the screen is turned off and turned back on, the SurfaceTexture is already 103 | // available, and "onSurfaceTextureAvailable" will not be called. In that case, we can open 104 | // a camera and start preview from here (otherwise, we wait until the surface is ready in 105 | // the SurfaceTextureListener). 106 | 107 | if (textureView.isAvailable()) { 108 | startCamera(); 109 | } else { 110 | textureView.setSurfaceTextureListener(surfaceTextureListener); 111 | } 112 | } 113 | 114 | @Override 115 | public void onPause() { 116 | stopCamera(); 117 | stopBackgroundThread(); 118 | super.onPause(); 119 | } 120 | 121 | /** Starts a background thread and its {@link Handler}. */ 122 | private void startBackgroundThread() { 123 | backgroundThread = new HandlerThread("CameraBackground"); 124 | backgroundThread.start(); 125 | } 126 | 127 | /** Stops the background thread and its {@link Handler}. */ 128 | private void stopBackgroundThread() { 129 | backgroundThread.quitSafely(); 130 | try { 131 | backgroundThread.join(); 132 | backgroundThread = null; 133 | } catch (final InterruptedException e) { 134 | } 135 | } 136 | 137 | private void startCamera() { 138 | int index = getCameraId(); 139 | camera = Camera.open(index); 140 | 141 | try { 142 | Camera.Parameters parameters = camera.getParameters(); 143 | List focusModes = parameters.getSupportedFocusModes(); 144 | if (focusModes != null 145 | && focusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) { 146 | parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE); 147 | } 148 | List cameraSizes = parameters.getSupportedPreviewSizes(); 149 | Size[] sizes = new Size[cameraSizes.size()]; 150 | int i = 0; 151 | for (Camera.Size size : cameraSizes) { 152 | sizes[i++] = new Size(size.width, size.height); 153 | } 154 | Size previewSize = 155 | CameraConnectionFragment.chooseOptimalSize( 156 | sizes, desiredSize.getWidth(), desiredSize.getHeight()); 157 | parameters.setPreviewSize(previewSize.getWidth(), previewSize.getHeight()); 158 | camera.setDisplayOrientation(90); 159 | camera.setParameters(parameters); 160 | camera.setPreviewTexture(availableSurfaceTexture); 161 | } catch (IOException exception) { 162 | camera.release(); 163 | } 164 | 165 | camera.setPreviewCallbackWithBuffer(imageListener); 166 | Camera.Size s = camera.getParameters().getPreviewSize(); 167 | camera.addCallbackBuffer(new byte[ImageUtils.getYUVByteSize(s.height, s.width)]); 168 | 169 | textureView.setAspectRatio(s.height, s.width); 170 | 171 | camera.startPreview(); 172 | } 173 | 174 | protected void stopCamera() { 175 | if (camera != null) { 176 | camera.stopPreview(); 177 | camera.setPreviewCallback(null); 178 | camera.release(); 179 | camera = null; 180 | } 181 | } 182 | 183 | private int getCameraId() { 184 | Camera.CameraInfo ci = new Camera.CameraInfo(); 185 | for (int i = 0; i < Camera.getNumberOfCameras(); i++) { 186 | Camera.getCameraInfo(i, ci); 187 | if (ci.facing == Camera.CameraInfo.CAMERA_FACING_BACK) return i; 188 | } 189 | return -1; // No camera found 190 | } 191 | } 192 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/camera/env/BorderedText.java: -------------------------------------------------------------------------------- 1 | /* Copyright 2019 The TensorFlow Authors. All Rights Reserved. 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | ==============================================================================*/ 15 | 16 | package com.example.tflite_yolov5_test.camera.env; 17 | 18 | import android.graphics.Canvas; 19 | import android.graphics.Color; 20 | import android.graphics.Paint; 21 | import android.graphics.Paint.Align; 22 | import android.graphics.Paint.Style; 23 | import android.graphics.Rect; 24 | import android.graphics.Typeface; 25 | import java.util.Vector; 26 | 27 | /** A class that encapsulates the tedious bits of rendering legible, bordered text onto a canvas. */ 28 | public class BorderedText { 29 | private final Paint interiorPaint; 30 | private final Paint exteriorPaint; 31 | 32 | private final float textSize; 33 | 34 | /** 35 | * Creates a left-aligned bordered text object with a white interior, and a black exterior with 36 | * the specified text size. 37 | * 38 | * @param textSize text size in pixels 39 | */ 40 | public BorderedText(final float textSize) { 41 | this(Color.WHITE, Color.BLACK, textSize); 42 | } 43 | 44 | /** 45 | * Create a bordered text object with the specified interior and exterior colors, text size and 46 | * alignment. 47 | * 48 | * @param interiorColor the interior text color 49 | * @param exteriorColor the exterior text color 50 | * @param textSize text size in pixels 51 | */ 52 | public BorderedText(final int interiorColor, final int exteriorColor, final float textSize) { 53 | interiorPaint = new Paint(); 54 | interiorPaint.setTextSize(textSize); 55 | interiorPaint.setColor(interiorColor); 56 | interiorPaint.setStyle(Style.FILL); 57 | interiorPaint.setAntiAlias(false); 58 | interiorPaint.setAlpha(255); 59 | 60 | exteriorPaint = new Paint(); 61 | exteriorPaint.setTextSize(textSize); 62 | exteriorPaint.setColor(exteriorColor); 63 | exteriorPaint.setStyle(Style.FILL_AND_STROKE); 64 | exteriorPaint.setStrokeWidth(textSize / 8); 65 | exteriorPaint.setAntiAlias(false); 66 | exteriorPaint.setAlpha(255); 67 | 68 | this.textSize = textSize; 69 | } 70 | 71 | public void setTypeface(Typeface typeface) { 72 | interiorPaint.setTypeface(typeface); 73 | exteriorPaint.setTypeface(typeface); 74 | } 75 | 76 | public void drawText(final Canvas canvas, final float posX, final float posY, final String text) { 77 | canvas.drawText(text, posX, posY, exteriorPaint); 78 | canvas.drawText(text, posX, posY, interiorPaint); 79 | } 80 | 81 | public void drawText( 82 | final Canvas canvas, final float posX, final float posY, final String text, Paint bgPaint) { 83 | 84 | float width = exteriorPaint.measureText(text); 85 | float textSize = exteriorPaint.getTextSize(); 86 | Paint paint = new Paint(bgPaint); 87 | paint.setStyle(Paint.Style.FILL); 88 | paint.setAlpha(160); 89 | canvas.drawRect(posX, (posY + (int) (textSize)), (posX + (int) (width)), posY, paint); 90 | 91 | canvas.drawText(text, posX, (posY + textSize), interiorPaint); 92 | } 93 | 94 | public void drawLines(Canvas canvas, final float posX, final float posY, Vector lines) { 95 | int lineNum = 0; 96 | for (final String line : lines) { 97 | drawText(canvas, posX, posY - getTextSize() * (lines.size() - lineNum - 1), line); 98 | ++lineNum; 99 | } 100 | } 101 | 102 | public void setInteriorColor(final int color) { 103 | interiorPaint.setColor(color); 104 | } 105 | 106 | public void setExteriorColor(final int color) { 107 | exteriorPaint.setColor(color); 108 | } 109 | 110 | public float getTextSize() { 111 | return textSize; 112 | } 113 | 114 | public void setAlpha(final int alpha) { 115 | interiorPaint.setAlpha(alpha); 116 | exteriorPaint.setAlpha(alpha); 117 | } 118 | 119 | public void getTextBounds( 120 | final String line, final int index, final int count, final Rect lineBounds) { 121 | interiorPaint.getTextBounds(line, index, count, lineBounds); 122 | } 123 | 124 | public void setTextAlign(final Align align) { 125 | interiorPaint.setTextAlign(align); 126 | exteriorPaint.setTextAlign(align); 127 | } 128 | } 129 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/camera/env/ImageUtils.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test.camera.env; 2 | import android.graphics.Bitmap; 3 | import android.graphics.Matrix; 4 | import android.os.Environment; 5 | 6 | import java.io.File; 7 | import java.io.FileOutputStream; 8 | 9 | import android.graphics.Bitmap; 10 | import android.graphics.Matrix; 11 | import android.os.Environment; 12 | import java.io.File; 13 | import java.io.FileOutputStream; 14 | 15 | /** Utility class for manipulating images. */ 16 | public class ImageUtils { 17 | // This value is 2 ^ 18 - 1, and is used to clamp the RGB values before their ranges 18 | // are normalized to eight bits. 19 | static final int kMaxChannelValue = 262143; 20 | 21 | 22 | /** 23 | * Utility method to compute the allocated size in bytes of a YUV420SP image of the given 24 | * dimensions. 25 | */ 26 | public static int getYUVByteSize(final int width, final int height) { 27 | // The luminance plane requires 1 byte per pixel. 28 | final int ySize = width * height; 29 | 30 | // The UV plane works on 2x2 blocks, so dimensions with odd size must be rounded up. 31 | // Each 2x2 block takes 2 bytes to encode, one each for U and V. 32 | final int uvSize = ((width + 1) / 2) * ((height + 1) / 2) * 2; 33 | 34 | return ySize + uvSize; 35 | } 36 | 37 | /** 38 | * Saves a Bitmap object to disk for analysis. 39 | * 40 | * @param bitmap The bitmap to save. 41 | */ 42 | public static void saveBitmap(final Bitmap bitmap) { 43 | saveBitmap(bitmap, "preview.png"); 44 | } 45 | 46 | /** 47 | * Saves a Bitmap object to disk for analysis. 48 | * 49 | * @param bitmap The bitmap to save. 50 | * @param filename The location to save the bitmap to. 51 | */ 52 | public static void saveBitmap(final Bitmap bitmap, final String filename) { 53 | final String root = 54 | Environment.getExternalStorageDirectory().getAbsolutePath() + File.separator + "tensorflow"; 55 | final File myDir = new File(root); 56 | 57 | if (!myDir.mkdirs()) { 58 | } 59 | 60 | final String fname = filename; 61 | final File file = new File(myDir, fname); 62 | if (file.exists()) { 63 | file.delete(); 64 | } 65 | try { 66 | final FileOutputStream out = new FileOutputStream(file); 67 | bitmap.compress(Bitmap.CompressFormat.PNG, 99, out); 68 | out.flush(); 69 | out.close(); 70 | } catch (final Exception e) { 71 | } 72 | } 73 | 74 | public static void convertYUV420SPToARGB8888(byte[] input, int width, int height, int[] output) { 75 | final int frameSize = width * height; 76 | for (int j = 0, yp = 0; j < height; j++) { 77 | int uvp = frameSize + (j >> 1) * width; 78 | int u = 0; 79 | int v = 0; 80 | 81 | for (int i = 0; i < width; i++, yp++) { 82 | int y = 0xff & input[yp]; 83 | if ((i & 1) == 0) { 84 | v = 0xff & input[uvp++]; 85 | u = 0xff & input[uvp++]; 86 | } 87 | 88 | output[yp] = YUV2RGB(y, u, v); 89 | } 90 | } 91 | } 92 | 93 | private static int YUV2RGB(int y, int u, int v) { 94 | // Adjust and check YUV values 95 | y = (y - 16) < 0 ? 0 : (y - 16); 96 | u -= 128; 97 | v -= 128; 98 | 99 | // This is the floating point equivalent. We do the conversion in integer 100 | // because some Android devices do not have floating point in hardware. 101 | // nR = (int)(1.164 * nY + 2.018 * nU); 102 | // nG = (int)(1.164 * nY - 0.813 * nV - 0.391 * nU); 103 | // nB = (int)(1.164 * nY + 1.596 * nV); 104 | int y1192 = 1192 * y; 105 | int r = (y1192 + 1634 * v); 106 | int g = (y1192 - 833 * v - 400 * u); 107 | int b = (y1192 + 2066 * u); 108 | 109 | // Clipping RGB values to be inside boundaries [ 0 , kMaxChannelValue ] 110 | r = r > kMaxChannelValue ? kMaxChannelValue : (r < 0 ? 0 : r); 111 | g = g > kMaxChannelValue ? kMaxChannelValue : (g < 0 ? 0 : g); 112 | b = b > kMaxChannelValue ? kMaxChannelValue : (b < 0 ? 0 : b); 113 | 114 | return 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff); 115 | } 116 | 117 | public static void convertYUV420ToARGB8888( 118 | byte[] yData, 119 | byte[] uData, 120 | byte[] vData, 121 | int width, 122 | int height, 123 | int yRowStride, 124 | int uvRowStride, 125 | int uvPixelStride, 126 | int[] out) { 127 | int yp = 0; 128 | for (int j = 0; j < height; j++) { 129 | int pY = yRowStride * j; 130 | int pUV = uvRowStride * (j >> 1); 131 | 132 | for (int i = 0; i < width; i++) { 133 | int uv_offset = pUV + (i >> 1) * uvPixelStride; 134 | 135 | out[yp++] = YUV2RGB(0xff & yData[pY + i], 0xff & uData[uv_offset], 0xff & vData[uv_offset]); 136 | } 137 | } 138 | } 139 | 140 | /** 141 | * Returns a transformation matrix from one reference frame into another. Handles cropping (if 142 | * maintaining aspect ratio is desired) and rotation. 143 | * 144 | * @param srcWidth Width of source frame. 145 | * @param srcHeight Height of source frame. 146 | * @param dstWidth Width of destination frame. 147 | * @param dstHeight Height of destination frame. 148 | * @param applyRotation Amount of rotation to apply from one frame to another. Must be a multiple 149 | * of 90. 150 | * @param maintainAspectRatio If true, will ensure that scaling in x and y remains constant, 151 | * cropping the image if necessary. 152 | * @return The transformation fulfilling the desired requirements. 153 | */ 154 | public static Matrix getTransformationMatrix( 155 | final int srcWidth, 156 | final int srcHeight, 157 | final int dstWidth, 158 | final int dstHeight, 159 | final int applyRotation, 160 | final boolean maintainAspectRatio) { 161 | final Matrix matrix = new Matrix(); 162 | 163 | if (applyRotation != 0) { 164 | if (applyRotation % 90 != 0) { 165 | } 166 | 167 | // Translate so center of image is at origin. 168 | matrix.postTranslate(-srcWidth / 2.0f, -srcHeight / 2.0f); 169 | 170 | // Rotate around origin. 171 | matrix.postRotate(applyRotation); 172 | } 173 | 174 | // Account for the already applied rotation, if any, and then determine how 175 | // much scaling is needed for each axis. 176 | final boolean transpose = (Math.abs(applyRotation) + 90) % 180 == 0; 177 | 178 | final int inWidth = transpose ? srcHeight : srcWidth; 179 | final int inHeight = transpose ? srcWidth : srcHeight; 180 | 181 | // Apply scaling if necessary. 182 | if (inWidth != dstWidth || inHeight != dstHeight) { 183 | final float scaleFactorX = dstWidth / (float) inWidth; 184 | final float scaleFactorY = dstHeight / (float) inHeight; 185 | 186 | if (maintainAspectRatio) { 187 | // Scale by minimum factor so that dst is filled completely while 188 | // maintaining the aspect ratio. Some image may fall off the edge. 189 | final float scaleFactor = Math.max(scaleFactorX, scaleFactorY); 190 | matrix.postScale(scaleFactor, scaleFactor); 191 | } else { 192 | // Scale exactly to fill dst from src. 193 | matrix.postScale(scaleFactorX, scaleFactorY); 194 | } 195 | } 196 | 197 | if (applyRotation != 0) { 198 | // Translate back from origin centered reference to destination frame. 199 | matrix.postTranslate(dstWidth / 2.0f, dstHeight / 2.0f); 200 | } 201 | 202 | return matrix; 203 | } 204 | } 205 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/camera/tracker/MultiBoxTracker.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test.camera.tracker; 2 | 3 | import android.content.Context; 4 | import android.graphics.Canvas; 5 | import android.graphics.Color; 6 | import android.graphics.Matrix; 7 | import android.graphics.Paint; 8 | import android.graphics.RectF; 9 | import android.text.TextUtils; 10 | import android.util.Log; 11 | import android.util.Pair; 12 | import android.util.Size; 13 | import android.util.TypedValue; 14 | 15 | import java.util.LinkedList; 16 | import java.util.List; 17 | import java.util.Queue; 18 | 19 | import android.content.Context; 20 | import android.graphics.Canvas; 21 | import android.graphics.Color; 22 | import android.graphics.Matrix; 23 | import android.graphics.Paint; 24 | import android.graphics.Paint.Cap; 25 | import android.graphics.Paint.Join; 26 | import android.graphics.Paint.Style; 27 | import android.graphics.RectF; 28 | import android.text.TextUtils; 29 | import android.util.Pair; 30 | import android.util.TypedValue; 31 | import java.util.LinkedList; 32 | import java.util.List; 33 | import java.util.Queue; 34 | import com.example.tflite_yolov5_test.camera.env.BorderedText; 35 | import com.example.tflite_yolov5_test.camera.env.ImageUtils; 36 | import com.example.tflite_yolov5_test.TfliteRunner.Recognition; 37 | 38 | /** A tracker that handles non-max suppression and matches existing objects to new detections. */ 39 | public class MultiBoxTracker { 40 | private static final float TEXT_SIZE_DIP = 18; 41 | private static final float MIN_SIZE = 16.0f; 42 | private static final int[] COLORS = { 43 | Color.BLUE, 44 | Color.RED, 45 | Color.GREEN, 46 | Color.YELLOW, 47 | Color.CYAN, 48 | Color.MAGENTA, 49 | Color.WHITE, 50 | Color.parseColor("#55FF55"), 51 | Color.parseColor("#FFA500"), 52 | Color.parseColor("#FF8888"), 53 | Color.parseColor("#AAAAFF"), 54 | Color.parseColor("#FFFFAA"), 55 | Color.parseColor("#55AAAA"), 56 | Color.parseColor("#AA33AA"), 57 | Color.parseColor("#0D0068") 58 | }; 59 | final List> screenRects = new LinkedList>(); 60 | private final Queue availableColors = new LinkedList(); 61 | private final List trackedObjects = new LinkedList(); 62 | private final Paint boxPaint = new Paint(); 63 | private final float textSizePx; 64 | private final BorderedText borderedText; 65 | private Matrix frameToCanvasMatrix; 66 | private int detectorInputSize; 67 | private int frameWidth; 68 | private int frameHeight; 69 | private int sensorOrientation; 70 | 71 | public MultiBoxTracker(final Context context) { 72 | for (final int color : COLORS) { 73 | availableColors.add(color); 74 | } 75 | 76 | boxPaint.setColor(Color.RED); 77 | boxPaint.setStyle(Paint.Style.STROKE); 78 | boxPaint.setStrokeWidth(10.0f); 79 | boxPaint.setStrokeCap(Paint.Cap.ROUND); 80 | boxPaint.setStrokeJoin(Paint.Join.ROUND); 81 | boxPaint.setStrokeMiter(100); 82 | 83 | textSizePx = 84 | TypedValue.applyDimension( 85 | TypedValue.COMPLEX_UNIT_DIP, TEXT_SIZE_DIP, context.getResources().getDisplayMetrics()); 86 | borderedText = new BorderedText(textSizePx); 87 | } 88 | 89 | public synchronized void setFrameConfiguration( 90 | final Size frameSize, final int detectorInputSize, final int sensorOrientation) { 91 | this.detectorInputSize = detectorInputSize; 92 | //reverse 93 | frameWidth = frameSize.getHeight(); 94 | frameHeight = frameSize.getWidth(); 95 | this.sensorOrientation = sensorOrientation; 96 | } 97 | 98 | public synchronized void drawDebug(final Canvas canvas) { 99 | final Paint textPaint = new Paint(); 100 | textPaint.setColor(Color.WHITE); 101 | textPaint.setTextSize(60.0f); 102 | 103 | final Paint boxPaint = new Paint(); 104 | boxPaint.setColor(Color.RED); 105 | boxPaint.setAlpha(200); 106 | boxPaint.setStyle(Paint.Style.STROKE); 107 | 108 | for (final Pair detection : screenRects) { 109 | final RectF rect = detection.second; 110 | canvas.drawRect(rect, boxPaint); 111 | canvas.drawText("" + detection.first, rect.left, rect.top, textPaint); 112 | borderedText.drawText(canvas, rect.centerX(), rect.centerY(), "" + detection.first); 113 | } 114 | } 115 | 116 | public synchronized void trackResults(final List results) { 117 | processResults(results); 118 | } 119 | 120 | private Matrix getFrameToCanvasMatrix() { 121 | return frameToCanvasMatrix; 122 | } 123 | 124 | public synchronized void draw(final Canvas canvas) { 125 | //WARNING(from lp6m): I don't understand about sensorOrientation, so fix to 0 for now... 126 | 127 | for (final TrackedRecognition recognition : trackedObjects) { 128 | float frameToCanvasScale = Math.min((float)canvas.getHeight() / frameHeight, (float)canvas.getWidth() / frameWidth); 129 | float scale_width = frameToCanvasScale * ((float)frameWidth / detectorInputSize); 130 | float scale_height = frameToCanvasScale * ((float)frameHeight / detectorInputSize); 131 | float x1 = recognition.location.left * scale_width; 132 | float y1 = recognition.location.top * scale_height; 133 | float x2 = recognition.location.right * scale_width; 134 | float y2 = recognition.location.bottom * scale_height; 135 | 136 | final RectF trackedPos = new RectF(x1, y1, x2, y2); 137 | boxPaint.setColor(recognition.color); 138 | 139 | float cornerSize = Math.min(trackedPos.width(), trackedPos.height()) / 8.0f; 140 | canvas.drawRoundRect(trackedPos, cornerSize, cornerSize, boxPaint); 141 | 142 | final String labelString = 143 | !TextUtils.isEmpty(recognition.title) 144 | ? String.format("%s %.2f", recognition.title, (100 * recognition.detectionConfidence)) 145 | : String.format("%.2f", (100 * recognition.detectionConfidence)); 146 | // borderedText.drawText(canvas, trackedPos.left + cornerSize, trackedPos.top, 147 | // labelString); 148 | borderedText.drawText( 149 | canvas, trackedPos.left + cornerSize, trackedPos.top, labelString + "%", boxPaint); 150 | } 151 | } 152 | 153 | private void processResults(final List results) { 154 | final List> rectsToTrack = new LinkedList>(); 155 | 156 | screenRects.clear(); 157 | final Matrix rgbFrameToScreen = new Matrix(getFrameToCanvasMatrix()); 158 | 159 | for (final Recognition result : results) { 160 | if (result.getLocation() == null) { 161 | continue; 162 | } 163 | final RectF detectionFrameRect = new RectF(result.getLocation()); 164 | 165 | final RectF detectionScreenRect = new RectF(); 166 | rgbFrameToScreen.mapRect(detectionScreenRect, detectionFrameRect); 167 | screenRects.add(new Pair(result.getConfidence(), detectionScreenRect)); 168 | 169 | if (detectionFrameRect.width() < MIN_SIZE || detectionFrameRect.height() < MIN_SIZE) { 170 | continue; 171 | } 172 | 173 | rectsToTrack.add(new Pair(result.getConfidence(), result)); 174 | } 175 | 176 | trackedObjects.clear(); 177 | if (rectsToTrack.isEmpty()) { 178 | return; 179 | } 180 | 181 | for (final Pair potential : rectsToTrack) { 182 | final TrackedRecognition trackedRecognition = new TrackedRecognition(); 183 | trackedRecognition.detectionConfidence = potential.first; 184 | trackedRecognition.location = new RectF(potential.second.getLocation()); 185 | trackedRecognition.title = potential.second.getTitle(); 186 | trackedRecognition.color = COLORS[trackedObjects.size()]; 187 | trackedObjects.add(trackedRecognition); 188 | 189 | if (trackedObjects.size() >= COLORS.length) { 190 | break; 191 | } 192 | } 193 | } 194 | 195 | private static class TrackedRecognition { 196 | RectF location; 197 | float detectionConfidence; 198 | int color; 199 | String title; 200 | } 201 | } 202 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/customview/AutoFitTextureView.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test.customview; 2 | 3 | import android.content.Context; 4 | import android.util.AttributeSet; 5 | import android.view.TextureView; 6 | 7 | 8 | /** A {@link TextureView} that can be adjusted to a specified aspect ratio. */ 9 | public class AutoFitTextureView extends TextureView { 10 | private int ratioWidth = 0; 11 | private int ratioHeight = 0; 12 | 13 | public AutoFitTextureView(final Context context) { 14 | this(context, null); 15 | } 16 | 17 | public AutoFitTextureView(final Context context, final AttributeSet attrs) { 18 | this(context, attrs, 0); 19 | } 20 | 21 | public AutoFitTextureView(final Context context, final AttributeSet attrs, final int defStyle) { 22 | super(context, attrs, defStyle); 23 | } 24 | 25 | /** 26 | * Sets the aspect ratio for this view. The size of the view will be measured based on the ratio 27 | * calculated from the parameters. Note that the actual sizes of parameters don't matter, that is, 28 | * calling setAspectRatio(2, 3) and setAspectRatio(4, 6) make the same result. 29 | * 30 | * @param width Relative horizontal size 31 | * @param height Relative vertical size 32 | */ 33 | public void setAspectRatio(final int width, final int height) { 34 | if (width < 0 || height < 0) { 35 | throw new IllegalArgumentException("Size cannot be negative."); 36 | } 37 | ratioWidth = width; 38 | ratioHeight = height; 39 | requestLayout(); 40 | } 41 | 42 | @Override 43 | protected void onMeasure(final int widthMeasureSpec, final int heightMeasureSpec) { 44 | super.onMeasure(widthMeasureSpec, heightMeasureSpec); 45 | final int width = MeasureSpec.getSize(widthMeasureSpec); 46 | final int height = MeasureSpec.getSize(heightMeasureSpec); 47 | if (0 == ratioWidth || 0 == ratioHeight) { 48 | setMeasuredDimension(width, height); 49 | } else { 50 | if (width < height * ratioWidth / ratioHeight) { 51 | setMeasuredDimension(width, width * ratioHeight / ratioWidth); 52 | } else { 53 | setMeasuredDimension(height * ratioWidth / ratioHeight, height); 54 | } 55 | } 56 | } 57 | } 58 | 59 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/java/com/example/tflite_yolov5_test/customview/OverlayView.java: -------------------------------------------------------------------------------- 1 | package com.example.tflite_yolov5_test.customview; 2 | 3 | import android.content.Context; 4 | import android.graphics.Canvas; 5 | import android.util.AttributeSet; 6 | import android.view.View; 7 | 8 | import java.util.LinkedList; 9 | import java.util.List; 10 | 11 | import android.content.Context; 12 | import android.graphics.Canvas; 13 | import android.util.AttributeSet; 14 | import android.view.View; 15 | import java.util.LinkedList; 16 | import java.util.List; 17 | 18 | /** A simple View providing a render callback to other classes. */ 19 | public class OverlayView extends View { 20 | private final List callbacks = new LinkedList(); 21 | 22 | public OverlayView(final Context context, final AttributeSet attrs) { 23 | super(context, attrs); 24 | } 25 | 26 | public void addCallback(final DrawCallback callback) { 27 | callbacks.add(callback); 28 | } 29 | 30 | @Override 31 | public synchronized void draw(final Canvas canvas) { 32 | for (final DrawCallback callback : callbacks) { 33 | callback.drawCallback(canvas); 34 | } 35 | } 36 | 37 | /** Interface defining the callback for client classes. */ 38 | public interface DrawCallback { 39 | public void drawCallback(final Canvas canvas); 40 | } 41 | } 42 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/res/drawable-v24/ic_launcher_foreground.xml: -------------------------------------------------------------------------------- 1 | 7 | 8 | 9 | 15 | 18 | 21 | 22 | 23 | 24 | 30 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/res/drawable/ic_dashboard_black_24dp.xml: -------------------------------------------------------------------------------- 1 | 6 | 9 | 10 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/res/drawable/ic_home_black_24dp.xml: -------------------------------------------------------------------------------- 1 | 6 | 9 | 10 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/res/drawable/ic_launcher_background.xml: -------------------------------------------------------------------------------- 1 | 2 | 7 | 10 | 15 | 20 | 25 | 30 | 35 | 40 | 45 | 50 | 55 | 60 | 65 | 70 | 75 | 80 | 85 | 90 | 95 | 100 | 105 | 110 | 115 | 120 | 125 | 130 | 135 | 140 | 145 | 150 | 155 | 160 | 165 | 170 | 171 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/res/drawable/ic_notifications_black_24dp.xml: -------------------------------------------------------------------------------- 1 | 6 | 9 | 10 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/res/layout/activity_camera.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 9 | 10 | 14 | 15 | 20 | 21 | 22 | 23 | 29 | 30 | 36 | 37 | 43 | 44 | 50 | 51 | 55 | 56 | 62 | 63 | 67 | 68 | 69 | 70 | 71 | -------------------------------------------------------------------------------- /app/tflite_yolov5_test/app/src/main/res/layout/activity_main.xml: -------------------------------------------------------------------------------- 1 | 2 | 8 | 13 | 14 | 18 | 19 | 27 | 28 | 32 | 33 | 38 | 39 | 44 | 45 | 51 | 52 | 62 | 63 | 72 | 73 | 74 | 75 | 76 | 81 | 82 | 87 | 88 | 94 | 95 | 106 | 107 | 117 | 118 | 119 | 120 | 121 | 126 | 127 | 132 | 133 | 139 | 140 | 151 | 152 | 162 | 163 | 173 | 174 | 175 | 176 | 177 | 182 | 183 | 188 | 189 | 195 | 196 | 206 | 207 | 216 | 217 | 218 | 219 | 220 | 221 | 222 |